Sample records for distributed simulation technology

  1. NASA Constellation Distributed Simulation Middleware Trade Study

    NASA Technical Reports Server (NTRS)

    Hasan, David; Bowman, James D.; Fisher, Nancy; Cutts, Dannie; Cures, Edwin Z.

    2008-01-01

    This paper presents the results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL.

  2. Research on Collaborative Technology in Distributed Virtual Reality System

    NASA Astrophysics Data System (ADS)

    Lei, ZhenJiang; Huang, JiJie; Li, Zhao; Wang, Lei; Cui, JiSheng; Tang, Zhi

    2018-01-01

    Distributed virtual reality technology applied to the joint training simulation needs the CSCW (Computer Supported Cooperative Work) terminal multicast technology to display and the HLA (high-level architecture) technology to ensure the temporal and spatial consistency of the simulation, in order to achieve collaborative display and collaborative computing. In this paper, the CSCW’s terminal multicast technology has been used to modify and expand the implementation framework of HLA. During the simulation initialization period, this paper has used the HLA statement and object management service interface to establish and manage the CSCW network topology, and used the HLA data filtering mechanism for each federal member to establish the corresponding Mesh tree. During the simulation running period, this paper has added a new thread for the RTI and the CSCW real-time multicast interactive technology into the RTI, so that the RTI can also use the window message mechanism to notify the application update the display screen. Through many applications of submerged simulation training in substation under the operation of large power grid, it is shown that this paper has achieved satisfactory training effect on the collaborative technology used in distributed virtual reality simulation.

  3. A Distributed Simulation Facility to Support Human Factors Research in Advanced Air Transportation Technology

    NASA Technical Reports Server (NTRS)

    Amonlirdviman, Keith; Farley, Todd C.; Hansman, R. John, Jr.; Ladik, John F.; Sherer, Dana Z.

    1998-01-01

    A distributed real-time simulation of the civil air traffic environment developed to support human factors research in advanced air transportation technology is presented. The distributed environment is based on a custom simulation architecture designed for simplicity and flexibility in human experiments. Standard Internet protocols are used to create the distributed environment, linking all advanced cockpit simulator, all Air Traffic Control simulator, and a pseudo-aircraft control and simulation management station. The pseudo-aircraft control station also functions as a scenario design tool for coordinating human factors experiments. This station incorporates a pseudo-pilot interface designed to reduce workload for human operators piloting multiple aircraft simultaneously in real time. The application of this distributed simulation facility to support a study of the effect of shared information (via air-ground datalink) on pilot/controller shared situation awareness and re-route negotiation is also presented.

  4. Middleware Trade Study for NASA Domain

    NASA Technical Reports Server (NTRS)

    Bowman, Dan

    2007-01-01

    This presentation presents preliminary results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are: the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL

  5. Verification, Validation, and Accreditation Challenges of Distributed Simulation for Space Exploration Technology

    NASA Technical Reports Server (NTRS)

    Thomas, Danny; Hartway, Bobby; Hale, Joe

    2006-01-01

    Throughout its rich history, NASA has invested heavily in sophisticated simulation capabilities. These capabilities reside in NASA facilities across the country - and with partners around the world. NASA s Exploration Systems Mission Directorate (ESMD) has the opportunity to leverage these considerable investments to resolve technical questions relating to its missions. The distributed nature of the assets, both in terms of geography and organization, present challenges to their combined and coordinated use, but precedents of geographically distributed real-time simulations exist. This paper will show how technological advances in simulation can be employed to address the issues associated with netting NASA simulation assets.

  6. Internet of Things: a possible change in the distributed modeling and simulation architecture paradigm

    NASA Astrophysics Data System (ADS)

    Riecken, Mark; Lessmann, Kurt; Schillero, David

    2016-05-01

    The Data Distribution Service (DDS) was started by the Object Management Group (OMG) in 2004. Currently, DDS is one of the contenders to support the Internet of Things (IoT) and the Industrial IOT (IIoT). DDS has also been used as a distributed simulation architecture. Given the anticipated proliferation of IoT and II devices, along with the explosive growth of sensor technology, can we expect this to have an impact on the broader community of distributed simulation? If it does, what is the impact and which distributed simulation domains will be most affected? DDS shares many of the same goals and characteristics of distributed simulation such as the need to support scale and an emphasis on Quality of Service (QoS) that can be tailored to meet the end user's needs. In addition, DDS has some built-in features such as security that are not present in traditional distributed simulation protocols. If the IoT and II realize their potential application, we predict a large base of technology to be built around this distributed data paradigm, much of which could be directly beneficial to the distributed M&S community. In this paper we compare some of the perceived gaps and shortfalls of current distributed M&S technology to the emerging capabilities of DDS built around the IoT. Although some trial work has been conducted in this area, we propose a more focused examination of the potential of these new technologies and their applicability to current and future problems in distributed M&S. The Internet of Things (IoT) and its data communications mechanisms such as the Data Distribution System (DDS) share properties in common with distributed modeling and simulation (M&S) and its protocols such as the High Level Architecture (HLA) and the Test and Training Enabling Architecture (TENA). This paper proposes a framework based on the sensor use case for how the two communities of practice (CoP) can benefit from one another and achieve greater capability in practical distributed computing.

  7. Using detailed inter-network simulation and model abstraction to investigate and evaluate joint battlespace infosphere (JBI) support technologies

    NASA Astrophysics Data System (ADS)

    Green, David M.; Dallaire, Joel D.; Reaper, Jerome H.

    2004-08-01

    The Joint Battlespace Infosphere (JBI) program is performing a technology investigation into global communications, data mining and warehousing, and data fusion technologies by focusing on techniques and methodologies that support twenty first century military distributed collaboration. Advancement of these technologies is vitally important if military decision makers are to have the right data, in the right format, at the right time and place to support making the right decisions within available timelines. A quantitative understanding of individual and combinational effects arising from the application of technologies within a framework is presently far too complex to evaluate at more than a cursory depth. In order to facilitate quantitative analysis under these circumstances, the Distributed Information Enterprise Modeling and Simulation (DIEMS) team was formed to apply modeling and simulation (M&S) techniques to help in addressing JBI analysis challenges. The DIEMS team has been tasked utilizing collaborative distributed M&S architectures to quantitatively evaluate JBI technologies and tradeoffs. This paper first presents a high level view of the DIEMS project. Once this approach has been established, a more concentrated view of the detailed communications simulation techniques used in generating the underlying support data sets is presented.

  8. An Orion/Ares I Launch and Ascent Simulation: One Segment of the Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Crues, Edwin Z.; Blum, Mike G.; Alofs, Cathy; Busto, Juan

    2007-01-01

    This paper describes the architecture and implementation of a distributed launch and ascent simulation of NASA's Orion spacecraft and Ares I launch vehicle. This simulation is one segment of the Distributed Space Exploration Simulation (DSES) Project. The DSES project is a research and development collaboration between NASA centers which investigates technologies and processes for distributed simulation of complex space systems in support of NASA's Exploration Initiative. DSES is developing an integrated end-to-end simulation capability to support NASA development and deployment of new exploration spacecraft and missions. This paper describes the first in a collection of simulation capabilities that DSES will support.

  9. Hybrid Communication Architectures for Distributed Smart Grid Applications

    DOE PAGES

    Zhang, Jianhua; Hasandka, Adarsh; Wei, Jin; ...

    2018-04-09

    Wired and wireless communications both play an important role in the blend of communications technologies necessary to enable future smart grid communications. Hybrid networks exploit independent mediums to extend network coverage and improve performance. However, whereas individual technologies have been applied in simulation networks, as far as we know there is only limited attention that has been paid to the development of a suite of hybrid communication simulation models for the communications system design. Hybrid simulation models are needed to capture the mixed communication technologies and IP address mechanisms in one simulation. To close this gap, we have developed amore » suite of hybrid communication system simulation models to validate the critical system design criteria for a distributed solar Photovoltaic (PV) communications system, including a single trip latency of 300 ms, throughput of 9.6 Kbps, and packet loss rate of 1%. In conclusion, the results show that three low-power wireless personal area network (LoWPAN)-based hybrid architectures can satisfy three performance metrics that are critical for distributed energy resource communications.« less

  10. Hybrid Communication Architectures for Distributed Smart Grid Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jianhua; Hasandka, Adarsh; Wei, Jin

    Wired and wireless communications both play an important role in the blend of communications technologies necessary to enable future smart grid communications. Hybrid networks exploit independent mediums to extend network coverage and improve performance. However, whereas individual technologies have been applied in simulation networks, as far as we know there is only limited attention that has been paid to the development of a suite of hybrid communication simulation models for the communications system design. Hybrid simulation models are needed to capture the mixed communication technologies and IP address mechanisms in one simulation. To close this gap, we have developed amore » suite of hybrid communication system simulation models to validate the critical system design criteria for a distributed solar Photovoltaic (PV) communications system, including a single trip latency of 300 ms, throughput of 9.6 Kbps, and packet loss rate of 1%. In conclusion, the results show that three low-power wireless personal area network (LoWPAN)-based hybrid architectures can satisfy three performance metrics that are critical for distributed energy resource communications.« less

  11. Hardware-in-the-Loop Simulation of a Distribution System with Air Conditioners under Model Predictive Control: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sparn, Bethany F; Ruth, Mark F; Krishnamurthy, Dheepak

    Many have proposed that responsive load provided by distributed energy resources (DERs) and demand response (DR) are an option to provide flexibility to the grid and especially to distribution feeders. However, because responsive load involves a complex interplay between tariffs and DER and DR technologies, it is challenging to test and evaluate options without negatively impacting customers. This paper describes a hardware-in-the-loop (HIL) simulation system that has been developed to reduce the cost of evaluating the impact of advanced controllers (e.g., model predictive controllers) and technologies (e.g., responsive appliances). The HIL simulation system combines large-scale software simulation with a smallmore » set of representative building equipment hardware. It is used to perform HIL simulation of a distribution feeder and the loads on it under various tariff structures. In the reported HIL simulation, loads include many simulated air conditioners and one physical air conditioner. Independent model predictive controllers manage operations of all air conditioners under a time-of-use tariff. Results from this HIL simulation and a discussion of future development work of the system are presented.« less

  12. Survey of student attitudes towards digital simulation technologies at a dental school in China.

    PubMed

    Ren, Q; Wang, Y; Zheng, Q; Ye, L; Zhou, X D; Zhang, L L

    2017-08-01

    Digital simulation technologies have become widespread in healthcare education, especially in dentistry; these technologies include digital X-ray images, digital microscopes, virtual pathology slides and other types of simulation. This study aimed to assess students' attitudes towards digital simulation technologies at a large, top-ranked dental school in China, as well as find out how students compare the digital technologies with traditional training methods. In April 2015, a custom-designed questionnaire was distributed to a total of 389 students who had received digital technology and simulation-based training in West China Dental School during 2012-2014. Results of a cross-sectional survey show that most students accept digital simulation technology; they report that the technology is stimulating and facilitates self-directed and self-paced learning. These findings, together with the objective advantages of digital technology, suggest that digital simulation training offers significant potential for dental education, highlighting the need for further research and more widespread implementation. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. Comparing Natural Gas Leakage Detection Technologies Using an Open-Source "Virtual Gas Field" Simulator.

    PubMed

    Kemp, Chandler E; Ravikumar, Arvind P; Brandt, Adam R

    2016-04-19

    We present a tool for modeling the performance of methane leak detection and repair programs that can be used to evaluate the effectiveness of detection technologies and proposed mitigation policies. The tool uses a two-state Markov model to simulate the evolution of methane leakage from an artificial natural gas field. Leaks are created stochastically, drawing from the current understanding of the frequency and size distributions at production facilities. Various leak detection and repair programs can be simulated to determine the rate at which each would identify and repair leaks. Integrating the methane leakage over time enables a meaningful comparison between technologies, using both economic and environmental metrics. We simulate four existing or proposed detection technologies: flame ionization detection, manual infrared camera, automated infrared drone, and distributed detectors. Comparing these four technologies, we found that over 80% of simulated leakage could be mitigated with a positive net present value, although the maximum benefit is realized by selectively targeting larger leaks. Our results show that low-cost leak detection programs can rely on high-cost technology, as long as it is applied in a way that allows for rapid detection of large leaks. Any strategy to reduce leakage should require a careful consideration of the differences between low-cost technologies and low-cost programs.

  14. The Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Mike G.; Bowman, James D.

    2007-01-01

    The paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which focuses on the investigation and development of technologies, processes and integrated simulations related to the collaborative distributed simulation of complex space systems in support of NASA's Exploration Initiative. This paper describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. In the network work area, DSES is developing a Distributed Simulation Network that will provide agency wide support for distributed simulation between all NASA centers. In the software work area, DSES is developing a collection of software models, tool and procedures that ease the burden of developing distributed simulations and provides a consistent interoperability infrastructure for agency wide participation in integrated simulation. Finally, for simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper will present current status and plans for each of these work areas with specific examples of simulations that support NASA's exploration initiatives.

  15. Interaction and Impact Studies for Distributed Energy Resource, Transactive Energy, and Electric Grid, using High Performance Computing ?based Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, B. M.

    The electric utility industry is undergoing significant transformations in its operation model, including a greater emphasis on automation, monitoring technologies, and distributed energy resource management systems (DERMS). With these changes and new technologies, while driving greater efficiencies and reliability, these new models may introduce new vectors of cyber attack. The appropriate cybersecurity controls to address and mitigate these newly introduced attack vectors and potential vulnerabilities are still widely unknown and performance of the control is difficult to vet. This proposal argues that modeling and simulation (M&S) is a necessary tool to address and better understand these problems introduced by emergingmore » technologies for the grid. M&S will provide electric utilities a platform to model its transmission and distribution systems and run various simulations against the model to better understand the operational impact and performance of cybersecurity controls.« less

  16. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  17. The Osseus platform: a prototype for advanced web-based distributed simulation

    NASA Astrophysics Data System (ADS)

    Franceschini, Derrick; Riecken, Mark

    2016-05-01

    Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.

  18. Optimal spinneret layout in Von Koch curves of fractal theory based needleless electrospinning process

    NASA Astrophysics Data System (ADS)

    Yang, Wenxiu; Liu, Yanbo; Zhang, Ligai; Cao, Hong; Wang, Yang; Yao, Jinbo

    2016-06-01

    Needleless electrospinning technology is considered as a better avenue to produce nanofibrous materials at large scale, and electric field intensity and its distribution play an important role in controlling nanofiber diameter and quality of the nanofibrous web during electrospinning. In the current study, a novel needleless electrospinning method was proposed based on Von Koch curves of Fractal configuration, simulation and analysis on electric field intensity and distribution in the new electrospinning process were performed with Finite element analysis software, Comsol Multiphysics 4.4, based on linear and nonlinear Von Koch fractal curves (hereafter called fractal models). The result of simulation and analysis indicated that Second level fractal structure is the optimal linear electrospinning spinneret in terms of field intensity and uniformity. Further simulation and analysis showed that the circular type of Fractal spinneret has better field intensity and distribution compared to spiral type of Fractal spinneret in the nonlinear Fractal electrospinning technology. The electrospinning apparatus with the optimal Von Koch fractal spinneret was set up to verify the theoretical analysis results from Comsol simulation, achieving more uniform electric field distribution and lower energy cost, compared to the current needle and needleless electrospinning technologies.

  19. Leveraging Simulation Against the F-16 Flying Training Gap

    DTIC Science & Technology

    2005-11-01

    must leverage emerging simulation technology into combined flight training to counter mission employment complexity created by technology itself...two or more of these stand-alone simulators creates a mission training center (MTC), which when further networked create distributed mission...operations (DMO). Ultimately, the grand operational vision of DMO is to interconnect non-collocated users creating a “virtual” joint training environment

  20. Distributed collaborative environments for virtual capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    Distributed collaboration is an emerging technology that will significantly change how decisions are made in the 21st century. Collaboration involves two or more geographically dispersed individuals working together to share and exchange data, information, knowledge, and actions. The marriage of information, collaboration, and simulation technologies provides the decision maker with a collaborative virtual environment for planning and decision support. This paper reviews research that is focusing on the applying open standards agent-based framework with integrated modeling and simulation to a new Air Force initiative in capability-based planning and the ability to implement it in a distributed virtual environment. Virtual Capability Planning effort will provide decision-quality knowledge for Air Force resource allocation and investment planning including examining proposed capabilities and cost of alternative approaches, the impact of technologies, identification of primary risk drivers, and creation of executable acquisition strategies. The transformed Air Force business processes are enabled by iterative use of constructive and virtual modeling, simulation, and analysis together with information technology. These tools are applied collaboratively via a technical framework by all the affected stakeholders - warfighter, laboratory, product center, logistics center, test center, and primary contractor.

  1. Technologies to Increase PV Hosting Capacity in Distribution Feeders: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Mather, Barry; Gotseff, Peter

    This paper studies the distributed photovoltaic (PV) hosting capacity in distribution feeders by using the stochastic analysis approach. Multiple scenario simulations are conducted to analyze several factors that affect PV hosting capacity, including the existence of voltage regulator, PV location, the power factor of PV inverter and Volt/VAR control. Based on the conclusions obtained from simulation results, three approaches are then proposed to increase distributed PV hosting capacity, which can be formulated as the optimization problem to obtain the optimal solution. All technologies investigated in this paper utilize only existing assets in the feeder and therefore are implementable for amore » low cost. Additionally, the tool developed for these studies is described.« less

  2. Technologies to Increase PV Hosting Capacity in Distribution Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Mather, Barry; Gotseff, Peter

    This paper studies the distributed photovoltaic (PV) hosting capacity in distribution feeders by using the stochastic analysis approach. Multiple scenario simulations are conducted to analyze several factors that affect PV hosting capacity, including the existence of voltage regulator, PV location, the power factor of PV inverter and Volt/VAR control. Based on the conclusions obtained from simulation results, three approaches are then proposed to increase distributed PV hosting capacity, which can be formulated as the optimization problem to obtain the optimal solution. All technologies investigated in this paper utilize only existing assets in the feeder and therefore are implementable for amore » low cost. Additionally, the tool developed for these studies is described.« less

  3. Implementation of quantum key distribution network simulation module in the network simulator NS-3

    NASA Astrophysics Data System (ADS)

    Mehic, Miralem; Maurhart, Oliver; Rass, Stefan; Voznak, Miroslav

    2017-10-01

    As the research in quantum key distribution (QKD) technology grows larger and becomes more complex, the need for highly accurate and scalable simulation technologies becomes important to assess the practical feasibility and foresee difficulties in the practical implementation of theoretical achievements. Due to the specificity of the QKD link which requires optical and Internet connection between the network nodes, to deploy a complete testbed containing multiple network hosts and links to validate and verify a certain network algorithm or protocol would be very costly. Network simulators in these circumstances save vast amounts of money and time in accomplishing such a task. The simulation environment offers the creation of complex network topologies, a high degree of control and repeatable experiments, which in turn allows researchers to conduct experiments and confirm their results. In this paper, we described the design of the QKD network simulation module which was developed in the network simulator of version 3 (NS-3). The module supports simulation of the QKD network in an overlay mode or in a single TCP/IP mode. Therefore, it can be used to simulate other network technologies regardless of QKD.

  4. Development of a Design Tool for Planning Aqueous Amendment Injection Systems Permanganate Design Tool

    DTIC Science & Technology

    2010-08-01

    CSTR continuously stirred tank reactors CT contact time EDB ethylene dibromide ESTCP Environmental Security Technology Certification Program...63 6.2 Simulating Oxidant Distribution Using a Series of CSTRs -------------------- 63 6.2.1 Model...SIMULATING OXIDANT DISTRIBUTION USING A SERIES OF CSTRS 6.2.1 MODEL DEVELOPMENT The transport and consumption of permanganate are simulated within the

  5. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Messagemore » Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.« less

  6. Distributed Generation Market Demand Model (dGen): Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sigrin, Benjamin; Gleason, Michael; Preus, Robert

    The Distributed Generation Market Demand model (dGen) is a geospatially rich, bottom-up, market-penetration model that simulates the potential adoption of distributed energy resources (DERs) for residential, commercial, and industrial entities in the continental United States through 2050. The National Renewable Energy Laboratory (NREL) developed dGen to analyze the key factors that will affect future market demand for distributed solar, wind, storage, and other DER technologies in the United States. The new model builds off, extends, and replaces NREL's SolarDS model (Denholm et al. 2009a), which simulates the market penetration of distributed PV only. Unlike the SolarDS model, dGen can modelmore » various DER technologies under one platform--it currently can simulate the adoption of distributed solar (the dSolar module) and distributed wind (the dWind module) and link with the ReEDS capacity expansion model (Appendix C). The underlying algorithms and datasets in dGen, which improve the representation of customer decision making as well as the spatial resolution of analyses (Figure ES-1), also are improvements over SolarDS.« less

  7. Massively Parallel Processing for Fast and Accurate Stamping Simulations

    NASA Astrophysics Data System (ADS)

    Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu

    2005-08-01

    The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.

  8. Design Tool for Planning Permanganate Injection Systems

    DTIC Science & Technology

    2010-08-01

    Chemical Spill 10 CSTR continuously stirred tank reactors CT contact time EDB ethylene dibromide ESTCP Environmental Security Technology...63 6.2 Simulating Oxidant Distribution Using a Series of CSTRs ...ER- 0625. 6.2 SIMULATING OXIDANT DISTRIBUTION USING A SERIES OF CSTRS 6.2.1 MODEL DEVELOPMENT The transport and consumption of permanganate

  9. Injection of Contaminants into a Simulated Water Distribution System Equipped with Continuous Multi-Parameter Water Monitors

    EPA Science Inventory

    The U.S. EPA’s Technology Testing and Evaluation Program has been charged by EPA to evaluate the performance of commercially available water security-related technologies. Multi-parameter water monitors for distributions systems have been evaluated as such a water security techn...

  10. Simulated and measured neutron/gamma light output distribution for poly-energetic neutron/gamma sources

    NASA Astrophysics Data System (ADS)

    Hosseini, S. A.; Zangian, M.; Aghabozorgi, S.

    2018-03-01

    In the present paper, the light output distribution due to poly-energetic neutron/gamma (neutron or gamma) source was calculated using the developed MCNPX-ESUT-PE (MCNPX-Energy engineering of Sharif University of Technology-Poly Energetic version) computational code. The simulation of light output distribution includes the modeling of the particle transport, the calculation of scintillation photons induced by charged particles, simulation of the scintillation photon transport and considering the light resolution obtained from the experiment. The developed computational code is able to simulate the light output distribution due to any neutron/gamma source. In the experimental step of the present study, the neutron-gamma discrimination based on the light output distribution was performed using the zero crossing method. As a case study, 241Am-9Be source was considered and the simulated and measured neutron/gamma light output distributions were compared. There is an acceptable agreement between the discriminated neutron/gamma light output distributions obtained from the simulation and experiment.

  11. Technology Development Risk Assessment for Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Godsell, Aga M.; Go, Susie

    2006-01-01

    A new approach for assessing development risk associated with technology development projects is presented. The method represents technology evolution in terms of sector-specific discrete development stages. A Monte Carlo simulation is used to generate development probability distributions based on statistical models of the discrete transitions. Development risk is derived from the resulting probability distributions and specific program requirements. Two sample cases are discussed to illustrate the approach, a single rocket engine development and a three-technology space transportation portfolio.

  12. Software To Secure Distributed Propulsion Simulations

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    Distributed-object computing systems are presented with many security threats, including network eavesdropping, message tampering, and communications middleware masquerading. NASA Glenn Research Center, and its industry partners, has taken an active role in mitigating the security threats associated with developing and operating their proprietary aerospace propulsion simulations. In particular, they are developing a collaborative Common Object Request Broker Architecture (CORBA) Security (CORBASec) test bed to secure their distributed aerospace propulsion simulations. Glenn has been working with its aerospace propulsion industry partners to deploy the Numerical Propulsion System Simulation (NPSS) object-based technology. NPSS is a program focused on reducing the cost and time in developing aerospace propulsion engines

  13. The research of distributed interactive simulation based on HLA in coal mine industry inherent safety

    NASA Astrophysics Data System (ADS)

    Dou, Zhi-Wu

    2010-08-01

    To solve the inherent safety problem puzzling the coal mining industry, analyzing the characteristic and the application of distributed interactive simulation based on high level architecture (DIS/HLA), a new method is proposed for developing coal mining industry inherent safety distributed interactive simulation adopting HLA technology. Researching the function and structure of the system, a simple coal mining industry inherent safety is modeled with HLA, the FOM and SOM are developed, and the math models are suggested. The results of the instance research show that HLA plays an important role in developing distributed interactive simulation of complicated distributed system and the method is valid to solve the problem puzzling coal mining industry. To the coal mining industry, the conclusions show that the simulation system with HLA plays an important role to identify the source of hazard, to make the measure for accident, and to improve the level of management.

  14. Multi-KW dc distribution system technology research study

    NASA Technical Reports Server (NTRS)

    Dawson, S. G.

    1978-01-01

    The Multi-KW DC Distribution System Technology Research Study is the third phase of the NASA/MSFC study program. The purpose of this contract was to complete the design of the integrated technology test facility, provide test planning, support test operations and evaluate test results. The subjet of this study is a continuation of this contract. The purpose of this continuation is to study and analyze high voltage system safety, to determine optimum voltage levels versus power, to identify power distribution system components which require development for higher voltage systems and finally to determine what modifications must be made to the Power Distribution System Simulator (PDSS) to demonstrate 300 Vdc distribution capability.

  15. Beowulf Distributed Processing and the United States Geological Survey

    USGS Publications Warehouse

    Maddox, Brian G.

    2002-01-01

    Introduction In recent years, the United States Geological Survey's (USGS) National Mapping Discipline (NMD) has expanded its scientific and research activities. Work is being conducted in areas such as emergency response research, scientific visualization, urban prediction, and other simulation activities. Custom-produced digital data have become essential for these types of activities. High-resolution, remotely sensed datasets are also seeing increased use. Unfortunately, the NMD is also finding that it lacks the resources required to perform some of these activities. Many of these projects require large amounts of computer processing resources. Complex urban-prediction simulations, for example, involve large amounts of processor-intensive calculations on large amounts of input data. This project was undertaken to learn and understand the concepts of distributed processing. Experience was needed in developing these types of applications. The idea was that this type of technology could significantly aid the needs of the NMD scientific and research programs. Porting a numerically intensive application currently being used by an NMD science program to run in a distributed fashion would demonstrate the usefulness of this technology. There are several benefits that this type of technology can bring to the USGS's research programs. Projects can be performed that were previously impossible due to a lack of computing resources. Other projects can be performed on a larger scale than previously possible. For example, distributed processing can enable urban dynamics research to perform simulations on larger areas without making huge sacrifices in resolution. The processing can also be done in a more reasonable amount of time than with traditional single-threaded methods (a scaled version of Chester County, Pennsylvania, took about fifty days to finish its first calibration phase with a single-threaded program). This paper has several goals regarding distributed processing technology. It will describe the benefits of the technology. Real data about a distributed application will be presented as an example of the benefits that this technology can bring to USGS scientific programs. Finally, some of the issues with distributed processing that relate to USGS work will be discussed.

  16. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    2008-01-01

    NASA s advanced visual simulations are essential for analyses associated with life cycle planning, design, training, testing, operations, and evaluation. Kennedy Space Center, in particular, uses simulations for ground services and space exploration planning in an effort to reduce risk and costs while improving safety and performance. However, it has been difficult to circulate and share the results of simulation tools among the field centers, and distance and travel expenses have made timely collaboration even harder. In response, NASA joined with Valador Inc. to develop the Distributed Observer Network (DON), a collaborative environment that leverages game technology to bring 3-D simulations to conventional desktop and laptop computers. DON enables teams of engineers working on design and operations to view and collaborate on 3-D representations of data generated by authoritative tools. DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3-D visual environment. Multiple widely dispersed users, working individually or in groups, can view and analyze simulation results on desktop and laptop computers in real time.

  17. Research on Intelligent Synthesis Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Lobeck, William E.

    2002-01-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  18. Research on Intelligent Synthesis Environments

    NASA Astrophysics Data System (ADS)

    Noor, Ahmed K.; Loftin, R. Bowen

    2002-12-01

    Four research activities related to Intelligent Synthesis Environment (ISE) have been performed under this grant. The four activities are: 1) non-deterministic approaches that incorporate technologies such as intelligent software agents, visual simulations and other ISE technologies; 2) virtual labs that leverage modeling, simulation and information technologies to create an immersive, highly interactive virtual environment tailored to the needs of researchers and learners; 3) advanced learning modules that incorporate advanced instructional, user interface and intelligent agent technologies; and 4) assessment and continuous improvement of engineering team effectiveness in distributed collaborative environments.

  19. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE PAGES

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei; ...

    2017-06-12

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  20. A simulation-based efficiency comparison of AC and DC power distribution networks in commercial buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Daniel L.; Vossos, Vagelis; Feng, Wei

    Direct current (DC) power distribution has recently gained traction in buildings research due to the proliferation of on-site electricity generation and battery storage, and an increasing prevalence of internal DC loads. The research discussed in this paper uses Modelica-based simulation to compare the efficiency of DC building power distribution with an equivalent alternating current (AC) distribution. The buildings are all modeled with solar generation, battery storage, and loads that are representative of the most efficient building technology. A variety of paramet ric simulations determine how and when DC distribution proves advantageous. These simulations also validate previous studies that use simplermore » approaches and arithmetic efficiency models. This work shows that using DC distribution can be considerably more efficient: a medium sized office building using DC distribution has an expected baseline of 12% savings, but may also save up to 18%. In these results, the baseline simulation parameters are for a zero net energy (ZNE) building that can island as a microgrid. DC is most advantageous in buildings with large solar capacity, large battery capacity, and high voltage distribution.« less

  1. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Analysis and finite element simulation of electromagnetic heating in the nitride MOCVD reactor

    NASA Astrophysics Data System (ADS)

    Li, Zhi-Ming; Hao, Yue; Zhang, Jin-Cheng; Xu, Sheng-Rui; Ni, Jin-Yu; Zhou, Xiao-Wei

    2009-11-01

    Electromagnetic field distribution in the vertical metal organic chemical vapour deposition (MOCVD) reactor is simulated by using the finite element method (FEM). The effects of alternating current frequency, intensity, coil turn number and the distance between the coil turns on the distribution of the Joule heat are analysed separately, and their relations to the value of Joule heat are also investigated. The temperature distribution on the susceptor is also obtained. It is observed that the results of the simulation are in good agreement with previous measurements.

  2. Cryospheric Research in China

    DTIC Science & Technology

    2015-03-30

    marine monitoring for environment and security, using satellite Earth observation technologies), the WCRP/CliC Project (an international cooperative...BIOME4) to simulate the responses of biome distribution to future climate change in China. The simulation results suggest that regional climate

  3. Control and Communication for a Secure and Reconfigurable Power Distribution System

    NASA Astrophysics Data System (ADS)

    Giacomoni, Anthony Michael

    A major transformation is taking place throughout the electric power industry to overlay existing electric infrastructure with advanced sensing, communications, and control system technologies. This transformation to a smart grid promises to enhance system efficiency, increase system reliability, support the electrification of transportation, and provide customers with greater control over their electricity consumption. Upgrading control and communication systems for the end-to-end electric power grid, however, will present many new security challenges that must be dealt with before extensive deployment and implementation of these technologies can begin. In this dissertation, a comprehensive systems approach is taken to minimize and prevent cyber-physical disturbances to electric power distribution systems using sensing, communications, and control system technologies. To accomplish this task, an intelligent distributed secure control (IDSC) architecture is presented and validated in silico for distribution systems to provide greater adaptive protection, with the ability to proactively reconfigure, and rapidly respond to disturbances. Detailed descriptions of functionalities at each layer of the architecture as well as the whole system are provided. To compare the performance of the IDSC architecture with that of other control architectures, an original simulation methodology is developed. The simulation model integrates aspects of cyber-physical security, dynamic price and demand response, sensing, communications, intermittent distributed energy resources (DERs), and dynamic optimization and reconfiguration. Applying this comprehensive systems approach, performance results for the IEEE 123 node test feeder are simulated and analyzed. The results show the trade-offs between system reliability, operational constraints, and costs for several control architectures and optimization algorithms. Additional simulation results are also provided. In particular, the advantages of an IDSC architecture are highlighted when an intermittent DER is present on the system.

  4. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  5. High performance real-time flight simulation at NASA Langley

    NASA Technical Reports Server (NTRS)

    Cleveland, Jeff I., II

    1994-01-01

    In order to meet the stringent time-critical requirements for real-time man-in-the-loop flight simulation, computer processing operations must be deterministic and be completed in as short a time as possible. This includes simulation mathematical model computational and data input/output to the simulators. In 1986, in response to increased demands for flight simulation performance, personnel at NASA's Langley Research Center (LaRC), working with the contractor, developed extensions to a standard input/output system to provide for high bandwidth, low latency data acquisition and distribution. The Computer Automated Measurement and Control technology (IEEE standard 595) was extended to meet the performance requirements for real-time simulation. This technology extension increased the effective bandwidth by a factor of ten and increased the performance of modules necessary for simulator communications. This technology is being used by more than 80 leading technological developers in the United States, Canada, and Europe. Included among the commercial applications of this technology are nuclear process control, power grid analysis, process monitoring, real-time simulation, and radar data acquisition. Personnel at LaRC have completed the development of the use of supercomputers for simulation mathematical model computational to support real-time flight simulation. This includes the development of a real-time operating system and the development of specialized software and hardware for the CAMAC simulator network. This work, coupled with the use of an open systems software architecture, has advanced the state of the art in real time flight simulation. The data acquisition technology innovation and experience with recent developments in this technology are described.

  6. Distributed Observer Network (DON), Version 3.0, User's Guide

    NASA Technical Reports Server (NTRS)

    Mazzone, Rebecca A.; Conroy, Michael P.

    2015-01-01

    The Distributed Observer Network (DON) is a data presentation tool developed by the National Aeronautics and Space Administration (NASA) to distribute and publish simulation results. Leveraging the display capabilities inherent in modern gaming technology, DON places users in a fully navigable 3-D environment containing graphical models and allows the users to observe how those models evolve and interact over time in a given scenario. Each scenario is driven with data that has been generated by authoritative NASA simulation tools and exported in accordance with a published data interface specification. This decoupling of the data from the source tool enables DON to faithfully display a simulator's results and ensure that every simulation stakeholder will view the exact same information every time.

  7. Immersive Simulations for Smart Classrooms: Exploring Evolutionary Concepts in Secondary Science

    ERIC Educational Resources Information Center

    Lui, Michelle; Slotta, James D.

    2014-01-01

    This article presents the design of an immersive simulation and inquiry activity for technology-enhanced classrooms. Using a co-design method, researchers worked with a high school biology teacher to create a rainforest simulation, distributed across several large displays in the room to immerse students in the environment. The authors created and…

  8. Mission Simulation Facility: Simulation Support for Autonomy Development

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Plice, Laura; Neukom, Christian; Flueckiger, Lorenzo; Wagner, Michael

    2003-01-01

    The Mission Simulation Facility (MSF) supports research in autonomy technology for planetary exploration vehicles. Using HLA (High Level Architecture) across distributed computers, the MSF connects users autonomy algorithms with provided or third-party simulations of robotic vehicles and planetary surface environments, including onboard components and scientific instruments. Simulation fidelity is variable to meet changing needs as autonomy technology advances in Technical Readiness Level (TRL). A virtual robot operating in a virtual environment offers numerous advantages over actual hardware, including availability, simplicity, and risk mitigation. The MSF is in use by researchers at NASA Ames Research Center (ARC) and has demonstrated basic functionality. Continuing work will support the needs of a broader user base.

  9. Connecting Research to Teaching: Using Data to Motivate the Use of Empirical Sampling Distributions

    ERIC Educational Resources Information Center

    Lee, Hollylynne S.; Starling, Tina T.; Gonzalez, Marggie D.

    2014-01-01

    Research shows that students often struggle with understanding empirical sampling distributions. Using hands-on and technology models and simulations of problems generated by real data help students begin to make connections between repeated sampling, sample size, distribution, variation, and center. A task to assist teachers in implementing…

  10. A better sequence-read simulator program for metagenomics.

    PubMed

    Johnson, Stephen; Trost, Brett; Long, Jeffrey R; Pittet, Vanessa; Kusalik, Anthony

    2014-01-01

    There are many programs available for generating simulated whole-genome shotgun sequence reads. The data generated by many of these programs follow predefined models, which limits their use to the authors' original intentions. For example, many models assume that read lengths follow a uniform or normal distribution. Other programs generate models from actual sequencing data, but are limited to reads from single-genome studies. To our knowledge, there are no programs that allow a user to generate simulated data following non-parametric read-length distributions and quality profiles based on empirically-derived information from metagenomics sequencing data. We present BEAR (Better Emulation for Artificial Reads), a program that uses a machine-learning approach to generate reads with lengths and quality values that closely match empirically-derived distributions. BEAR can emulate reads from various sequencing platforms, including Illumina, 454, and Ion Torrent. BEAR requires minimal user input, as it automatically determines appropriate parameter settings from user-supplied data. BEAR also uses a unique method for deriving run-specific error rates, and extracts useful statistics from the metagenomic data itself, such as quality-error models. Many existing simulators are specific to a particular sequencing technology; however, BEAR is not restricted in this way. Because of its flexibility, BEAR is particularly useful for emulating the behaviour of technologies like Ion Torrent, for which no dedicated sequencing simulators are currently available. BEAR is also the first metagenomic sequencing simulator program that automates the process of generating abundances, which can be an arduous task. BEAR is useful for evaluating data processing tools in genomics. It has many advantages over existing comparable software, such as generating more realistic reads and being independent of sequencing technology, and has features particularly useful for metagenomics work.

  11. Simulating Synchronous Processors

    DTIC Science & Technology

    1988-06-01

    34f Fvtvru m LABORATORY FOR INMASSACHUSETTSFCOMPUTER SCIENCE TECHNOLOGY MIT/LCS/TM-359 SIMULATING SYNCHRONOUS PROCESSORS Jennifer Lundelius Welch...PROJECT TASK WORK UNIT Arlington, VA 22217 ELEMENT NO. NO. NO ACCESSION NO. 11. TITLE Include Security Classification) Simulating Synchronous Processors...necessary and identify by block number) In this paper we show how a distributed system with synchronous processors and asynchro- nous message delays can

  12. Distributed simulation using a real-time shared memory network

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Mattern, Duane L.; Wong, Edmond; Musgrave, Jeffrey L.

    1993-01-01

    The Advanced Control Technology Branch of the NASA Lewis Research Center performs research in the area of advanced digital controls for aeronautic and space propulsion systems. This work requires the real-time implementation of both control software and complex dynamical models of the propulsion system. We are implementing these systems in a distributed, multi-vendor computer environment. Therefore, a need exists for real-time communication and synchronization between the distributed multi-vendor computers. A shared memory network is a potential solution which offers several advantages over other real-time communication approaches. A candidate shared memory network was tested for basic performance. The shared memory network was then used to implement a distributed simulation of a ramjet engine. The accuracy and execution time of the distributed simulation was measured and compared to the performance of the non-partitioned simulation. The ease of partitioning the simulation, the minimal time required to develop for communication between the processors and the resulting execution time all indicate that the shared memory network is a real-time communication technique worthy of serious consideration.

  13. Shadow Mode Assessment Using Realistic Technologies for the National Airspace (SMART NAS)

    NASA Technical Reports Server (NTRS)

    Kopardekar, Parimal H.

    2014-01-01

    Develop a simulation and modeling capability that includes: (a) Assessment of multiple parallel universes, (b) Accepts data feeds, (c) Allows for live virtual constructive distribute environment, (d) Enables integrated examinations of concepts, algorithms, technologies and National Airspace System (NAS) architectures.

  14. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  15. Distributed dynamic simulations of networked control and building performance applications.

    PubMed

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  16. Distributed dynamic simulations of networked control and building performance applications

    PubMed Central

    Yahiaoui, Azzedine

    2017-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135

  17. Development of a Free-Flight Simulation Infrastructure

    NASA Technical Reports Server (NTRS)

    Miles, Eric S.; Wing, David J.; Davis, Paul C.

    1999-01-01

    In anticipation of a projected rise in demand for air transportation, NASA and the FAA are researching new air-traffic-management (ATM) concepts that fall under the paradigm known broadly as ":free flight". This paper documents the software development and engineering efforts in progress by Seagull Technology, to develop a free-flight simulation (FFSIM) that is intended to help NASA researchers test mature-state concepts for free flight, otherwise referred to in this paper as distributed air / ground traffic management (DAG TM). Under development is a distributed, human-in-the-loop simulation tool that is comprehensive in its consideration of current and envisioned communication, navigation and surveillance (CNS) components, and will allow evaluation of critical air and ground traffic management technologies from an overall systems perspective. The FFSIM infrastructure is designed to incorporate all three major components of the ATM triad: aircraft flight decks, air traffic control (ATC), and (eventually) airline operational control (AOC) centers.

  18. How uncertain is the future of electric vehicle market: Results from Monte Carlo simulations using a nested logit model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong

    Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less

  19. How uncertain is the future of electric vehicle market: Results from Monte Carlo simulations using a nested logit model

    DOE PAGES

    Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong; ...

    2016-12-08

    Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less

  20. Research on simulation based material delivery system for an automobile company with multi logistics center

    NASA Astrophysics Data System (ADS)

    Luo, D.; Guan, Z.; Wang, C.; Yue, L.; Peng, L.

    2017-06-01

    Distribution of different parts to the assembly lines is significant for companies to improve production. Current research investigates the problem of distribution method optimization of a logistics system in a third party logistic company that provide professional services to an automobile manufacturing case company in China. Current research investigates the logistics leveling the material distribution and unloading platform of the automobile logistics enterprise and proposed logistics distribution strategy, material classification method, as well as logistics scheduling. Moreover, the simulation technology Simio is employed on assembly line logistics system which helps to find and validate an optimization distribution scheme through simulation experiments. Experimental results indicate that the proposed scheme can solve the logistic balance and levels the material problem and congestion of the unloading pattern in an efficient way as compared to the original method employed by the case company.

  1. Variability-aware compact modeling and statistical circuit validation on SRAM test array

    NASA Astrophysics Data System (ADS)

    Qiao, Ying; Spanos, Costas J.

    2016-03-01

    Variability modeling at the compact transistor model level can enable statistically optimized designs in view of limitations imposed by the fabrication technology. In this work we propose a variability-aware compact model characterization methodology based on stepwise parameter selection. Transistor I-V measurements are obtained from bit transistor accessible SRAM test array fabricated using a collaborating foundry's 28nm FDSOI technology. Our in-house customized Monte Carlo simulation bench can incorporate these statistical compact models; and simulation results on SRAM writability performance are very close to measurements in distribution estimation. Our proposed statistical compact model parameter extraction methodology also has the potential of predicting non-Gaussian behavior in statistical circuit performances through mixtures of Gaussian distributions.

  2. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  3. TechTuning: Stress Management For 3D Through-Silicon-Via Stacking Technologies

    NASA Astrophysics Data System (ADS)

    Radojcic, Riko; Nowak, Matt; Nakamoto, Mark

    2011-09-01

    The concerns with managing mechanical stress distributions and the consequent effects on device performance and material integrity, for advanced TSV based technologies 3D are outlined. A model and simulation based Design For Manufacturability (DFM) type of a flow for managing the mechanical stresses throughout Si die, stack and package design is proposed. The key attributes of the models and simulators required to fuel the proposed flow are summarized. Finally, some of the essential infrastructure and the Supply Chain support items are described.

  4. Minimization of Blast furnace Fuel Rate by Optimizing Burden and Gas Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Chenn Zhou

    2012-08-15

    The goal of the research is to improve the competitive edge of steel mills by using the advanced CFD technology to optimize the gas and burden distributions inside a blast furnace for achieving the best gas utilization. A state-of-the-art 3-D CFD model has been developed for simulating the gas distribution inside a blast furnace at given burden conditions, burden distribution and blast parameters. The comprehensive 3-D CFD model has been validated by plant measurement data from an actual blast furnace. Validation of the sub-models is also achieved. The user friendly software package named Blast Furnace Shaft Simulator (BFSS) has beenmore » developed to simulate the blast furnace shaft process. The research has significant benefits to the steel industry with high productivity, low energy consumption, and improved environment.« less

  5. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    NASA Technical Reports Server (NTRS)

    Papathakis, Kurt V.

    2017-01-01

    There a few NASA funded electric and hybrid electric projects from different NASA Centers, including the NASA Armstrong Flight Research Center (AFRC) (Edwards, California). Each project identifies a specific technology gap that is currently inhibiting the growth and proliferation of relevant technologies in commercial aviation. This paper describes the design and development of a turbo-electric distributed propulsion (TeDP) hardware-in-the-loop (HIL) simulation bench, which is a test bed for discovering turbo-electric control, distributed electric control, power management control, and integration competencies while providing risk mitigation for future turbo-electric flying demonstrators.

  6. Local-Area-Network Simulator

    NASA Technical Reports Server (NTRS)

    Gibson, Jim; Jordan, Joe; Grant, Terry

    1990-01-01

    Local Area Network Extensible Simulator (LANES) computer program provides method for simulating performance of high-speed local-area-network (LAN) technology. Developed as design and analysis software tool for networking computers on board proposed Space Station. Load, network, link, and physical layers of layered network architecture all modeled. Mathematically models according to different lower-layer protocols: Fiber Distributed Data Interface (FDDI) and Star*Bus. Written in FORTRAN 77.

  7. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  8. Simulation of ICESat-2 canopy height retrievals for different ecosystems

    NASA Astrophysics Data System (ADS)

    Neuenschwander, A. L.

    2016-12-01

    Slated for launch in late 2017 (or early 2018), the ICESat-2 satellite will provide a global distribution of geodetic measurements from a space-based laser altimeter of both the terrain surface and relative canopy heights which will provide a significant benefit to society through a variety of applications ranging from improved global digital terrain models to producing distribution of above ground vegetation structure. The ATLAS instrument designed for ICESat-2, will utilize a different technology than what is found on most laser mapping systems. The photon counting technology of the ATLAS instrument onboard ICESat-2 will record the arrival time associated with a single photon detection. That detection can occur anywhere within the vertical distribution of the reflected signal, that is, anywhere within the vertical distribution of the canopy. This uncertainty of where the photon will be returned from within the vegetation layer is referred to as the vertical sampling error. Preliminary simulation studies to estimate vertical sampling error have been conducted for several ecosystems including woodland savanna, montane conifers, temperate hardwoods, tropical forest, and boreal forest. The results from these simulations indicate that the canopy heights reported on the ATL08 data product will underestimate the top canopy height in the range of 1 - 4 m. Although simulation results indicate the ICESat-2 will underestimate top canopy height, there is, however, a strong correlation between ICESat-2 heights and relative canopy height metrics (e.g. RH75, RH90). In tropical forest, simulation results indicate the ICESat-2 height correlates strongly with RH90. Similarly, in temperate broadleaf forest, the simulated ICESat-2 heights were also strongly correlated with RH90. In boreal forest, the simulated ICESat-2 heights are strongly correlated with RH75 heights. It is hypothesized that the correlations between simulated ICESat-2 heights and canopy height metrics are a function of both canopy cover and vegetation physiology (e.g. leaf size/shape) which contributes to the horizontal and vertical structure of the vegetation.

  9. The Distributed Geothermal Market Demand Model (dGeo): Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCabe, Kevin; Mooney, Meghan E; Sigrin, Benjamin O

    The National Renewable Energy Laboratory (NREL) developed the Distributed Geothermal Market Demand Model (dGeo) as a tool to explore the potential role of geothermal distributed energy resources (DERs) in meeting thermal energy demands in the United States. The dGeo model simulates the potential for deployment of geothermal DERs in the residential and commercial sectors of the continental United States for two specific technologies: ground-source heat pumps (GHP) and geothermal direct use (DU) for district heating. To quantify the opportunity space for these technologies, dGeo leverages a highly resolved geospatial database and robust bottom-up, agent-based modeling framework. This design is consistentmore » with others in the family of Distributed Generation Market Demand models (dGen; Sigrin et al. 2016), including the Distributed Solar Market Demand (dSolar) and Distributed Wind Market Demand (dWind) models. dGeo is intended to serve as a long-term scenario-modeling tool. It has the capability to simulate the technical potential, economic potential, market potential, and technology deployment of GHP and DU through the year 2050 under a variety of user-defined input scenarios. Through these capabilities, dGeo can provide substantial analytical value to various stakeholders interested in exploring the effects of various techno-economic, macroeconomic, financial, and policy factors related to the opportunity for GHP and DU in the United States. This report documents the dGeo modeling design, methodology, assumptions, and capabilities.« less

  10. Advanced Distributed Simulation Technology Advanced Rotary Wing Aircraft. System/Segment Specification. Volume 1. Simulation System Module

    DTIC Science & Technology

    1994-03-31

    overhead water sprinklers in enclosed personnel areas not already protected by existing facility fire suppression systems. Sprinkler systems shall not...facilitate future changes and updates to remain current with the application aircraft. 3.4.4 Availabilit . The ARWA SS shall be designed and constructed to

  11. Investigation of Probability Distributions Using Dice Rolling Simulation

    ERIC Educational Resources Information Center

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  12. The application of simulation modeling to the cost and performance ranking of solar thermal power plants

    NASA Technical Reports Server (NTRS)

    Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.

    1981-01-01

    A computer simulation code was employed to evaluate several generic types of solar power systems (up to 10 MWe). Details of the simulation methodology, and the solar plant concepts are given along with cost and performance results. The Solar Energy Simulation computer code (SESII) was used, which optimizes the size of the collector field and energy storage subsystem for given engine-generator and energy-transport characteristics. Nine plant types were examined which employed combinations of different technology options, such as: distributed or central receivers with one- or two-axis tracking or no tracking; point- or line-focusing concentrator; central or distributed power conversion; Rankin, Brayton, or Stirling thermodynamic cycles; and thermal or electrical storage. Optimal cost curves were plotted as a function of levelized busbar energy cost and annualized plant capacity. Point-focusing distributed receiver systems were found to be most efficient (17-26 percent).

  13. Intracranial hemorrhage alters scalp potential distribution in bioimpedance cerebral monitoring: Preliminary results from FEM simulation on a realistic head model and human subjects

    PubMed Central

    Atefi, Seyed Reza; Seoane, Fernando; Kamalian, Shervin; Rosenthal, Eric S.; Lev, Michael H.; Bonmassar, Giorgio

    2016-01-01

    Purpose: Current diagnostic neuroimaging for detection of intracranial hemorrhage (ICH) is limited to fixed scanners requiring patient transport and extensive infrastructure support. ICH diagnosis would therefore benefit from a portable diagnostic technology, such as electrical bioimpedance (EBI). Through simulations and patient observation, the authors assessed the influence of unilateral ICH hematomas on quasisymmetric scalp potential distributions in order to establish the feasibility of EBI technology as a potential tool for early diagnosis. Methods: Finite element method (FEM) simulations and experimental left–right hemispheric scalp potential differences of healthy and damaged brains were compared with respect to the asymmetry caused by ICH lesions on quasisymmetric scalp potential distributions. In numerical simulations, this asymmetry was measured at 25 kHz and visualized on the scalp as the normalized potential difference between the healthy and ICH damaged models. Proof-of-concept simulations were extended in a pilot study of experimental scalp potential measurements recorded between 0 and 50 kHz with the authors’ custom-made bioimpedance spectrometer. Mean left–right scalp potential differences recorded from the frontal, central, and parietal brain regions of ten healthy control and six patients suffering from acute/subacute ICH were compared. The observed differences were measured at the 5% level of significance using the two-sample Welch t-test. Results: The 3D-anatomically accurate FEM simulations showed that the normalized scalp potential difference between the damaged and healthy brain models is zero everywhere on the head surface, except in the vicinity of the lesion, where it can vary up to 5%. The authors’ preliminary experimental results also confirmed that the left–right scalp potential difference in patients with ICH (e.g., 64 mV) is significantly larger than in healthy subjects (e.g., 20.8 mV; P < 0.05). Conclusions: Realistic, proof-of-concept simulations confirmed that ICH affects quasisymmetric scalp potential distributions. Pilot clinical observations with the authors’ custom-made bioimpedance spectrometer also showed higher left–right potential differences in the presence of ICH, similar to those of their simulations, that may help to distinguish healthy subjects from ICH patients. Although these pilot clinical observations are in agreement with the computer simulations, the small sample size of this study lacks statistical power to exclude the influence of other possible confounders such as age, sex, and electrode positioning. The agreement with previously published simulation-based and clinical results, however, suggests that EBI technology may be potentially useful for ICH detection. PMID:26843231

  14. Study of a Compression-Molding Process for Ultraviolet Light-Emitting Diode Exposure Systems via Finite-Element Analysis

    PubMed Central

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-01-01

    Although wafer-level camera lenses are a very promising technology, problems such as warpage with time and non-uniform thickness of products still exist. In this study, finite element simulation was performed to simulate the compression molding process for acquiring the pressure distribution on the product on completion of the process and predicting the deformation with respect to the pressure distribution. Results show that the single-gate compression molding process significantly increases the pressure at the center of the product, whereas the multi-gate compressing molding process can effectively distribute the pressure. This study evaluated the non-uniform thickness of product and changes in the process parameters through computer simulations, which could help to improve the compression molding process. PMID:28617315

  15. Development of Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Capability 2 and Experimental Plans

    NASA Technical Reports Server (NTRS)

    Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.; hide

    2006-01-01

    The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.

  16. A Demonstration of Delay and Constructive Modeling Effects in Distributed Interactive Simulation.

    DTIC Science & Technology

    1998-02-01

    with the Armstrong Laboratory Design Technology Branch, Veda Incorporated, and Science Applications International Corporation (SAIC). SAIC was working...The authors express special thanks to Mr. Dave O’Quinn of Veda Incorporated who provided quality simulation engineering support, and to Mr. David...platform employed in the study was the Engineering Design Simulator (EDSM) shown in Figure 3. Developed by Veda Inc., the EDSM is a single-seat

  17. A First Look at the Upcoming SISO Space Reference FOM

    NASA Technical Reports Server (NTRS)

    Mueller, Bjorn; Crues, Edwin Z.; Dexter, Dan; Garro, Alfredo; Skuratovskiy, Anton; Vankov, Alexander

    2016-01-01

    Spaceflight is difficult, dangerous and expensive; human spaceflight even more so. In order to mitigate some of the danger and expense, professionals in the space domain have relied, and continue to rely, on computer simulation. Simulation is used at every level including concept, design, analysis, construction, testing, training and ultimately flight. As space systems have grown more complex, new simulation technologies have been developed, adopted and applied. Distributed simulation is one those technologies. Distributed simulation provides a base technology for segmenting these complex space systems into smaller, and usually simpler, component systems or subsystems. This segmentation also supports the separation of responsibilities between participating organizations. This segmentation is particularly useful for complex space systems like the International Space Station (ISS), which is composed of many elements from many nations along with visiting vehicles from many nations. This is likely to be the case for future human space exploration activities. Over the years, a number of distributed simulations have been built within the space domain. While many use the High Level Architecture (HLA) to provide the infrastructure for interoperability, HLA without a Federation Object Model (FOM) is insufficient by itself to insure interoperability. As a result, the Simulation Interoperability Standards Organization (SISO) is developing a Space Reference FOM. The Space Reference FOM Product Development Group is composed of members from several countries. They contribute experiences from projects within NASA, ESA and other organizations and represent government, academia and industry. The initial version of the Space Reference FOM is focusing on time and space and will provide the following: (i) a flexible positioning system using reference frames for arbitrary bodies in space, (ii) a naming conventions for well-known reference frames, (iii) definitions of common time scales, (iv) federation agreements for common types of time management with focus on time stepped simulation, and (v) support for physical entities, such as space vehicles and astronauts. The Space Reference FOM is expected to make collaboration politically, contractually and technically easier. It is also expected to make collaboration easier to manage and extend.

  18. Simulation and curriculum design: a global survey in dental education.

    PubMed

    Perry, S; Burrow, M F; Leung, W K; Bridges, S M

    2017-12-01

    Curriculum reforms are being driven by globalization and international standardization. Although new information technologies such as dental haptic virtual reality (VR) simulation systems have provided potential new possibilities for clinical learning in dental curricula, infusion into curricula requires careful planning. This study aimed to identify current patterns in the role and integration of simulation in dental degree curricula internationally. An original internet survey was distributed by invitation to clinical curriculum leaders in dental schools in Asia, Europe, North America, and Oceania (Australia and New Zealand). The results (N = 62) showed Asia, Europe and Oceania tended towards integrated curriculum designs with North America having a higher proportion of traditional curricula. North America had limited implementation of haptic VR simulation technology but reported the highest number of scheduled simulation hours. Australia and New Zealand were the most likely regions to incorporate haptic VR simulation technology. This survey indicated considerable variation in curriculum structure with regionally-specific preferences being evident in terms of curriculum structure, teaching philosophies and motivation for incorporation of VR haptic simulation into curricula. This study illustrates the need for an improved evidence base on dental simulations to inform curriculum designs and psychomotor skill learning in dentistry. © 2017 Australian Dental Association.

  19. CFD-PBM coupled simulation of a nanobubble generator with honeycomb structure

    NASA Astrophysics Data System (ADS)

    Ren, F.; Noda, N. A.; Ueda, T.; Sano, Y.; Takase, Y.; Umekage, T.; Yonezawa, Y.; Tanaka, H.

    2018-06-01

    In recent years, nanobubble technologies have drawn great attention due to their wide applications in many fields of science and technology. The nitrogen nanobubble water circulation can be used to slow the progressions of oxidation and spoilage for the seafood long- term storage. From previous studies, a kind of honeycomb structure for high-efficiency nanobubble generation has been proposed. In this paper, the bubbly flow in the honeycomb structure was studied. The numerical simulations of honeycomb structure were performed by using a computational fluid dynamics–population balance model (CFD-PBM) coupled model. The numerical model was based on the Eulerian multiphase model and the population balance model (PBM) was used to calculate the gas bubble size distribution. The bubble coalescence and breakage were included. Considering the effect of bubble diameter on the fluid flow, the phase interactions were coupled with the PBM. The bubble size distributions in the honeycomb structure under different work conditions were predicted. The experimental results were compared with the simulation predictions.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuruganti, Phani Teja

    The smart grid is a combined process of revitalizing the traditional power grid applications and introducing new applications to improve the efficiency of power generation, transmission and distribution. This can be achieved by leveraging advanced communication and networking technologies. Therefore the selection of the appropriate communication technology for different smart grid applications has been debated a lot in the recent past. After comparing different possible technologies, a recent research study has arrived at a conclusion that the 3G cellular technology is the right choice for distribution side smart grid applications like smart metering, advanced distribution automation and demand response managementmore » system. In this paper, we argue that the current 3G/4G cellular technologies are not an appropriate choice for smart grid distribution applications and propose a Hybrid Spread Spectrum (HSS) based Advanced Metering Infrastructure (AMI) as one of the alternatives to 3G/4G technologies. We present a preliminary PHY and MAC layer design of a HSS based AMI network and evaluate their performance using matlab and NS2 simulations. Also, we propose a time hierarchical scheme that can significantly reduce the volume of random access traffic generated during blackouts and the delay in power outage reporting.« less

  1. Multi-agent systems and their applications

    DOE PAGES

    Xie, Jing; Liu, Chen-Ching

    2017-07-14

    The number of distributed energy components and devices continues to increase globally. As a result, distributed control schemes are desirable for managing and utilizing these devices, together with the large amount of data. In recent years, agent-based technology becomes a powerful tool for engineering applications. As a computational paradigm, multi agent systems (MASs) provide a good solution for distributed control. Here in this paper, MASs and applications are discussed. A state-of-the-art literature survey is conducted on the system architecture, consensus algorithm, and multi-agent platform, framework, and simulator. In addition, a distributed under-frequency load shedding (UFLS) scheme is proposed using themore » MAS. Simulation results for a case study are presented. The future of MASs is discussed in the conclusion.« less

  2. Multi-agent systems and their applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Jing; Liu, Chen-Ching

    The number of distributed energy components and devices continues to increase globally. As a result, distributed control schemes are desirable for managing and utilizing these devices, together with the large amount of data. In recent years, agent-based technology becomes a powerful tool for engineering applications. As a computational paradigm, multi agent systems (MASs) provide a good solution for distributed control. Here in this paper, MASs and applications are discussed. A state-of-the-art literature survey is conducted on the system architecture, consensus algorithm, and multi-agent platform, framework, and simulator. In addition, a distributed under-frequency load shedding (UFLS) scheme is proposed using themore » MAS. Simulation results for a case study are presented. The future of MASs is discussed in the conclusion.« less

  3. A Network Scheduling Model for Distributed Control Simulation

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Thomas, George; Aretskin-Hariton, Eliot

    2016-01-01

    Distributed engine control is a hardware technology that radically alters the architecture for aircraft engine control systems. Of its own accord, it does not change the function of control, rather it seeks to address the implementation issues for weight-constrained vehicles that can limit overall system performance and increase life-cycle cost. However, an inherent feature of this technology, digital communication networks, alters the flow of information between critical elements of the closed-loop control. Whereas control information has been available continuously in conventional centralized control architectures through virtue of analog signaling, moving forward, it will be transmitted digitally in serial fashion over the network(s) in distributed control architectures. An underlying effect is that all of the control information arrives asynchronously and may not be available every loop interval of the controller, therefore it must be scheduled. This paper proposes a methodology for modeling the nominal data flow over these networks and examines the resulting impact for an aero turbine engine system simulation.

  4. Multimode marine engine room simulation system based on field bus technology

    NASA Astrophysics Data System (ADS)

    Zheng, Huayao; Deng, Linlin; Guo, Yi

    2003-09-01

    Developing multi mode MER (Marine Engine Room) Labs is the main work in Marine Simulation Center, which is the key lab of Communication Ministry of China. It includes FPP (Fixed Pitch Propeller) and CPP (Controllable Pitch Propeller) mode MER simulation systems, integrated electrical propulsion mode MER simulation system, physical mode MER lab, etc. FPP mode simulation system, which was oriented to large container ship, had been completed since 1999, and got second level of Shanghai Municipal Science and Technical Progress award. This paper mainly introduces the recent development and achievements of Marine Simulation Center. Based on the Lon Works field bus, the structure characteristics and control strategies of completely distributed intelligent control network are discussed. The experiment mode of multi-nodes field bus detection and control system is described. Besides, intelligent fault diagnosis technology about some mechatronics integration control systems explored is also involved.

  5. Simulation of a microgrid

    NASA Astrophysics Data System (ADS)

    Dulǎu, Lucian Ioan

    2015-12-01

    This paper describes the simulation of a microgrid system with storage technologies. The microgrid comprises 6 distributed generators (DGs), 3 loads and a 150 kW storage unit. The installed capacity of the generators is 1100 kW, while the total load demand is 900 kW. The simulation is performed by using a SCADA software, considering the power generation costs, the loads demand and the system's power losses. The generators access the system in order of their power generation cost. The simulation is performed for the entire day.

  6. System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures

    NASA Technical Reports Server (NTRS)

    Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger

    2007-01-01

    This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

  7. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.

    PubMed

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology.

  8. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    PubMed Central

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  9. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  10. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.

  11. Component Technology for High-Performance Scientific Simulation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epperly, T; Kohn, S; Kumfert, G

    2000-11-09

    We are developing scientific software component technology to manage the complexity of modem, parallel simulation software and increase the interoperability and re-use of scientific software packages. In this paper, we describe a language interoperability tool named Babel that enables the creation and distribution of language-independent software libraries using interface definition language (IDL) techniques. We have created a scientific IDL that focuses on the unique interface description needs of scientific codes, such as complex numbers, dense multidimensional arrays, complicated data types, and parallelism. Preliminary results indicate that in addition to language interoperability, this approach provides useful tools for thinking about themore » design of modem object-oriented scientific software libraries. Finally, we also describe a web-based component repository called Alexandria that facilitates the distribution, documentation, and re-use of scientific components and libraries.« less

  12. Experimental and numerical studies of micro PEM fuel cell

    NASA Astrophysics Data System (ADS)

    Peng, Rong-Gui; Chung, Chen-Chung; Chen, Chiun-Hsun

    2011-10-01

    A single micro proton exchange membrane fuel cell (PEMFC) has been produced using Micro-electromechanical systems (MEMS) technology with the active area of 2.5 cm2 and channel depth of about 500 µm. A theoretical analysis is performed in this study for a novel MEMS-based design of amicro PEMFC. Themodel consists of the conservation equations of mass, momentum, species and electric current in a fully integrated finite-volume solver using the CFD-ACE+ commercial code. The polarization curves of simulation are well correlated with experimental data. Three-dimensional simulations are carried out to treat prediction and analysis of micro PEMFC temperature, current density and water distributions in two different fuel flow rates (15 cm3/min and 40 cm3/min). Simulation results show that temperature distribution within the micro PEMFC is affected by water distribution in the membrane and indicate that low and uniform temperature distribution in the membrane at low fuel flow rates leads to increased membrane water distribution and obtains superior micro PEMFC current density distribution under 0.4V operating voltage. Model predictions are well within those known for experimental mechanism phenomena.

  13. Design and Development of a 200-kW Turbo-Electric Distributed Propulsion Testbed

    NASA Technical Reports Server (NTRS)

    Papathakis, Kurt V.; Kloesel, Kurt J.; Lin, Yohan; Clarke, Sean; Ediger, Jacob J.; Ginn, Starr

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center (AFRC) (Edwards, California) is developing a Hybrid-Electric Integrated Systems Testbed (HEIST) Testbed as part of the HEIST Project, to study power management and transition complexities, modular architectures, and flight control laws for turbo-electric distributed propulsion technologies using representative hardware and piloted simulations. Capabilities are being developed to assess the flight readiness of hybrid electric and distributed electric vehicle architectures. Additionally, NASA will leverage experience gained and assets developed from HEIST to assist in flight-test proposal development, flight-test vehicle design, and evaluation of hybrid electric and distributed electric concept vehicles for flight safety. The HEIST test equipment will include three trailers supporting a distributed electric propulsion wing, a battery system and turbogenerator, dynamometers, and supporting power and communication infrastructure, all connected to the AFRC Core simulation. Plans call for 18 high performance electric motors that will be powered by batteries and the turbogenerator, and commanded by a piloted simulation. Flight control algorithms will be developed on the turbo-electric distributed propulsion system.

  14. The spatial distribution the thickness of polymer powder coatings for ultrasonic sensors

    NASA Astrophysics Data System (ADS)

    Gavrilova, V. A.; Fazlyyyakhmatov, M. G.; Kashapov, N. F.

    2014-11-01

    Objects of research are coatings and technology of their applying to the piezoelectric elements for ultrasound. Results of studies the distribution coating thickness according to different modes of coating process are presented. Experimentally confirmed the simulation results of the movement gas suspension on the electrostatic field in the electrode system "needle - plane".

  15. Power quality and protection of electric distribution systems with small, dispersed generation devices

    NASA Astrophysics Data System (ADS)

    Rizy, D. T.; Jewell, W. T.

    1984-10-01

    There are several operational problems associated with the connection of small power sources, such as wind turbines and photovoltaic (PV) arrays, to an electric distribution system. In one study the harmonic distortion produced by a subdivision of PV arrays connected through line-commutated inverters was simulated. A second simulation study evaluated protection problems associated with the operation of dispersed ac generators. The purpose of these studies was to examine the adequacy of the electric utility industry's traditional practices and hardware for the operation of dispersed power sources. The results of these simulation studies are discussed and recommendations are given for hardware and system operation needed for accommodating this new technology.

  16. A "Skylight" Simulator for HWIL Simulation of Hyperspectral Remote Sensing.

    PubMed

    Zhao, Huijie; Cui, Bolun; Jia, Guorui; Li, Xudong; Zhang, Chao; Zhang, Xinyang

    2017-12-06

    Even though digital simulation technology has been widely used in the last two decades, hardware-in-the-loop (HWIL) simulation is still an indispensable method for spectral uncertainty research of ground targets. However, previous facilities mainly focus on the simulation of panchromatic imaging. Therefore, neither the spectral nor the spatial performance is enough for hyperspectral simulation. To improve the accuracy of illumination simulation, a new dome-like skylight simulator is designed and developed to fit the spatial distribution and spectral characteristics of a real skylight for the wavelength from 350 nm to 2500 nm. The simulator's performance was tested using a spectroradiometer with different accessories. The spatial uniformity is greater than 0.91. The spectral mismatch decreases to 1/243 of the spectral mismatch of the Imagery Simulation Facility (ISF). The spatial distribution of radiance can be adjusted, and the accuracy of the adjustment is greater than 0.895. The ability of the skylight simulator is also demonstrated by comparing radiometric quantities measured in the skylight simulator with those in a real skylight in Beijing.

  17. Simulation of the Velocity and Temperature Distribution of Inhalation Thermal Injury in a Human Upper Airway Model by Application of Computational Fluid Dynamics.

    PubMed

    Chang, Yang; Zhao, Xiao-zhuo; Wang, Cheng; Ning, Fang-gang; Zhang, Guo-an

    2015-01-01

    Inhalation injury is an important cause of death after thermal burns. This study was designed to simulate the velocity and temperature distribution of inhalation thermal injury in the upper airway in humans using computational fluid dynamics. Cervical computed tomography images of three Chinese adults were imported to Mimics software to produce three-dimensional models. After grids were established and boundary conditions were defined, the simulation time was set at 1 minute and the gas temperature was set to 80 to 320°C using ANSYS software (ANSYS, Canonsburg, PA) to simulate the velocity and temperature distribution of inhalation thermal injury. Cross-sections were cut at 2-mm intervals, and maximum airway temperature and velocity were recorded for each cross-section. The maximum velocity peaked in the lower part of the nasal cavity and then decreased with air flow. The velocities in the epiglottis and glottis were higher than those in the surrounding areas. Further, the maximum airway temperature decreased from the nasal cavity to the trachea. Computational fluid dynamics technology can be used to simulate the velocity and temperature distribution of inhaled heated air.

  18. AI and simulation: What can they learn from each other

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano P.

    1988-01-01

    Simulation and Artificial Intelligence share a fertile common ground both from a practical and from a conceptual point of view. Strengths and weaknesses of both Knowledge Based System and Modeling and Simulation are examined and three types of systems that combine the strengths of both technologies are discussed. These types of systems are a practical starting point, however, the real strengths of both technologies will be exploited only when they are combined in a common knowledge representation paradigm. From an even deeper conceptual point of view, one might even argue that the ability to reason from a set of facts (i.e., Expert System) is less representative of human reasoning than the ability to make a model of the world, change it as required, and derive conclusions about the expected behavior of world entities. This is a fundamental problem in AI, and Modeling Theory can contribute to its solution. The application of Knowledge Engineering technology to a Distributed Processing Network Simulator (DPNS) is discussed.

  19. A “Skylight” Simulator for HWIL Simulation of Hyperspectral Remote Sensing

    PubMed Central

    Zhao, Huijie; Cui, Bolun; Li, Xudong; Zhang, Chao; Zhang, Xinyang

    2017-01-01

    Even though digital simulation technology has been widely used in the last two decades, hardware-in-the-loop (HWIL) simulation is still an indispensable method for spectral uncertainty research of ground targets. However, previous facilities mainly focus on the simulation of panchromatic imaging. Therefore, neither the spectral nor the spatial performance is enough for hyperspectral simulation. To improve the accuracy of illumination simulation, a new dome-like skylight simulator is designed and developed to fit the spatial distribution and spectral characteristics of a real skylight for the wavelength from 350 nm to 2500 nm. The simulator’s performance was tested using a spectroradiometer with different accessories. The spatial uniformity is greater than 0.91. The spectral mismatch decreases to 1/243 of the spectral mismatch of the Imagery Simulation Facility (ISF). The spatial distribution of radiance can be adjusted, and the accuracy of the adjustment is greater than 0.895. The ability of the skylight simulator is also demonstrated by comparing radiometric quantities measured in the skylight simulator with those in a real skylight in Beijing. PMID:29211004

  20. Towards Reconfigurable, Separable and Hard Real-Time Hybrid Simulation and Test Systems

    NASA Astrophysics Data System (ADS)

    Quartier, F.; Delatte, B.; Joubert, M.

    2009-05-01

    Formation flight needs several new technologies, new disciplines, new approaches and above all, more concurrent engineering by more players. One of the problems to be addressed are more complex simulation and test systems that are easy to re-configure to include parts of the target hardware and that can provide sufficient power to handle simulation cores that are requiring one to two orders of magnitude more processing power than the current technology provides. Critical technologies that are already addressed by CNES and Spacebel are study model reuse and simulator reconfigurability (Basiles), model portability (SMP2) and the federation of several simulators using HLA. Two more critical issues are addressed in ongoing R&D work by CNES and Spacebel and are covered by this paper and concern the time engineering and management. The first issue concerns separability (characterisation, identification and handling of separable subsystems) and the consequences on practical systems. Experiments on the Pleiades operational simulator have shown that adding precise simulation of instruments such as Doris and the Star Tracker can be added without significantly impacting overall performance. Improved time analysis leads to better system understanding and testability. The second issue concerns architectures for distributed hybrid simulators systems that provide hard real-time capabilities and can react with a relative time precision and jitter that is in the 10 to 50 µsecond range using mainstream PC's and mainstream Operating Systems. This opens a way to make smaller economic hardware test systems that can be reconfigured to make large hardware test systems without restarting development. Although such systems were considered next to impossible till now, distributed hard real-time systems are getting in reach when modern but mainstream electronics are used and when processor cores can be isolated and reserved for real-time cores. This requires a complete rethinking of the overall system, but needs very little overall changes. Automated identification of potential parallel simulation capability might become possible in a not so distant future.

  1. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Timothy M.; Palmintier, Bryan; Suryanarayanan, Siddharth

    As more Smart Grid technologies (e.g., distributed photovoltaic, spatially distributed electric vehicle charging) are integrated into distribution grids, static distribution simulations are no longer sufficient for performing modeling and analysis. GridLAB-D is an agent-based distribution system simulation environment that allows fine-grained end-user models, including geospatial and network topology detail. A problem exists in that, without outside intervention, once the GridLAB-D simulation begins execution, it will run to completion without allowing the real-time interaction of Smart Grid controls, such as home energy management systems and aggregator control. We address this lack of runtime interaction by designing a flexible communication interface, Bus.pymore » (pronounced bus-dot-pie), that uses Python to pass messages between one or more GridLAB-D instances and a Smart Grid simulator. This work describes the design and implementation of Bus.py, discusses its usefulness in terms of some Smart Grid scenarios, and provides an example of an aggregator-based residential demand response system interacting with GridLAB-D through Bus.py. The small scale example demonstrates the validity of the interface and shows that an aggregator using said interface is able to control residential loads in GridLAB-D during runtime to cause a reduction in the peak load on the distribution system in (a) peak reduction and (b) time-of-use pricing cases.« less

  3. ModSAF Programmers Reference Manual. Volume 1

    DTIC Science & Technology

    1993-12-20

    Army Simulation Training, and nsrmtao Command (SlWCOM) 12350 Research Parkway Orlando, FL 32826-3276 Preparedby: -IM cem | ADST Pmgram Office 12151-A... Research Parkway r ’-. , 94-24445 Olan, FL 382 H~~ ll/iIIEiitIilI! ’i III, 94 8 0’ 0SD ADST-TR-W003268 ADVANCED DISTRIBUTED SIMULATION TECHNOLOGY...A001 Prepared for: U.S. Army Simulation, Training, and Instrumentation Command (STRICOM) 12350 Research Parkway Orlando, FL 32826-3276 Accesion For

  4. Monitoring the distribution of prompt gamma rays in boron neutron capture therapy using a multiple-scattering Compton camera: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Lee, Taewoong; Lee, Hyounggun; Lee, Wonho

    2015-10-01

    This study evaluated the use of Compton imaging technology to monitor prompt gamma rays emitted by 10B in boron neutron capture therapy (BNCT) applied to a computerized human phantom. The Monte Carlo method, including particle-tracking techniques, was used for simulation. The distribution of prompt gamma rays emitted by the phantom during irradiation with neutron beams is closely associated with the distribution of the boron in the phantom. Maximum likelihood expectation maximization (MLEM) method was applied to the information obtained from the detected prompt gamma rays to reconstruct the distribution of the tumor including the boron uptake regions (BURs). The reconstructed Compton images of the prompt gamma rays were combined with the cross-sectional images of the human phantom. Quantitative analysis of the intensity curves showed that all combined images matched the predetermined conditions of the simulation. The tumors including the BURs were distinguishable if they were more than 2 cm apart.

  5. Secure Large-Scale Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Dan (Technical Monitor)

    2001-01-01

    To fully conduct research that will support the far-term concepts, technologies and methods required to improve the safety of Air Transportation a simulation environment of the requisite degree of fidelity must first be in place. The Virtual National Airspace Simulation (VNAS) will provide the underlying infrastructure necessary for such a simulation system. Aerospace-specific knowledge management services such as intelligent data-integration middleware will support the management of information associated with this complex and critically important operational environment. This simulation environment, in conjunction with a distributed network of supercomputers, and high-speed network connections to aircraft, and to Federal Aviation Administration (FAA), airline and other data-sources will provide the capability to continuously monitor and measure operational performance against expected performance. The VNAS will also provide the tools to use this performance baseline to obtain a perspective of what is happening today and of the potential impact of proposed changes before they are introduced into the system.

  6. Computer Modeling to Evaluate the Impact of Technology Changes on Resident Procedural Volume.

    PubMed

    Grenda, Tyler R; Ballard, Tiffany N S; Obi, Andrea T; Pozehl, William; Seagull, F Jacob; Chen, Ryan; Cohn, Amy M; Daskin, Mark S; Reddy, Rishindra M

    2016-12-01

    As resident "index" procedures change in volume due to advances in technology or reliance on simulation, it may be difficult to ensure trainees meet case requirements. Training programs are in need of metrics to determine how many residents their institutional volume can support. As a case study of how such metrics can be applied, we evaluated a case distribution simulation model to examine program-level mediastinoscopy and endobronchial ultrasound (EBUS) volumes needed to train thoracic surgery residents. A computer model was created to simulate case distribution based on annual case volume, number of trainees, and rotation length. Single institutional case volume data (2011-2013) were applied, and 10 000 simulation years were run to predict the likelihood (95% confidence interval) of all residents (4 trainees) achieving board requirements for operative volume during a 2-year program. The mean annual mediastinoscopy volume was 43. In a simulation of pre-2012 board requirements (thoracic pathway, 25; cardiac pathway, 10), there was a 6% probability of all 4 residents meeting requirements. Under post-2012 requirements (thoracic, 15; cardiac, 10), however, the likelihood increased to 88%. When EBUS volume (mean 19 cases per year) was concurrently evaluated in the post-2012 era (thoracic, 10; cardiac, 0), the likelihood of all 4 residents meeting case requirements was only 23%. This model provides a metric to predict the probability of residents meeting case requirements in an era of changing volume by accounting for unpredictable and inequitable case distribution. It could be applied across operations, procedures, or disease diagnoses and may be particularly useful in developing resident curricula and schedules.

  7. Application of Microsoft's ActiveX and DirectX technologies to the visulization of physical system dynamics

    NASA Astrophysics Data System (ADS)

    Mann, Christopher; Narasimhamurthi, Natarajan

    1998-08-01

    This paper discusses a specific implementation of a web and complement based simulation systems. The overall simulation container is implemented within a web page viewed with Microsoft's Internet Explorer 4.0 web browser. Microsoft's ActiveX/Distributed Component Object Model object interfaces are used in conjunction with the Microsoft DirectX graphics APIs to provide visualization functionality for the simulation. The MathWorks' Matlab computer aided control system design program is used as an ActiveX automation server to provide the compute engine for the simulations.

  8. Investigating the impact of visuohaptic simulations for the conceptual understanding of electric field for distributed charges

    NASA Astrophysics Data System (ADS)

    Shaikh, Uzma Abdul Sattar

    The present study assessed the benefits of a multisensory intervention on the conceptual understanding of electric field for distributed charges in engineering and technology undergraduate students. A novel visuohaptic intervention was proposed, which focused on exploring the forces around the different electric field configurations for distributed charges namely point, infinitely long line and uniformly charged ring. The before and after effects of the visuohaptic intervention are compared, wherein the intervention includes instructional scaffolding. Three single-group studies were conducted to investigate the effect among three different populations: (a) Undergraduate engineering students, (b) Undergraduate technology students and (c) Undergraduate engineering technology students from a different demographic setting. The findings from the three studies suggests that the haptic modality intervention provides beneficial effects by allowing students to improve their conceptual understanding of electric field for distributed charges, although students from groups (b) and (c) showed a statistically significant increase in the conceptual understanding. The findings also indicate a positive learning perception among all the three groups.

  9. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  10. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  11. Lunar Regolith Simulant Materials: Recommendations for Standardization, Production, and Usage

    NASA Technical Reports Server (NTRS)

    Sibille, L.; Carpenter, P.; Schlagheck, R.; French, R. A.

    2006-01-01

    Experience gained during the Apollo program demonstrated the need for extensive testing of surface systems in relevant environments, including regolith materials similar to those encountered on the lunar surface. As NASA embarks on a return to the Moon, it is clear that the current lunar sample inventory is not only insufficient to support lunar surface technology and system development, but its scientific value is too great to be consumed by destructive studies. Every effort must be made to utilize standard simulant materials, which will allow developers to reduce the cost, development, and operational risks to surface systems. The Lunar Regolith Simulant Materials Workshop held in Huntsville, AL, on January 24 26, 2005, identified the need for widely accepted standard reference lunar simulant materials to perform research and development of technologies required for lunar operations. The workshop also established a need for a common, traceable, and repeatable process regarding the standardization, characterization, and distribution of lunar simulants. This document presents recommendations for the standardization, production and usage of lunar regolith simulant materials.

  12. Distributed decision support for the 21st century mission space

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2002-07-01

    The past decade has produced significant changes in the conduct of military operations: increased humanitarian missions, asymmetric warfare, the reliance on coalitions and allies, stringent rules of engagement, concern about casualties, and the need for sustained air operations. Future mission commanders will need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Integral to this process is creating situational assessment-understanding the mission space, simulation to analyze alternative futures, current capabilities, planning assessments, course-of-action assessments, and a common operational picture-keeping everyone on the same sheet of paper. Decision support tools in a distributed collaborative environment offer the capability of decomposing these complex multitask processes and distributing them over a dynamic set of execution assets. Decision support technologies can semi-automate activities, such as planning an operation, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that is not currently fused. The marriage of information and simulation technologies provides the mission commander with a collaborative virtual environment for planning and decision support.

  13. A study and simulation of the impact of high-order aberrations to overlay error distribution

    NASA Astrophysics Data System (ADS)

    Sun, G.; Wang, F.; Zhou, C.

    2011-03-01

    With reduction of design rules, a number of corresponding new technologies, such as i-HOPC, HOWA and DBO have been proposed and applied to eliminate overlay error. When these technologies are in use, any high-order error distribution needs to be clearly distinguished in order to remove the underlying causes. Lens aberrations are normally thought to mainly impact the Matching Machine Overlay (MMO). However, when using Image-Based overlay (IBO) measurement tools, aberrations become the dominant influence on single machine overlay (SMO) and even on stage repeatability performance. In this paper, several measurements of the error distributions of the lens of SMEE SSB600/10 prototype exposure tool are presented. Models that characterize the primary influence from lens magnification, high order distortion, coma aberration and telecentricity are shown. The contribution to stage repeatability (as measured with IBO tools) from the above errors was predicted with simulator and compared to experiments. Finally, the drift of every lens distortion that impact to SMO over several days was monitored and matched with the result of measurements.

  14. Distributed automatic control of technological processes in conditions of weightlessness

    NASA Technical Reports Server (NTRS)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  15. LDRD project final report : hybrid AI/cognitive tactical behavior framework for LVC.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Djordjevich, Donna D.; Xavier, Patrick Gordon; Brannon, Nathan Gregory

    This Lab-Directed Research and Development (LDRD) sought to develop technology that enhances scenario construction speed, entity behavior robustness, and scalability in Live-Virtual-Constructive (LVC) simulation. We investigated issues in both simulation architecture and behavior modeling. We developed path-planning technology that improves the ability to express intent in the planning task while still permitting an efficient search algorithm. An LVC simulation demonstrated how this enables 'one-click' layout of squad tactical paths, as well as dynamic re-planning for simulated squads and for real and simulated mobile robots. We identified human response latencies that can be exploited in parallel/distributed architectures. We did an experimentalmore » study to determine where parallelization would be productive in Umbra-based force-on-force (FOF) simulations. We developed and implemented a data-driven simulation composition approach that solves entity class hierarchy issues and supports assurance of simulation fairness. Finally, we proposed a flexible framework to enable integration of multiple behavior modeling components that model working memory phenomena with different degrees of sophistication.« less

  16. SAMICS marketing and distribution model

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A SAMICS (Solar Array Manufacturing Industry Costing Standards) was formulated as a computer simulation model. Given a proper description of the manufacturing technology as input, this model computes the manufacturing price of solar arrays for a broad range of production levels. This report presents a model for computing these marketing and distribution costs, the end point of the model being the loading dock of the final manufacturer.

  17. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  18. Trans-oceanic Remote Power Hardware-in-the-Loop: Multi-site Hardware, Integrated Controller, and Electric Network Co-simulation

    DOE PAGES

    Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...

    2017-07-24

    Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less

  19. Advanced distributed simulation technology: Digital Voice Gateway Reference Guide

    NASA Astrophysics Data System (ADS)

    Vanhook, Dan; Stadler, Ed

    1994-01-01

    The Digital Voice Gateway (referred to as the 'DVG' in this document) transmits and receives four full duplex encoded speech channels over the Ethernet. The information in this document applies only to DVG's running firmware of the version listed on the title page. This document, previously named Digital Voice Gateway Reference Guide, BBN Systems and Technologies Corporation, Cambridge, MA 02138, was revised for revision 2.00. This new revision changes the network protocol used by the DVG, to comply with the SINCGARS radio simulation (For SIMNET 6.6.1). Because of the extensive changes to revision 2.00 a separate document was created rather than supplying change pages.

  20. Stay Alive--Simulation for Situational Safety Awareness

    NASA Technical Reports Server (NTRS)

    Ruder, Michelle

    2008-01-01

    STAY ALIVE is an idea for a safety awareness simulation prototype, powered by gaming technology, that would make safety training enlightening, engaging and fun. Recalling initial instructions and using situational awareness principles, participants would escape a fire by choosing the appropriate door. Escape times would be measured while stressors increased. This presentation describes how STAY ALIVE utilizes first person point of view (PoV), a generic scenario, immersion- and presence-enhancing design, and ease of distribution to provide more people opportunity to realize, review, analyze and practice effective awareness behaviors. The goals for this prototype include facilitating interest in first-person PoV safety training and eliciting further suggestions on prevention technologies.

  1. On validating remote sensing simulations using coincident real data

    NASA Astrophysics Data System (ADS)

    Wang, Mingming; Yao, Wei; Brown, Scott; Goodenough, Adam; van Aardt, Jan

    2016-05-01

    The remote sensing community often requires data simulation, either via spectral/spatial downsampling or through virtual, physics-based models, to assess systems and algorithms. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model is one such first-principles, physics-based model for simulating imagery for a range of modalities. Complex simulation of vegetation environments subsequently has become possible, as scene rendering technology and software advanced. This in turn has created questions related to the validity of such complex models, with potential multiple scattering, bidirectional distribution function (BRDF), etc. phenomena that could impact results in the case of complex vegetation scenes. We selected three sites, located in the Pacific Southwest domain (Fresno, CA) of the National Ecological Observatory Network (NEON). These sites represent oak savanna, hardwood forests, and conifer-manzanita-mixed forests. We constructed corresponding virtual scenes, using airborne LiDAR and imaging spectroscopy data from NEON, ground-based LiDAR data, and field-collected spectra to characterize the scenes. Imaging spectroscopy data for these virtual sites then were generated using the DIRSIG simulation environment. This simulated imagery was compared to real AVIRIS imagery (15m spatial resolution; 12 pixels/scene) and NEON Airborne Observation Platform (AOP) data (1m spatial resolution; 180 pixels/scene). These tests were performed using a distribution-comparison approach for select spectral statistics, e.g., established the spectra's shape, for each simulated versus real distribution pair. The initial comparison results of the spectral distributions indicated that the shapes of spectra between the virtual and real sites were closely matched.

  2. Ship electric propulsion simulator based on networking technology

    NASA Astrophysics Data System (ADS)

    Zheng, Huayao; Huang, Xuewu; Chen, Jutao; Lu, Binquan

    2006-11-01

    According the new ship building tense, a novel electric propulsion simulator (EPS) had been developed in Marine Simulation Center of SMU. The architecture, software function and FCS network technology of EPS and integrated power system (IPS) were described. In allusion to the POD propeller in ship, a special physical model was built. The POD power was supplied from the simulative 6.6 kV Medium Voltage Main Switchboard, its control could be realized in local or remote mode. Through LAN, the simulated feature information of EPS will pass to the physical POD model, which would reflect the real thruster working status in different sea conditions. The software includes vessel-propeller math module, thruster control system, distribution and emergency integrated management, double closed loop control system, vessel static water resistance and dynamic software; instructor main control software. The monitor and control system is realized by real time data collection system and CAN bus technology. During the construction, most devices such as monitor panels and intelligent meters, are developed in lab which were based on embedded microcomputer system with CAN interface to link the network. They had also successfully used in practice and would be suitable for the future demands of digitalization ship.

  3. A 3D radiative transfer model based on lidar data and its application on hydrological and ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Li, W.; Su, Y.; Harmon, T. C.; Guo, Q.

    2013-12-01

    Light Detection and Ranging (lidar) is an optical remote sensing technology that measures properties of scattered light to find range and/or other information of a distant object. Due to its ability to generate 3-dimensional data with high spatial resolution and accuracy, lidar technology is being increasingly used in ecology, geography, geology, geomorphology, seismology, remote sensing, and atmospheric physics. In this study we construct a 3-dimentional (3D) radiative transfer model (RTM) using lidar data to simulate the spatial distribution of solar radiation (direct and diffuse) on the surface of water and mountain forests. The model includes three sub-models: a light model simulating the light source, a sensor model simulating the camera, and a scene model simulating the landscape. We use ground-based and airborne lidar data to characterize the 3D structure of the study area, and generate a detailed 3D scene model. The interactions between light and object are simulated using the Monte Carlo Ray Tracing (MCRT) method. A large number of rays are generated from the light source. For each individual ray, the full traveling path is traced until it is absorbed or escapes from the scene boundary. By locating the sensor at different positions and directions, we can simulate the spatial distribution of solar energy at the ground, vegetation and water surfaces. These outputs can then be incorporated into meteorological drivers for hydrologic and energy balance models to improve our understanding of hydrologic processes and ecosystem functions.

  4. Proceedings of the Organization of 1990 Meeting of International Neural Network Society Jointed with IEEE Held in Washington, DC on January 15 - 19, 1990. Volume 2. Applications Track.

    DTIC Science & Technology

    1990-11-30

    Simonotto Universita’ di Genova Learning from Natural Selection in an Artificial Environment ...................................................... 1...11-92 Ethem Alpaydin Swiss Federal Institute of Technology Framework for Distributed Artificial Neural System Simulation...11-129 David Y. Fong Lockheed Missiles and Space Co. and Christopher Tocci Raytheon Co. Simulation of Artificial Neural

  5. Rapid Technology Assessment via Unified Deployment of Global Optical and Virtual Diagnostics

    NASA Technical Reports Server (NTRS)

    Jordan, Jeffrey D.; Watkins, A. Neal; Fleming, Gary A.; Leighty, Bradley D.; Schwartz, Richard J.; Ingram, JoAnne L.; Grinstead, Keith D., Jr.; Oglesby, Donald M.; Tyler, Charles

    2003-01-01

    This paper discusses recent developments in rapid technology assessment resulting from an active collaboration between researchers at the Air Force Research Laboratory (AFRL) at Wright Patterson Air Force Base (WPAFB) and the NASA Langley Research Center (LaRC). This program targets the unified development and deployment of global measurement technologies coupled with a virtual diagnostic interface to enable the comparative evaluation of experimental and computational results. Continuing efforts focus on the development of seamless data translation methods to enable integration of data sets of disparate file format in a common platform. Results from a successful low-speed wind tunnel test at WPAFB in which global surface pressure distributions were acquired simultaneously with model deformation and geometry measurements are discussed and comparatively evaluated with numerical simulations. Intensity- and lifetime-based pressure-sensitive paint (PSP) and projection moire interferometry (PMI) results are presented within the context of rapid technology assessment to enable simulation-based R&D.

  6. Effects of welding technology on welding stress based on the finite element method

    NASA Astrophysics Data System (ADS)

    Fu, Jianke; Jin, Jun

    2017-01-01

    Finite element method is used to simulate the welding process under four different conditions of welding flat butt joints. Welding seams are simulated with birth and death elements. The size and distribution of welding residual stress is obtained in the four kinds of welding conditions by Q345 manganese steel plate butt joint of the work piece. The results shown that when using two-layers welding,the longitudinal and transverse residual stress were reduced;When welding from Middle to both sides,the residual stress distribution will change,and the residual stress in the middle of the work piece was reduced.

  7. Simulations and experiments on gas adsorption in novel microporous polymers

    NASA Astrophysics Data System (ADS)

    Larsen, Gregory Steven

    Microporous materials represent a fascinating class of materials with a broad range of applications. The work presented here focuses on the use of a novel class of microporous material known as polymers of intrinsic micrioporosity, or PIMs, for use in gas separation and storage technologies. The aim of this research is to develop a detailed understanding of the relationship between the monomeric structure and the adsorptive performance of PIMs. First, a generalizable structure generation technique was developed such that simulation samples of PIM-1 recreated experimental densities, scattering, surface areas, pore size distributions, and adsorption isotherms. After validation, the simulations were applied as virtual experiments on several new PIMs with the intent to screen their capabilities as adsorbent materials and elucidate design principles for linear PIMs. The simulations are useful in understanding the unique properties such as pore size distribution and scattering observed experimentally.

  8. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  9. Collaborative environments for capability-based planning

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2005-05-01

    Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.

  10. Sensitivity field distributions for segmental bioelectrical impedance analysis based on real human anatomy

    NASA Astrophysics Data System (ADS)

    Danilov, A. A.; Kramarenko, V. K.; Nikolaev, D. V.; Rudnev, S. G.; Salamatova, V. Yu; Smirnov, A. V.; Vassilevski, Yu V.

    2013-04-01

    In this work, an adaptive unstructured tetrahedral mesh generation technology is applied for simulation of segmental bioimpedance measurements using high-resolution whole-body model of the Visible Human Project man. Sensitivity field distributions for a conventional tetrapolar, as well as eight- and ten-electrode measurement configurations are obtained. Based on the ten-electrode configuration, we suggest an algorithm for monitoring changes in the upper lung area.

  11. Technology evaluation, assessment, modeling, and simulation: the TEAMS capability

    NASA Astrophysics Data System (ADS)

    Holland, Orgal T.; Stiegler, Robert L.

    1998-08-01

    The United States Marine Corps' Technology Evaluation, Assessment, Modeling and Simulation (TEAMS) capability, located at the Naval Surface Warfare Center in Dahlgren Virginia, provides an environment for detailed test, evaluation, and assessment of live and simulated sensor and sensor-to-shooter systems for the joint warfare community. Frequent use of modeling and simulation allows for cost effective testing, bench-marking, and evaluation of various levels of sensors and sensor-to-shooter engagements. Interconnectivity to live, instrumented equipment operating in real battle space environments and to remote modeling and simulation facilities participating in advanced distributed simulations (ADS) exercises is available to support a wide- range of situational assessment requirements. TEAMS provides a valuable resource for a variety of users. Engineers, analysts, and other technology developers can use TEAMS to evaluate, assess and analyze tactical relevant phenomenological data on tactical situations. Expeditionary warfare and USMC concept developers can use the facility to support and execute advanced warfighting experiments (AWE) to better assess operational maneuver from the sea (OMFTS) concepts, doctrines, and technology developments. Developers can use the facility to support sensor system hardware, software and algorithm development as well as combat development, acquisition, and engineering processes. Test and evaluation specialists can use the facility to plan, assess, and augment their processes. This paper presents an overview of the TEAMS capability and focuses specifically on the technical challenges associated with the integration of live sensor hardware into a synthetic environment and how those challenges are being met. Existing sensors, recent experiments and facility specifications are featured.

  12. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs source is also compared with the experimental data.

  13. Dry Volume Fracturing Simulation of Shale Gas Reservoir

    NASA Astrophysics Data System (ADS)

    Xu, Guixi; Wang, Shuzhong; Luo, Xiangrong; Jing, Zefeng

    2017-11-01

    Application of CO2 dry fracturing technology to shale gas reservoir development in China has advantages of no water consumption, little reservoir damage and promoting CH4 desorption. This paper uses Meyer simulation to study complex fracture network extension and the distribution characteristics of shale gas reservoirs in the CO2 dry volume fracturing process. The simulation results prove the validity of the modified CO2 dry fracturing fluid used in shale volume fracturing and provides a theoretical basis for the following study on interval optimization of the shale reservoir dry volume fracturing.

  14. Distributed Observer Network

    NASA Technical Reports Server (NTRS)

    Conroy, Michael; Mazzone, Rebecca; Little, William; Elfrey, Priscilla; Mann, David; Mabie, Kevin; Cuddy, Thomas; Loundermon, Mario; Spiker, Stephen; McArthur, Frank; hide

    2010-01-01

    The Distributed Observer network (DON) is a NASA-collaborative environment that leverages game technology to bring three-dimensional simulations to conventional desktop and laptop computers in order to allow teams of engineers working on design and operations, either individually or in groups, to view and collaborate on 3D representations of data generated by authoritative tools such as Delmia Envision, Pro/Engineer, or Maya. The DON takes models and telemetry from these sources and, using commercial game engine technology, displays the simulation results in a 3D visual environment. DON has been designed to enhance accessibility and user ability to observe and analyze visual simulations in real time. A variety of NASA mission segment simulations [Synergistic Engineering Environment (SEE) data, NASA Enterprise Visualization Analysis (NEVA) ground processing simulations, the DSS simulation for lunar operations, and the Johnson Space Center (JSC) TRICK tool for guidance, navigation, and control analysis] were experimented with. Desired functionalities, [i.e. Tivo-like functions, the capability to communicate textually or via Voice-over-Internet Protocol (VoIP) among team members, and the ability to write and save notes to be accessed later] were targeted. The resulting DON application was slated for early 2008 release to support simulation use for the Constellation Program and its teams. Those using the DON connect through a client that runs on their PC or Mac. This enables them to observe and analyze the simulation data as their schedule allows, and to review it as frequently as desired. DON team members can move freely within the virtual world. Preset camera points can be established, enabling team members to jump to specific views. This improves opportunities for shared analysis of options, design reviews, tests, operations, training, and evaluations, and improves prospects for verification of requirements, issues, and approaches among dispersed teams.

  15. A Guide for Developing Human-Robot Interaction Experiments in the Robotic Interactive Visualization and Experimentation Technology (RIVET) Simulation

    DTIC Science & Technology

    2016-05-01

    research, Kunkler (2006) suggested that the similarities between computer simulation tools and robotic surgery systems (e.g., mechanized feedback...distribution is unlimited. 49 Davies B. A review of robotics in surgery . Proceedings of the Institution of Mechanical Engineers, Part H: Journal...ARL-TR-7683 ● MAY 2016 US Army Research Laboratory A Guide for Developing Human- Robot Interaction Experiments in the Robotic

  16. Experimental and numerical modeling research of rubber material during microwave heating process

    NASA Astrophysics Data System (ADS)

    Chen, Hailong; Li, Tao; Li, Kunling; Li, Qingling

    2018-05-01

    This paper aims to investigate the heating behaviors of block rubber by experimental and simulated method. The COMSOL Multiphysics 5.0 software was utilized in numerical simulation work. The effects of microwave frequency, power and sample size on temperature distribution are examined. The effect of frequency on temperature distribution is obvious. The maximum and minimum temperatures of block rubber increase first and then decrease with frequency increasing. The microwave heating efficiency is maximum in the microwave frequency of 2450 MHz. However, more uniform temperature distribution is presented in other microwave frequencies. The influence of microwave power on temperature distribution is also remarkable. The smaller the power, the more uniform the temperature distribution on the block rubber. The effect of power on microwave heating efficiency is not obvious. The effect of sample size on temperature distribution is evidently found. The smaller the sample size, the more uniform the temperature distribution on the block rubber. However, the smaller the sample size, the lower the microwave heating efficiency. The results can serve as references for the research on heating rubber material by microwave technology.

  17. About the Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  18. Director, ARNG

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  19. Effects of system size and cooling rate on the structure and properties of sodium borosilicate glasses from molecular dynamics simulations.

    PubMed

    Deng, Lu; Du, Jincheng

    2018-01-14

    Borosilicate glasses form an important glass forming system in both glass science and technologies. The structure and property changes of borosilicate glasses as a function of thermal history in terms of cooling rate during glass formation and simulation system sizes used in classical molecular dynamics (MD) simulation were investigated with recently developed composition dependent partial charge potentials. Short and medium range structural features such as boron coordination, Si and B Q n distributions, and ring size distributions were analyzed to elucidate the effects of cooling rate and simulation system size on these structure features and selected glass properties such as glass transition temperature, vibration density of states, and mechanical properties. Neutron structure factors, neutron broadened pair distribution functions, and vibrational density of states were calculated and compared with results from experiments as well as ab initio calculations to validate the structure models. The results clearly indicate that both cooling rate and system size play an important role on the structures of these glasses, mainly by affecting the 3 B and 4 B distributions and consequently properties of the glasses. It was also found that different structure features and properties converge at different sizes or cooling rates; thus convergence tests are needed in simulations of the borosilicate glasses depending on the targeted properties. The results also shed light on the complex thermal history dependence on structure and properties in borosilicate glasses and the protocols in MD simulations of these and other glass materials.

  20. Effects of system size and cooling rate on the structure and properties of sodium borosilicate glasses from molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Deng, Lu; Du, Jincheng

    2018-01-01

    Borosilicate glasses form an important glass forming system in both glass science and technologies. The structure and property changes of borosilicate glasses as a function of thermal history in terms of cooling rate during glass formation and simulation system sizes used in classical molecular dynamics (MD) simulation were investigated with recently developed composition dependent partial charge potentials. Short and medium range structural features such as boron coordination, Si and B Qn distributions, and ring size distributions were analyzed to elucidate the effects of cooling rate and simulation system size on these structure features and selected glass properties such as glass transition temperature, vibration density of states, and mechanical properties. Neutron structure factors, neutron broadened pair distribution functions, and vibrational density of states were calculated and compared with results from experiments as well as ab initio calculations to validate the structure models. The results clearly indicate that both cooling rate and system size play an important role on the structures of these glasses, mainly by affecting the 3B and 4B distributions and consequently properties of the glasses. It was also found that different structure features and properties converge at different sizes or cooling rates; thus convergence tests are needed in simulations of the borosilicate glasses depending on the targeted properties. The results also shed light on the complex thermal history dependence on structure and properties in borosilicate glasses and the protocols in MD simulations of these and other glass materials.

  1. Distributed Monte Carlo production for DZero

    NASA Astrophysics Data System (ADS)

    Snow, Joel; DØ Collaboration

    2010-04-01

    The DZero collaboration uses a variety of resources on four continents to pursue a strategy of flexibility and automation in the generation of simulation data. This strategy provides a resilient and opportunistic system which ensures an adequate and timely supply of simulation data to support DZero's physics analyses. A mixture of facilities, dedicated and opportunistic, specialized and generic, large and small, grid job enabled and not, are used to provide a production system that has adapted to newly developing technologies. This strategy has increased the event production rate by a factor of seven and the data production rate by a factor of ten in the last three years despite diminishing manpower. Common to all production facilities is the SAM (Sequential Access to Metadata) data-grid. Job submission to the grid uses SAMGrid middleware which may forward jobs to the OSG, the WLCG, or native SAMGrid sites. The distributed computing and data handling system used by DZero will be described and the results of MC production since the deployment of grid technologies will be presented.

  2. Hermite-Gaussian beams with self-forming spiral phase distribution

    NASA Astrophysics Data System (ADS)

    Zinchik, Alexander A.; Muzychenko, Yana B.

    2014-05-01

    Spiral laser beams is a family of laser beams that preserve the structural stability up to scale and rotate with the propagation. Properties of spiral beams are of practical interest for laser technology, medicine and biotechnology. Researchers use a spiral beams for movement and manipulation of microparticles. Spiral beams have a complicated phase distribution in cross section. This paper describes the results of analytical and computer simulation of Hermite-Gaussian beams with self-forming spiral phase distribution. In the simulation used a laser beam consisting of the sum of the two modes HG TEMnm and TEMn1m1. The coefficients n1, n, m1, m were varied. Additional phase depending from the coefficients n, m, m1, n1 imposed on the resulting beam. As a result, formed the Hermite Gaussian beam phase distribution which takes the form of a spiral in the process of distribution. For modeling was used VirtualLab 5.0 (manufacturer LightTrans GmbH).

  3. CAD and CAE Analysis for Siphon Jet Toilet

    NASA Astrophysics Data System (ADS)

    Wang, Yuhua; Xiu, Guoji; Tan, Haishu

    The high precision 3D laser scanner with the dual CCD technology was used to measure the original design sample of a siphon jet toilet. The digital toilet model was constructed from the cloud data measured with the curve and surface fitting technology and the CAD/CAE systems. The Realizable k - ɛ double equation model of the turbulence viscosity coefficient method and the VOF multiphase flow model were used to simulate the flushing flow in the toilet digital model. Through simulating and analyzing the distribution of the flushing flow's total pressure, the flow speed at the toilet-basin surface and the siphoning bent tube, the toilet performance can be evaluated efficiently and conveniently. The method of "establishing digital model, flushing flow simulating, performances evaluating, function shape modifying" would provide a high efficiency approach to develop new water-saving toilets.

  4. Autonomous Information Fading and Provision to Achieve High Response Time in Distributed Information Systems

    NASA Astrophysics Data System (ADS)

    Lu, Xiaodong; Arfaoui, Helene; Mori, Kinji

    In highly dynamic electronic commerce environment, the need for adaptability and rapid response time to information service systems has become increasingly important. In order to cope with the continuously changing conditions of service provision and utilization, Faded Information Field (FIF) has been proposed. FIF is a distributed information service system architecture, sustained by push/pull mobile agents to bring high-assurance of services through a recursive demand-oriented provision of the most popular information closer to the users to make a tradeoff between the cost of information service allocation and access. In this paper, based on the analysis of the relationship that exists among the users distribution, information provision and access time, we propose the technology for FIF design to resolve the competing requirements of users and providers to improve users' access time. In addition, to achieve dynamic load balancing with changing users preference, the autonomous information reallocation technology is proposed. We proved the effectiveness of the proposed technology through the simulation and comparison with the conventional system.

  5. Autonomous Decentralized Voltage Profile Control of Super Distributed Energy System using Multi-agent Technology

    NASA Astrophysics Data System (ADS)

    Tsuji, Takao; Hara, Ryoichi; Oyama, Tsutomu; Yasuda, Keiichiro

    A super distributed energy system is a future energy system in which the large part of its demand is fed by a huge number of distributed generators. At one time some nodes in the super distributed energy system behave as load, however, at other times they behave as generator - the characteristic of each node depends on the customers' decision. In such situation, it is very difficult to regulate voltage profile over the system due to the complexity of power flows. This paper proposes a novel control method of distributed generators that can achieve the autonomous decentralized voltage profile regulation by using multi-agent technology. The proposed multi-agent system employs two types of agent; a control agent and a mobile agent. Control agents generate or consume reactive power to regulate the voltage profile of neighboring nodes and mobile agents transmit the information necessary for VQ-control among the control agents. The proposed control method is tested through numerical simulations.

  6. Simulation in bronchoscopy: current and future perspectives.

    PubMed

    Nilsson, Philip Mørkeberg; Naur, Therese Maria Henriette; Clementsen, Paul Frost; Konge, Lars

    2017-01-01

    To provide an overview of current literature that informs how to approach simulation practice of bronchoscopy and discuss how findings from other simulation research can help inform the use of simulation in bronchoscopy training. We conducted a literature search on simulation training of bronchoscopy and divided relevant studies in three categories: 1) structuring simulation training in bronchoscopy, 2) assessment of competence in bronchoscopy training, and 3) development of cheap alternatives for bronchoscopy simulation. Bronchoscopy simulation is effective, and the training should be structured as distributed practice with mastery learning criteria (ie, training until a certain level of competence is achieved). Dyad practice (training in pairs) is possible and may increase utility of available simulators. Trainee performance should be assessed with assessment tools with established validity. Three-dimensional printing is a promising new technology opening possibilities for developing cheap simulators with innovative features.

  7. National Guard > Resources > Image Gallery

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  8. Guard News - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  9. I am the Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  10. Featured Videos - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  11. Videos - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  12. News Images - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  13. Office of the Joint Surgeon

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  14. Office of the Provost Marshal

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  15. State Websites - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  16. Helpful Links - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  17. MEETING REPORT: OMG Technical Committee Meeting in Orlando, FL, sees significant enhancement to CORBA

    NASA Astrophysics Data System (ADS)

    1998-06-01

    The Object Management Group (OMG) Platform Technology Committee (PTC) ratified its support for a new asynchronous messaging service for CORBA at OMG's recent Technical Committee Meeting in Orlando, FL. The meeting, held from 8 - 12 June, saw the PTC send the Messaging Service out for a final vote among the OMG membership. The Messaging Service, which will integrate Message Oriented Middleware (MOM) with CORBA, will give CORBA a true asynchronous messaging capability - something of great interest to users and developers. Formal adoption of the specification will most likely occur by the end of the year. The Messaging Service The Messaging Service, when adopted, will be the world's first standard for Message Oriented Middleware and will give CORBA a true asynchronous messaging capability. Asynchronous messaging allows developers to build simpler, richer client environments. With asynchronous messaging there is less need for multi-threaded clients because the Asynchronous Method Invocation is non-blocking, meaning the client thread can continue work while the application waits for a reply. David Curtis, Director of Platform Technology for OMG, said: `This messaging service is one of the more valuable additions to CORBA. It enhances CORBA's existing asynchronous messaging capabilities which is a feature of many popular message oriented middleware products. This service will allow better integration between ORBs and MOM products. This enhanced messaging capability will only make CORBA more valuable for builders of distributed object systems.' The Messaging Service is one of sixteen technologies currently being worked on by the PTC. Additionally, seventeen Revision Task Forces (RTFs) are working on keeping OMG specifications up to date. The purpose of these Revision Task Forces is to take input from the implementors of OMG specifications and clarify or make necessary changes based on the implementor's input. The RTFs also ensure that the specifications remain up to date with changes in the OMA and with industry advances in general. Domain work Thirty-eight technology processes are ongoing in the Domain Technology Committee (DTC). These range over a wide variety of industries, including healthcare, telecommunications, life sciences, manufacturing, business objects, electronic commerce, finance, transportation, utilities, and distributed simulation. These processes aim to enhance CORBA's value and provide interoperability for specific vertical industries. At the Orlando meeting, the Domain Technology Committee issued the following requests to industry: Telecom Wireless Access Request For Information (RFI); Statistics RFI; Clinical Image Access Service Request For Proposal (RFP); Distributed Simulation Request For Comment (RFC). The newly-formed Statistics group at OMG plans to standarize interfaces for Statistical Services in CORBA, and their RFI, to which any person or company can respond, asks for input and guidance as they start this work which will impact the broad spectrum of industries and processes which use statistics. The Clinical Image Access Service will standarize access to important medical images including digital x-rays, MRI scans, and other formats. The Distributed Simulation RFC, when complete, will establish the Distributed Simulation High-Level Architecture of the US Defense Military Simulation Office as an OMG standard. For the next 90 days any person or company, not only OMG members, may submit their comments on the submission. The OMG looks forward to its next meeting to be held in Helsinki, Finland, on 27 - 31 July and hosted by Nokia. OMG encourages anyone considering OMG membership to attend the meeting as a guest. For more information on attending call +1-508-820-4300 or e-mail info@omg.org. Note: descriptions for all RFPs, RFIs and RFCs in progress are available for viewing on the OMG Website at http://www.omg.org/schedule.htm, or contact OMG for a copy of the `Work in Progress' document. For more information on the OMG Technology Process please call Jeurgen Boldt, OMG Process Manager, at +1-508-820-4300 or email jeurgen@omg.org.

  18. National Guard > Leadership > Joint Staff > Special Staff > Chaplain

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  19. Family Programs News - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  20. About the Air National Guard - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  1. Overseas Operations News - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  2. State Partnership Program News - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  3. Your National Guard - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  4. Chief, National Guard Bureau - Leadership - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  5. The National Guard - Official Website of the National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  6. National Guard Bureau Posture Statement - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  7. Personal Staff - Joint Staff - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  8. Special Staff - Joint Staff - Leadership - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  9. Small Business Programs - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  10. Michigan field artillery's 'Blackjacks' training in Latvia > National Guard

    Science.gov Websites

    Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications Civic Leader's Guide ARNG Vision 2020 Posture Statement

  11. Central East Pacific Flight Routing

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon; Sridhar, Banavar; Kopardekar, Parimal; Cheng, Nadia

    2006-01-01

    With the introduction of the Federal Aviation Administration s Advanced Technology and Oceanic Procedures system at the Oakland Oceanic Center, a level of automation now exists in the oceanic environment to potentially begin accommodating increased user preferred routing requests. This paper presents the results of an initial feasibility assessment which examines the potential benefits of transitioning from the fixed Central East Pacific routes to user preferred routes. As a surrogate for the actual user-provided routing requests, a minimum-travel-time, wind-optimal dynamic programming algorithm was developed and utilized in this paper. After first describing the characteristics (e.g., origin airport, destination airport, vertical distribution and temporal distribution) of the westbound flights utilizing the Central East Pacific routes on Dec. 14-16 and 19-20, the results of both a flight-plan-based simulation and a wind-optimal-based simulation are presented. Whereas the lateral and longitudinal distribution of the aircraft trajectories in these two simulations varied dramatically, the number of simulated first-loss-of-separation events remained relatively constant. One area of concern that was uncovered in this initial analysis was a potential workload issue associated with the redistribution of traffic in the oceanic sectors due to thc prevailing wind patterns.

  12. CORBASec Used to Secure Distributed Aerospace Propulsion Simulations

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    The NASA Glenn Research Center and its industry partners are developing a Common Object Request Broker (CORBA) Security (CORBASec) test bed to secure their distributed aerospace propulsion simulations. Glenn has been working with its aerospace propulsion industry partners to deploy the Numerical Propulsion System Simulation (NPSS) object-based technology. NPSS is a program focused on reducing the cost and time in developing aerospace propulsion engines. It was developed by Glenn and is being managed by the NASA Ames Research Center as the lead center reporting directly to NASA Headquarters' Aerospace Technology Enterprise. Glenn is an active domain member of the Object Management Group: an open membership, not-for-profit consortium that produces and manages computer industry specifications (i.e., CORBA) for interoperable enterprise applications. When NPSS is deployed, it will assemble a distributed aerospace propulsion simulation scenario from proprietary analytical CORBA servers and execute them with security afforded by the CORBASec implementation. The NPSS CORBASec test bed was initially developed with the TPBroker Security Service product (Hitachi Computer Products (America), Inc., Waltham, MA) using the Object Request Broker (ORB), which is based on the TPBroker Basic Object Adaptor, and using NPSS software across different firewall products. The test bed has been migrated to the Portable Object Adaptor architecture using the Hitachi Security Service product based on the VisiBroker 4.x ORB (Borland, Scotts Valley, CA) and on the Orbix 2000 ORB (Dublin, Ireland, with U.S. headquarters in Waltham, MA). Glenn, GE Aircraft Engines, and Pratt & Whitney Aircraft are the initial industry partners contributing to the NPSS CORBASec test bed. The test bed uses Security SecurID (RSA Security Inc., Bedford, MA) two-factor token-based authentication together with Hitachi Security Service digital-certificate-based authentication to validate the various NPSS users. The test bed is expected to demonstrate NPSS CORBASec-specific policy functionality, confirm adequate performance, and validate the required Internet configuration in a distributed collaborative aerospace propulsion environment.

  13. Requirement of spatiotemporal resolution for imaging intracellular temperature distribution

    NASA Astrophysics Data System (ADS)

    Hiroi, Noriko; Tanimoto, Ryuichi; , Kaito, Ii; Ozeki, Mitsunori; Mashimo, Kota; Funahashi, Akira

    2017-04-01

    Intracellular temperature distribution is an emerging target in biology nowadays. Because thermal diffusion is rapid dynamics in comparison with molecular diffusion, we need a spatiotemporally high-resolution imaging technology to catch this phenomenon. We demonstrate that time-lapse imaging which consists of single-shot 3D volume images acquired at high-speed camera rate is desired for the imaging of intracellular thermal diffusion based on the simulation results of thermal diffusion from a nucleus to cytosol.

  14. Ultranarrow bandwidth spectral filtering for long-range free-space quantum key distribution at daytime.

    PubMed

    Höckel, David; Koch, Lars; Martin, Eugen; Benson, Oliver

    2009-10-15

    We describe a Fabry-Perot-based spectral filter for free-space quantum key distribution (QKD). A multipass etalon filter was built, and its performance was studied. The whole filter setup was carefully optimized to add less than 2 dB attenuation to a signal beam but block stray light by 21 dB. Simulations show that such a filter might be sufficient to allow QKD satellite downlinks during daytime with the current technology.

  15. A framework for stochastic simulation of distribution practices for hotel reservations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halkos, George E.; Tsilika, Kyriaki D.

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries.more » We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.« less

  16. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  17. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 2

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar (Compiler)

    1994-01-01

    This is the second volume of papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools; systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development; perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; and robotics technologies.

  18. Output characteristics of a series three-port axial piston pump

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaogang; Quan, Long; Yang, Yang; Wang, Chengbin; Yao, Liwei

    2012-05-01

    Driving a hydraulic cylinder directly by a closed-loop hydraulic pump is currently a key research area in the field of electro-hydraulic control technology, and it is the most direct means to improve the energy efficiency of an electro-hydraulic control system. So far, this technology has been well applied to the pump-controlled symmetric hydraulic cylinder. However, for the differential cylinder that is widely used in hydraulic technology, satisfactory results have not yet been achieved, due to the asymmetric flow constraint. Therefore, based on the principle of the asymmetric valve controlled asymmetric cylinder in valve controlled cylinder technology, an innovative idea for an asymmetric pump controlled asymmetric cylinder is put forward to address this problem. The scheme proposes to transform the oil suction window of the existing axial piston pump into two series windows. When in use, one window is connected to the rod chamber of the hydraulic cylinder and the other is linked with a low-pressure oil tank. This allows the differential cylinders to be directly controlled by changing the displacement or rotation speed of the pumps. Compared with the loop principle of offsetting the area difference of the differential cylinder through hydraulic valve using existing technology, this method may simplify the circuits and increase the energy efficiency of the system. With the software SimulationX, a hydraulic pump simulation model is set up, which examines the movement characteristics of an individual piston and the compressibility of oil, as well as the flow distribution area as it changes with the rotation angle. The pump structure parameters, especially the size of the unloading groove of the valve plate, are determined through digital simulation. All of the components of the series arranged three distribution-window axial piston pump are designed, based on the simulation analysis of the flow pulse characteristics of the pump, and then the prototype pump is made. The basic characteristics, such as the pressure, flow and noise of the pumps under different rotation speeds, are measured on the test bench. The test results verify the correctness of the principle. The proposed research lays a theoretical foundation for the further development of a new pump-controlled cylinder system.

  19. Running SW4 On New Commodity Technology Systems (CTS-1) Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, Arthur J.; Petersson, N. Anders; Pitarka, Arben

    We have recently been running earthquake ground motion simulations with SW4 on the new capacity computing systems, called the Commodity Technology Systems - 1 (CTS-1) at Lawrence Livermore National Laboratory (LLNL). SW4 is a fourth order time domain finite difference code developed by LLNL and distributed by the Computational Infrastructure for Geodynamics (CIG). SW4 simulates seismic wave propagation in complex three-dimensional Earth models including anelasticity and surface topography. We are modeling near-fault earthquake strong ground motions for the purposes of evaluating the response of engineered structures, such as nuclear power plants and other critical infrastructure. Engineering analysis of structures requiresmore » the inclusion of high frequencies which can cause damage, but are often difficult to include in simulations because of the need for large memory to model fine grid spacing on large domains.« less

  20. JANNAF 25th Airbreathing Propulsion Subcommittee, 37th Combustion Subcommittee and 1st Modeling and Simulation Subcommittee Joint Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Fry, Ronald S.; Becker, Dorothy L.

    2000-01-01

    Volume I, the first of three volumes, is a compilation of 24 unclassified/unlimited-distribution technical papers presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 25th Airbreathing Propulsion Subcommittee, 37th Combustion Subcommittee and 1st Modeling and Simulation Subcommittee (MSS) meeting held jointly with the 19th Propulsion Systems Hazards Subcommittee. The meeting was held 13-17 November 2000 at the Naval Postgraduate School and Hyatt Regency Hotel, Monterey, California. Topics covered include: a Keynote Address on Future Combat Systems, a review of the new JANNAF Modeling and Simulation Subcommittee, and technical papers on Hyper-X propulsion development and verification; GTX airbreathing launch vehicles; Hypersonic technology development, including program overviews, fuels for advanced propulsion, ramjet and scramjet research, hypersonic test medium effects; and RBCC engine design and performance, and PDE and UCAV advanced and combined cycle engine technologies.

  1. The Deflector Selector: A Machine Learning Framework for Prioritizing Hazardous Object Deflection Technology Development

    NASA Astrophysics Data System (ADS)

    Nesvold, Erika; Greenberg, Adam; Erasmus, Nicolas; Van Heerden, Elmarie; Galache, J. L.; Dahlstrom, Eric; Marchis, Franck

    2018-01-01

    Several technologies have been proposed for deflecting a hazardous Solar System object on a trajectory that would otherwise impact the Earth. The effectiveness of each technology depends on several characteristics of the given object, including its orbit and size. The distribution of these parameters in the likely population of Earth-impacting objects can thus determine which of the technologies are most likely to be useful in preventing a collision with the Earth. None of the proposed deflection technologies has been developed and fully tested in space. Developing every proposed technology is currently prohibitively expensive, so determining now which technologies are most likely to be effective would allow us to prioritize a subset of proposed deflection technologies for funding and development. We will present a new model, the Deflector Selector, that takes as its input the characteristics of a hazardous object or population of such objects and predicts which technology would be able to perform a successful deflection. The model consists of a machine-learning algorithm trained on data produced by N-body integrations simulating the deflections. We will describe the model and present the results of tests of the effectiveness of nuclear explosives, kinetic impactors, and gravity tractors on three simulated populations of hazardous objects.

  2. The Deflector Selector: A machine learning framework for prioritizing hazardous object deflection technology development

    NASA Astrophysics Data System (ADS)

    Nesvold, E. R.; Greenberg, A.; Erasmus, N.; van Heerden, E.; Galache, J. L.; Dahlstrom, E.; Marchis, F.

    2018-05-01

    Several technologies have been proposed for deflecting a hazardous Solar System object on a trajectory that would otherwise impact the Earth. The effectiveness of each technology depends on several characteristics of the given object, including its orbit and size. The distribution of these parameters in the likely population of Earth-impacting objects can thus determine which of the technologies are most likely to be useful in preventing a collision with the Earth. None of the proposed deflection technologies has been developed and fully tested in space. Developing every proposed technology is currently prohibitively expensive, so determining now which technologies are most likely to be effective would allow us to prioritize a subset of proposed deflection technologies for funding and development. We present a new model, the Deflector Selector, that takes as its input the characteristics of a hazardous object or population of such objects and predicts which technology would be able to perform a successful deflection. The model consists of a machine-learning algorithm trained on data produced by N-body integrations simulating the deflections. We describe the model and present the results of tests of the effectiveness of nuclear explosives, kinetic impactors, and gravity tractors on three simulated populations of hazardous objects.

  3. Distributed Fiber Optic Sensor for On-Line Monitoring of Coal Gasifier Refractory Health

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Anbo; Yu, Zhihao

    This report summarizes technical progress on the program “Distributed Fiber Optic Sensor for On-Line Monitoring of Coal Gasifier Refractory Health,” funded by the National Energy Technology Laboratory of the U.S. Department of Energy, and performed by the Center for Photonics Technology of the Bradley Department of Electrical and Computer Engineering at Virginia Tech. The scope of work entails analyses of traveling grating generation technologies in an optical fiber, as well as the interrogation of the gratings to infer a distributed temperature along the fiber, for the purpose of developing a real-time refractory health condition monitoring technology for coal gasifiers. Duringmore » the project period, which is from 2011-2015, three different sensing principles were studied, including four-wave mixing (FWM), coherent optical time-domain reflectometer (C-OTDR) and Brillouin optical time-domain analysis (BOTDA). By comparing the three methods, the BOTDA was selected for further development into a complete bench-top sensing system for the proposed high-temperature sensing application. Based on the input from Eastman Chemical, the industrial collaborator on this project, a cylindrical furnace was designed and constructed to simulate typical gasifier refractory temperature conditions in the laboratory, and verify the sensor’s capability to fully monitor refractory conditions on the back-side at temperatures up to 1000°C. In the later stages of the project, the sensing system was tested in the simulated environment for its sensing performance and high-temperature survivability. Through theoretical analyses and experimental research on the different factors affecting the sensor performance, a sensor field deployment strategy was proposed for possible future sensor field implementations.« less

  4. Aeronautical-Satellite-Assisted Process Being Developed for Information Exchange Through Network Technologies (Aero-SAPIENT)

    NASA Technical Reports Server (NTRS)

    Zernic, Michael J.

    2001-01-01

    Communications technologies are being developed to address safety issues during aviation travel. Some of these technologies enable the aircraft to be in constant bidirectional communications with necessary systems, people, and other aircraft that are not currently in place today. Networking technologies, wireless datalinks, and advanced avionics techniques are areas of particular importance that the NASA Glenn Research Center has contributed. Glenn, in conjunction with the NASA Ames Research Center, NASA Dryden Flight Research Center, and NASA Langley Research Center, is investigating methods and applications that would utilize these communications technologies. In mid-June 2000, the flight readiness of the network and communications technologies were demonstrated via a simulated aircraft. A van simulating an aircraft was equipped with advanced phased-array antennas (Advanced Communications/Air Traffic Management (AC/ATM) Advanced Air Transportation Technologies (AATT) project) that used commercial Ku-band satellite communications to connect Glenn, Dryden, and Ames in a combined system ground test. This test simulated air-ground bidirectional transport of real-time digital audio, text, and video data via a hybrid network configuration that demonstrated the flight readiness of the network and communications technologies. Specifically, a Controller Pilot Data Link Communications application was used with other applications to demonstrate a multiprotocol capability via Internet-protocol encapsulated ATN (Aeronautical Telecommunications Network) data packets. The significance of this combined ground test is its contribution to the Aero Information Technology Base Program Level I milestone (Software Technology investment area) of a real-time data link for the National Airspace System. The objective of this milestone was to address multiprotocol technology applicable for real-time data links between aircraft, a satellite, and the ground as well as the ability to distribute flight data with multilevel priorities among several sites.

  5. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudgins, Andrew P.; Carrillo, Ismael M.; Jin, Xin

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR)more » power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.« less

  6. A Perspective on Coupled Multiscale Simulation and Validation in Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. P. Short; D. Gaston; C. R. Stanek

    2014-01-01

    The field of nuclear materials encompasses numerous opportunities to address and ultimately solve longstanding industrial problems by improving the fundamental understanding of materials through the integration of experiments with multiscale modeling and high-performance simulation. A particularly noteworthy example is an ongoing study of axial power distortions in a nuclear reactor induced by corrosion deposits, known as CRUD (Chalk River unidentified deposits). We describe how progress is being made toward achieving scientific advances and technological solutions on two fronts. Specifically, the study of thermal conductivity of CRUD phases has augmented missing data as well as revealed new mechanisms. Additionally, the developmentmore » of a multiscale simulation framework shows potential for the validation of a new capability to predict the power distribution of a reactor, in effect direct evidence of technological impact. The material- and system-level challenges identified in the study of CRUD are similar to other well-known vexing problems in nuclear materials, such as irradiation accelerated corrosion, stress corrosion cracking, and void swelling; they all involve connecting materials science fundamentals at the atomistic- and mesoscales to technology challenges at the macroscale.« less

  7. From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems

    DTIC Science & Technology

    2015-03-13

    A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  8. An application of sedimentation simulation in Tahe oilfield

    NASA Astrophysics Data System (ADS)

    Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He

    2017-12-01

    The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.

  9. Reprint of “Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS”

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2013-01-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  10. Performance analysis of a model-sized superconducting DC transmission system based VSC-HVDC transmission technologies using RTDS

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Sung-Kyu; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun

    2012-08-01

    The combination of a high temperature superconducting DC power cable and a voltage source converter based HVDC (VSC-HVDC) creates a new option for transmitting power with multiple collection and distribution points for long distance and bulk power transmissions. It offers some greater advantages compared with HVAC or conventional HVDC transmission systems, and it is well suited for the grid integration of renewable energy sources in existing distribution or transmission systems. For this reason, a superconducting DC transmission system based HVDC transmission technologies is planned to be set up in the Jeju power system, Korea. Before applying this system to a real power system on Jeju Island, system analysis should be performed through a real time test. In this paper, a model-sized superconducting VSC-HVDC system, which consists of a small model-sized VSC-HVDC connected to a 2 m YBCO HTS DC model cable, is implemented. The authors have performed the real-time simulation method that incorporates the model-sized superconducting VSC-HVDC system into the simulated Jeju power system using Real Time Digital Simulator (RTDS). The performance analysis of the superconducting VSC-HVDC systems has been verified by the proposed test platform and the results were discussed in detail.

  11. Advanced Distributed Simulation Technology II (ADST-II) LAM Task Force DO #14 CDRL ABO3 After Action Report

    DTIC Science & Technology

    1997-01-17

    SHOWDirect Control Systems (6) Betacam SP Players (Video Backup) (6) Betacam SP Recorders (Show Record) (2) CRV Laser Disc Rec/Players (GoTo) (14) Multi...IK Scoops (3)lKDP’s (1) Schedule 40 Light Pole (Flown) Control Console Dimming Cables & Distribution PRODUCTION HARDWARE (1) Sony Betacam SP...Shooters Package (1) Folsom Hi-Res Video Scan Converter (20) Betacam SP VideoTapes STAGING HARDWARE (1) Custom Screen Divider / Support 44 This

  12. Parallel task processing of very large datasets

    NASA Astrophysics Data System (ADS)

    Romig, Phillip Richardson, III

    This research concerns the use of distributed computer technologies for the analysis and management of very large datasets. Improvements in sensor technology, an emphasis on global change research, and greater access to data warehouses all are increase the number of non-traditional users of remotely sensed data. We present a framework for distributed solutions to the challenges of datasets which exceed the online storage capacity of individual workstations. This framework, called parallel task processing (PTP), incorporates both the task- and data-level parallelism exemplified by many image processing operations. An implementation based on the principles of PTP, called Tricky, is also presented. Additionally, we describe the challenges and practical issues in modeling the performance of parallel task processing with large datasets. We present a mechanism for estimating the running time of each unit of work within a system and an algorithm that uses these estimates to simulate the execution environment and produce estimated runtimes. Finally, we describe and discuss experimental results which validate the design. Specifically, the system (a) is able to perform computation on datasets which exceed the capacity of any one disk, (b) provides reduction of overall computation time as a result of the task distribution even with the additional cost of data transfer and management, and (c) in the simulation mode accurately predicts the performance of the real execution environment.

  13. System-of-Systems Approach for Integrated Energy Systems Modeling and Simulation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Saurabh; Ruth, Mark; Pratt, Annabelle

    Today’s electricity grid is the most complex system ever built—and the future grid is likely to be even more complex because it will incorporate distributed energy resources (DERs) such as wind, solar, and various other sources of generation and energy storage. The complexity is further augmented by the possible evolution to new retail market structures that provide incentives to owners of DERs to support the grid. To understand and test new retail market structures and technologies such as DERs, demand-response equipment, and energy management systems while providing reliable electricity to all customers, an Integrated Energy System Model (IESM) is beingmore » developed at NREL. The IESM is composed of a power flow simulator (GridLAB-D), home energy management systems implemented using GAMS/Pyomo, a market layer, and hardware-in-the-loop simulation (testing appliances such as HVAC, dishwasher, etc.). The IESM is a system-of-systems (SoS) simulator wherein the constituent systems are brought together in a virtual testbed. We will describe an SoS approach for developing a distributed simulation environment. We will elaborate on the methodology and the control mechanisms used in the co-simulation illustrated by a case study.« less

  14. 40th Combat Aviation Brigade and USS Ponce conduct interoperability

    Science.gov Websites

    training exercise > National Guard > Guard News - The National Guard Skip to main content Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & ; Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle

  15. Judge Advocate (NGB-JA) - Personal Staff - Joint Staff - The National Guard

    Science.gov Websites

    training. Assists in identifying NG JAs to meet service operational requirements. And Coordinates and ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training

  16. Senior Enlisted Advisor to the CNGB - The National Guard

    Science.gov Websites

    , Army Good Conduct Medal, Army Service Ribbon, National Defense Service Ribbon, Armed Forces Reserve ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training

  17. Principal Assistant Responsible for Contracting (PARC) - J8 - The National

    Science.gov Websites

    Officer of the ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News

  18. Long before smartphones, National Guard responded to nationwide muster in

    Science.gov Websites

    Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & ; Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications Civic Leader's Guide ARNG

  19. Massachusetts Air National Guard dad deploys with his son for final time >

    Science.gov Websites

    Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications Civic Leader's Guide ARNG Vision 2020 Posture

  20. National Guard Bureau Office of Legislative Liaison - The National Guard

    Science.gov Websites

    ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News Publications

  1. Distributing Planning and Control for Teams of Cooperating Mobile Robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, L.E.

    2004-07-19

    This CRADA project involved the cooperative research of investigators in ORNL's Center for Engineering Science Advanced Research (CESAR) with researchers at Caterpillar, Inc. The subject of the research was the development of cooperative control strategies for autonomous vehicles performing applications of interest to Caterpillar customers. The project involved three Phases of research, conducted over the time period of November 1998 through December 2001. This project led to the successful development of several technologies and demonstrations in realistic simulation that illustrated the effectiveness of our control approaches for distributed planning and cooperation in multi-robot teams. The primary objectives of this researchmore » project were to: (1) Develop autonomous control technologies to enable multiple vehicles to work together cooperatively, (2) Provide the foundational capabilities for a human operator to exercise oversight and guidance during the multi-vehicle task execution, and (3) Integrate these capabilities to the ALLIANCE-based autonomous control approach for multi-robot teams. These objectives have been successfully met with the results implemented and demonstrated in a near real-time multi-vehicle simulation of up to four vehicles performing mission-relevant tasks.« less

  2. Research on radiation characteristics of dipole antenna modulation by sub-wavelength inhomogeneous plasma layer

    NASA Astrophysics Data System (ADS)

    Kong, Fanrong; Chen, Peiqi; Nie, Qiuyue; Zhang, Xiaoning; Zhang, Zhen; Jiang, Binhao

    2018-02-01

    The modulation and enhancement effect of sub-wavelength plasma structures on compact antennas exhibits obvious technological advantage and considerable progress. In order to extend the availability of this technology under complex and actual environment with inhomogeneous plasma structure, a numerical simulation analysis based on finite element method has been conducted in this paper. The modulation function of the antenna radiation with sub-wavelength plasma layer located at different positions was investigated, and the inhomogeneous plasma layer with multiple electron density distribution profiles were employed to explore the effect of plasma density distribution on the antenna radiation. It has been revealed that the optical near-field modulated distance and reduced plasma distribution are more beneficial to enhance the radiation. On the basis above, an application-focused research about communication through the plasma sheath surrounding a hypersonic vehicle has been carried out aiming at exploring an effective communication window. The relevant results devote guiding significance in the field of antenna radiation modulation and enhancement, as well as the development of communication technology in hypersonic flight.

  3. A distributed system for fast alignment of next-generation sequencing data.

    PubMed

    Srimani, Jaydeep K; Wu, Po-Yen; Phan, John H; Wang, May D

    2010-12-01

    We developed a scalable distributed computing system using the Berkeley Open Interface for Network Computing (BOINC) to align next-generation sequencing (NGS) data quickly and accurately. NGS technology is emerging as a promising platform for gene expression analysis due to its high sensitivity compared to traditional genomic microarray technology. However, despite the benefits, NGS datasets can be prohibitively large, requiring significant computing resources to obtain sequence alignment results. Moreover, as the data and alignment algorithms become more prevalent, it will become necessary to examine the effect of the multitude of alignment parameters on various NGS systems. We validate the distributed software system by (1) computing simple timing results to show the speed-up gained by using multiple computers, (2) optimizing alignment parameters using simulated NGS data, and (3) computing NGS expression levels for a single biological sample using optimal parameters and comparing these expression levels to that of a microarray sample. Results indicate that the distributed alignment system achieves approximately a linear speed-up and correctly distributes sequence data to and gathers alignment results from multiple compute clients.

  4. Dual-Use Space Technology Transfer Conference and Exhibition. Volume 1

    NASA Technical Reports Server (NTRS)

    Krishen, Kumar (Compiler)

    1994-01-01

    This document contains papers presented at the Dual-Use Space Technology Transfer Conference and Exhibition held at the Johnson Space Center February 1-3, 1994. Possible technology transfers covered during the conference were in the areas of information access; innovative microwave and optical applications; materials and structures; marketing and barriers; intelligent systems; human factors and habitation; communications and data systems; business process and technology transfer; software engineering; biotechnology and advanced bioinstrumentation; communications signal processing and analysis; new ways of doing business; medical care; applications derived from control center data systems; human performance evaluation; technology transfer methods; mathematics, modeling, and simulation; propulsion; software analysis and decision tools systems/processes in human support technology; networks, control centers, and distributed systems; power; rapid development perception and vision technologies; integrated vehicle health management; automation technologies; advanced avionics; ans robotics technologies. More than 77 papers, 20 presentations, and 20 exhibits covering various disciplines were presented b experts from NASA, universities, and industry.

  5. Monte Carlo simulation as a tool to predict blasting fragmentation based on the Kuz Ram model

    NASA Astrophysics Data System (ADS)

    Morin, Mario A.; Ficarazzo, Francesco

    2006-04-01

    Rock fragmentation is considered the most important aspect of production blasting because of its direct effects on the costs of drilling and blasting and on the economics of the subsequent operations of loading, hauling and crushing. Over the past three decades, significant progress has been made in the development of new technologies for blasting applications. These technologies include increasingly sophisticated computer models for blast design and blast performance prediction. Rock fragmentation depends on many variables such as rock mass properties, site geology, in situ fracturing and blasting parameters and as such has no complete theoretical solution for its prediction. However, empirical models for the estimation of size distribution of rock fragments have been developed. In this study, a blast fragmentation Monte Carlo-based simulator, based on the Kuz-Ram fragmentation model, has been developed to predict the entire fragmentation size distribution, taking into account intact and joints rock properties, the type and properties of explosives and the drilling pattern. Results produced by this simulator were quite favorable when compared with real fragmentation data obtained from a blast quarry. It is anticipated that the use of Monte Carlo simulation will increase our understanding of the effects of rock mass and explosive properties on the rock fragmentation by blasting, as well as increase our confidence in these empirical models. This understanding will translate into improvements in blasting operations, its corresponding costs and the overall economics of open pit mines and rock quarries.

  6. Distributed Grooming in Multi-Domain IP/MPLS-DWDM Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Qing

    2009-12-01

    This paper studies distributed multi-domain, multilayer provisioning (grooming) in IP/MPLS-DWDM networks. Although many multi-domain studies have emerged over the years, these have primarily considered 'homogeneous' network layers. Meanwhile, most grooming studies have assumed idealized settings with 'global' link state across all layers. Hence there is a critical need to develop practical distributed grooming schemes for real-world networks consisting of multiple domains and technology layers. Along these lines, a detailed hierarchical framework is proposed to implement inter-layer routing, distributed grooming, and setup signaling. The performance of this solution is analyzed in detail using simulation studies and future work directions are alsomore » high-lighted.« less

  7. Finite Element Simulation of the Shear Effect of Ultrasonic on Heat Exchanger Descaling

    NASA Astrophysics Data System (ADS)

    Lu, Shaolv; Wang, Zhihua; Wang, Hehui

    2018-03-01

    The shear effect on the interface of metal plate and its attached scale is an important mechanism of ultrasonic descaling, which is caused by the different propagation speed of ultrasonic wave in two different mediums. The propagating of ultrasonic wave on the shell is simulated based on the ANSYS/LS-DYNA explicit dynamic analysis. The distribution of shear stress in different paths under ultrasonic vibration is obtained through the finite element analysis and it reveals the main descaling mechanism of shear effect. The simulation result is helpful and enlightening to the reasonable design and the application of the ultrasonic scaling technology on heat exchanger.

  8. A Geospatial Comparison of Distributed Solar Heat and Power in Europe and the US

    PubMed Central

    Norwood, Zack; Nyholm, Emil; Otanicar, Todd; Johnsson, Filip

    2014-01-01

    The global trends for the rapid growth of distributed solar heat and power in the last decade will likely continue as the levelized cost of production for these technologies continues to decline. To be able to compare the economic potential of solar technologies one must first quantify the types and amount of solar resource that each technology can utilize; second, estimate the technological performance potential based on that resource; and third, compare the costs of each technology across regions. In this analysis, we have performed the first two steps in this process. We use physical and empirically validated models of a total of 8 representative solar system types: non-tracking photovoltaics, 2d-tracking photovoltaics, high concentration photovoltaics, flat-plate thermal, evacuated tube thermal, concentrating trough thermal, concentrating solar combined heat and power, and hybrid concentrating photovoltaic/thermal. These models are integrated into a simulation that uses typical meteorological year weather data to create a yearly time series of heat and electricity production for each system over 12,846 locations in Europe and 1,020 locations in the United States. Through this simulation, systems composed of various permutations of collector-types and technologies can be compared geospatially and temporally in terms of their typical production in each location. For example, we see that silicon solar cells show a significant advantage in yearly electricity production over thin-film cells in the colder climatic regions, but that advantage is lessened in regions that have high average irradiance. In general, the results lead to the conclusion that comparing solar technologies across technology classes simply on cost per peak watt, as is usually done, misses these often significant regional differences in annual performance. These results have implications for both solar power development and energy systems modeling of future pathways of the electricity system. PMID:25474632

  9. A geospatial comparison of distributed solar heat and power in Europe and the US.

    PubMed

    Norwood, Zack; Nyholm, Emil; Otanicar, Todd; Johnsson, Filip

    2014-01-01

    The global trends for the rapid growth of distributed solar heat and power in the last decade will likely continue as the levelized cost of production for these technologies continues to decline. To be able to compare the economic potential of solar technologies one must first quantify the types and amount of solar resource that each technology can utilize; second, estimate the technological performance potential based on that resource; and third, compare the costs of each technology across regions. In this analysis, we have performed the first two steps in this process. We use physical and empirically validated models of a total of 8 representative solar system types: non-tracking photovoltaics, 2d-tracking photovoltaics, high concentration photovoltaics, flat-plate thermal, evacuated tube thermal, concentrating trough thermal, concentrating solar combined heat and power, and hybrid concentrating photovoltaic/thermal. These models are integrated into a simulation that uses typical meteorological year weather data to create a yearly time series of heat and electricity production for each system over 12,846 locations in Europe and 1,020 locations in the United States. Through this simulation, systems composed of various permutations of collector-types and technologies can be compared geospatially and temporally in terms of their typical production in each location. For example, we see that silicon solar cells show a significant advantage in yearly electricity production over thin-film cells in the colder climatic regions, but that advantage is lessened in regions that have high average irradiance. In general, the results lead to the conclusion that comparing solar technologies across technology classes simply on cost per peak watt, as is usually done, misses these often significant regional differences in annual performance. These results have implications for both solar power development and energy systems modeling of future pathways of the electricity system.

  10. Indian emissions of technology-linked NMVOCs with chemical speciation: An evaluation of the SAPRC99 mechanism with WRF-CAMx simulations

    NASA Astrophysics Data System (ADS)

    Sarkar, M.; Venkataraman, C.; Guttikunda, S.; Sadavarte, P.

    2016-06-01

    Non-methane volatile organic compounds (NMVOCs) are important precursors to reactions producing tropospheric ozone and secondary organic aerosols. The present work uses a detailed technology-linked NMVOC emission database for India, along with a standard mapping method to measured NMVOC profiles, to develop speciated NMVOC emissions, which are aggregated into multiple chemical mechanisms used in chemical transport models. The fully speciated NMVOC emissions inventory with 423 constituent species, was regrouped into model-ready reactivity classes of the RADM2, SAPRC99 and CB-IV chemical mechanisms, and spatially distributed at 25 × 25 km2 resolution, using source-specific spatial proxies. Emissions were considered from four major sectors, i.e. industry, transport, agriculture and residential and from non-combustion activities (use of solvents and paints). It was found that residential cooking with biomass fuels, followed by agricultural residue burning in fields and on-road transport, were largest contributors to the highest reactivity group of NMVOC emissions from India. The emissions were evaluated using WRF-CAMx simulations, using the SAPRC99 photochemical mechanism, over India for contrasting months of April, July and October 2010. Modelled columnar abundance of NO2, CO and O3 agreed well with satellite observations both in magnitude and spatial distribution, in the three contrasting months. Evaluation of monthly and spatial differences between model predictions and observations indicates the need for further refinement of the spatial distribution of NOX emissions, spatio-temporal distribution of agricultural residue burning emissions.

  11. Three-dimensional magnetohydrodynamical simulation of expanding magnetic flux ropes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, L.; Dreher, J.; Grauer, R.

    Three-dimensional, time-dependent numerical simulations of the dynamics of magnetic flux ropes are presented. The simulations are targeted towards an experiment previously conducted at California Institute of Technology [P. M. Bellan and J. F. Hansen, Phys. Plasmas 5, 1991 (1998)] which aimed at simulating solar prominence eruptions in the laboratory. The plasma dynamics is described by ideal magnetohydrodynamics using different models for the evolution of the mass density. The initial current distribution represents the situation at the plasma creation phase, while it is not increased during the simulation. Key features of the reported experimental observations like pinching of the current loop,more » its expansion and distortion into helical shape are reproduced in the numerical simulations. Details of the final structure depend on the choice of a specific model for the mass density.« less

  12. Study of multipactor suppression of microwave components using perforated waveguide technology for space applications

    NASA Astrophysics Data System (ADS)

    Ye, Ming; Li, Yun; He, Yongning; Daneshmand, Mojgan

    2017-05-01

    With the development of space technology, microwave components with increased power handling capability and reduced weight have been urgently required. In this work, the perforated waveguide technology is proposed to suppress the multipactor effect of high power microwave components. Meanwhile, this novel method has the advantage of reducing components' weight, which makes it to have great potential in space applications. The perforated part of the waveguide components can be seen as an electron absorber (namely, its total electron emission yield is zero) since most of the electrons impacting on this part will go out of the components. Based on thoroughly benchmarked numerical simulation procedures, we simulated an S band and an X band waveguide transformer to conceptually verify this idea. Both electron dynamic simulations and electrical loss simulations demonstrate that the perforation technology can improve the multipactor threshold at least ˜8 dB while maintaining the acceptable insertion loss level compared with its un-perforated components. We also found that the component with larger minimum gap is easier to achieve multipactor suppression. This effect is interpreted by a parallel plate waveguide model. What's more, to improve the multipactor threshold of the X band waveguide transformer with a minimum gap of ˜0.1 mm, we proposed a perforation structure with the slope edge and explained its mechanism. Future study will focus on further optimization of the perforation structure, size, and distribution to maximize the comprehensive performances of microwave components.

  13. Electrical properties study under radiation of the 3D-open-shell-electrode detector

    NASA Astrophysics Data System (ADS)

    Liu, Manwen; Li, Zheng

    2018-05-01

    Since the 3D-Open-Shell-Electrode Detector (3DOSED) is proposed and the structure is optimized, it is important to study 3DOSED's electrical properties to determine the detector's working performance, especially in the heavy radiation environments, like the Large Hadron Collider (LHC) and it's upgrade, the High Luminosity (HL-LHC) at CERN. In this work, full 3D technology computer-aided design (TCAD) simulations have been done on this novel silicon detector structure. Simulated detector properties include the electric field distribution, the electric potential distribution, current-voltage (I-V) characteristics, capacitance-voltage (C-V) characteristics, charge collection property, and full depletion voltage. Through the analysis of calculations and simulation results, we find that the 3DOSED's electric field and potential distributions are very uniform, even in the tiny region near the shell openings with little perturbations. The novel detector fits the designing purpose of collecting charges generated by particle/light in a good fashion with a well defined funnel shape of electric potential distribution that makes these charges drifting towards the center collection electrode. Furthermore, by analyzing the I-V, C-V, charge collection property and full depletion voltage, we can expect that the novel detector will perform well, even in the heavy radiation environments.

  14. Securing Sensitive Flight and Engine Simulation Data Using Smart Card Technology

    NASA Technical Reports Server (NTRS)

    Blaser, Tammy M.

    2003-01-01

    NASA Glenn Research Center has developed a smart card prototype capable of encrypting and decrypting disk files required to run a distributed aerospace propulsion simulation. Triple Data Encryption Standard (3DES) encryption is used to secure the sensitive intellectual property on disk pre, during, and post simulation execution. The prototype operates as a secure system and maintains its authorized state by safely storing and permanently retaining the encryption keys only on the smart card. The prototype is capable of authenticating a single smart card user and includes pre simulation and post simulation tools for analysis and training purposes. The prototype's design is highly generic and can be used to protect any sensitive disk files with growth capability to urn multiple simulations. The NASA computer engineer developed the prototype on an interoperable programming environment to enable porting to other Numerical Propulsion System Simulation (NPSS) capable operating system environments.

  15. Exploring High School Students Beginning Reasoning about Significance Tests with Technology

    ERIC Educational Resources Information Center

    García, Víctor N.; Sánchez, Ernesto

    2017-01-01

    In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…

  16. Director of the Air National Guard - Lieutenant General L Scott Rice - The

    Science.gov Websites

    Officer of the ARNG Command Sergeant Major of the ARNG State Mission Sustainability Training ARNG Distributed Learning Program Training & Technology Battle Lab (T3BL) Civil Support Simulation Exercises Regional Training Site Maintenance Battle Focused Training Strategy Battle Staff Training Resources News

  17. ONR Europe Reports. Computer Science/Computer Engineering in Central Europe: A Report on Czechoslovakia, Hungary, and Poland

    DTIC Science & Technology

    1992-08-01

    Rychlik J.: Simulation of distributed control systems. Research report of Institute of Technology in 22 Pilsen no. 209-07-85, Jun. 1985 Kocur P... Kocur P.: Sensitivity analysis of reliability parameters. Proceedings of conf. FTSD, Brno, Jun. 1986, pp. 97-101 Smrha P., Kocur P., Racek S.: A

  18. Cybercare 2.0: meeting the challenge of the global burden of disease in 2030.

    PubMed

    Rosen, Joseph M; Kun, Luis; Mosher, Robyn E; Grigg, Elliott; Merrell, Ronald C; Macedonia, Christian; Klaudt-Moreau, Julien; Price-Smith, Andrew; Geiling, James

    In this paper, we propose to advance and transform today's healthcare system using a model of networked health care called Cybercare. Cybercare means "health care in cyberspace" - for example, doctors consulting with patients via videoconferencing across a distributed network; or patients receiving care locally - in neighborhoods, "minute clinics," and homes - using information technologies such as telemedicine, smartphones, and wearable sensors to link to tertiary medical specialists. This model contrasts with traditional health care, in which patients travel (often a great distance) to receive care from providers in a central hospital. The Cybercare model shifts health care provision from hospital to home; from specialist to generalist; and from treatment to prevention. Cybercare employs advanced technology to deliver services efficiently across the distributed network - for example, using telemedicine, wearable sensors and cell phones to link patients to specialists and upload their medical data in near-real time; using information technology (IT) to rapidly detect, track, and contain the spread of a global pandemic; or using cell phones to manage medical care in a disaster situation. Cybercare uses seven "pillars" of technology to provide medical care: genomics; telemedicine; robotics; simulation, including virtual and augmented reality; artificial intelligence (AI), including intelligent agents; the electronic medical record (EMR); and smartphones. All these technologies are evolving and blending. The technologies are integrated functionally because they underlie the Cybercare network, and/or form part of the care for patients using that distributed network. Moving health care provision to a networked, distributed model will save money, improve outcomes, facilitate access, improve security, increase patient and provider satisfaction, and may mitigate the international global burden of disease. In this paper we discuss how Cybercare is being implemented now, and envision its growth by 2030.

  19. A radar-enabled collaborative sensor network integrating COTS technology for surveillance and tracking.

    PubMed

    Kozma, Robert; Wang, Lan; Iftekharuddin, Khan; McCracken, Ernest; Khan, Muhammad; Islam, Khandakar; Bhurtel, Sushil R; Demirer, R Murat

    2012-01-01

    The feasibility of using Commercial Off-The-Shelf (COTS) sensor nodes is studied in a distributed network, aiming at dynamic surveillance and tracking of ground targets. Data acquisition by low-cost (<$50 US) miniature low-power radar through a wireless mote is described. We demonstrate the detection, ranging and velocity estimation, classification and tracking capabilities of the mini-radar, and compare results to simulations and manual measurements. Furthermore, we supplement the radar output with other sensor modalities, such as acoustic and vibration sensors. This method provides innovative solutions for detecting, identifying, and tracking vehicles and dismounts over a wide area in noisy conditions. This study presents a step towards distributed intelligent decision support and demonstrates effectiveness of small cheap sensors, which can complement advanced technologies in certain real-life scenarios.

  20. Application of distributed optical fiber sensing technologies to the monitoring of leakage and abnormal disturbance of oil pipeline

    NASA Astrophysics Data System (ADS)

    Yang, Xiaojun; Zhu, Xiaofei; Deng, Chi; Li, Junyi; Liu, Cheng; Yu, Wenpeng; Luo, Hui

    2017-10-01

    To improve the level of management and monitoring of leakage and abnormal disturbance of long distance oil pipeline, the distributed optical fiber temperature and vibration sensing system is employed to test the feasibility for the healthy monitoring of a domestic oil pipeline. The simulating leakage and abnormal disturbance affairs of oil pipeline are performed in the experiment. It is demonstrated that the leakage and abnormal disturbance affairs of oil pipeline can be monitored and located accurately with the distributed optical fiber sensing system, which exhibits good performance in the sensitivity, reliability, operation and maintenance etc., and shows good market application prospect.

  1. Modeling fire-induced smoke spread and carbon monoxide transportation in a long channel: Fire Dynamics Simulator comparisons with measured data.

    PubMed

    Hu, L H; Fong, N K; Yang, L Z; Chow, W K; Li, Y Z; Huo, R

    2007-02-09

    Smoke and toxic gases, such as carbon monoxide, are the most fatal factors in fires. This paper models fire-induced smoke spread and carbon monoxide transportation in an 88m long channel by Fire Dynamics Simulator (FDS) with large eddy simulation (LES). FDS is now a well-founded fire dynamics computational fluid dynamic (CFD) program, which was developed by National Institute of Standards and Technology (NIST). Two full scale experiments with fire sizes of 0.75 and 1.6MW were conducted in this channel to validate the program. The spread of the fire-induced smoke flow together with the smoke temperature distribution along the channel, and the carbon monoxide concentration at an assigned position were measured. The FDS simulation results were compared with experimental data with fairly good agreement demonstrated. The validation work is then extended to numerically study the carbon monoxide concentration distribution, both vertically and longitudinally, in this long channel. Results showed that carbon monoxide concentration increase linearly with the height above the floor and decreases exponentially with the distance away from the fire source.

  2. Analysis of exposure to electromagnetic fields in a healthcare environment: simulation and experimental study.

    PubMed

    de Miguel-Bilbao, Silvia; Martín, Miguel Angel; Del Pozo, Alejandro; Febles, Victor; Hernández, José A; de Aldecoa, José C Fernández; Ramos, Victoria

    2013-11-01

    Recent advances in wireless technologies have lead to an increase in wireless instrumentation present in healthcare centers. This paper presents an analytical method for characterizing electric field (E-field) exposure within these environments. The E-field levels of the different wireless communications systems have been measured in two floors of the Canary University Hospital Consortium (CUHC). The electromagnetic (EM) conditions detected with the experimental measures have been estimated using the software EFC-400-Telecommunications (Narda Safety Test Solutions, Sandwiesenstrasse 7, 72793 Pfullingen, Germany). The experimental and simulated results are represented through 2D contour maps, and have been compared with the recommended safety and exposure thresholds. The maximum value obtained is much lower than the 3 V m(-1) that is established in the International Electrotechnical Commission Standard of Electromedical Devices. Results show a high correlation in terms of E-field cumulative distribution function (CDF) between the experimental and simulation results. In general, the CDFs of each pair of experimental and simulated samples follow a lognormal distribution with the same mean.

  3. Experimental and numerical study of impact of voltage fluctuate, flicker and power factor wave electric generator to local distribution

    NASA Astrophysics Data System (ADS)

    Hadi, Nik Azran Ab; Rashid, Wan Norhisyam Abd; Hashim, Nik Mohd Zarifie; Mohamad, Najmiah Radiah; Kadmin, Ahmad Fauzan

    2017-10-01

    Electricity is the most powerful energy source in the world. Engineer and technologist combined and cooperated to invent a new low-cost technology and free carbon emission where the carbon emission issue is a major concern now due to global warming. Renewable energy sources such as hydro, wind and wave are becoming widespread to reduce the carbon emissions, on the other hand, this effort needs several novel methods, techniques and technologies compared to coal-based power. Power quality of renewable sources needs in depth research and endless study to improve renewable energy technologies. The aim of this project is to investigate the impact of renewable electric generator on its local distribution system. The power farm was designed to connect to the local distribution system and it will be investigated and analyzed to make sure that energy which is supplied to customer is clean. The MATLAB tools are used to simulate the overall analysis. At the end of the project, a summary of identifying various voltage fluctuates data sources is presented in terms of voltage flicker. A suggestion of the analysis impact of wave power generation on its local distribution is also presented for the development of wave generator farms.

  4. Developing an Integration Infrastructure for Distributed Engine Control Technologies

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Zinnecker, Alicia; Aretskin-Hariton, Eliot; Kratz, Jonathan

    2014-01-01

    Turbine engine control technology is poised to make the first revolutionary leap forward since the advent of full authority digital engine control in the mid-1980s. This change aims squarely at overcoming the physical constraints that have historically limited control system hardware on aero-engines to a federated architecture. Distributed control architecture allows complex analog interfaces existing between system elements and the control unit to be replaced by standardized digital interfaces. Embedded processing, enabled by high temperature electronics, provides for digitization of signals at the source and network communications resulting in a modular system at the hardware level. While this scheme simplifies the physical integration of the system, its complexity appears in other ways. In fact, integration now becomes a shared responsibility among suppliers and system integrators. While these are the most obvious changes, there are additional concerns about performance, reliability, and failure modes due to distributed architecture that warrant detailed study. This paper describes the development of a new facility intended to address the many challenges of the underlying technologies of distributed control. The facility is capable of performing both simulation and hardware studies ranging from component to system level complexity. Its modular and hierarchical structure allows the user to focus their interaction on specific areas of interest.

  5. Modern Grid Initiative Distribution Taxonomy Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Kevin P.; Chen, Yousu; Chassin, David P.

    2008-11-01

    This is the final report for the development of a toxonomy of prototypical electrical distribution feeders. Two of the primary goals of the Department of Energy's (DOE) Modern Grid Initiative (MGI) are 'to accelerate the modernization of our nation's electricity grid' and to 'support demonstrations of systems of key technologies that can serve as the foundation for an integrated, modern power grid'. A key component to the realization of these goals is the effective implementation of new, as well as existing, 'smart grid technologies'. Possibly the largest barrier that has been identified in the deployment of smart grid technologies ismore » the inability to evaluate how their deployment will affect the electricity infrastructure, both locally and on a regional scale. The inability to evaluate the impacts of these technologies is primarily due to the lack of detailed electrical distribution feeder information. While detailed distribution feeder information does reside with the various distribution utilities, there is no central repository of information that can be openly accessed. The role of Pacific Northwest National Laboratory (PNNL) in the MGI for FY08 was to collect distribution feeder models, in the SynerGEE{reg_sign} format, from electric utilities around the nation so that they could be analyzed to identify regional differences in feeder design and operation. Based on this analysis PNNL developed a taxonomy of 24 prototypical feeder models in the GridLAB-D simulations environment that contain the fundamental characteristics of non-urban core, radial distribution feeders from the various regions of the U.S. Weighting factors for these feeders are also presented so that they can be used to generate a representative sample for various regions within the United States. The final product presented in this report is a toolset that enables the evaluation of new smart grid technologies, with the ability to aggregate their effects to regional and national levels. The distribution feeder models presented in this report are based on actual utility models but do not contain any proprietary or system specific information. As a result, the models discussed in this report can be openly distributed to industry, academia, or any interested entity, in order to facilitate the ability to evaluate smart grid technologies.« less

  6. A novel method for energy harvesting simulation based on scenario generation

    NASA Astrophysics Data System (ADS)

    Wang, Zhe; Li, Taoshen; Xiao, Nan; Ye, Jin; Wu, Min

    2018-06-01

    Energy harvesting network (EHN) is a new form of computer networks. It converts ambient energy into usable electric energy and supply the electrical energy as a primary or secondary power source to the communication devices. However, most of the EHN uses the analytical probability distribution function to describe the energy harvesting process, which cannot accurately identify the actual situation for the lack of authenticity. We propose an EHN simulation method based on scenario generation in this paper. Firstly, instead of setting a probability distribution in advance, it uses optimal scenario reduction technology to generate representative scenarios in single period based on the historical data of the harvested energy. Secondly, it uses homogeneous simulated annealing algorithm to generate optimal daily energy harvesting scenario sequences to get a more accurate simulation of the random characteristics of the energy harvesting network. Then taking the actual wind power data as an example, the accuracy and stability of the method are verified by comparing with the real data. Finally, we cite an instance to optimize the network throughput, which indicate the feasibility and effectiveness of the method we proposed from the optimal solution and data analysis in energy harvesting simulation.

  7. Engineering research, development and technology FY99

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langland, R T

    The growth of computer power and connectivity, together with advances in wireless sensing and communication technologies, is transforming the field of complex distributed systems. The ability to deploy large numbers of sensors with a rapid, broadband communication system will enable high-fidelity, near real-time monitoring of complex systems. These technological developments will provide unprecedented insight into the actual performance of engineered and natural environment systems, enable the evolution of many new types of engineered systems for monitoring and detection, and enhance our ability to perform improved and validated large-scale simulations of complex systems. One of the challenges facing engineering is tomore » develop methodologies to exploit the emerging information technologies. Particularly important will be the ability to assimilate measured data into the simulation process in a way which is much more sophisticated than current, primarily ad hoc procedures. The reports contained in this section on the Center for Complex Distributed Systems describe activities related to the integrated engineering of large complex systems. The first three papers describe recent developments for each link of the integrated engineering process for large structural systems. These include (1) the development of model-based signal processing algorithms which will formalize the process of coupling measurements and simulation and provide a rigorous methodology for validation and update of computational models; (2) collaborative efforts with faculty at the University of California at Berkeley on the development of massive simulation models for the earth and large bridge structures; and (3) the development of wireless data acquisition systems which provide a practical means of monitoring large systems like the National Ignition Facility (NIF) optical support structures. These successful developments are coming to a confluence in the next year with applications to NIF structural characterizations and analysis of large bridge structures for the State of California. Initial feasibility investigations into the development of monitoring and detection systems are described in the papers on imaging of underground structures with ground-penetrating radar, and the use of live insects as sensor platforms. These efforts are establishing the basic performance characteristics essential to the decision process for future development of sensor arrays for information gathering related to national security.« less

  8. Role and challenges of simulation in undergraduate curriculum.

    PubMed

    Nuzhat, Ayesha; Salem, Raneem Osama; Al Shehri, Fatimah Nasser; Al Hamdan, Nasser

    2014-04-01

    Medical simulation is relatively a novel technology widely utilized for teaching and assessing students clinical skills. Students and faculty face many challenges when simulation sessions are introduced into undergraduate curriculum. The aim of this study is to obtain the opinion of undergraduate medical students and our faculty regarding the role of simulation in undergraduate curriculum, the simulation modalities used, and the perceived barriers in implementing simulation sessions. A self-administered pilot tested questionnaire with 18 items using a 5-point Likert scale was distributed to undergraduate male (n = 125) and female students (n = 70) as well as to the faculty members (n = 14) at King Fahad Medical City, King Saud Bin Abdul Aziz University of Health Sciences, Saudi Arabia, to respond. Survey elements addressed the role of simulation, simulation modalities used, and perceived challenges to implementation of simulation sessions. Various learning outcomes are achieved and improved through the technology enhanced simulation sessions such as communication skills, diagnostic skills, procedural skills, self-confidence, and integration of basic and clinical sciences. The use of high fidelity simulators, simulated patients and task trainers was more desirable by our students and faculty for teaching and learning as well as an evaluation tool. According to most of the students', institutional support in terms of resources, staff and duration of sessions was adequate. However, motivation to participate in the sessions and provision of adequate feedback by the staff was a constraint. The use of simulation laboratory is of great benefit to the students and a great teaching tool for the staff to ensure students learn various skills.

  9. ρ-VOF: An interface sharpening method for gas-liquid flow simulation

    NASA Astrophysics Data System (ADS)

    Wang, Jiantao; Liu, Gang; Jiang, Xiong; Mou, Bin

    2018-05-01

    The study on simulation of compressible gas-liquid flow remains open. Popular methods are either confined to incompressible flow regime, or inevitably induce smear of the free interface. A new finite volume method for compressible two-phase flow simulation is contributed for this subject. First, the “heterogeneous equilibrium” assumption is introduced to the control volume, by hiring free interface reconstruction technology, the distribution of each component in the control volume is achieved. Next, AUSM+-up (advection upstream splitting method) scheme is employed to calculate the convective fluxes and pressure fluxes, with the contact discontinuity characteristic considered, followed by the update of the whole flow field. The new method features on density-based pattern and interface reconstruction technology from VOF (volume of fluid), thus we name it “ρ-VOF method”. Inherited from AUSM families and VOF, ρ-VOF behaves as an all-speed method, capable of simulating shock in gas-liquid flow, and preserving the sharpness of the free interface. Gas-liquid shock tube is simulated to evaluate the method, from which good agreement is obtained between the predicted results and those of the cited literature, meanwhile, sharper free interface is identified. Finally, the capability and validity of ρ-VOF method can be concluded in compressible gas-liquid flow simulation.

  10. Adaptive spatial filtering for daytime satellite quantum key distribution

    NASA Astrophysics Data System (ADS)

    Gruneisen, Mark T.; Sickmiller, Brett A.; Flanagan, Michael B.; Black, James P.; Stoltenberg, Kurt E.; Duchane, Alexander W.

    2014-11-01

    The rate of secure key generation (SKG) in quantum key distribution (QKD) is adversely affected by optical noise and loss in the quantum channel. In a free-space atmospheric channel, the scattering of sunlight into the channel can lead to quantum bit error ratios (QBERs) sufficiently large to preclude SKG. Furthermore, atmospheric turbulence limits the degree to which spatial filtering can reduce sky noise without introducing signal losses. A system simulation quantifies the potential benefit of tracking and higher-order adaptive optics (AO) technologies to SKG rates in a daytime satellite engagement scenario. The simulations are performed assuming propagation from a low-Earth orbit (LEO) satellite to a terrestrial receiver that includes an AO system comprised of a Shack-Hartmann wave-front sensor (SHWFS) and a continuous-face-sheet deformable mirror (DM). The effects of atmospheric turbulence, tracking, and higher-order AO on the photon capture efficiency are simulated using statistical representations of turbulence and a time-domain waveoptics hardware emulator. Secure key generation rates are then calculated for the decoy state QKD protocol as a function of the receiver field of view (FOV) for various pointing angles. The results show that at FOVs smaller than previously considered, AO technologies can enhance SKG rates in daylight and even enable SKG where it would otherwise be prohibited as a consequence of either background optical noise or signal loss due to turbulence effects.

  11. Remote observatory access via the Advanced Communications Technology Satellite

    NASA Technical Reports Server (NTRS)

    Horan, Stephen; Anderson, Kurt; Georghiou, Georghios

    1992-01-01

    An investigation of the potential for using the ACTS to provide the data distribution network for a distributed set of users of an astronomical observatory has been conducted. The investigation consisted of gathering the data and interface standards for the ACTS network and the observatory instrumentation and telecommunications devices. A simulation based on COMNET was then developed to test data transport configurations for real-time suitability. The investigation showed that the ACTS network should support the real-time requirements and allow for growth in the observatory needs for data transport.

  12. Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems

    NASA Technical Reports Server (NTRS)

    Holda, Julie

    2004-01-01

    The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.

  13. Distributed collaborative environments for predictive battlespace awareness

    NASA Astrophysics Data System (ADS)

    McQuay, William K.

    2003-09-01

    The past decade has produced significant changes in the conduct of military operations: asymmetric warfare, the reliance on dynamic coalitions, stringent rules of engagement, increased concern about collateral damage, and the need for sustained air operations. Mission commanders need to assimilate a tremendous amount of information, make quick-response decisions, and quantify the effects of those decisions in the face of uncertainty. Situational assessment is crucial in understanding the battlespace. Decision support tools in a distributed collaborative environment offer the capability of decomposing complex multitask processes and distributing them over a dynamic set of execution assets that include modeling, simulations, and analysis tools. Decision support technologies can semi-automate activities, such as analysis and planning, that have a reasonably well-defined process and provide machine-level interfaces to refine the myriad of information that the commander must fused. Collaborative environments provide the framework and integrate models, simulations, and domain specific decision support tools for the sharing and exchanging of data, information, knowledge, and actions. This paper describes ongoing AFRL research efforts in applying distributed collaborative environments to predictive battlespace awareness.

  14. Learning from Massive Distributed Data Sets (Invited)

    NASA Astrophysics Data System (ADS)

    Kang, E. L.; Braverman, A. J.

    2013-12-01

    Technologies for remote sensing and ever-expanding computer experiments in climate science are generating massive data sets. Meanwhile, it has been common in all areas of large-scale science to have these 'big data' distributed over multiple different physical locations, and moving large amounts of data can be impractical. In this talk, we will discuss efficient ways for us to summarize and learn from distributed data. We formulate a graphical model to mimic the main characteristics of a distributed-data network, including the size of the data sets and speed of moving data. With this nominal model, we investigate the trade off between prediction accurate and cost of data movement, theoretically and through simulation experiments. We will also discuss new implementations of spatial and spatio-temporal statistical methods optimized for distributed data.

  15. Virtual reality as a tool for cross-cultural communication: an example from military team training

    NASA Astrophysics Data System (ADS)

    Downes-Martin, Stephen; Long, Mark; Alexander, Joanna R.

    1992-06-01

    A major problem with communication across cultures, whether professional or national, is that simple language translation if often insufficient to communicate the concepts. This is especially true when the communicators come from highly specialized fields of knowledge or from national cultures with long histories of divergence. This problem becomes critical when the goal of the communication is national negotiation dealing with such high risk items as arms negotiation or trade wars. Virtual Reality technology has considerable potential for facilitating communication across cultures, by immersing the communicators within multiple visual representations of the concepts, and providing control over those representations. Military distributed team training provides a model for virtual reality suitable for cross cultural communication such as negotiation. In both team training and negotiation, the participants must cooperate, agree on a set of goals, and achieve mastery over the concepts being negotiated. Team training technologies suitable for supporting cross cultural negotiation exist (branch wargaming, computer image generation and visualization, distributed simulation), and have developed along different lines than traditional virtual reality technology. Team training de-emphasizes the realism of physiological interfaces between the human and the virtual reality, and emphasizes the interaction of humans with each other and with intelligent simulated agents within the virtual reality. This approach to virtual reality is suggested as being more fruitful for future work.

  16. Influence of the Distribution of Tag IDs on RFID Memoryless Anti-Collision Protocols

    PubMed Central

    Cmiljanic, Nikola; Landaluce, Hugo; Perallos, Asier; Arjona, Laura

    2017-01-01

    In recent years, Radio Frequency Identification (RFID) has become very popular. The main feature of this technology is that RFID tags do not require close handling and no line of sight is required between the reader and the tags. RFID is a technology that uses radio frequencies in order to identify tags, which do not need to be positioned accurately relative to the reader. Tags share the communication channel, increasing the likelihood of causing a problem, viz., a message collision. Tree based protocols can resolve these collisions, but require a uniform tag ID distribution. This means they are very dependent of the distribution of the IDs of the tags. Tag IDs are written in the tag and contain a predefined bit string of data. A study of the influence of the tag ID distribution on the protocols’ behaviour is proposed here. A new protocol, called the Flexible Query window Tree (FQwT) is presented to estimate the tag ID distribution, taking into consideration the type of distribution. The aim is to create a flexible anti-collision protocol in order to identify a set of tags that constitute an ID distribution. As a result, the reader classifies tags into groups determined by using a distribution estimator. Simulations show that the FQwT protocol contributes to significant reductions in identification time and energy consumption regardless of the type of ID distribution. PMID:28817070

  17. Influence of the Distribution of Tag IDs on RFID Memoryless Anti-Collision Protocols.

    PubMed

    Cmiljanic, Nikola; Landaluce, Hugo; Perallos, Asier; Arjona, Laura

    2017-08-17

    In recent years, Radio Frequency Identification (RFID) has become very popular. The main feature of this technology is that RFID tags do not require close handling and no line of sight is required between the reader and the tags. RFID is a technology that uses radio frequencies in order to identify tags, which do not need to be positioned accurately relative to the reader. Tags share the communication channel, increasing the likelihood of causing a problem, viz., a message collision. Tree based protocols can resolve these collisions, but require a uniform tag ID distribution. This means they are very dependent of the distribution of the IDs of the tags. Tag IDs are written in the tag and contain a predefined bit string of data. A study of the influence of the tag ID distribution on the protocols' behaviour is proposed here. A new protocol, called the Flexible Query window Tree (FQwT) is presented to estimate the tag ID distribution, taking into consideration the type of distribution. The aim is to create a flexible anti-collision protocol in order to identify a set of tags that constitute an ID distribution. As a result, the reader classifies tags into groups determined by using a distribution estimator. Simulations show that the FQwT protocol contributes to significant reductions in identification time and energy consumption regardless of the type of ID distribution.

  18. Characteristics of the mixing volume model with the interactions among spatially distributed particles for Lagrangian simulations of turbulent mixing

    NASA Astrophysics Data System (ADS)

    Watanabe, Tomoaki; Nagata, Koji

    2016-11-01

    The mixing volume model (MVM), which is a mixing model for molecular diffusion in Lagrangian simulations of turbulent mixing problems, is proposed based on the interactions among spatially distributed particles in a finite volume. The mixing timescale in the MVM is derived by comparison between the model and the subgrid scale scalar variance equation. A-priori test of the MVM is conducted based on the direct numerical simulations of planar jets. The MVM is shown to predict well the mean effects of the molecular diffusion under various conditions. However, a predicted value of the molecular diffusion term is positively correlated to the exact value in the DNS only when the number of the mixing particles is larger than two. Furthermore, the MVM is tested in the hybrid implicit large-eddy-simulation/Lagrangian-particle-simulation (ILES/LPS). The ILES/LPS with the present mixing model predicts well the decay of the scalar variance in planar jets. This work was supported by JSPS KAKENHI Nos. 25289030 and 16K18013. The numerical simulations presented in this manuscript were carried out on the high performance computing system (NEC SX-ACE) in the Japan Agency for Marine-Earth Science and Technology.

  19. Diffuse characteristics study of laser target board using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Yang, Pengling; Wu, Yong; Wang, Zhenbao; Tao, Mengmeng; Wu, Junjie; Wang, Ping; Yan, Yan; Zhang, Lei; Feng, Gang; Zhu, Jinghui; Feng, Guobin

    2013-05-01

    In this paper, Torrance-Sparrow and Oren-Nayar model is adopt to study diffuse characteristics of laser target board. The model which based on geometric optics, assumes that rough surfaces are made up of a series of symmetric V-groove cavities with different slopes at microscopic level. The distribution of the slopes of the V-grooves are modeled as beckman distribution function, and every microfacet of the V-groove cavity is assumed to behave like a perfect mirror, which means the reflected ray follows Fresnel law at the microfacet. The masking and shadowing effects of rough surface are also taken into account through geometric attenuation factor. Monte Carlo method is used to simulate the diffuse reflectance distribution of the laser target board with different materials and processing technology, and all the calculated results are verified by experiment. It is shown that the profile of bidirectional reflectance distribution curve is lobe-shaped with the maximum lies along the mirror reflection direction. The width of the profile is narrower for a lower roughness value, and broader for a higher roughness value. The refractive index of target material will also influence the intensity and distribution of diffuse reflectance of laser target surface.

  20. A Functional Comparison of Lunar Regoliths and Their Simulants

    NASA Technical Reports Server (NTRS)

    Rickman, D.; Edmunson, J.; McLemore, C.

    2012-01-01

    Lunar regolith simulants are essential to the development of technology for human exploration of the Moon. Any equipment that will interact with the surface environment must be tested with simulant to mitigate risk. To reduce the greatest amount of risk, the simulant must replicate the lunar surface as well as possible. To quantify the similarities and differences between simulants, the Figures of Merit were developed. The Figures of Merit software compares the simulants and regolith by particle size, particle shape, density, and bulk chemistry and mineralogy; these four properties dictate the majority of the remaining characteristics of a geologic material. There are limitations to both the current Figures of Merit approach and simulants in general. The effect of particle textures is lacking in the Figures of Merit software, and research into this topic has only recently begun with applications to simulants. In addition, not all of the properties for lunar regolith are defined sufficiently for simulant reproduction or comparison; for example, the size distribution of particles greater than 1 centimeter and the makeup of particles less than 10 micrometers is not well known. For simulants, contamination by terrestrial weathering products or undesired trace phases in feedstock material is a major issue. Vapor deposited rims have not yet been created for simulants. Fortunately, previous limitations such as the lack of agglutinates in simulants have been addressed and commercial companies are now making agglutinate material for simulants. Despite some limitations, the Figures of Merit sufficiently quantify the comparison between simulants and regolith for useful application in lunar surface technology. Over time, the compilation and analysis of simulant user data will add an advantageous predictive capability to the Figures of Merit, accurately relating Figures of Merit characteristics to simulant user parameters.

  1. Simulation of SiO2 etching in an inductively coupled CF4 plasma

    NASA Astrophysics Data System (ADS)

    Xu, Qing; Li, Yu-Xing; Li, Xiao-Ning; Wang, Jia-Bin; Yang, Fan; Yang, Yi; Ren, Tian-Ling

    2017-02-01

    Plasma etching technology is an indispensable processing method in the manufacturing process of semiconductor devices. Because of the high fluorine/carbon ratio of CF4, the CF4 gas is often used for etching SiO2. A commercial software ESI-CFD is used to simulate the process of plasma etching with an inductively coupled plasma model. For the simulation part, CFD-ACE is used to simulate the chamber, and CFD-TOPO is used to simulate the surface of the sample. The effects of chamber pressure, bias voltage and ICP power on the reactant particles were investigated, and the etching profiles of SiO2 were obtained. Simulation can be used to predict the effects of reaction conditions on the density, energy and angular distributions of reactant particles, which can play a good role in guiding the etching process.

  2. Effects of Home Energy Management Systems on Distribution Utilities and Feeders Under Various Market Structures: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruth, Mark; Pratt, Annabelle; Lunacek, Monte

    2015-07-17

    The combination of distributed energy resources (DER) and retail tariff structures to provide benefits to both utility consumers and the utilities is poorly understood. To improve understanding, an Integrated Energy System Model (IESM) is being developed to simulate the physical and economic aspects of DER technologies, the buildings where they reside, and feeders servicing them. The IESM was used to simulate 20 houses with home energy management systems on a single feeder under a time of use tariff to estimate economic and physical impacts on both the households and the distribution utilities. HEMS reduce consumers’ electric bills by precooling housesmore » in the hours before peak electricity pricing. Household savings are greater than the reduction utility net revenue indicating that HEMS can provide a societal benefit providing tariffs are structured so that utilities remain solvent. Utilization of HEMS reduce peak loads during high price hours but shifts it to hours with off-peak and shoulder prices and resulting in a higher peak load.« less

  3. Development of the electromagnetic technology for broken rail detection from a mobil platform

    NASA Astrophysics Data System (ADS)

    Plotnikov, Yuri; Raghunathan, Arun; Kumar, Ajith; Noffsinger, Joseph; Fries, Jeffrey; Ehret, Steven; Frangieh, Tannous; Palanganda, Samhitha

    2016-02-01

    Timely detection of breaks in running rails remains a topic of significant importance for the railroad industry. GE has been investigating new ideas of the Rail Integrity Monitoring or RIM technology that can be implemented on a wide range of the rolling stock platforms including locomotives, passenger and freight cars. The focus of the project is to establish a simple, non-contact, and inexpensive means of nondestructive inspection by fusion of known solutions with new technology development that can result in detection with high reliability. A scaled down model of a typical locomotive-track system has been developed at GE Global research for detailed study of the detection process. In addition, a finite element model has been established and used to understand distribution of the magnetic field and currents in such a system. Both models have been using the rails and wheel-axles geometry to establish a realistic model that would provide the electric current and magnetic field distribution close to the real world phenomenon. Initial magnetic field maps were obtained by scanning a 1:15 model constructed of steel bars using a 3D scanner and an inductive coil. Sensitivity to a broken rail located between two locomotive axles simulated by an opening in this metallic frame was demonstrated. Further investigation and optimization was conducted on a larger, 1:3 scale, physical model and by running mathematical simulations. Special attention was paid to consistency between the finite element and physical model results. The obtained results allowed establishment of a working frequency range, inductive current injection into the rail-wheel-axle loop and measuring the electromagnetic response to a broken rail. The verification and full scale system prototype tests are following the laboratory experiments and mathematical simulations.

  4. Perspectives on the Future of CFD

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2000-01-01

    This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.

  5. An Overview of the Formation and Attitude Control System for the Terrestrial Planet Finder Formation Flying Interferometer

    NASA Technical Reports Server (NTRS)

    Scharf, Daniel P.; Hadaegh, Fred Y.; Rahman, Zahidul H.; Shields, Joel F.; Singh, Gurkipal; Wette, Matthew R.

    2004-01-01

    The Terrestrial Planet Finder formation flying Interferometer (TPF-I) will be a five-spacecraft, precision formation operating near the second Sun-Earth Lagrange point. As part of technology development for TPF-I, a formation and attitude control system (FACS) is being developed that achieves the precision and functionality needed for the TPF-I formation and that will be demonstrated in a distributed, real-time simulation environment. In this paper we present an overview of FACS and discuss in detail its formation estimation, guidance and control architectures and algorithms. Since FACS is currently being integrated into a high-fidelity simulation environment, component simulations demonstrating algorithm performance are presented.

  6. An Overview of the Formation and Attitude Control System for the Terrestrial Planet Finder Formation Flying Interferometer

    NASA Technical Reports Server (NTRS)

    Scharf, Daniel P.; Hadaegh, Fred Y.; Rahman, Zahidul H.; Shields, Joel F.; Singh, Gurkipal

    2004-01-01

    The Terrestrial Planet Finder formation flying Interferometer (TPF-I) will be a five-spacecraft, precision formation operating near a Sun-Earth Lagrange point. As part of technology development for TPF-I, a formation and attitude control system (FACS) is being developed that achieves the precision and functionality associated with the TPF-I formation. This FACS will be demonstrated in a distributed, real-time simulation environment. In this paper we present an overview of the FACS and discuss in detail its constituent formation estimation, guidance and control architectures and algorithms. Since the FACS is currently being integrated into a high-fidelity simulation environment, component simulations demonstrating algorithm performance are presented.

  7. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  8. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  9. Multiple technologies applied to characterization of the porosity and permeability of the Biscayne aquifer, Florida

    USGS Publications Warehouse

    Cunningham, K.J.; Sukop, M.C.

    2011-01-01

    Research is needed to determine how seepage-control actions planned by the Comprehensive Everglades Restoration Plan (CERP) will affect recharge, groundwater flow, and discharge within the dual-porosity karstic Biscayne aquifer where it extends eastward from the Everglades to Biscayne Bay. A key issue is whether the plan can be accomplished without causing urban flooding in adjacent populated areas and diminishing coastal freshwater flow needed in the restoration of the ecologic systems. Predictive simulation of groundwater flow is a prudent approach to understanding hydrologic change and potential ecologic impacts. A fundamental problem to simulation of karst groundwater flow is how best to represent aquifer heterogeneity. Currently, U.S. Geological Survey (USGS) researchers and academic partners are applying multiple innovative technologies to characterize the spatial distribution of porosity and permeability within the Biscayne aquifer.

  10. Geographic Visualization of Power-Grid Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.

    2015-06-18

    The visualization enables the simulation analyst to see changes in the frequency through time and space. With this technology, the analyst has a bird's eye view of the frequency at loads and generators as the simulated power system responds to the loss of a generator, spikes in load, and other contingencies. The significance of a contingency to the operation of an electrical power system depends critically on how the resulting tansients evolve in time and space. Consequently, these dynamic events can only be understood when seen in their proper geographic context. this understanding is indispensable to engineers working on themore » next generation of distributed sensing and control systems for the smart grid. By making possible a natural and intuitive presentation of dynamic behavior, our new visualization technology is a situational-awareness tool for power-system engineers.« less

  11. Enhancing audiovisual experience with haptic feedback: a survey on HAV.

    PubMed

    Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M

    2013-01-01

    Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.

  12. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  13. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  14. Magnetic Flux Distribution of Linear Machines with Novel Three-Dimensional Hybrid Magnet Arrays

    PubMed Central

    Yao, Nan; Yan, Liang; Wang, Tianyi; Wang, Shaoping

    2017-01-01

    The objective of this paper is to propose a novel tubular linear machine with hybrid permanent magnet arrays and multiple movers, which could be employed for either actuation or sensing technology. The hybrid magnet array produces flux distribution on both sides of windings, and thus helps to increase the signal strength in the windings. The multiple movers are important for airspace technology, because they can improve the system’s redundancy and reliability. The proposed design concept is presented, and the governing equations are obtained based on source free property and Maxwell equations. The magnetic field distribution in the linear machine is thus analytically formulated by using Bessel functions and harmonic expansion of magnetization vector. Numerical simulation is then conducted to validate the analytical solutions of the magnetic flux field. It is proved that the analytical model agrees with the numerical results well. Therefore, it can be utilized for the formulation of signal or force output subsequently, depending on its particular implementation. PMID:29156577

  15. Magnetic Flux Distribution of Linear Machines with Novel Three-Dimensional Hybrid Magnet Arrays.

    PubMed

    Yao, Nan; Yan, Liang; Wang, Tianyi; Wang, Shaoping

    2017-11-18

    The objective of this paper is to propose a novel tubular linear machine with hybrid permanent magnet arrays and multiple movers, which could be employed for either actuation or sensing technology. The hybrid magnet array produces flux distribution on both sides of windings, and thus helps to increase the signal strength in the windings. The multiple movers are important for airspace technology, because they can improve the system's redundancy and reliability. The proposed design concept is presented, and the governing equations are obtained based on source free property and Maxwell equations. The magnetic field distribution in the linear machine is thus analytically formulated by using Bessel functions and harmonic expansion of magnetization vector. Numerical simulation is then conducted to validate the analytical solutions of the magnetic flux field. It is proved that the analytical model agrees with the numerical results well. Therefore, it can be utilized for the formulation of signal or force output subsequently, depending on its particular implementation.

  16. Knowledge Management for Distributed Tracking

    DTIC Science & Technology

    2008-11-01

    PACIFIC San Diego, California 92152-5001 M. T. Kohlheim, CAPT, USN Commanding Officer C. A. Keeney Technical Director ADMINISTRATIVE INFORMATION ...bearing cross-fix algorithm. Modeling and simulation was used to generate test data that intelligent agents ingested to search for information that...potential of this approach to reduce information overload for the operator. The KMDT approach demonstrates how knowledge management technologies can be

  17. Contamination and Micropropulsion Technology

    DTIC Science & Technology

    2012-07-01

    23, 027101 (2011) Evaluation of active flow control applied to wind turbine blade section J. Renewable Sustainable Energy 2, 063101 (2010) Effect...field lines at high latitudes where solar wind electrons can readily access the upper atmosphere. The electron energy distribution in the auroral... slip behavior of n-hexadecane in large amplitude oscillatory shear flow via nonequilibrium molecular dynamic simulation J. Chem. Phys. 136, 104904

  18. CPU and GPU-based Numerical Simulations of Combustion Processes

    DTIC Science & Technology

    2012-04-27

    Distribution unlimited UCLA MAE Research and Technology Review April 27, 2012 Magnetohydrodynamic Augmentation of the Pulse Detonation Rocket Engines...Pulse Detonation Rocket-Induced MHD Ejector (PDRIME) – Energy extract from exhaust flow by MHD generator – Seeded air stream acceleration by MHD...accelerator for thrust enhancement and control • Alternative concept: Magnetic piston – During PDE blowdown process, MHD extracts energy and

  19. Bibliography--Unclassified Technical Reports, Special Reports, and Technical Notes: FY 1982.

    DTIC Science & Technology

    1982-11-01

    in each category are listed in chronological order under seven areas: manpower management, personnel administration , organization management, education...7633). Technical reports listed that have unlimited distribution can also be obtained from the National Technical Information Service , 5285 Port Royal...simulations of manpower systems. This research exploits the technology of computer-managed large-scale data bases. PERSONNEL ADMINISTRATION The personnel

  20. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  1. Efficient scatter model for simulation of ultrasound images from computed tomography data

    NASA Astrophysics Data System (ADS)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  2. A "total parameter estimation" method in the varification of distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Wang, M.; Qin, D.; Wang, H.

    2011-12-01

    Conventionally hydrological models are used for runoff or flood forecasting, hence the determination of model parameters are common estimated based on discharge measurements at the catchment outlets. With the advancement in hydrological sciences and computer technology, distributed hydrological models based on the physical mechanism such as SWAT, MIKESHE, and WEP, have gradually become the mainstream models in hydrology sciences. However, the assessments of distributed hydrological models and model parameter determination still rely on runoff and occasionally, groundwater level measurements. It is essential in many countries, including China, to understand the local and regional water cycle: not only do we need to simulate the runoff generation process and for flood forecasting in wet areas, we also need to grasp the water cycle pathways and consumption process of transformation in arid and semi-arid regions for the conservation and integrated water resources management. As distributed hydrological model can simulate physical processes within a catchment, we can get a more realistic representation of the actual water cycle within the simulation model. Runoff is the combined result of various hydrological processes, using runoff for parameter estimation alone is inherits problematic and difficult to assess the accuracy. In particular, in the arid areas, such as the Haihe River Basin in China, runoff accounted for only 17% of the rainfall, and very concentrated during the rainy season from June to August each year. During other months, many of the perennial rivers within the river basin dry up. Thus using single runoff simulation does not fully utilize the distributed hydrological model in arid and semi-arid regions. This paper proposed a "total parameter estimation" method to verify the distributed hydrological models within various water cycle processes, including runoff, evapotranspiration, groundwater, and soil water; and apply it to the Haihe river basin in China. The application results demonstrate that this comprehensive testing method is very useful in the development of a distributed hydrological model and it provides a new way of thinking in hydrological sciences.

  3. Optothermal transfer simulation in laser-irradiated human dentin.

    PubMed

    Moriyama, Eduardo H; Zangaro, Renato A; Lobo, Paulo D C; Villaverde, Antonio Balbin; Pacheco, Marcos T; Watanabe, Ii-Sei; Vitkin, Alex

    2003-04-01

    Laser technology has been studied as a potential replacement to the conventional dental drill. However, to prevent pulpal cell damage, information related to the safety parameters using high-power lasers in oral mineralized tissues is needed. In this study, the heat distribution profiles at the surface and subsurface regions of human dentine samples irradiated with a Nd:YAG laser were simulated using Crank-Nicolson's finite difference method for different laser energies and pulse durations. Heat distribution throughout the dentin layer, from the external dentin surface to the pulp chamber wall, were calculated in each case, to investigate the details of pulsed laser-hard dental tissue interactions. The results showed that the final temperature at the pulp chamber wall and at the dentin surface are strongly dependent on the pulse duration, exposure time, and the energy contained in each pulse.

  4. Wavelength tunable InGaN/GaN nano-ring LEDs via nano-sphere lithography

    PubMed Central

    Wang, Sheng-Wen; Hong, Kuo-Bin; Tsai, Yu-Lin; Teng, Chu-Hsiang; Tzou, An-Jye; Chu, You-Chen; Lee, Po-Tsung; Ku, Pei-Cheng; Lin, Chien-Chung; Kuo, Hao-Chung

    2017-01-01

    In this research, nano-ring light-emitting diodes (NRLEDs) with different wall width (120 nm, 80 nm and 40 nm) were fabricated by specialized nano-sphere lithography technology. Through the thinned wall, the effective bandgaps of nano-ring LEDs can be precisely tuned by reducing the strain inside the active region. Photoluminescence (PL) and time-resolved PL measurements indicated the lattice-mismatch induced strain inside the active region was relaxed when the wall width is reduced. Through the simulation, we can understand the strain distribution of active region inside NRLEDs. The simulation results not only revealed the exact distribution of strain but also predicted the trend of wavelength-shifted behavior of NRLEDs. Finally, the NRLEDs devices with four-color emission on the same wafer were demonstrated. PMID:28256529

  5. Simulation in paediatric urology and surgery, part 2: An overview of simulation modalities and their applications.

    PubMed

    Nataraja, R M; Webb, N; Lopez, P J

    2018-04-01

    Surgical training has changed radically in the last few decades. The traditional Halstedian model of time-bound apprenticeship has been replaced with competency-based training. In our previous article, we presented an overview of learning theory relevant to clinical teaching; a summary for the busy paediatric surgeon and urologist. We introduced the concepts underpinning current changes in surgical education and training. In this next article, we give an overview of the various modalities of surgical simulation, the educational principles that underlie them, and potential applications in clinical practice. These modalities include; open surgical models and trainers, laparoscopic bench trainers, virtual reality trainers, simulated patients and role-play, hybrid simulation, scenario-based simulation, distributed simulation, virtual reality, and online simulation. Specific examples of technology that may be used for these modalities are included but this is not a comprehensive review of all available products. Copyright © 2018 Journal of Pediatric Urology Company. Published by Elsevier Ltd. All rights reserved.

  6. A Radar-Enabled Collaborative Sensor Network Integrating COTS Technology for Surveillance and Tracking

    PubMed Central

    Kozma, Robert; Wang, Lan; Iftekharuddin, Khan; McCracken, Ernest; Khan, Muhammad; Islam, Khandakar; Bhurtel, Sushil R.; Demirer, R. Murat

    2012-01-01

    The feasibility of using Commercial Off-The-Shelf (COTS) sensor nodes is studied in a distributed network, aiming at dynamic surveillance and tracking of ground targets. Data acquisition by low-cost (<$50 US) miniature low-power radar through a wireless mote is described. We demonstrate the detection, ranging and velocity estimation, classification and tracking capabilities of the mini-radar, and compare results to simulations and manual measurements. Furthermore, we supplement the radar output with other sensor modalities, such as acoustic and vibration sensors. This method provides innovative solutions for detecting, identifying, and tracking vehicles and dismounts over a wide area in noisy conditions. This study presents a step towards distributed intelligent decision support and demonstrates effectiveness of small cheap sensors, which can complement advanced technologies in certain real-life scenarios. PMID:22438713

  7. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  8. Competition and Cooperation of Distributed Generation and Power System

    NASA Astrophysics Data System (ADS)

    Miyake, Masatoshi; Nanahara, Toshiya

    Advances in distributed generation technologies together with the deregulation of an electric power industry can lead to a massive introduction of distributed generation. Since most of distributed generation will be interconnected to a power system, coordination and competition between distributed generators and large-scale power sources would be a vital issue in realizing a more desirable energy system in the future. This paper analyzes competitions between electric utilities and cogenerators from the viewpoints of economic and energy efficiency based on the simulation results on an energy system including a cogeneration system. First, we examine best response correspondence of an electric utility and a cogenerator with a noncooperative game approach: we obtain a Nash equilibrium point. Secondly, we examine the optimum strategy that attains the highest social surplus and the highest energy efficiency through global optimization.

  9. Online location of a break in water distribution systems

    NASA Astrophysics Data System (ADS)

    Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei

    2003-08-01

    Breaks often occur to urban water distribution systems under severely cold weather, or due to corrosion of pipes, deformation of ground, etc., and the breaks cannot easily be located, especially immediately after the events. This paper develops a methodology to locate a break in a water distribution system by monitoring water pressure online at some nodes in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the break based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides a quick, effective, and practical way in which a break in a water distribution system can be located.

  10. Computational Methods for HSCT-Inlet Controls/CFD Interdisciplinary Research

    NASA Technical Reports Server (NTRS)

    Cole, Gary L.; Melcher, Kevin J.; Chicatelli, Amy K.; Hartley, Tom T.; Chung, Joongkee

    1994-01-01

    A program aimed at facilitating the use of computational fluid dynamics (CFD) simulations by the controls discipline is presented. The objective is to reduce the development time and cost for propulsion system controls by using CFD simulations to obtain high-fidelity system models for control design and as numerical test beds for control system testing and validation. An interdisciplinary team has been formed to develop analytical and computational tools in three discipline areas: controls, CFD, and computational technology. The controls effort has focused on specifying requirements for an interface between the controls specialist and CFD simulations and a new method for extracting linear, reduced-order control models from CFD simulations. Existing CFD codes are being modified to permit time accurate execution and provide realistic boundary conditions for controls studies. Parallel processing and distributed computing techniques, along with existing system integration software, are being used to reduce CFD execution times and to support the development of an integrated analysis/design system. This paper describes: the initial application for the technology being developed, the high speed civil transport (HSCT) inlet control problem; activities being pursued in each discipline area; and a prototype analysis/design system in place for interactive operation and visualization of a time-accurate HSCT-inlet simulation.

  11. Integration of advanced technologies to enhance problem-based learning over distance: Project TOUCH.

    PubMed

    Jacobs, Joshua; Caudell, Thomas; Wilks, David; Keep, Marcus F; Mitchell, Steven; Buchanan, Holly; Saland, Linda; Rosenheimer, Julie; Lozanoff, Beth K; Lozanoff, Scott; Saiki, Stanley; Alverson, Dale

    2003-01-01

    Distance education delivery has increased dramatically in recent years as a result of the rapid advancement of communication technology. The National Computational Science Alliance's Access Grid represents a significant advancement in communication technology with potential for distance medical education. The purpose of this study is to provide an overview of the TOUCH project (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) with special emphasis on the process of problem-based learning case development for distribution over the Access Grid. The objective of the TOUCH project is to use emerging Internet-based technology to overcome geographic barriers for delivery of tutorial sessions to medical students pursuing rotations at remote sites. The TOUCH project also is aimed at developing a patient simulation engine and an immersive virtual reality environment to achieve a realistic health care scenario enhancing the learning experience. A traumatic head injury case is developed and distributed over the Access Grid as a demonstration of the TOUCH system. Project TOUCH serves as an example of a computer-based learning system for developing and implementing problem-based learning cases within the medical curriculum, but this system should be easily applied to other educational environments and disciplines involving functional and clinical anatomy. Future phases will explore PC versions of the TOUCH cases for increased distribution. Copyright 2003 Wiley-Liss, Inc.

  12. Particle Simulations of the Guard Electrode Effects on the Photoelectron Distribution Around an Electric Field Sensor

    NASA Astrophysics Data System (ADS)

    Miyake, Y.; Usui, H.; Kojima, H.

    2010-12-01

    In tenuous space plasma environment, photoelectrons emitted due to solar illumination produce a high-density photoelectron cloud localized in the vicinity of a spacecraft body and an electric field sensor. The photoelectron current emitted from the sensor has also received considerable attention because it becomes a primary factor in determining floating potentials of the sunlit spacecraft and sensor bodies. Considering the fact that asymmetric photoelectron distribution between sunlit and sunless sides of the spacecraft occasionally causes a spurious sunward electric field, we require quantitative evaluation of the photoelectron distribution around the spacecraft and its influence on electric field measurements by means of a numerical approach. In the current study, we applied the Particle-in-Cell plasma simulation to the analysis of the photoelectron environment around spacecraft. By using the PIC modeling, we can self-consistently consider the plasma kinetics. This enables us to simulate the formation of the photoelectron cloud as well as the spacecraft and sensor charging in a self-consistent manner. We report the progress of an analysis on photoelectron environment around MEFISTO, which is an electric field instrument for the BepiColombo/MMO spacecraft to Mercury’s magnetosphere. The photoelectron guard electrode is a key technology for ensuring an optimum photoelectron environment. We show some simulation results on the guard electrode effects on surrounding photoelectrons and discuss a guard operation condition for producing the optimum photoelectron environment. We also deal with another important issue, that is, how the guard electrode can mitigate an undesirable influence of an asymmetric photoelectron distribution on electric field measurements.

  13. A parallel implementation of an off-lattice individual-based model of multicellular populations

    NASA Astrophysics Data System (ADS)

    Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe

    2015-07-01

    As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.

  14. Electrical System Technology Working Group (WG) Report

    NASA Technical Reports Server (NTRS)

    Silverman, S.; Ford, F. E.

    1984-01-01

    The technology needs for space power systems (military, public, commercial) were assessed for the period 1995 to 2005 in the area of power management and distribution, components, circuits, subsystems, controls and autonomy, modeling and simulation. There was general agreement that the military requirements for pulse power would be the dominant factor in the growth of power systems. However, the growth of conventional power to the 100 to 250kw range would be in the public sector, with low Earth orbit needs being the driver toward large 100kw systems. An overall philosophy for large power system development is also described.

  15. «Smart Grid» Concept As A Modern Technology For The Power Industry Development

    NASA Astrophysics Data System (ADS)

    Vidyaev, Igor G.; Ivashutenko, Alexandr S.; Samburskaya, Maria A.

    2017-01-01

    The article discusses the main problems of the power industry and energy supply to the distribution networks. One of the suggested solutions for these problems is the use of intelligent energy networks on the basis of digital reality simulation, in particular, the concept of «SMART GRID». The article presents the basic points of the concept and the peculiarities of its application at the enterprises. It was demonstrated that the use of this technology eliminates power shortage, reduces the energy intensity and improves the energy efficiency throughout the operation of an enterprise as a whole.

  16. Power and Thermal Technology for Air and Space-Scientific Research Program Delivery Order 0003: Electrical Technology Component Development

    DTIC Science & Technology

    2007-03-01

    specific contact resistivity of Ti/AlNi/Au 24 21 The full view 3D model of the IGBT ………………………………….. 25 22 2D temperature distribution of the SiC...comprised of multiple materials. The representative geometry of a Si isolated gated bipolar transistor ( IGBT ) was chosen for the initial simulation...samples annealed at 650°C for 30 minutes in either the tube furnace with an oxygen gettering system or in the vacuum chamber, represented the superior

  17. Communication Simulations for Power System Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.

    2013-05-29

    New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systemsmore » will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.« less

  18. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects

    PubMed Central

    Lambers, Martin; Kolb, Andreas

    2017-01-01

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data. PMID:29271888

  19. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.

    PubMed

    Bulczak, David; Lambers, Martin; Kolb, Andreas

    2017-12-22

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  20. Faster than Real-Time Dynamic Simulation for Large-Size Power System with Detailed Dynamic Models using High-Performance Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Jin, Shuangshuang; Chen, Yousu

    This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less

  1. Study on key technologies of optimization of big data for thermal power plant performance

    NASA Astrophysics Data System (ADS)

    Mao, Mingyang; Xiao, Hong

    2018-06-01

    Thermal power generation accounts for 70% of China's power generation, the pollutants accounted for 40% of the same kind of emissions, thermal power efficiency optimization needs to monitor and understand the whole process of coal combustion and pollutant migration, power system performance data show explosive growth trend, The purpose is to study the integration of numerical simulation of big data technology, the development of thermal power plant efficiency data optimization platform and nitrogen oxide emission reduction system for the thermal power plant to improve efficiency, energy saving and emission reduction to provide reliable technical support. The method is big data technology represented by "multi-source heterogeneous data integration", "large data distributed storage" and "high-performance real-time and off-line computing", can greatly enhance the energy consumption capacity of thermal power plants and the level of intelligent decision-making, and then use the data mining algorithm to establish the boiler combustion mathematical model, mining power plant boiler efficiency data, combined with numerical simulation technology to find the boiler combustion and pollutant generation rules and combustion parameters of boiler combustion and pollutant generation Influence. The result is to optimize the boiler combustion parameters, which can achieve energy saving.

  2. Designing a Distributed Space Systems Simulation in Accordance with the Simulation Interoperability Standards Organization (SISO)

    NASA Technical Reports Server (NTRS)

    Cowen, Benjamin

    2011-01-01

    Simulations are essential for engineering design. These virtual realities provide characteristic data to scientists and engineers in order to understand the details and complications of the desired mission. A standard development simulation package known as Trick is used in developing a source code to model a component (federate in HLA terms). The runtime executive is integrated into an HLA based distributed simulation. TrickHLA is used to extend a Trick simulation for a federation execution, develop a source code for communication between federates, as well as foster data input and output. The project incorporates international cooperation along with team collaboration. Interactions among federates occur throughout the simulation, thereby relying on simulation interoperability. Communication through the semester went on between participants to figure out how to create this data exchange. The NASA intern team is designing a Lunar Rover federate and a Lunar Shuttle federate. The Lunar Rover federate supports transportation across the lunar surface and is essential for fostering interactions with other federates on the lunar surface (Lunar Shuttle, Lunar Base Supply Depot and Mobile ISRU Plant) as well as transporting materials to the desired locations. The Lunar Shuttle federate transports materials to and from lunar orbit. Materials that it takes to the supply depot include fuel and cargo necessary to continue moon-base operations. This project analyzes modeling and simulation technologies as well as simulation interoperability. Each team from participating universities will work on and engineer their own federate(s) to participate in the SISO Spring 2011 Workshop SIW Smackdown in Boston, Massachusetts. This paper will focus on the Lunar Rover federate.

  3. Simulation of cooling efficiency via miniaturised channels in multilayer LTCC for power electronics

    NASA Astrophysics Data System (ADS)

    Pietrikova, Alena; Girasek, Tomas; Lukacs, Peter; Welker, Tilo; Müller, Jens

    2017-03-01

    The aim of this paper is detailed investigation of thermal resistance, flow analysis and distribution of coolant as well as thermal distribution inside multilayer LTCC substrates with embedded channels for power electronic devices by simulation software. For this reason four various structures of internal channels in the multilayer LTCC substrates were designed and simulated. The impact of the volume flow, structures of channels, and power loss of chip was simulated, calculated and analyzed by using the simulation software Mentor Graphics FloEFDTM. The structure, size and location of channels have the significant impact on thermal resistance, pressure of coolant as well as the effectivity of cooling power components (chips) that can be placed on the top of LTCC substrate. The main contribution of this paper is thermal analyze, optimization and impact of 4 various cooling channels embedded in LTCC multilayer structure. Paper investigate, the effect of volume flow in cooling channels for achieving the least thermal resistance of LTCC substrate that is loaded by power thermal chips. Paper shows on the impact of the first chips thermal load on the second chip as well as. This possible new technology could ensure in the case of practical realization effective cooling and increasing reliability of high power modules.

  4. Lessons Learned from Numerical Simulations of the F-16XL Aircraft at Flight Conditions

    NASA Technical Reports Server (NTRS)

    Rizzi, Arthur; Jirasek, Adam; Lamar, John; Crippa, Simone; Badcock, Kenneth; Boelens, Oklo

    2009-01-01

    Nine groups participating in the Cranked Arrow Wing Aerodynamics Project International (CAWAPI) project have contributed steady and unsteady viscous simulations of a full-scale, semi-span model of the F-16XL aircraft. Three different categories of flight Reynolds/Mach number combinations were computed and compared with flight-test measurements for the purpose of code validation and improved understanding of the flight physics. Steady-state simulations are done with several turbulence models of different complexity with no topology information required and which overcome Boussinesq-assumption problems in vortical flows. Detached-eddy simulation (DES) and its successor delayed detached-eddy simulation (DDES) have been used to compute the time accurate flow development. Common structured and unstructured grids as well as individually-adapted unstructured grids were used. Although discrepancies are observed in the comparisons, overall reasonable agreement is demonstrated for surface pressure distribution, local skin friction and boundary velocity profiles at subsonic speeds. The physical modeling, steady or unsteady, and the grid resolution both contribute to the discrepancies observed in the comparisons with flight data, but at this time it cannot be determined how much each part contributes to the whole. Overall it can be said that the technology readiness of CFD-simulation technology for the study of vehicle performance has matured since 2001 such that it can be used today with a reasonable level of confidence for complex configurations.

  5. Multi-INT and Information Operations Simulation and Training Technologies (MIISTT)

    DTIC Science & Technology

    2010-11-01

    Communications Ronnie F. Silber Lockheed Martin Raymond T. Tillman L-3 Communications November 2010 Final Report DISTRIBUTION A. Approved...of information, including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services , Directorate for...Michelle Caisse, Ronnie F. Silber , Raymond T. Tillman 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 2830HXA1 7. PERFORMING

  6. Scale-dependent diffusion anisotropy in nanoporous silicon

    PubMed Central

    Kondrashova, Daria; Lauerer, Alexander; Mehlhorn, Dirk; Jobic, Hervé; Feldhoff, Armin; Thommes, Matthias; Chakraborty, Dipanjan; Gommes, Cedric; Zecevic, Jovana; de Jongh, Petra; Bunde, Armin; Kärger, Jörg; Valiullin, Rustem

    2017-01-01

    Nanoporous silicon produced by electrochemical etching of highly B-doped p-type silicon wafers can be prepared with tubular pores imbedded in a silicon matrix. Such materials have found many technological applications and provide a useful model system for studying phase transitions under confinement. This paper reports a joint experimental and simulation study of diffusion in such materials, covering displacements from molecular dimensions up to tens of micrometers with carefully selected probe molecules. In addition to mass transfer through the channels, diffusion (at much smaller rates) is also found to occur in directions perpendicular to the channels, thus providing clear evidence of connectivity. With increasing displacements, propagation in both axial and transversal directions is progressively retarded, suggesting a scale-dependent, hierarchical distribution of transport resistances (“constrictions” in the channels) and of shortcuts (connecting “bridges”) between adjacent channels. The experimental evidence from these studies is confirmed by molecular dynamics (MD) simulation in the range of atomistic displacements and rationalized with a simple model of statistically distributed “constrictions” and “bridges” for displacements in the micrometer range via dynamic Monte Carlo (DMC) simulation. Both ranges are demonstrated to be mutually transferrable by DMC simulations based on the pore space topology determined by electron tomography. PMID:28106047

  7. In Situ Distribution Guided Analysis and Visualization of Transonic Jet Engine Simulations.

    PubMed

    Dutta, Soumya; Chen, Chun-Ming; Heinlein, Gregory; Shen, Han-Wei; Chen, Jen-Ping

    2017-01-01

    Study of flow instability in turbine engine compressors is crucial to understand the inception and evolution of engine stall. Aerodynamics experts have been working on detecting the early signs of stall in order to devise novel stall suppression technologies. A state-of-the-art Navier-Stokes based, time-accurate computational fluid dynamics simulator, TURBO, has been developed in NASA to enhance the understanding of flow phenomena undergoing rotating stall. Despite the proven high modeling accuracy of TURBO, the excessive simulation data prohibits post-hoc analysis in both storage and I/O time. To address these issues and allow the expert to perform scalable stall analysis, we have designed an in situ distribution guided stall analysis technique. Our method summarizes statistics of important properties of the simulation data in situ using a probabilistic data modeling scheme. This data summarization enables statistical anomaly detection for flow instability in post analysis, which reveals the spatiotemporal trends of rotating stall for the expert to conceive new hypotheses. Furthermore, the verification of the hypotheses and exploratory visualization using the summarized data are realized using probabilistic visualization techniques such as uncertain isocontouring. Positive feedback from the domain scientist has indicated the efficacy of our system in exploratory stall analysis.

  8. Extending the Capabilities of Closed-loop Distributed Engine Control Simulations Using LAN Communication

    NASA Technical Reports Server (NTRS)

    Aretskin-Hariton, Eliot D.; Zinnecker, Alicia Mae; Culley, Dennis E.

    2014-01-01

    Distributed Engine Control (DEC) is an enabling technology that has the potential to advance the state-of-the-art in gas turbine engine control. To analyze the capabilities that DEC offers, a Hardware-In-the-Loop (HIL) test bed is being developed at NASA Glenn Research Center. This test bed will support a systems-level analysis of control capabilities in closed-loop engine simulations. The structure of the HIL emulates a virtual test cell by implementing the operator functions, control system, and engine on three separate computers. This implementation increases the flexibility and extensibility of the HIL. Here, a method is discussed for implementing these interfaces by connecting the three platforms over a dedicated Local Area Network (LAN). This approach is verified using the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k), which is typically implemented on one computer. There are marginal differences between the results from simulation of the typical and the three-computer implementation. Additional analysis of the LAN network, including characterization of network load, packet drop, and latency, is presented. The three-computer setup supports the incorporation of complex control models and proprietary engine models into the HIL framework.

  9. Distributed Planning and Control for Teams of Cooperating Mobile Robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, L.E.

    2004-06-15

    This CRADA project involved the cooperative research of investigators in ORNL's Center for Engineering Science Advanced Research (CESAR) with researchers at Caterpillar, Inc. The subject of the research was the development of cooperative control strategies for autonomous vehicles performing applications of interest to Caterpillar customers. The project involved three Phases of research, conducted over the time period of November 1998 through December 2001. This project led to the successful development of several technologies and demonstrations in realistic simulation that illustrated the effectiveness of the control approaches for distributed planning and cooperation in multi-robot teams.

  10. MATSIM: Development of a Voxel Model of the MATROSHKA Astronaut Dosimetric Phantom

    NASA Astrophysics Data System (ADS)

    Beck, Peter; Zechner, Andrea; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Hranitzky, Christian; Latocha, Marcin; Reitz, Günther; Stadtmann, Hannes; Vana, Norbert; Wind, Michael

    2011-08-01

    The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center, to perform FLUKA Monte Carlo simulations of the MATROSHKA numerical phantom irradiated under reference radiation field conditions as well as for the radiation environment at the International Space Station (ISS). MATSIM is carried out as co-investigation of the ESA ELIPS projects SORD and RADIS (commonly known as MATROSHKA), an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. During MATSIM a computer tomography scan of the MATROSHKA phantom has been converted into a high resolution 3-dimensional voxel model. The energy imparted and absorbed dose distribution inside the model is determined for various radiation fields. The major goal of the MATSIM project is the validation of the numerical model under reference radiation conditions and further investigations under the radiation environment at ISS. In this report we compare depth dose distributions inside the phantom measured with thermoluminescence detectors (TLDs) and an ionization chamber with FLUKA Monte Carlo particle transport simulations due to 60Co photon exposure. Further reference irradiations with neutrons, protons and heavy ions are planned. The fully validated numerical model MATSIM will provide a perfect tool to assess the radiation exposure to humans during current and future space missions to ISS, Moon, Mars and beyond.

  11. Replicative manufacturing of complex lighting optics by non-isothermal glass molding

    NASA Astrophysics Data System (ADS)

    Kreilkamp, Holger; Vu, Anh Tuan; Dambon, Olaf; Klocke, Fritz

    2016-09-01

    The advantages of LED lighting, especially its energy efficiency and the long service life have led to a wide distribution of LED technology in the world. However, in order to make fully use of the great potential that LED lighting offers, complex optics are required to distribute the emitted light from the LED efficiently. Nowadays, many applications use polymer optics which can be manufactured at low costs. However, due to ever increasing luminous power, polymer optics reach their technological limits. Due to its outstanding properties, especially its temperature resistance, resistance against UV radiation and its long term stability, glass is the alternative material of choice for the use in LED optics. This research is introducing a new replicative glass manufacturing approach, namely non-isothermal glass molding (NGM) which is able to manufacture complex lighting optics in high volumes at competitive prices. The integration of FEM simulation at the early stage of the process development is presented and helps to guarantee a fast development cycle. A coupled thermo-mechanical model is used to define the geometry of the glass preform as well as to define the mold surface geometry. Furthermore, simulation is used to predict main process outcomes, especially in terms of resulting form accuracy of the molded optics. Experiments conducted on a commercially available molding machine are presented to validate the developed simulation model. Finally, the influence of distinct parameters on important process outcomes like form accuracy, surface roughness, birefringence, etc. is discussed.

  12. Using OPC technology to support the study of advanced process control.

    PubMed

    Mahmoud, Magdi S; Sabih, Muhammad; Elshafei, Moustafa

    2015-03-01

    OPC, originally the Object Linking and Embedding (OLE) for Process Control, brings a broad communication opportunity between different kinds of control systems. This paper investigates the use of OPC technology for the study of distributed control systems (DCS) as a cost effective and flexible research tool for the development and testing of advanced process control (APC) techniques in university research centers. Co-Simulation environment based on Matlab, LabVIEW and TCP/IP network is presented here. Several implementation issues and OPC based client/server control application have been addressed for TCP/IP network. A nonlinear boiler model is simulated as OPC server and OPC client is used for closed loop model identification, and to design a Model Predictive Controller. The MPC is able to control the NOx emissions in addition to drum water level and steam pressure. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  13. 26th JANNAF Airbreathing Propulsion Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Fry, Ronald S. (Editor); Gannaway, Mary T. (Editor)

    2002-01-01

    This volume, the first of four volumes, is a collection of 28 unclassified/unlimited-distribution papers which were presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 26th Airbreathing Propulsion Subcommittee (APS) was held jointly with the 38th Combustion Subcommittee (CS), 20th Propulsion Systems Hazards Subcommittee (PSHS), and 2nd Modeling and Simulation Subcommittee. The meeting was held 8-12 April 2002 at the Bayside Inn at The Sandestin Golf & Beach Resort and Eglin Air Force Base, Destin, Florida. Topics covered include: scramjet and ramjet R&D program overviews; tactical propulsion; space access; NASA GTX status; PDE technology; actively cooled engine structures; modeling and simulation of complex hydrocarbon fuels and unsteady processes; and component modeling and simulation.

  14. The investigation on mirrors maladjustment for RLG

    NASA Astrophysics Data System (ADS)

    He, Xiao-qing; Gao, Ai-hua; Hu, Shang-bin; Lu, Zhi-guo

    2011-06-01

    In order to meet the high demand of the entire technology processing, the error compensation method is usually used to correct them and is premised on a good understanding of error sources and the law of the errors. In this paper, based on the theories of Collins's Integral and Collins's EIKONAL Function and the MATLAB software, we simulated and calculated the spatial distribution of optical beam in the cavity of the ring laser gyro under the resonator's maladjustment caused by the technology processing. From the simulation results, we can get that to the small-gain lasers, the same amount of disorders in the different structures have different effects on the spatial distribution of the beam, and the structures using the spherical mirrors relatively have the small impact on the beam; under the same disorder in the same cavity shape, the signal light and the calibration light which are respectively detected from the mirror M1 and M4 are different; under the same structures, different mirrors with the same amount of disorder will cause the different beat frequency difference; because of the disorders, the spot centers of clockwise and counterclockwise waves happen shift and will seriously affect the normal operation of the laser gyro if the imbalance reaches a certain degree. This work has a guiding role in the mirror adjustment of the laser gyros' technology processing, and has a reference value to the survival rate of the laser gyros and the improvement of measurement accuracy.

  15. Low-cost real-time 3D PC distributed-interactive-simulation (DIS) application for C4I

    NASA Astrophysics Data System (ADS)

    Gonthier, David L.; Veron, Harry

    1998-04-01

    A 3D Distributed Interactive Simulation (DIS) application was developed and demonstrated in a PC environment. The application is capable of running in the stealth mode or as a player which includes battlefield simulations, such as ModSAF. PCs can be clustered together, but not necessarily collocated, to run a simulation or training exercise on their own. A 3D perspective view of the battlefield is displayed that includes terrain, trees, buildings and other objects supported by the DIS application. Screen update rates of 15 to 20 frames per second have been achieved with fully lit and textured scenes thus providing high quality and fast graphics. A complete PC system can be configured for under $2,500. The software runs under Windows95 and WindowsNT. It is written in C++ and uses a commercial API called RenderWare for 3D rendering. The software uses Microsoft Foundation classes and Microsoft DirectPlay for joystick input. The RenderWare libraries enhance the performance through optimization for MMX and the Pentium Pro processor. The RenderWare and the Righteous 3D graphics board from Orchid Technologies with an advertised rendering rate of up to 2 million texture mapped triangles per second. A low-cost PC DIS simulator that can partake in a real-time collaborative simulation with other platforms is thus achieved.

  16. High Fidelity Simulations of Large-Scale Wireless Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma; Benz, Zachary

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulationsmore » (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.« less

  17. Flow Simulation of N3-X Hybrid Wing-Body Configuration

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungjin; Liou, Meng-Sing

    2013-01-01

    System studies show that a N3-X hybrid wing-body aircraft with a turboelectric distributed propulsion system using a mail-slot inlet/nozzle nacelle can meet the environmental and performance goals for N+3 generation transports (three generations beyond the current air transport technology level) set by NASA s Subsonic Fixed Wing Project. In this study, a Navier-Stokes flow simulation of N3-X on hybrid unstructured meshes was conducted, including the mail-slot propulsor. The geometry of the mail-slot propulsor was generated by a CAD (Computer-Aided Design)-free shape parameterization. A body force approach was used for a more realistic and efficient simulation of the turning and loss effects of the fan blades and the inlet-fan interactions. Flow simulation results of the N3-X demonstrates the validity of the present approach.

  18. The theoretical simulation on electrostatic distribution of 1st proximity region in proximity focusing low-light-level image intensifier

    NASA Astrophysics Data System (ADS)

    Zhang, Liandong; Bai, Xiaofeng; Song, De; Fu, Shencheng; Li, Ye; Duanmu, Qingduo

    2015-03-01

    Low-light-level night vision technology is magnifying low light level signal large enough to be seen by naked eye, which uses the photons - photoelectron as information carrier. Until the micro-channel plate was invented, it has been possibility for the realization of high performance and miniaturization of low-light-level night vision device. The device is double-proximity focusing low-light-level image intensifier which places a micro-channel plate close to photocathode and phosphor screen. The advantages of proximity focusing low-light-level night vision are small size, light weight, small power consumption, no distortion, fast response speed, wide dynamic range and so on. It is placed parallel to each other for Micro-channel plate (both sides of it with metal electrode), the photocathode and the phosphor screen are placed parallel to each other. The voltage is applied between photocathode and the input of micro-channel plate when image intensifier works. The emission electron excited by photo on the photocathode move towards to micro-channel plate under the electric field in 1st proximity focusing region, and then it is multiplied through the micro-channel. The movement locus of emission electrons can be calculated and simulated when the distributions of electrostatic field equipotential lines are determined in the 1st proximity focusing region. Furthermore the resolution of image tube can be determined. However the distributions of electrostatic fields and equipotential lines are complex due to a lot of micro-channel existing in the micro channel plate. This paper simulates electrostatic distribution of 1st proximity region in double-proximity focusing low-light-level image intensifier with the finite element simulation analysis software Ansoft maxwell 3D. The electrostatic field distributions of 1st proximity region are compared when the micro-channel plates' pore size, spacing and inclination angle ranged. We believe that the electron beam movement trajectory in 1st proximity region will be better simulated when the electronic electrostatic fields are simulated.

  19. Electrohydrodynamic simulation of an electrospray in a colloid thruster

    NASA Astrophysics Data System (ADS)

    Jugroot, Manish; Forget, Martin; Malardier-Jugroot, Cecile

    2012-02-01

    A precise understanding of electrosprays is highly interesting as the complexity of micro-technology (such as nano-material processing, spacecraft propulsion and mass-spectrometers) systems increases. A multi-component CFD-based model coupling fluid dynamics, charged species dynamics and electric field is developed. The simulations describe the charged fluid interface with emphasis on the Taylor cone formation and cone-jet transition under the effect of a electric field. The goal is to recapture this transition from a rounded liquid interface into a Taylor cone from an initial uniform distribution, without making assumptions on the behaviour, geometry or charge distribution of the system. The time evolution of the interface highlights the close interaction among space charge, coulombic forces and the surface tension, which appear as governing and competing processes in the transition. The results from the coupled formalism provide valuable insights on the physical phenomena and will be applied to a colloid thruster for small spacecrafts.

  20. The simulation of decontamination works in premises of the research reactor in NRC 'Kurchatov institute'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danilovich, Alexey; Ivanov, Oleg; Potapov, Victor

    2013-07-01

    Application of remote sensing methods using a spectrometric collimated system allows obtaining information about features of a formation of radiation fields in contaminated premises. This information helps in a preparation of a phased plan for dismantling of contaminated equipment. When the survey of technological premises of the research reactor at the Russian Research Centre 'Kurchatov institute' was conducted the remote controlled collimated spectrometric system was used. With its help the scanning of surveyed premises were carried out. As a result of this work, the distribution pattern of radionuclides activity was restored. The simulation of decontamination works was carried out andmore » maps of the distribution of activity and dose rate for surveyed premises were plotted and superimposed on its photo for situations before and after decontamination. The use of obtained results will allow significantly reduce radiation dose for staff at work on dismantling. (authors)« less

  1. Red mud flocculation process in alumina production

    NASA Astrophysics Data System (ADS)

    Fedorova, E. R.; Firsov, A. Yu

    2018-05-01

    The process of thickening and washing red mud is a gooseneck of alumina production. The existing automated systems of the thickening process control involve stabilizing the parameters of the primary technological circuits of the thickener. The actual direction of scientific research is the creation and improvement of models and systems of the thickening process control by model. But the known models do not fully consider the presence of perturbing effects, in particular the particle size distribution in the feed process, distribution of floccules by size after the aggregation process in the feed barrel. The article is devoted to the basic concepts and terms used in writing the population balance algorithm. The population balance model is implemented in the MatLab environment. The result of the simulation is the particle size distribution after the flocculation process. This model allows one to foreseen the distribution range of floccules after the process of aggregation of red mud in the feed barrel. The mud of Jamaican bauxite was acting as an industrial sample of red mud; Cytec Industries of HX-3000 series with a concentration of 0.5% was acting as a flocculant. When simulating, model constants obtained in a tubular tank in the laboratories of CSIRO (Australia) were used.

  2. The Steinberg-Bernstein Centre for Minimally Invasive Surgery at McGill University.

    PubMed

    Fried, Gerald M

    2005-12-01

    Surgical skills and simulation centers have been developed in recent years to meet the educational needs of practicing surgeons, residents, and students. The rapid pace of innovation in surgical procedures and technology, as well as the overarching desire to enhance patient safety, have driven the development of simulation technology and new paradigms for surgical education. McGill University has implemented an innovative approach to surgical education in the field of minimally invasive surgery. The goal is to measure surgical performance in the operating room using practical, reliable, and valid metrics, which allow the educational needs of the learner to be established and enable feedback and performance to be tracked over time. The GOALS system and the MISTELS program have been developed to measure operative performance and minimally invasive surgical technical skills in the inanimate skills lab, respectively. The MISTELS laparoscopic simulation-training program has been incorporated as the manual skills education and evaluation component of the Fundamentals of Laparoscopic Surgery program distributed by the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) and the American College of Surgeons.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruth, M.; Pratt, A.; Lunacek, M.

    The combination of distributed energy resources (DER) and retail tariff structures to provide benefits to both utility consumers and the utilities is not well understood. To improve understanding, an Integrated Energy System Model (IESM) is being developed to simulate the physical and economic aspects of DER technologies, the buildings where they reside, and feeders servicing them. The IESM was used to simulate 20 houses with home energy management systems on a single feeder under a time-of-use (TOU) tariff to estimate economic and physical impacts on both the households and the distribution utilities. Home energy management systems (HEMS) reduce consumers’ electricmore » bills by precooling houses in the hours before peak electricity pricing. Utilization of HEMS reduce peak loads during high price hours but shifts it to hours with off-peak and shoulder prices, resulting in a higher peak load. used to simulate 20 houses with home energy management systems on a single feeder under a time-of-use (TOU) tariff to estimate economic and physical impacts on both the households and the distribution utilities. Home energy management systems (HEMS) reduce consumers’ electric bills by precooling houses in the hours before peak electricity pricing. Utilization of HEMS reduce peak loads during high price hours but shifts it to hours with off-peak and shoulder prices, resulting in a higher peak load.« less

  4. Simulation studies promote technological development of radiofrequency phased array hyperthermia.

    PubMed

    Wust, P; Seebass, M; Nadobny, J; Deuflhard, P; Mönich, G; Felix, R

    1996-01-01

    A treatment planning program package for radiofrequency hyperthermia has been developed. It consists of software modules for processing three-dimensional computerized tomography (CT) data sets, manual segmentation, generation of tetrahedral grids, numerical calculation and optimisation of three-dimensional E field distributions using a volume surface integral equation algorithm as well as temperature distributions using an adaptive multilevel finite-elements code, and graphical tools for simultaneous representation of CT data and simulation results. Heat treatments are limited by hot spots in healthy tissues caused by E field maxima at electrical interfaces (bone/muscle). In order to reduce or avoid hot spots suitable objective functions are derived from power deposition patterns and temperature distributions, and are utilised to optimise antenna parameters (phases, amplitudes). The simulation and optimisation tools have been applied to estimate the improvements that could be reached by upgrades of the clinically used SIGMA-60 applicator (consisting of a single ring of four antenna pairs). The investigated upgrades are increased number of antennas and channels (triple-ring of 3 x 8 antennas and variation of antenna inclination. Significant improvement of index temperatures (1-2 degrees C) is achieved by upgrading the single ring to a triple ring with free phase selection for every antenna or antenna pair. Antenna amplitudes and inclinations proved as less important parameters.

  5. Web-based multimedia courseware for emergency cardiac patient management simulations.

    PubMed

    Ambrosiadou, V; Compton, T; Panchal, T; Polovina, S

    2000-01-01

    This is a multidisciplinary inter-departmental/faculty project between the departments of computer science, electronic, communications and electrical engineering and nursing and paramedic sciences. The objective is to develop a web based multimedia front end to existing simulations of cardiac emergency scenaria. It will be used firstly in the teaching of nurses. The University of Hertfordshire is the only University in Britain using simulations of cardiac emergency scenaria for nurse and paramedic science education and therefore this project will add the multimedia dimension in distributed courses over the web and will assess the improvement in the educational process. The use of network and multimedia technologies, provide interactive learning, immediate feedback to students' responses, individually tailored instructions, objective testing and entertaining delivery. The end product of this project will serve as interactive material to enhance experiential learning for nursing students using the simulations of cardiac emergency scenaria. The emergency treatment simulations have been developed using VisSim and may be compiled as C code. The objective of the project is to provide a web based user friendly multimedia interface in order to demonstrate the way in which patients may be managed in critical situations by applying advanced technological equipment and drug administration. Then the user will be able to better appreciate the concepts involved by running the VisSim simulations. The evaluation group for the proposed software will be the Department of Nursing and Paramedic Sciences About 200 nurses use simulations every year for training purposes as part of their course requirements.

  6. Effects of Droplet Size on Intrusion of Sub-Surface Oil Spills

    NASA Astrophysics Data System (ADS)

    Adams, Eric; Chan, Godine; Wang, Dayang

    2014-11-01

    We explore effects of droplet size on droplet intrusion and transport in sub-surface oil spills. Negatively buoyant glass beads released continuously to a stratified ambient simulate oil droplets in a rising multiphase plume, and distributions of settled beads are used to infer signatures of surfacing oil. Initial tests used quiescent conditions, while ongoing tests simulate currents by towing the source and a bottom sled. Without current, deposited beads have a Gaussian distribution, with variance increasing with decreasing particle size. Distributions agree with a model assuming first order particle loss from an intrusion layer of constant thickness, and empirically determined flow rate. With current, deposited beads display a parabolic distribution similar to that expected from a source in uniform flow; we are currently comparing observed distributions with similar analytical models. Because chemical dispersants have been used to reduce oil droplet size, our study provides one measure of their effectiveness. Results are applied to conditions from the `Deep Spill' field experiment, and the recent Deepwater Horizon oil spill, and are being used to provide ``inner boundary conditions'' for subsequent far field modeling of these events. This research was made possible by grants from Chevron Energy Technology Co., through the Chevron-MITEI University Partnership Program, and BP/The Gulf of Mexico Research Initiative, GISR.

  7. Security Assessment Simulation Toolkit (SAST) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSDmore » NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.« less

  8. Material Protection, Accounting, and Control Technologies (MPACT): Modeling and Simulation Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipiti, Benjamin; Dunn, Timothy; Durbin, Samual

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal. This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. These tools willmore » consist of instrumentation and devices as well as computer software for modeling. To aid in framing its long-term goal, during FY16, a modeling and simulation roadmap is being developed for three major areas of investigation: (1) radiation transport and sensors, (2) process and chemical models, and (3) shock physics and assessments. For each area, current modeling approaches are described, and gaps and needs are identified.« less

  9. Data management and analysis for the Earth System Grid

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.

    2008-07-01

    The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.

  10. An Attack-Resilient Middleware Architecture for Grid Integration of Distributed Energy Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yifu; Mendis, Gihan J.; He, Youbiao

    In recent years, the increasing penetration of Distributed Energy Resources (DERs) has made an impact on the operation of the electric power systems. In the grid integration of DERs, data acquisition systems and communications infrastructure are crucial technologies to maintain system economic efficiency and reliability. Since most of these generators are relatively small, dedicated communications investments for every generator are capital cost prohibitive. Combining real-time attack-resilient communications middleware with Internet of Things (IoTs) technologies allows for the use of existing infrastructure. In our paper, we propose an intelligent communication middleware that utilizes the Quality of Experience (QoE) metrics to complementmore » the conventional Quality of Service (QoS) evaluation. Furthermore, our middleware employs deep learning techniques to detect and defend against congestion attacks. The simulation results illustrate the efficiency of our proposed communications middleware architecture.« less

  11. Summary of Previous Chamber or Controlled Anthrax Studies and Recommendations for Possible Additional Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piepel, Gregory F.; Amidan, Brett G.; Morrow, Jayne B.

    2010-12-29

    This report and an associated Excel file(a) summarizes the investigations and results of previous chamber and controlled studies(b) to characterize the performance of methods for collecting, storing and/or transporting, extracting, and analyzing samples from surfaces contaminated by Bacillus anthracis (BA) or related simulants. This report and the Excel are the joint work of the Pacific Northwest National Laboratory (PNNL) and the National Institute of Standards and Technology (NIST) for the Department of Homeland Security, Science and Technology Directorate. The report was originally released as PNNL-SA-69338, Rev. 0 in November 2009 with limited distribution, but was subsequently cleared for release withmore » unlimited distribution in this Rev. 1. Only minor changes were made to Rev. 0 to yield Rev. 1. A more substantial update (including summarizing data from other studies and more condensed summary tables of data) is underway« less

  12. Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models

    NASA Astrophysics Data System (ADS)

    Rigler, E. J.; Wiltberger, M. J.; Love, J. J.

    2017-12-01

    Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.

  13. The implementation of microstructural and heat treatment models to development of forming technology of critical aluminum-alloy parts

    NASA Astrophysics Data System (ADS)

    Biba, Nikolay; Alimov, Artem; Shitikov, Andrey; Stebunov, Sergei

    2018-05-01

    The demand for high performance and energy efficient transportation systems have boosted interest in lightweight design solutions. To achieve maximum weight reductions, it is not enough just to replace steel parts by their aluminium analogues, but it is necessary to change the entire concept of vehicle design. In this case we must develop methods for manufacturing a variety of critical parts with unusual and difficult to produce shapes. The mechanical properties of the material in these parts must also be optimised and tightly controlled to provide the best distribution within the part volume. The only way to achieve these goals is to implement technology development methods based on simulation of the entire manufacturing chain from preparing a billet through the forming operations and heat treatment of the product. The paper presents an approach to such technology development. The simulation of the technological chain starts with extruding a round billet. Depending on the extrusion process parameters, the billet can have different levels of material workout and variation of grain size throughout the volume. After extrusion, the billet gets formed into the required shape in a forging process. The main requirements at this stage are to get the near net shape of the product without defects and to provide proper configuration of grain flow that strengthens the product in the most critical direction. Then the product undergoes solution treatment, quenching and ageing. The simulation of all these stages are performed by QForm FEM code that provides thermo-mechanical coupled deformation of the material during extrusion and forging. To provide microstructure and heat treatment simulation, special subroutines has been developed by the authors. The proposed approach is illustrated by an industrial case study.

  14. Numerical simulations of epitaxial growth process in MOVPE reactor as a tool for design of modern semiconductors for high power electronics

    NASA Astrophysics Data System (ADS)

    Skibinski, Jakub; Caban, Piotr; Wejrzanowski, Tomasz; Kurzydlowski, Krzysztof J.

    2014-10-01

    In the present study numerical simulations of epitaxial growth of gallium nitride in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Epitaxial growth means crystal growth that progresses while inheriting the laminar structure and the orientation of substrate crystals. One of the technological problems is to obtain homogeneous growth rate over the main deposit area. Since there are many agents influencing reaction on crystal area such as temperature, pressure, gas flow or reactor geometry, it is difficult to design optimal process. According to the fact that it's impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during crystal growth, modeling is the only solution to understand the process precisely. Numerical simulations allow to understand the epitaxial process by calculation of heat and mass transfer distribution during growth of gallium nitride. Including chemical reactions in numerical model allows to calculate the growth rate of the substrate and estimate the optimal process conditions for obtaining the most homogeneous product.

  15. US Geological Survey National Computer Technology Meeting; Proceedings, Phoenix, Arizona, November 14-18, 1988

    USGS Publications Warehouse

    Balthrop, Barbara H.; Terry, J.E.

    1991-01-01

    The U.S. Geological Survey National Computer Technology Meetings (NCTM) are sponsored by the Water Resources Division and provide a forum for the presentation of technical papers and the sharing of ideas or experiences related to computer technology. This report serves as a proceedings of the meeting held in November, 1988 at the Crescent Hotel in Phoenix, Arizona. The meeting was attended by more than 200 technical and managerial people representing all Divisions of the U.S. Geological Survey.Scientists in every Division of the U.S. Geological Survey rely heavily upon state-of-the-art computer technology (both hardware and sofnuare). Today the goals of each Division are pursued in an environment where high speed computers, distributed communications, distributed data bases, high technology input/output devices, and very sophisticated simulation tools are used regularly. Therefore, information transfer and the sharing of advances in technology are very important issues that must be addressed regularly.This report contains complete papers and abstracts of papers that were presented at the 1988 NCTM. The report is divided into topical sections that reflect common areas of interest and application. In each section, papers are presented first followed by abstracts. For these proceedings, the publication of a complete paper or only an abstract was at the discretion of the author, although complete papers were encouraged.Some papers presented at the 1988 NCTM are not published in these proceedings.

  16. Research on the application of vehicle network in optimization of automobile supply supply chain

    NASA Astrophysics Data System (ADS)

    Jing, Xuelei; Jia, Baoxian

    2017-09-01

    The four key areas of the development of Internet-connected (intelligent transportation) with great potential for development,environmental monitoring, goods tracking, and the development of smart grid are the core supporting technologies of many applications. In order to improve the adaptability of data distribution, so that it can be used in urban, rural or highway and other different car networking scenarios, the study test and hypothetical test of the technical means to accurately estimate the different car network scene parameters indicators, and then different scenarios take different distribution strategies. Taking into account the limited nature of the data distribution of the Internet network data, the paper uses the idea of a customer to optimize the simulation

  17. Online monitoring of seismic damage in water distribution systems

    NASA Astrophysics Data System (ADS)

    Liang, Jianwen; Xiao, Di; Zhao, Xinhua; Zhang, Hongwei

    2004-07-01

    It is shown that water distribution systems can be damaged by earthquakes, and the seismic damages cannot easily be located, especially immediately after the events. Earthquake experiences show that accurate and quick location of seismic damage is critical to emergency response of water distribution systems. This paper develops a methodology to locate seismic damage -- multiple breaks in a water distribution system by monitoring water pressure online at limited positions in the water distribution system. For the purpose of online monitoring, supervisory control and data acquisition (SCADA) technology can well be used. A neural network-based inverse analysis method is constructed for locating the seismic damage based on the variation of water pressure. The neural network is trained by using analytically simulated data from the water distribution system, and validated by using a set of data that have never been used in the training. It is found that the methodology provides an effective and practical way in which seismic damage in a water distribution system can be accurately and quickly located.

  18. Simulating Coupling Complexity in Space Plasmas: First Results from a new code

    NASA Astrophysics Data System (ADS)

    Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.

    2005-12-01

    The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.

  19. Prospects of second generation artificial intelligence tools in calibration of chemical sensors.

    PubMed

    Braibanti, Antonio; Rao, Rupenaguntla Sambasiva; Ramam, Veluri Anantha; Rao, Gollapalli Nageswara; Rao, Vaddadi Venkata Panakala

    2005-05-01

    Multivariate data driven calibration models with neural networks (NNs) are developed for binary (Cu++ and Ca++) and quaternary (K+, Ca++, NO3- and Cl-) ion-selective electrode (ISE) data. The response profiles of ISEs with concentrations are non-linear and sub-Nernstian. This task represents function approximation of multi-variate, multi-response, correlated, non-linear data with unknown noise structure i.e. multi-component calibration/prediction in chemometric parlance. Radial distribution function (RBF) and Fuzzy-ARTMAP-NN models implemented in the software packages, TRAJAN and Professional II, are employed for the calibration. The optimum NN models reported are based on residuals in concentration space. Being a data driven information technology, NN does not require a model, prior- or posterior- distribution of data or noise structure. Missing information, spikes or newer trends in different concentration ranges can be modeled through novelty detection. Two simulated data sets generated from mathematical functions are modeled as a function of number of data points and network parameters like number of neurons and nearest neighbors. The success of RBF and Fuzzy-ARTMAP-NNs to develop adequate calibration models for experimental data and function approximation models for more complex simulated data sets ensures AI2 (artificial intelligence, 2nd generation) as a promising technology in quantitation.

  20. Virtual patient simulator for distributed collaborative medical education.

    PubMed

    Caudell, Thomas P; Summers, Kenneth L; Holten, Jim; Hakamata, Takeshi; Mowafi, Moad; Jacobs, Joshua; Lozanoff, Beth K; Lozanoff, Scott; Wilks, David; Keep, Marcus F; Saiki, Stanley; Alverson, Dale

    2003-01-01

    Project TOUCH (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) investigates the feasibility of using advanced technologies to enhance education in an innovative problem-based learning format currently being used in medical school curricula, applying specific clinical case models, and deploying to remote sites/workstations. The University of New Mexico's School of Medicine and the John A. Burns School of Medicine at the University of Hawai'i face similar health care challenges in providing and delivering services and training to remote and rural areas. Recognizing that health care needs are local and require local solutions, both states are committed to improving health care delivery to their unique populations by sharing information and experiences through emerging telehealth technologies by using high-performance computing and communications resources. The purpose of this study is to describe the deployment of a problem-based learning case distributed over the National Computational Science Alliance's Access Grid. Emphasis is placed on the underlying technical components of the TOUCH project, including the virtual reality development tool Flatland, the artificial intelligence-based simulation engine, the Access Grid, high-performance computing platforms, and the software that connects them all. In addition, educational and technical challenges for Project TOUCH are identified. Copyright 2003 Wiley-Liss, Inc.

  1. A design of wireless sensor networks for a power quality monitoring system.

    PubMed

    Lim, Yujin; Kim, Hak-Man; Kang, Sanggil

    2010-01-01

    Power grids deal with the business of generation, transmission, and distribution of electric power. Recently, interest in power quality in electrical distribution systems has increased rapidly. In Korea, the communication network to deliver voltage, current, and temperature measurements gathered from pole transformers to remote monitoring centers employs cellular mobile technology. Due to high cost of the cellular mobile technology, power quality monitoring measurements are limited and data gathering intervals are large. This causes difficulties in providing the power quality monitoring service. To alleviate the problems, in this paper we present a communication infrastructure to provide low cost, reliable data delivery. The communication infrastructure consists of wired connections between substations and monitoring centers, and wireless connections between pole transformers and substations. For the wireless connection, we employ a wireless sensor network and design its corresponding data forwarding protocol to improve the quality of data delivery. For the design, we adopt a tree-based data forwarding protocol in order to customize the distribution pattern of the power quality information. We verify the performance of the proposed data forwarding protocol quantitatively using the NS-2 network simulator.

  2. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  3. Predicting the effectiveness of depth-based technologies to prevent salmon lice infection using a dispersal model.

    PubMed

    Samsing, Francisca; Johnsen, Ingrid; Stien, Lars Helge; Oppedal, Frode; Albretsen, Jon; Asplin, Lars; Dempster, Tim

    2016-07-01

    Salmon lice is one of the major parasitic problems affecting wild and farmed salmonid species. The planktonic larval stages of these marine parasites can survive for extended periods without a host and are transported long distances by water masses. Salmon lice larvae have limited swimming capacity, but can influence their horizontal transport by vertical positioning. Here, we adapted a coupled biological-physical model to calculate the distribution of farm-produced salmon lice (Lepeophtheirus salmonis) during winter in the southwest coast of Norway. We tested 4 model simulations to see which best represented empirical data from two sources: (1) observed lice infection levels reported by farms; and (2) experimental data from a vertical exposure experiment where fish were forced to swim at different depths with a lice-barrier technology. Model simulations tested were different development time to the infective stage (35 or 50°-days), with or without the presence of temperature-controlled vertical behaviour of lice early planktonic stages (naupliar stages). The best model fit occurred with a 35°-day development time to the infective stage, and temperature-controlled vertical behaviour. We applied this model to predict the effectiveness of depth-based preventive lice-barrier technologies. Both simulated and experimental data revealed that hindering fish from swimming close to the surface efficiently reduced lice infection. Moreover, while our model simulation predicted that this preventive technology is widely applicable, its effectiveness will depend on environmental conditions. Low salinity surface waters reduce the effectiveness of this technology because salmon lice avoid these conditions, and can encounter the fish as they sink deeper in the water column. Correctly parameterized and validated salmon lice dispersal models can predict the impact of preventive approaches to control this parasite and become an essential tool in lice management strategies. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Computer simulation of low-temperature composites sintering processes for additive technologies

    NASA Astrophysics Data System (ADS)

    Tovpinets, A. O.; Leytsin, V. N.; Dmitrieva, M. A.

    2017-12-01

    This is impact research of mixture raw components characteristics on the low-temperature composites structure formation during the sintering process. The obtained results showed that the structure determination of initial compacts obtained after thermal destruction of the polymer binder lets quantify the concentrations of main components and the refractory crystalline product of thermal destruction. Accounting for the distribution of thermal destruction refractory product allows us to refine the forecast of thermal stresses in the matrix of sintered composite. The presented results can be considered as a basis for optimization of initial compositions of multilayer low-temperature composites obtained by additive technologies.

  5. Why Isn't There More High-fidelity Simulation Training in Diagnostic Radiology? Results of a Survey of Academic Radiologists.

    PubMed

    Cook, Tessa S; Hernandez, Jessica; Scanlon, Mary; Langlotz, Curtis; Li, Chun-Der L

    2016-07-01

    Despite its increasing use in training other medical specialties, high-fidelity simulation to prepare diagnostic radiology residents for call remains an underused educational resource. To attempt to characterize the barriers toward adoption of this technology, we conducted a survey of academic radiologists and radiology trainees. An Institutional Review Board-approved survey was distributed to the Association of University Radiologists members via e-mail. Survey results were collected electronically, tabulated, and analyzed. A total of 68 survey responses representing 51 programs were received from program directors, department chairs, chief residents, and program administrators. The most common form of educational activity for resident call preparation was lectures. Faculty supervised "baby call" was also widely reported. Actual simulated call environments were quite rare with only three programs reporting this type of educational activity. Barriers to the use of simulation include lack of faculty time, lack of faculty expertise, and lack of perceived need. High-fidelity simulation can be used to mimic the high-stress, high-stakes independent call environment that the typical radiology resident encounters during the second year of training, and can provide objective data for program directors to assess the Accreditation Council of Graduate Medical Education milestones. We predict that this technology will begin to supplement traditional diagnostic radiology teaching methods and to improve patient care and safety in the next decade. Published by Elsevier Inc.

  6. Integrated Mission Simulation (IMSim): Multiphase Initialization Design with Late Joiners, Rejoiners and Federation Save & Restore

    NASA Technical Reports Server (NTRS)

    Dexter, Daniel E.; Varesic, Tony E.

    2015-01-01

    This document describes the design of the Integrated Mission Simulation (IMSim) federate multiphase initialization process. The main goal of multiphase initialization is to allow for data interdependencies during the federate initialization process. IMSim uses the High Level Architecture (HLA) IEEE 1516 [1] to provide the communication and coordination between the distributed parts of the simulation. They are implemented using the Runtime Infrastructure (RTI) from Pitch Technologies AB. This document assumes a basic understanding of IEEE 1516 HLA, and C++ programming. In addition, there are several subtle points in working with IEEE 1516 and the Pitch RTI that need to be understood, which are covered in Appendix A. Please note the C++ code samples shown in this document are for the IEEE 1516-2000 standard.

  7. ORAC: a molecular dynamics simulation program to explore free energy surfaces in biomolecular systems at the atomistic level.

    PubMed

    Marsili, Simone; Signorini, Giorgio Federico; Chelli, Riccardo; Marchi, Massimo; Procacci, Piero

    2010-04-15

    We present the new release of the ORAC engine (Procacci et al., Comput Chem 1997, 18, 1834), a FORTRAN suite to simulate complex biosystems at the atomistic level. The previous release of the ORAC code included multiple time steps integration, smooth particle mesh Ewald method, constant pressure and constant temperature simulations. The present release has been supplemented with the most advanced techniques for enhanced sampling in atomistic systems including replica exchange with solute tempering, metadynamics and steered molecular dynamics. All these computational technologies have been implemented for parallel architectures using the standard MPI communication protocol. ORAC is an open-source program distributed free of charge under the GNU general public license (GPL) at http://www.chim.unifi.it/orac. 2009 Wiley Periodicals, Inc.

  8. Active illuminated space object imaging and tracking simulation

    NASA Astrophysics Data System (ADS)

    Yue, Yufang; Xie, Xiaogang; Luo, Wen; Zhang, Feizhou; An, Jianzhu

    2016-10-01

    Optical earth imaging simulation of a space target in orbit and it's extraction in laser illumination condition were discussed. Based on the orbit and corresponding attitude of a satellite, its 3D imaging rendering was built. General simulation platform was researched, which was adaptive to variable 3D satellite models and relative position relationships between satellite and earth detector system. Unified parallel projection technology was proposed in this paper. Furthermore, we denoted that random optical distribution in laser-illuminated condition was a challenge for object discrimination. Great randomicity of laser active illuminating speckles was the primary factor. The conjunction effects of multi-frame accumulation process and some tracking methods such as Meanshift tracking, contour poid, and filter deconvolution were simulated. Comparison of results illustrates that the union of multi-frame accumulation and contour poid was recommendable for laser active illuminated images, which had capacities of high tracking precise and stability for multiple object attitudes.

  9. Failure Analysis of a Sheet Metal Blanking Process Based on Damage Coupling Model

    NASA Astrophysics Data System (ADS)

    Wen, Y.; Chen, Z. H.; Zang, Y.

    2013-11-01

    In this paper, a blanking process of sheet metal is studied by the methods of numerical simulation and experimental observation. The effects of varying technological parameters related to the quality of products are investigated. An elastoplastic constitutive equation accounting for isotropic ductile damage is implemented into the finite element code ABAQUS with a user-defined material subroutine UMAT. The simulations of the damage evolution and ductile fracture in a sheet metal blanking process have been carried out by the FEM. In order to guarantee computation accuracy and avoid numerical divergence during large plastic deformation, a specified remeshing technique is successively applied when severe element distortion occurs. In the simulation, the evolutions of damage at different stage of the blanking process have been evaluated and the distributions of damage obtained from simulation are in proper agreement with the experimental results.

  10. The Emergence of Agent-Based Technology as an Architectural Component of Serious Games

    NASA Technical Reports Server (NTRS)

    Phillips, Mark; Scolaro, Jackie; Scolaro, Daniel

    2010-01-01

    The evolution of games as an alternative to traditional simulations in the military context has been gathering momentum over the past five years, even though the exploration of their use in the serious sense has been ongoing since the mid-nineties. Much of the focus has been on the aesthetics of the visuals provided by the core game engine as well as the artistry provided by talented development teams to produce not only breathtaking artwork, but highly immersive game play. Consideration of game technology is now so much a part of the modeling and simulation landscape that it is becoming difficult to distinguish traditional simulation solutions from game-based approaches. But games have yet to provide the much needed interactive free play that has been the domain of semi-autonomous forces (SAF). The component-based middleware architecture that game engines provide promises a great deal in terms of options for the integration of agent solutions to support the development of non-player characters that engage the human player without the deterministic nature of scripted behaviors. However, there are a number of hard-learned lessons on the modeling and simulation side of the equation that game developers have yet to learn, such as: correlation of heterogeneous systems, scalability of both terrain and numbers of non-player entities, and the bi-directional nature of simulation to game interaction provided by Distributed Interactive Simulation (DIS) and High Level Architecture (HLA).

  11. Collaborative enterprise and virtual prototyping (CEVP): a product-centric approach to distributed simulation

    NASA Astrophysics Data System (ADS)

    Saunders, Vance M.

    1999-06-01

    The downsizing of the Department of Defense (DoD) and the associated reduction in budgets has re-emphasized the need for commonality, reuse, and standards with respect to the way DoD does business. DoD has implemented significant changes in how it buys weapon systems. The new emphasis is on concurrent engineering with Integrated Product and Process Development and collaboration with Integrated Product Teams. The new DoD vision includes Simulation Based Acquisition (SBA), a process supported by robust, collaborative use of simulation technology that is integrated across acquisition phases and programs. This paper discusses the Air Force Research Laboratory's efforts to use Modeling and Simulation (M&S) resources within a Collaborative Enterprise Environment to support SBA and other Collaborative Enterprise and Virtual Prototyping (CEVP) applications. The paper will discuss four technology areas: (1) a Processing Ontology that defines a hierarchically nested set of collaboration contexts needed to organize and support multi-disciplinary collaboration using M&S, (2) a partial taxonomy of intelligent agents needed to manage different M&S resource contributions to advancing the state of product development, (3) an agent- based process for interfacing disparate M&S resources into a CEVP framework, and (4) a Model-View-Control based approach to defining `a new way of doing business' for users of CEVP frameworks/systems.

  12. Are surgery training programs ready for virtual reality? A survey of program directors in general surgery.

    PubMed

    Haluck, R S; Marshall, R L; Krummel, T M; Melkonian, M G

    2001-12-01

    The use of advanced technology, such as virtual environments and computer-based simulators (VR/CBS), in training has been well established by both industry and the military. In contrast the medical profession, including surgery, has been slow to incorporate such technology in its training. In an attempt to identify factors limiting the regular incorporation of this technology into surgical training programs, a survey was developed and distributed to all general surgery program directors in the United States. A 22-question survey was sent to 254 general surgery program directors. The survey was designed to reflect attitudes of the program directors regarding the use of computer-based simulation in surgical training. Questions were scaled from 1 to 5 with 1 = strongly disagree and 5 = strongly agree. A total of 139 responses (55%) were returned. The majority of respondents (58%) had seen VR/CBS, but only 19% had "hands-on" experience with these systems. Respondents strongly agreed that there is a need for learning opportunities outside of the operating room and a role for VR/CBS in surgical training. Respondents believed both staff and residents would support this type of training. Concerns included VR/CBS' lack of validation and potential requirements for frequent system upgrades. Virtual environments and computer-based simulators, although well established training tools in other fields, have not been widely incorporated into surgical education. Our results suggest that program directors believe this type of technology would be beneficial in surgical education, but they lack adequate information regarding VR/CBS. Developers of this technology may need to focus on educating potential users and addressing their concerns.

  13. [Dynamic road vehicle emission inventory simulation study based on real time traffic information].

    PubMed

    Huang, Cheng; Liu, Juan; Chen, Chang-Hong; Zhang, Jian; Liu, Deng-Guo; Zhu, Jing-Yu; Huang, Wei-Ming; Chao, Yuan

    2012-11-01

    The vehicle activity survey, including traffic flow distribution, driving condition, and vehicle technologies, were conducted in Shanghai. The databases of vehicle flow, VSP distribution and vehicle categories were established according to the surveyed data. Based on this, a dynamic vehicle emission inventory simulation method was designed by using the real time traffic information data, such as traffic flow and average speed. Some roads in Shanghai city were selected to conduct the hourly vehicle emission simulation as a case study. The survey results show that light duty passenger car and taxi are major vehicles on the roads of Shanghai city, accounting for 48% - 72% and 15% - 43% of the total flow in each hour, respectively. VSP distribution has a good relationship with the average speed. The peak of VSP distribution tends to move to high load section and become lower with the increase of average speed. Vehicles achieved Euro 2 and Euro 3 standards are majorities of current vehicle population in Shanghai. Based on the calibration of vehicle travel mileage data, the proportions of Euro 2 and Euro 3 standard vehicles take up 11% - 70% and 17% - 51% in the real-world situation, respectively. The emission simulation results indicate that the ratios of emission peak and valley for the pollutants of CO, VOC, NO(x) and PM are 3.7, 4.6, 9.6 and 19.8, respectively. CO and VOC emissions mainly come from light-duty passenger car and taxi, which has a good relationship with the traffic flow. NO(x) and PM emissions are mainly from heavy-duty bus and public buses and mainly concentrate in the morning and evening peak hours. The established dynamic vehicle emission simulation method can reflect the change of actual road emission and output high emission road sectors and hours in real time. The method can provide an important technical means and decision-making basis for transportation environment management.

  14. Carbon Tetrachloride Flow and Transport in the Subsurface of the 216-Z-9 Trench at the Hanford Site

    NASA Astrophysics Data System (ADS)

    Oostrom, M.; Rockhold, M.; Truex, M.; Thorne, P.; Last, G.; Rohay, V.

    2006-12-01

    Three-dimensional modeling was conducted with layered and heterogeneous models to enhance the conceptual model of CT distribution in the vertical and lateral direction beneath the 216-Z-9 trench and to investigate the effects of soil vapor extraction (SVE). This work supports the U.S. Department of Energy's (DOE's) efforts to characterize the nature and distribution of CT in the 200 West Area and subsequently select an appropriate final remedy. Simulations targeted migration of dense, nonaqueous phase liquid (DNAPL) consisting of CT and co-disposed organics in the subsurface beneath the 216-Z-9 trench as a function of the properties and distribution of subsurface sediments and of the properties and disposal history of the waste. Simulations of CT migration were conducted using the Subsurface Transport Over Multiple Phases (STOMP) simulator. Simulation results support a conceptual model for CT distribution where CT in the DNAPL phase is expected to have migrated primarily in a vertical direction below the disposal trench. Presence of small-scale heterogeneities tends to limit the extent of vertical migration of CT DNAPL due to enhanced retention of DNAPL compared to more homogeneous conditions, but migration is still predominantly in the vertical direction. Results also show that the Cold Creek units retain more CT DNAPL within the vadose zone than other hydrologic unit during SVE. A considerable amount of the disposed CT DNAPL may have partitioned to the vapor and subsequently water and sorbed phases. Presence of small-scale heterogeneities tends to increase the amount of volatilization. Any continued migration of CT from the vadose zone to the groundwater is likely through interaction of vapor phase CT with the groundwater and not through continued DNAPL migration. The results indicated that SVE appears to be an effective technology for vadose zone remediation, but additional effort is needed to improve simulation of the SVE process.

  15. Impact of Simulation Technology on Die and Stamping Business

    NASA Astrophysics Data System (ADS)

    Stevens, Mark W.

    2005-08-01

    Over the last ten years, we have seen an explosion in the use of simulation-based techniques to improve the engineering, construction, and operation of GM production tools. The impact has been as profound as the overall switch to CAD/CAM from the old manual design and construction methods. The changeover to N/C machining from duplicating milling machines brought advances in accuracy and speed to our construction activity. It also brought significant reductions in fitting sculptured surfaces. Changing over to CAD design brought similar advances in accuracy, and today's use of solid modeling has enhanced that accuracy gain while finally leading to the reduction in lead time and cost through the development of parametric techniques. Elimination of paper drawings for die design, along with the process of blueprinting and distribution, provided the savings required to install high capacity computer servers, high-speed data transmission lines and integrated networks. These historic changes in the application of CAE technology in manufacturing engineering paved the way for the implementation of simulation to all aspects of our business. The benefits are being realized now, and the future holds even greater promise as the simulation techniques mature and expand. Every new line of dies is verified prior to casting for interference free operation. Sheet metal forming simulation validates the material flow, eliminating the high costs of physical experimentation dependent on trial and error methods of the past. Integrated forming simulation and die structural analysis and optimization has led to a reduction in die size and weight on the order of 30% or more. The latest techniques in factory simulation enable analysis of automated press lines, including all stamping operations with corresponding automation. This leads to manufacturing lines capable of running at higher levels of throughput, with actual results providing the capability of two or more additional strokes per minute. As we spread these simulation techniques to the balance of our business, from blank de-stacking to the racking of parts, we anticipate continued reduction in lead-time and engineering expense while improving quality and start-up execution. The author will provide an overview of technology and business evolution of the math-based process that brought an historical transition and revitalization to the die and stamping industry in the past decade. Finally, the author will give an outlook for future business needs and technology development directions.

  16. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mike; Cipiti, Ben; Demuth, Scott Francis

    2017-01-30

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  17. Material Protection, Accounting, and Control Technologies (MPACT) Advanced Integration Roadmap

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durkee, Joe W.; Cipiti, Ben; Demuth, Scott Francis

    The development of sustainable advanced nuclear fuel cycles is a long-term goal of the Office of Nuclear Energy’s (DOE-NE) Fuel Cycle Technologies program. The Material Protection, Accounting, and Control Technologies (MPACT) campaign is supporting research and development (R&D) of advanced instrumentation, analysis tools, and integration methodologies to meet this goal (Miller, 2015). This advanced R&D is intended to facilitate safeguards and security by design of fuel cycle facilities. The lab-scale demonstration of a virtual facility, distributed test bed, that connects the individual tools being developed at National Laboratories and university research establishments, is a key program milestone for 2020. Thesemore » tools will consist of instrumentation and devices as well as computer software for modeling, simulation and integration.« less

  18. Improving combustion characteristics and NO(x) emissions of a down-fired 350 MW(e) utility boiler with multiple injection and multiple staging.

    PubMed

    Kuang, Min; Li, Zhengqi; Xu, Shantian; Zhu, Qunyi

    2011-04-15

    Within a Mitsui Babcock Energy Limited down-fired pulverized-coal 350 MW(e) utility boiler, in situ experiments were performed, with measurements taken of gas temperatures in the burner and near the right-wall regions, and of gas concentrations (O(2) and NO) from the near-wall region. Large combustion differences between zones near the front and rear walls and particularly high NO(x) emissions were found in the boiler. With focus on minimizing these problems, a new technology based on multiple-injection and multiple-staging has been developed. Combustion improvements and NO(x) reductions were validated by investigating three aspects. First, numerical simulations of the pulverized-coal combustion process and NO(x) emissions were compared in both the original and new technologies. Good agreement was found between simulations and in situ measurements with the original technology. Second, with the new technology, gas temperature and concentration distributions were found to be symmetric near the front and rear walls. A relatively low-temperature and high-oxygen-concentration zone formed in the near-wall region that helps mitigate slagging in the lower furnace. Third, NO(x) emissions were found to have decreased by as much as 50%, yielding a slight decrease in the levels of unburnt carbon in the fly ash.

  19. Enabling the Distributed Generation Market of High Temperature Fuel Cell and Absorption Chiller Systems to Support Critical and Commercial Loads

    NASA Astrophysics Data System (ADS)

    DiMola, Ashley M.

    Buildings account for over 18% of the world's anthropogenic Greenhouse Gas (GHG) emissions. As a result, a technology that can offset GHG emissions associated with buildings has the potential to save over 9 Giga-tons of GHG emissions per year. High temperature fuel cell and absorption chiller (HTFC/AC) technology offers a relatively low-carbon option for meeting cooling and electric loads for buildings while producing almost no criteria pollutants. GHG emissions in the state of California would decrease by 7.48 million metric tons per year if every commercial building in the State used HTFC/AC technology to meet its power and cooling requirements. In order to realize the benefits of HTFC/AC technology on a wide scale, the distributed generation market needs to be exposed to the technology and informed of its economic viability and real-world potential. This work characterizes the economics associated with HTFC/AC technology using select scenarios that are representative of realistic applications. The financial impacts of various input factors are evaluated and the HTFC/AC simulations are compared to the economics of traditional building utilities. It is shown that, in addition to the emissions reductions derived from the systems, HTFC/AC technology is financially preferable in all of the scenarios evaluated. This work also presents the design of a showcase environment, centered on a beta-test application, that presents (1) system operating data gathered using a custom data acquisition module, and (2) HTFC/AC technology in a clear and approachable manner in order to serve the target audience of market stakeholders.

  20. Toward production from gas hydrates: Current status, assessment of resources, and simulation-based evaluation of technology and potential

    USGS Publications Warehouse

    Moridis, G.J.; Collett, T.S.; Boswell, R.; Kurihara, M.; Reagan, M.T.; Koh, C.; Sloan, E.D.

    2009-01-01

    Gas hydrates (GHs) are a vast energy resource with global distribution in the permafrost and in the oceans. Even if conservative estimates are considered and only a small fraction is recoverable, the sheer size of the resource is so large that it demands evaluation as a potential energy source. In this review paper, we discuss the distribution of natural GH accumulations, the status of the primary international research and development (R&D) programs, and the remaining science and technological challenges facing the commercialization of production. After a brief examination of GH accumulations that are well characterized and appear to be models for future development and gas production, we analyze the role of numerical simulation in the assessment of the hydrate-production potential, identify the data needs for reliable predictions, evaluate the status of knowledge with regard to these needs, discuss knowledge gaps and their impact, and reach the conclusion that the numerical-simulation capabilities are quite advanced and that the related gaps either are not significant or are being addressed. We review the current body of literature relevant to potential productivity from different types of GH deposits and determine that there are consistent indications of a large production potential at high rates across long periods from a wide variety of hydrate deposits. Finally, we identify (a) features, conditions, geology and techniques that are desirable in potential production targets; (b) methods to maximize production; and (c) some of the conditions and characteristics that render certain GH deposits undesirable for production. Copyright ?? 2009 Society of Petroleum Engineers.

  1. Toward production from gas hydrates: Current status, assessment of resources, and simulation-based evaluation of technology and potential

    USGS Publications Warehouse

    Moridis, G.J.; Collett, T.S.; Boswell, R.; Kurihara, M.; Reagan, M.T.; Koh, C.; Sloan, E.D.

    2008-01-01

    Gas hydrates are a vast energy resource with global distribution in the permafrost and in the oceans. Even if conservative estimates are considered and only a small fraction is recoverable, the sheer size of the resource is so large that it demands evaluation as a potential energy source. In this review paper, we discuss the distribution of natural gas hydrate accumulations, the status of the primary international R&D programs, and the remaining science and technological challenges facing commercialization of production. After a brief examination of gas hydrate accumulations that are well characterized and appear to be models for future development and gas production, we analyze the role of numerical simulation in the assessment of the hydrate production potential, identify the data needs for reliable predictions, evaluate the status of knowledge with regard to these needs, discuss knowledge gaps and their impact, and reach the conclusion that the numerical simulation capabilities are quite advanced and that the related gaps are either not significant or are being addressed. We review the current body of literature relevant to potential productivity from different types of gas hydrate deposits, and determine that there are consistent indications of a large production potential at high rates over long periods from a wide variety of hydrate deposits. Finally, we identify (a) features, conditions, geology and techniques that are desirable in potential production targets, (b) methods to maximize production, and (c) some of the conditions and characteristics that render certain gas hydrate deposits undesirable for production. Copyright 2008, Society of Petroleum Engineers.

  2. Numerical Analysis of Shear Thickening Fluids for Blast Mitigation Applications

    DTIC Science & Technology

    2011-12-01

    integrate with other types of physics simulation technologies ( ANSYS , 2011). One well-known product offered by ANSYS is the ANSYS CFX . The ANSYS CFD...centered. The ANSYS CFX solver uses coupled algebraic multigrid to achieve its solutions and its engineered scalability ensures a linear increase in CPU...on the user-defined distribution and size. As the numerical analysis focused on the behavior of each individual particle, the ANSYS CFX Rigid Body

  3. Advanced Distributed Simulation Technology II (ADST-II) Dismounted Warrior Network Front End Analysis Experiments

    DTIC Science & Technology

    1997-12-19

    Resource Consultants Inc. (RCI) Science Applications InternatT Corp (SAIC) Veda Inc. Virtual Space Devices (VSD) 1.1 Background The Land Warrior...network. The VICs included: • VIC Alpha - a fully immersive Dismounted Soldier System developed by Veda under a STRICOM applied research effort...consists of the Dismounted Soldier System (DSS), which is characterized as follows: • Developed by Veda under a STRICOM applied research effort

  4. Electric Vehicle Modeling and Simulation.

    DTIC Science & Technology

    1983-08-01

    RD-RI39 709 ELECTRIC VEHICLE MODELING RHD SIMULRTION(U) AIR FORCE lit INST OF TECH NRIGHT-PRTTERSON RFD OH SCHOOL OF ENGINEERING A R DEMISPELARE RUG...for Public Release Distribution Unlimited Fl School of Engineering Air Force Institute of Technology Wright-Patterson Air Force Base, Ohio Table of... Engineering , 49: 49-51 (27 August 1979). 36. Renner -Smith, S. "Battery-Saving Flywheel Gives Electric Car Freeway Zip," Popular Science, 215(10): 82-84

  5. Why simulation can be efficient: on the preconditions of efficient learning in complex technology based practices.

    PubMed

    Hofmann, Bjørn

    2009-07-23

    It is important to demonstrate learning outcomes of simulation in technology based practices, such as in advanced health care. Although many studies show skills improvement and self-reported change to practice, there are few studies demonstrating patient outcome and societal efficiency. The objective of the study is to investigate if and why simulation can be effective and efficient in a hi-tech health care setting. This is important in order to decide whether and how to design simulation scenarios and outcome studies. Core theoretical insights in Science and Technology Studies (STS) are applied to analyze the field of simulation in hi-tech health care education. In particular, a process-oriented framework where technology is characterized by its devices, methods and its organizational setting is applied. The analysis shows how advanced simulation can address core characteristics of technology beyond the knowledge of technology's functions. Simulation's ability to address skilful device handling as well as purposive aspects of technology provides a potential for effective and efficient learning. However, as technology is also constituted by organizational aspects, such as technology status, disease status, and resource constraints, the success of simulation depends on whether these aspects can be integrated in the simulation setting as well. This represents a challenge for future development of simulation and for demonstrating its effectiveness and efficiency. Assessing the outcome of simulation in education in hi-tech health care settings is worthwhile if core characteristics of medical technology are addressed. This challenges the traditional technical versus non-technical divide in simulation, as organizational aspects appear to be part of technology's core characteristics.

  6. A Community-Based Approach to Leading the Nation in Smart Energy Use

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2013-12-31

    Project Objectives The AEP Ohio gridSMART® Demonstration Project (Project) achieved the following objectives: • Built a secure, interoperable, and integrated smart grid infrastructure in northeast central Ohio that demonstrated the ability to maximize distribution system efficiency and reliability and consumer use of demand response programs that reduced energy consumption, peak demand, and fossil fuel emissions. • Actively attracted, educated, enlisted, and retained consumers in innovative business models that provided tools and information reducing consumption and peak demand. • Provided the U.S. Department of Energy (DOE) information to evaluate technologies and preferred smart grid business models to be extended nationally. Projectmore » Description Ohio Power Company (the surviving company of a merger with Columbus Southern Power Company), doing business as AEP Ohio (AEP Ohio), took a community-based approach and incorporated a full suite of advanced smart grid technologies for 110,000 consumers in an area selected for its concentration and diversity of distribution infrastructure and consumers. It was organized and aligned around: • Technology, implementation, and operations • Consumer and stakeholder acceptance • Data management and benefit assessment Combined, these functional areas served as the foundation of the Project to integrate commercially available products, innovative technologies, and new consumer products and services within a secure two-way communication network between the utility and consumers. The Project included Advanced Metering Infrastructure (AMI), Distribution Management System (DMS), Distribution Automation Circuit Reconfiguration (DACR), Volt VAR Optimization (VVO), and Consumer Programs (CP). These technologies were combined with two-way consumer communication and information sharing, demand response, dynamic pricing, and consumer products, such as plug-in electric vehicles and smart appliances. In addition, the Project incorporated comprehensive cyber security capabilities, interoperability, and a data assessment that, with grid simulation capabilities, made the demonstration results an adaptable, integrated solution for AEP Ohio and the nation.« less

  7. Etude et simulation du protocole TTEthernet sur un sous-systeme de gestion de vols et adaptation de la planification des tâches a des fins de simulation

    NASA Astrophysics Data System (ADS)

    Abidi, Dhafer

    TTEthernet is a deterministic network technology that makes enhancements to Layer 2 Quality-of-Service (QoS) for Ethernet. The components that implement its services enrich the Ethernet functionality with distributed fault-tolerant synchronization, robust temporal partitioning bandwidth and synchronous communication with fixed latency and low jitter. TTEthernet services can facilitate the design of scalable, robust, less complex distributed systems and architectures tolerant to faults. Simulation is nowadays an essential step in critical systems design process and represents a valuable support for validation and performance evaluation. CoRE4INET is a project bringing together all TTEthernet simulation models currently available. It is based on the extension of models of OMNeT ++ INET framework. Our objective is to study and simulate the TTEthernet protocol on a flight management subsystem (FMS). The idea is to use CoRE4INET to design the simulation model of the target system. The problem is that CoRE4INET does not offer a task scheduling tool for TTEthernet network. To overcome this problem we propose an adaptation for simulation purposes of a task scheduling approach based on formal specification of network constraints. The use of Yices solver allowed the translation of the formal specification into an executable program to generate the desired transmission plan. A case study allowed us at the end to assess the impact of the arrangement of Time-Triggered frames offsets on the performance of each type of the system traffic.

  8. New consumer load prototype for electricity theft monitoring

    NASA Astrophysics Data System (ADS)

    Abdullateef, A. I.; Salami, M. J. E.; Musse, M. A.; Onasanya, M. A.; Alebiosu, M. I.

    2013-12-01

    Illegal connection which is direct connection to the distribution feeder and tampering of energy meter has been identified as a major process through which nefarious consumers steal electricity on low voltage distribution system. This has contributed enormously to the revenue losses incurred by the power and energy providers. A Consumer Load Prototype (CLP) is constructed and proposed in this study in order to understand the best possible pattern through which the stealing process is effected in real life power consumption. The construction of consumer load prototype will facilitate real time simulation and data collection for the monitoring and detection of electricity theft on low voltage distribution system. The prototype involves electrical design and construction of consumer loads with application of various standard regulations from Institution of Engineering and Technology (IET), formerly known as Institution of Electrical Engineers (IEE). LABVIEW platform was used for data acquisition and the data shows a good representation of the connected loads. The prototype will assist researchers and power utilities, currently facing challenges in getting real time data for the study and monitoring of electricity theft. The simulation of electricity theft in real time is one of the contributions of this prototype. Similarly, the power and energy community including students will appreciate the practical approach which the prototype provides for real time information rather than software simulation which has hitherto been used in the study of electricity theft.

  9. Comparing LOPES measurements of air-shower radio emission with REAS 3.11 and CoREAS simulations

    NASA Astrophysics Data System (ADS)

    Apel, W. D.; Arteaga-Velázquez, J. C.; Bähren, L.; Bekk, K.; Bertaina, M.; Biermann, P. L.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Daumiller, K.; de Souza, V.; di Pierro, F.; Doll, P.; Engel, R.; Falcke, H.; Fuchs, B.; Fuhrmann, D.; Gemmeke, H.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Horneffer, A.; Huber, D.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Krömer, O.; Kuijpers, J.; Link, K.; Łuczak, P.; Ludwig, M.; Mathes, H. J.; Melissas, M.; Morello, C.; Oehlschläger, J.; Palmieri, N.; Pierog, T.; Rautenberg, J.; Rebel, H.; Roth, M.; Rühle, C.; Saftoiu, A.; Schieler, H.; Schmidt, A.; Schröder, F. G.; Sima, O.; Toma, G.; Trinchero, G. C.; Weindl, A.; Wochele, J.; Zabierowski, J.; Zensus, J. A.

    2013-12-01

    Cosmic ray air showers emit radio pulses at MHz frequencies, which can be measured with radio antenna arrays - like LOPES at the Karlsruhe Institute of Technology in Germany. To improve the understanding of the radio emission, we test theoretical descriptions with measured data. The observables used for these tests are the absolute amplitude of the radio signal, and the shape of the radio lateral distribution. We compare lateral distributions of more than 500 LOPES events with two recent and public Monte Carlo simulation codes, REAS 3.11 and CoREAS (v 1.0). The absolute radio amplitudes predicted by REAS 3.11 are in good agreement with the LOPES measurements. The amplitudes predicted by CoREAS are lower by a factor of two, and marginally compatible with the LOPES measurements within the systematic scale uncertainties. In contrast to any previous versions of REAS, REAS 3.11 and CoREAS now reproduce the shape of the measured lateral distributions correctly. This reflects a remarkable progress compared to the situation a few years ago, and it seems that the main processes for the radio emission of air showers are now understood: The emission is mainly due to the geomagnetic deflection of the electrons and positrons in the shower. Less important but not negligible is the Askaryan effect (net charge variation). Moreover, we confirm that the refractive index of the air plays an important role, since it changes the coherence conditions for the emission: Only the new simulations including the refractive index can reproduce rising lateral distributions which we observe in a few LOPES events. Finally, we show that the lateral distribution is sensitive to the energy and the mass of the primary cosmic ray particles.

  10. Evidence of Long Range Dependence and Self-similarity in Urban Traffic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thakur, Gautam S; Helmy, Ahmed; Hui, Pan

    2015-01-01

    Transportation simulation technologies should accurately model traffic demand, distribution, and assignment parame- ters for urban environment simulation. These three param- eters significantly impact transportation engineering bench- mark process, are also critical in realizing realistic traffic modeling situations. In this paper, we model and charac- terize traffic density distribution of thousands of locations around the world. The traffic densities are generated from millions of images collected over several years and processed using computer vision techniques. The resulting traffic den- sity distribution time series are then analyzed. It is found using the goodness-of-fit test that the traffic density dis- tributions follows heavy-tailmore » models such as Log-gamma, Log-logistic, and Weibull in over 90% of analyzed locations. Moreover, a heavy-tail gives rise to long-range dependence and self-similarity, which we studied by estimating the Hurst exponent (H). Our analysis based on seven different Hurst estimators strongly indicate that the traffic distribution pat- terns are stochastically self-similar (0.5 H 1.0). We believe this is an important finding that will influence the design and development of the next generation traffic simu- lation techniques and also aid in accurately modeling traffic engineering of urban systems. In addition, it shall provide a much needed input for the development of smart cities.« less

  11. A Hierarchical Modulation Coherent Communication Scheme for Simultaneous Four-State Continuous-Variable Quantum Key Distribution and Classical Communication

    NASA Astrophysics Data System (ADS)

    Yang, Can; Ma, Cheng; Hu, Linxi; He, Guangqiang

    2018-06-01

    We present a hierarchical modulation coherent communication protocol, which simultaneously achieves classical optical communication and continuous-variable quantum key distribution. Our hierarchical modulation scheme consists of a quadrature phase-shifting keying modulation for classical communication and a four-state discrete modulation for continuous-variable quantum key distribution. The simulation results based on practical parameters show that it is feasible to transmit both quantum information and classical information on a single carrier. We obtained a secure key rate of 10^{-3} bits/pulse to 10^{-1} bits/pulse within 40 kilometers, and in the meantime the maximum bit error rate for classical information is about 10^{-7}. Because continuous-variable quantum key distribution protocol is compatible with standard telecommunication technology, we think our hierarchical modulation scheme can be used to upgrade the digital communication systems to extend system function in the future.

  12. Numerical simulation of MEMS-based blade load distribution control in centrifugal compressor surge suppression

    NASA Astrophysics Data System (ADS)

    Beneda, Károly

    2012-11-01

    The utilization of turbomachines requires up-to-date technologies to ensure safe operation throughout the widest possible range that makes novel ideas necessary to cope with classic problems. One of the most dangerous instability in compression systems is surge that has to be suppressed before its onset to avoid structural damages as well as other adverse consequences in the system. As surge occurs at low delivered mass flow rates the conventional widely spread surge control is based on bypassing the unnecessary airflow back to the atmosphere. This method has been implemented on a large number of aircraft and provides a robust control on suppressing compressor surge while creating a significant efficiency loss. This paper deals with an idea that has been originally designed as a fixed geometry that could be realized using up-to-date MEMS technology resulting in moderate losses but comparable stability enhancement. Previously the author has established the one-dimensional mathematical model of the concept, but it is indispensable - before the real instrument can be developed - to carry out detailed numerical simulation of the device. The aim of the paper is to acquaint the efforts of this CFD simulation.

  13. Modeling of luminance distribution in CAVE-type virtual reality systems

    NASA Astrophysics Data System (ADS)

    Meironke, Michał; Mazikowski, Adam

    2017-08-01

    At present, one of the most advanced virtual reality systems are CAVE-type (Cave Automatic Virtual Environment) installations. Such systems are usually consisted of four, five or six projection screens and in case of six screens arranged in form of a cube. Providing the user with a high level of immersion feeling in such systems is largely dependent of optical properties of the system. The modeling of physical phenomena plays nowadays a huge role in the most fields of science and technology. It allows to simulate work of device without a need to make any changes in the physical constructions. In this paper distribution of luminance in CAVE-type virtual reality systems were modelled. Calculations were performed for the model of 6-walled CAVE-type installation, based on Immersive 3D Visualization Laboratory, situated at the Faculty of Electronics, Telecommunications and Informatics at the Gdańsk University of Technology. Tests have been carried out for two different scattering distribution of the screen material in order to check how these characteristicinfluence on the luminance distribution of the whole CAVE. The basis assumption and simplification of modeled CAVE-type installation and results were presented. The brief discussion about the results and usefulness of developed model were also carried out.

  14. Adoption and supply of a distributed energy technology

    NASA Astrophysics Data System (ADS)

    Strachan, Neil Douglas

    2000-12-01

    Technical and economic developments in distributed generation (DG) represent an opportunity for a radically different energy market paradigm, and potentially significant cuts in global carbon emissions. This thesis investigates DG along two interrelated themes: (1) Early adoption and supply of the DG technology of internal combustion (IC) engine cogeneration. (2) Private and social cost implications of DG for private investors and within an energy system. IC engine cogeneration of both power and heat has been a remarkable success in the Netherlands with over 5,000 installations and 1,500MWe of installed capacity by 1997. However, the technology has struggled in the UK with an installed capacity of 110Mwe, fulfilling only 10% of its large estimated potential. An investment simulation model of DG investments in the UK and Netherlands was used, together with analysis of site level data on all DG adoptions from 1985 through 1997. In the UK over 60% of the early installations were sized too small (<140kWe) to be economically attractive (suppliers made their money with maintenance contracts). In the Netherlands, most facilities were sized well above the economic size threshold of 100kWe (lower due to reduced operating and grid connection costs). Institutional players were key in improved sizing of DG. Aided by energy market and CO2 reduction regulatory policy, Dutch distributions utilities played a proactive role in DG. This involved joint ventures with engine cogen suppliers and users, offering improved electricity buy-back tariffs and lower connection costs. This has allowed flexible operation of distributed generation, especially in electricity sales to the grid. Larger units can be sized for on-site heat requirements with electricity export providing revenue and aiding in management of energy networks. A comparison of internal and external costs of three distributed and three centralized generation technologies over a range of heat to power ratios (HPR) was made. Micro-turbines were found to be the lowest cost technology, especially at higher heat loads. Engines are also very competitive providing their NOx and CO emissions are controlled. A cost optimization program was used to develop an optimal green-field supply mix for Florida and New York. (Abstract shortened by UMI.)

  15. Improvement of the Processes of Liquid-Phase Epitaxial Growth of Nanoheteroepitaxial Structures

    NASA Astrophysics Data System (ADS)

    Maronchuk, I. I.; Sanikovich, D. D.; Potapkov, P. V.; Vel‧chenko, A. A.

    2018-05-01

    We have revealed the shortcomings of equipment and technological approaches in growing nanoheteroepitaxial structures with quantum dots by liquid-phase epitaxy. We have developed and fabricated a new vertical barreltype cassette for growing quantum dots and epitaxial layers of various thicknesses in one technological process. A physico-mathematical simulation has been carried out of the processes of liquid-phase epitaxial growth of quantumdimensional structures with the use of the program product SolidWorks (FlowSimulation program). Analysis has revealed the presence of negative factors influencing the growth process of the above structures. The mathematical model has been optimized, and the equipment has been modernized without additional experiments and measurements. The flow dynamics of the process gas in the reactor at various flow rates has been investigated. A method for tuning the thermal equipment has been developed. The calculated and experimental temperature distributions in the process of growing structures with high reproducibility are in good agreement, which confirms the validity of the modernization made.

  16. 38th JANNAF Combustion Subcommittee Meeting. Volume 1

    NASA Technical Reports Server (NTRS)

    Fry, Ronald S. (Editor); Eggleston, Debra S. (Editor); Gannaway, Mary T. (Editor)

    2002-01-01

    This volume, the first of two volumes, is a collection of 55 unclassified/unlimited-distribution papers which were presented at the Joint Army-Navy-NASA-Air Force (JANNAF) 38th Combustion Subcommittee (CS), 26 th Airbreathing Propulsion Subcommittee (APS), 20th Propulsion Systems Hazards Subcommittee (PSHS), and 21 Modeling and Simulation Subcommittee. The meeting was held 8-12 April 2002 at the Bayside Inn at The Sandestin Golf & Beach Resort and Eglin Air Force Base, Destin, Florida. Topics cover five major technology areas including: 1) Combustion - Propellant Combustion, Ingredient Kinetics, Metal Combustion, Decomposition Processes and Material Characterization, Rocket Motor Combustion, and Liquid & Hybrid Combustion; 2) Liquid Rocket Engines - Low Cost Hydrocarbon Liquid Rocket Engines, Liquid Propulsion Turbines, Liquid Propulsion Pumps, and Staged Combustion Injector Technology; 3) Modeling & Simulation - Development of Multi- Disciplinary RBCC Modeling, Gun Modeling, and Computational Modeling for Liquid Propellant Combustion; 4) Guns Gun Propelling Charge Design, and ETC Gun Propulsion; and 5) Airbreathing - Scramjet an Ramjet- S&T Program Overviews.

  17. Advanced Engineering Environments: Implications for Aerospace Manufacturing

    NASA Technical Reports Server (NTRS)

    Thomas, D.

    2001-01-01

    There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.

  18. Orbital Angular Momentum Multiplexing over Visible Light Communication Systems

    NASA Astrophysics Data System (ADS)

    Tripathi, Hardik Rameshchandra

    This thesis proposes and explores the possibility of using Orbital Angular Momentum multiplexing in Visible Light Communication system. Orbital Angular Momentum is mainly applied for laser and optical fiber transmissions, while Visible Light Communication is a technology using the light as a carrier for wireless communication. In this research, the study of the state of art and experiments showing some results on multiplexing based on Orbital Angular Momentum over Visible Light Communication system were done. After completion of the initial stage; research work and simulations were performed on spatial multiplexing over Li-Fi channel modeling. Simulation scenarios which allowed to evaluate the Signal-to-Noise Ratio, Received Power Distribution, Intensity and Illuminance were defined and developed.

  19. Current Trends in Higher Education Technology: Simulation

    ERIC Educational Resources Information Center

    Damewood, Andrea M.

    2016-01-01

    This paper is focused on how technology in use changes over time, and the current trend of simulation technology as a supported classroom technology. Simulation-based training as a learning tool is discussed within the context of adult learning theories, as is the technology used and how today's higher education technology administrators support…

  20. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  1. Adaptive spatial filtering of daytime sky noise in a satellite quantum key distribution downlink receiver

    NASA Astrophysics Data System (ADS)

    Gruneisen, Mark T.; Sickmiller, Brett A.; Flanagan, Michael B.; Black, James P.; Stoltenberg, Kurt E.; Duchane, Alexander W.

    2016-02-01

    Spatial filtering is an important technique for reducing sky background noise in a satellite quantum key distribution downlink receiver. Atmospheric turbulence limits the extent to which spatial filtering can reduce sky noise without introducing signal losses. Using atmospheric propagation and compensation simulations, the potential benefit of adaptive optics (AO) to secure key generation (SKG) is quantified. Simulations are performed assuming optical propagation from a low-Earth-orbit satellite to a terrestrial receiver that includes AO. Higher-order AO correction is modeled assuming a Shack-Hartmann wavefront sensor and a continuous-face-sheet deformable mirror. The effects of atmospheric turbulence, tracking, and higher-order AO on the photon capture efficiency are simulated using statistical representations of turbulence and a time-domain wave-optics hardware emulator. SKG rates are calculated for a decoy-state protocol as a function of the receiver field of view for various strengths of turbulence, sky radiances, and pointing angles. The results show that at fields of view smaller than those discussed by others, AO technologies can enhance SKG rates in daylight and enable SKG where it would otherwise be prohibited as a consequence of background optical noise and signal loss due to propagation and turbulence effects.

  2. Application of computer virtual simulation technology in 3D animation production

    NASA Astrophysics Data System (ADS)

    Mo, Can

    2017-11-01

    In the continuous development of computer technology, the application system of virtual simulation technology has been further optimized and improved. It also has been widely used in various fields of social development, such as city construction, interior design, industrial simulation and tourism teaching etc. This paper mainly introduces the virtual simulation technology used in 3D animation. Based on analyzing the characteristics of virtual simulation technology, the application ways and means of this technology in 3D animation are researched. The purpose is to provide certain reference for the 3D effect promotion days after.

  3. Standards in Modeling and Simulation: The Next Ten Years MODSIM World Paper 2010

    NASA Technical Reports Server (NTRS)

    Collins, Andrew J.; Diallo, Saikou; Sherfey, Solomon R.; Tolk, Andreas; Turnitsa, Charles D.; Petty, Mikel; Wiesel, Eric

    2011-01-01

    The world has moved on since the introduction of the Distributed Interactive Simulation (DIS) standard in the early 1980s. The cold-war maybe over but there is still a requirement to train for and analyze the next generation of threats that face the free world. With the emergence of new and more powerful computer technology and techniques means that modeling and simulation (M&S) has become an important and growing, part in satisfying this requirement. As an industry grows, the benefits from standardization within that industry grow with it. For example, it is difficult to imagine what the USA would be like without the 110 volts standard for domestic electricity supply. This paper contains an overview of the outcomes from a recent workshop to investigate the possible future of M&S standards within the federal government.

  4. Temporal Coherence: A Model for Non-Stationarity in Natural and Simulated Wind Records

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rinker, Jennifer M.; Gavin, Henri P.; Clifton, Andrew

    We present a novel methodology for characterizing and simulating non-stationary stochastic wind records. In this new method, non-stationarity is characterized and modelled via temporal coherence, which is quantified in the discrete frequency domain by probability distributions of the differences in phase between adjacent Fourier components. Temporal coherence can also be used to quantify non-stationary characteristics in wind data. Three case studies are presented that analyze the non-stationarity of turbulent wind data obtained at the National Wind Technology Center near Boulder, Colorado, USA. The first study compares the temporal and spectral characteristics of a stationary wind record and a non-stationary windmore » record in order to highlight their differences in temporal coherence. The second study examines the distribution of one of the proposed temporal coherence parameters and uses it to quantify the prevalence of nonstationarity in the dataset. The third study examines how temporal coherence varies with a range of atmospheric parameters to determine what conditions produce more non-stationarity.« less

  5. Modeling of projection electron lithography

    NASA Astrophysics Data System (ADS)

    Mack, Chris A.

    2000-07-01

    Projection Electron Lithography (PEL) has recently become a leading candidate for the next generation of lithography systems after the successful demonstration of SCAPEL by Lucent Technologies and PREVAIL by IBM. These systems use a scattering membrane mask followed by a lens with limited angular acceptance range to form an image of the mask when illuminated by high energy electrons. This paper presents an initial modeling system for such types of projection electron lithography systems. Monte Carlo modeling of electron scattering within the mask structure creates an effective mask 'diffraction' pattern, to borrow the standard optical terminology. A cutoff of this scattered pattern by the imaging 'lens' provides an electron energy distribution striking the wafer. This distribution is then convolved with a 'point spread function,' the results of a Monte Carlo scattering calculation of a point beam of electrons striking the resist coated substrate and including the effects of beam blur. Resist exposure and development models from standard electron beam lithography simulation are used to simulate the final three-dimensional resist profile.

  6. Numerical Investigation of the Ability of Salt Tracers to Represent the Residence Time Distribution of Fluidized Catalytic Cracking Particles

    DOE PAGES

    Lu, Liqiang; Gao, Xi; Li, Tingwen; ...

    2017-11-02

    For a long time, salt tracers have been used to measure the residence time distribution (RTD) of fluidized catalytic cracking (FCC) particles. However, due to limitations in experimental measurements and simulation methods, the ability of salt tracers to faithfully represent RTDs has never been directly investigated. Our current simulation results using coarse-grained computational fluid dynamic coupled with discrete element method (CFD-DEM) with filtered drag models show that the residence time of salt tracers with the same terminal velocity as FCC particles is slightly larger than that of FCC particles. This research also demonstrates the ability of filtered drag models tomore » predict the correct RTD curve for FCC particles while the homogeneous drag model may only be used in the dilute riser flow of Geldart type B particles. The RTD of large-scale reactors can then be efficiently investigated with our proposed numerical method as well as by using the old-fashioned salt tracer technology.« less

  7. Thermal and optical design analyses, optimizations, and experimental verification for a novel glare-free LED lamp for household applications.

    PubMed

    Khan, M Nisa

    2015-07-20

    Light-emitting diode (LED) technologies are undergoing very fast developments to enable household lamp products with improved energy efficiency and lighting properties at lower cost. Although many LED replacement lamps are claimed to provide similar or better lighting quality at lower electrical wattage compared with general-purpose incumbent lamps, certain lighting characteristics important to human vision are neglected in this comparison, which include glare-free illumination and omnidirectional or sufficiently broad light distribution with adequate homogeneity. In this paper, we comprehensively investigate the thermal and lighting performance and trade-offs for several commercial LED replacement lamps for the most popular Edison incandescent bulb. We present simulations and analyses for thermal and optical performance trade-offs for various LED lamps at the chip and module granularity levels. In addition, we present a novel, glare-free, and production-friendly LED lamp design optimized to produce very desirable light distribution properties as demonstrated by our simulation results, some of which are verified by experiments.

  8. Multiphase flow simulations of a moving fluidized bed regenerator in a carbon capture unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarkar, Avik; Pan, Wenxiao; Suh, Dong-Myung

    2014-10-01

    To accelerate the commercialization and deployment of carbon capture technologies, computational fluid dynamics (CFD)-based tools may be used to model and analyze the performance of carbon capture devices. This work presents multiphase CFD-based flow simulations for the regeneration device responsible for extracting CO 2 from CO 2-loaded sorbent particles before the particles are recycled. The use of solid particle sorbents in this design is a departure from previously reported systems, where aqueous sorbents are employed. Another new feature is the inclusion of a series of perforated plates along the regenerator height. The influence of these plates on sorbent distribution ismore » examined for varying sorbent holdup, fluidizing gas velocity, and particle size. The residence time distribution of sorbents is also measured to classify the low regime as plug flow or well-mixed flow. The purpose of this work is to better understand the sorbent flow characteristics before reaction kinetics of CO 2 desorption can be implemented.« less

  9. Research on Service Platform of Internet of Things for Smart City

    NASA Astrophysics Data System (ADS)

    Wang, W.; He, Z.; Huang, D.; Zhang, X.

    2014-04-01

    The application of Internet of Things in surveying and mapping industry basically is at the exploration stage, has not formed a unified standard. Chongqing Institute of Surveying and Mapping (CQISM) launched the research p roject "Research on the Technology of Internet of Things for Smart City". The project focuses on the key technologies of information transmission and exchange on the Internet of Things platform. The data standards of Internet of Things are designed. The real-time acquisition, mass storage and distributed data service of mass sensors are realized. On this basis, CQISM deploys the prototype platform of Internet of Things. The simulation application in Connected Car proves that the platform design is scientific and practical.

  10. Mid-Infrared Trace Gas Sensor Technology Based on Intracavity Quartz-Enhanced Photoacoustic Spectroscopy.

    PubMed

    Wojtas, Jacek; Gluszek, Aleksander; Hudzikowski, Arkadiusz; Tittel, Frank K

    2017-03-04

    The application of compact inexpensive trace gas sensor technology to a mid-infrared nitric oxide (NO) detectoion using intracavity quartz-enhanced photoacoustic spectroscopy (I-QEPAS) is reported. A minimum detection limit of 4.8 ppbv within a 30 ms integration time was demonstrated by using a room-temperature, continuous-wave, distributed-feedback quantum cascade laser (QCL) emitting at 5.263 µm (1900.08 cm -1 ) and a new compact design of a high-finesse bow-tie optical cavity with an integrated resonant quartz tuning fork (QTF). The optimum configuration of the bow-tie cavity was simulated using custom software. Measurements were performed with a wavelength modulation scheme (WM) using a 2f detection procedure.

  11. Wind cannot be Directed but Sails can be Adjusted for Malaysian Renewable Energy Progress

    NASA Astrophysics Data System (ADS)

    Palanichamy, C.; Nasir, Meseret; Veeramani, S.

    2015-04-01

    Wind energy has been the promising energy technology since 1980s in terms of percentage of yearly growth of installed capacity. However the progress of wind energy has not been evenly distributed around the world. Particularly, in South East Asian countries like Malaysia and Singapore, though the Governments are keen on promoting wind energy technology, it is not well practiced due to the low wind speeds. Owing to the recent advancements in wind turbine designs, even Malaysia is well suited for wind energy by proper choice of wind turbines. As evidence, this paper presents successful wind turbines with simulated study outcomes to encourage wind power developments in Malaysia.

  12. Radiotracer Technology in Mixing Processes for Industrial Applications

    PubMed Central

    Othman, N.; Kamarudin, S. K.

    2014-01-01

    Many problems associated with the mixing process remain unsolved and result in poor mixing performance. The residence time distribution (RTD) and the mixing time are the most important parameters that determine the homogenisation that is achieved in the mixing vessel and are discussed in detail in this paper. In addition, this paper reviews the current problems associated with conventional tracers, mathematical models, and computational fluid dynamics simulations involved in radiotracer experiments and hybrid of radiotracer. PMID:24616642

  13. A method of distributed avionics data processing based on SVM classifier

    NASA Astrophysics Data System (ADS)

    Guo, Hangyu; Wang, Jinyan; Kang, Minyang; Xu, Guojing

    2018-03-01

    Under the environment of system combat, in order to solve the problem on management and analysis of the massive heterogeneous data on multi-platform avionics system, this paper proposes a management solution which called avionics "resource cloud" based on big data technology, and designs an aided decision classifier based on SVM algorithm. We design an experiment with STK simulation, the result shows that this method has a high accuracy and a broad application prospect.

  14. A Generic Simulation Framework for Non-Entangled based Experimental Quantum Cryptography and Communication: Quantum Cryptography and Communication Simulator (QuCCs)

    NASA Astrophysics Data System (ADS)

    Buhari, Abudhahir; Zukarnain, Zuriati Ahmad; Khalid, Roszelinda; Zakir Dato', Wira Jaafar Ahmad

    2016-11-01

    The applications of quantum information science move towards bigger and better heights for the next generation technology. Especially, in the field of quantum cryptography and quantum computation, the world already witnessed various ground-breaking tangible product and promising results. Quantum cryptography is one of the mature field from quantum mechanics and already available in the markets. The current state of quantum cryptography is still under various researches in order to reach the heights of digital cryptography. The complexity of quantum cryptography is higher due to combination of hardware and software. The lack of effective simulation tool to design and analyze the quantum cryptography experiments delays the reaching distance of the success. In this paper, we propose a framework to achieve an effective non-entanglement based quantum cryptography simulation tool. We applied hybrid simulation technique i.e. discrete event, continuous event and system dynamics. We also highlight the limitations of a commercial photonic simulation tool based experiments. Finally, we discuss ideas for achieving one-stop simulation package for quantum based secure key distribution experiments. All the modules of simulation framework are viewed from the computer science perspective.

  15. Real-time global MHD simulation of the solar wind interaction with the earth’s magnetosphere

    NASA Astrophysics Data System (ADS)

    Shimazu, H.; Kitamura, K.; Tanaka, T.; Fujita, S.; Nakamura, M. S.; Obara, T.

    2008-11-01

    We have developed a real-time global MHD (magnetohydrodynamics) simulation of the solar wind interaction with the earth’s magnetosphere. By adopting the real-time solar wind parameters and interplanetary magnetic field (IMF) observed routinely by the ACE (Advanced Composition Explorer) spacecraft, responses of the magnetosphere are calculated with MHD code. The simulation is carried out routinely on the super computer system at National Institute of Information and Communications Technology (NICT), Japan. The visualized images of the magnetic field lines around the earth, pressure distribution on the meridian plane, and the conductivity of the polar ionosphere, can be referred to on the web site (http://www2.nict.go.jp/y/y223/simulation/realtime/). The results show that various magnetospheric activities are almost reproduced qualitatively. They also give us information how geomagnetic disturbances develop in the magnetosphere in relation with the ionosphere. From the viewpoint of space weather, the real-time simulation helps us to understand the whole image in the current condition of the magnetosphere. To evaluate the simulation results, we compare the AE indices derived from the simulation and observations. The simulation and observation agree well for quiet days and isolated substorm cases in general.

  16. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov Websites

    | NREL Vehicle Technology Simulation and Analysis Tools Vehicle Technology Simulation and vehicle technologies with the potential to achieve significant fuel savings and emission reductions. NREL : Automotive Deployment Options Projection Tool The ADOPT modeling tool estimates vehicle technology

  17. JELC-LITE: Unconventional Instructional Design for Special Operations Training

    NASA Technical Reports Server (NTRS)

    Friedman, Mark

    2012-01-01

    Current special operations staff training is based on the Joint Event Life Cycle (JELC). It addresses operational level tasks in multi-week, live military exercises which are planned over a 12 to 18 month timeframe. As the military experiences changing global mission sets, shorter training events using distributed technologies will increasingly be needed to augment traditional training. JELC-Lite is a new approach for providing relevant training between large scale exercises. This new streamlined, responsive training model uses distributed and virtualized training technologies to establish simulated scenarios. It keeps proficiency levels closer to optimal levels -- thereby reducing the performance degradation inherent in periodic training. It can be delivered to military as well as under-reached interagency groups to facilitate agile, repetitive training events. JELC-Lite is described by four phases paralleling the JELC, differing mostly in scope and scale. It has been successfully used with a Theater Special Operations Command and fits well within the current environment of reduced personnel and financial resources.

  18. Advanced Power Electronic Interfaces for Distributed Energy Systems, Part 2: Modeling, Development, and Experimental Evaluation of Advanced Control Functions for Single-Phase Utility-Connected Inverter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, S.; Kroposki, B.; Kramer, W.

    Integrating renewable energy and distributed generations into the Smart Grid architecture requires power electronic (PE) for energy conversion. The key to reaching successful Smart Grid implementation is to develop interoperable, intelligent, and advanced PE technology that improves and accelerates the use of distributed energy resource systems. This report describes the simulation, design, and testing of a single-phase DC-to-AC inverter developed to operate in both islanded and utility-connected mode. It provides results on both the simulations and the experiments conducted, demonstrating the ability of the inverter to provide advanced control functions such as power flow and VAR/voltage regulation. This report alsomore » analyzes two different techniques used for digital signal processor (DSP) code generation. Initially, the DSP code was written in C programming language using Texas Instrument's Code Composer Studio. In a later stage of the research, the Simulink DSP toolbox was used to self-generate code for the DSP. The successful tests using Simulink self-generated DSP codes show promise for fast prototyping of PE controls.« less

  19. Distributed and collaborative synthetic environments

    NASA Technical Reports Server (NTRS)

    Bajaj, Chandrajit L.; Bernardini, Fausto

    1995-01-01

    Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.

  20. Climate change, fisheries management and fishing aptitude affecting spatial and temporal distributions of the Barents Sea cod fishery.

    PubMed

    Eide, Arne

    2017-12-01

    Climate change is expected to influence spatial and temporal distributions of fish stocks. The aim of this paper is to compare climate change impact on a fishery with other factors impacting the performance of fishing fleets. The fishery in question is the Northeast Arctic cod fishery, a well-documented fishery where data on spatial and temporal distributions are available. A cellular automata model is developed for the purpose of mimicking possible distributional patterns and different management alternatives are studied under varying assumptions on the fleets' fishing aptitude. Fisheries management and fishing aptitude, also including technological development and local knowledge, turn out to have the greatest impact on the spatial distribution of the fishing effort, when comparing the IPCC's SRES A1B scenario with repeated sequences of the current environmental situation over a period of 45 years. In both cases, the highest profits in the simulation period of 45 years are obtained at low exploitation levels and moderate fishing aptitude.

  1. Room temperature solid-state quantum emitters in the telecom range.

    PubMed

    Zhou, Yu; Wang, Ziyu; Rasmita, Abdullah; Kim, Sejeong; Berhane, Amanuel; Bodrog, Zoltán; Adamo, Giorgio; Gali, Adam; Aharonovich, Igor; Gao, Wei-Bo

    2018-03-01

    On-demand, single-photon emitters (SPEs) play a key role across a broad range of quantum technologies. In quantum networks and quantum key distribution protocols, where photons are used as flying qubits, telecom wavelength operation is preferred because of the reduced fiber loss. However, despite the tremendous efforts to develop various triggered SPE platforms, a robust source of triggered SPEs operating at room temperature and the telecom wavelength is still missing. We report a triggered, optically stable, room temperature solid-state SPE operating at telecom wavelengths. The emitters exhibit high photon purity (~5% multiphoton events) and a record-high brightness of ~1.5 MHz. The emission is attributed to localized defects in a gallium nitride (GaN) crystal. The high-performance SPEs embedded in a technologically mature semiconductor are promising for on-chip quantum simulators and practical quantum communication technologies.

  2. Vitrification of radioactive contaminated soil by means of microwave energy

    NASA Astrophysics Data System (ADS)

    Yuan, Xun; Qing, Qi; Zhang, Shuai; Lu, Xirui

    2017-03-01

    Simulated radioactive contaminated soil was successfully vitrified by microwave sintering technology and the solidified body were systematically studied by Raman, XRD and SEM-EDX. The Raman results show that the solidified body transformed to amorphous structure better at higher temperature (1200 °C). The XRD results show that the metamictization has been significantly enhanced by the prolonged holding time at 1200 °C by microwave sintering, while by conventional sintering technology other crystal diffraction peaks, besides of silica at 2θ = 27.830°, still exist after being treated at 1200 °C for much longer time. The SEM-EDX discloses the micro-morphology of the sample and the uniform distribution of Nd element. All the results show that microwave technology performs vitrification better than the conventional sintering method in solidifying radioactive contaminated soil.

  3. Reference-Frame-Independent and Measurement-Device-Independent Quantum Key Distribution Using One Single Source

    NASA Astrophysics Data System (ADS)

    Li, Qian; Zhu, Changhua; Ma, Shuquan; Wei, Kejin; Pei, Changxing

    2018-04-01

    Measurement-device-independent quantum key distribution (MDI-QKD) is immune to all detector side-channel attacks. However, practical implementations of MDI-QKD, which require two-photon interferences from separated independent single-photon sources and a nontrivial reference alignment procedure, are still challenging with current technologies. Here, we propose a scheme that significantly reduces the experimental complexity of two-photon interferences and eliminates reference frame alignment by the combination of plug-and-play and reference frame independent MDI-QKD. Simulation results show that the secure communication distance can be up to 219 km in the finite-data case and the scheme has good potential for practical MDI-QKD systems.

  4. Determining Water Content and Distribution in PEMFCs to Predict Aging While in Storage

    DOE PAGES

    Stariha, Sarah; Wilson, Mahlon Scott; LaManna, Jacob M.; ...

    2017-08-24

    Proton membrane exchange fuel cells (PEMFCs) have the potential to be long term backup power sources with a startup time on the order of seconds. Water management is the key issue in being able to successfully store PEMFCs for extended periods of time. In this work custom made PEMFCs were humidified at various relative humidities (%RH) and subsequently stored for different lengths of time. The fuel cell’s water content was then imaged at the National Institute of Standards and Technology (NIST) neutron imaging facility. In conclusion, the cells’ startup performances were measured simulating quick startup conditions to define the effectmore » of different water distributions.« less

  5. Qualitative Description of Electric Power System Future States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardy, Trevor D.; Corbin, Charles D.

    The simulation and evaluation of transactive systems depends to a large extent on the context in which those efforts are performed. Assumptions regarding the composition of the electric power system, the regulatory and policy environment, the distribution of renewable and other distributed energy resources (DERs), technological advances, and consumer engagement all contribute to, and affect, the evaluation of any given transactive system, regardless of its design. It is our position that the assumptions made about the state of the future power grid will determine, to some extent, the systems ultimately deployed, and that the transactive system itself may play anmore » important role in the evolution of the power system.« less

  6. Determining Water Content and Distribution in PEMFCs to Predict Aging While in Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stariha, Sarah; Wilson, Mahlon Scott; LaManna, Jacob M.

    Proton membrane exchange fuel cells (PEMFCs) have the potential to be long term backup power sources with a startup time on the order of seconds. Water management is the key issue in being able to successfully store PEMFCs for extended periods of time. In this work custom made PEMFCs were humidified at various relative humidities (%RH) and subsequently stored for different lengths of time. The fuel cell’s water content was then imaged at the National Institute of Standards and Technology (NIST) neutron imaging facility. In conclusion, the cells’ startup performances were measured simulating quick startup conditions to define the effectmore » of different water distributions.« less

  7. Electrical Capacitance Volume Tomography for the Packed Bed Reactor ISS Flight Experiment

    NASA Technical Reports Server (NTRS)

    Marashdeh, Qussai; Motil, Brian; Wang, Aining; Liang-Shih, Fan

    2013-01-01

    Fixed packed bed reactors are compact, require minimum power and maintenance to operate, and are highly reliable. These features make this technology a highly desirable unit operation for long duration life support systems in space. NASA is developing an ISS experiment to address this technology with particular focus on water reclamation and air revitalization. Earlier research and development efforts funded by NASA have resulted in two hydrodynamic models which require validation with appropriate instrumentation in an extended microgravity environment. To validate these models, the instantaneous distribution of the gas and liquid phases must be measured.Electrical Capacitance Volume Tomography (ECVT) is a non-invasive imaging technology recently developed for multi-phase flow applications. It is based on distributing flexible capacitance plates on the peripheral of a flow column and collecting real-time measurements of inter-electrode capacitances. Capacitance measurements here are directly related to dielectric constant distribution, a physical property that is also related to material distribution in the imaging domain. Reconstruction algorithms are employed to map volume images of dielectric distribution in the imaging domain, which is in turn related to phase distribution. ECVT is suitable for imaging interacting materials of different dielectric constants, typical in multi-phase flow systems. ECVT is being used extensively for measuring flow variables in various gas-liquid and gas-solid flow systems. Recent application of ECVT include flows in risers and exit regions of circulating fluidized beds, gas-liquid and gas-solid bubble columns, trickle beds, and slurry bubble columns. ECVT is also used to validate flow models and CFD simulations. The technology is uniquely qualified for imaging phase concentrations in packed bed reactors for the ISS flight experiments as it exhibits favorable features of compact size, low profile sensors, high imaging speed, and flexibility to fit around columns of various shapes and sizes. ECVT is also safer than other commonly used imaging modalities as it operates in the range of low frequencies (1 MHz) and does not radiate radioactive energy. In this effort, ECVT is being used to image flow parameters in a packed bed reactor for an ISS flight experiment.

  8. Efficient numerical simulation of heat storage in subsurface georeservoirs

    NASA Astrophysics Data System (ADS)

    Boockmeyer, A.; Bauer, S.

    2015-12-01

    The transition of the German energy market towards renewable energy sources, e.g. wind or solar power, requires energy storage technologies to compensate for their fluctuating production. Large amounts of energy could be stored in georeservoirs such as porous formations in the subsurface. One possibility here is to store heat with high temperatures of up to 90°C through borehole heat exchangers (BHEs) since more than 80 % of the total energy consumption in German households are used for heating and hot water supply. Within the ANGUS+ project potential environmental impacts of such heat storages are assessed and quantified. Numerical simulations are performed to predict storage capacities, storage cycle times, and induced effects. For simulation of these highly dynamic storage sites, detailed high-resolution models are required. We set up a model that accounts for all components of the BHE and verified it using experimental data. The model ensures accurate simulation results but also leads to large numerical meshes and thus high simulation times. In this work, we therefore present a numerical model for each type of BHE (single U, double U and coaxial) that reduces the number of elements and the simulation time significantly for use in larger scale simulations. The numerical model includes all BHE components and represents the temporal and spatial temperature distribution with an accuracy of less than 2% deviation from the fully discretized model. By changing the BHE geometry and using equivalent parameters, the simulation time is reduced by a factor of ~10 for single U-tube BHEs, ~20 for double U-tube BHEs and ~150 for coaxial BHEs. Results of a sensitivity study that quantify the effects of different design and storage formation parameters on temperature distribution and storage efficiency for heat storage using multiple BHEs are then shown. It is found that storage efficiency strongly depends on the number of BHEs composing the storage site, their distance and the cycle time. The temperature distribution is most sensitive to thermal conductivity of both borehole grouting and storage formation while storage efficiency is mainly controlled by the thermal conductivity of the storage formation.

  9. Identifying the potential of changes to blood sample logistics using simulation.

    PubMed

    Jørgensen, Pelle; Jacobsen, Peter; Poulsen, Jørgen Hjelm

    2013-01-01

    Using simulation as an approach to display and improve internal logistics at hospitals has great potential. This study shows how a simulation model displaying the morning blood-taking round at a Danish public hospital can be developed and utilized with the aim of improving the logistics. The focus of the simulation was to evaluate changes made to the transportation of blood samples between wards and the laboratory. The average- (AWT) and maximum waiting time (MWT) from a blood sample was drawn at the ward until it was received at the laboratory, and the distribution of arrivals of blood samples in the laboratory were used as the evaluation criteria. Four different scenarios were tested and compared with the current approach: (1) Using AGVs (mobile robots), (2) using a pneumatic tube system, (3) using porters that are called upon, or (4) using porters that come to the wards every 45 minutes. Furthermore, each of the scenarios was tested in terms of what amount of resources would give the optimal result. The simulations showed a big improvement potential in implementing a new technology/mean for transporting the blood samples. The pneumatic tube system showed the biggest potential lowering the AWT and MWT with approx. 36% and 18%, respectively. Additionally, all of the scenarios had a more even distribution of arrivals except for porters coming to the wards every 45 min. As a consequence of the results obtained in the study, the hospital decided to implement a pneumatic tube system.

  10. Supporting Shared Resource Usage for a Diverse User Community: the OSG Experience and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Garzoglio, Gabriele; Levshina, Tanya; Rynge, Mats; Sehgal, Chander; Slyz, Marko

    2012-12-01

    The Open Science Grid (OSG) supports a diverse community of new and existing users in adopting and making effective use of the Distributed High Throughput Computing (DHTC) model. The LHC user community has deep local support within the experiments. For other smaller communities and individual users the OSG provides consulting and technical services through the User Support area. We describe these sometimes successful and sometimes not so successful experiences and analyze lessons learned that are helping us improve our services. The services offered include forums to enable shared learning and mutual support, tutorials and documentation for new technology, and troubleshooting of problematic or systemic failure modes. For new communities and users, we bootstrap their use of the distributed high throughput computing technologies and resources available on the OSG by following a phased approach. We first adapt the application and run a small production campaign on a subset of “friendly” sites. Only then do we move the user to run full production campaigns across the many remote sites on the OSG, adding to the community resources up to hundreds of thousands of CPU hours per day. This scaling up generates new challenges - like no determinism in the time to job completion, and diverse errors due to the heterogeneity of the configurations and environments - so some attention is needed to get good results. We cover recent experiences with image simulation for the Large Synoptic Survey Telescope (LSST), small-file large volume data movement for the Dark Energy Survey (DES), civil engineering simulation with the Network for Earthquake Engineering Simulation (NEES), and accelerator modeling with the Electron Ion Collider group at BNL. We will categorize and analyze the use cases and describe how our processes are evolving based on lessons learned.

  11. Air Pollution Monitoring and Mining Based on Sensor Grid in London

    PubMed Central

    Ma, Yajie; Richards, Mark; Ghanem, Moustafa; Guo, Yike; Hassard, John

    2008-01-01

    In this paper, we present a distributed infrastructure based on wireless sensors network and Grid computing technology for air pollution monitoring and mining, which aims to develop low-cost and ubiquitous sensor networks to collect real-time, large scale and comprehensive environmental data from road traffic emissions for air pollution monitoring in urban environment. The main informatics challenges in respect to constructing the high-throughput sensor Grid are discussed in this paper. We present a two-layer network framework, a P2P e-Science Grid architecture, and the distributed data mining algorithm as the solutions to address the challenges. We simulated the system in TinyOS to examine the operation of each sensor as well as the networking performance. We also present the distributed data mining result to examine the effectiveness of the algorithm. PMID:27879895

  12. Air Pollution Monitoring and Mining Based on Sensor Grid in London.

    PubMed

    Ma, Yajie; Richards, Mark; Ghanem, Moustafa; Guo, Yike; Hassard, John

    2008-06-01

    In this paper, we present a distributed infrastructure based on wireless sensors network and Grid computing technology for air pollution monitoring and mining, which aims to develop low-cost and ubiquitous sensor networks to collect real-time, large scale and comprehensive environmental data from road traffic emissions for air pollution monitoring in urban environment. The main informatics challenges in respect to constructing the high-throughput sensor Grid are discussed in this paper. We present a twolayer network framework, a P2P e-Science Grid architecture, and the distributed data mining algorithm as the solutions to address the challenges. We simulated the system in TinyOS to examine the operation of each sensor as well as the networking performance. We also present the distributed data mining result to examine the effectiveness of the algorithm.

  13. Using dCache in Archiving Systems oriented to Earth Observation

    NASA Astrophysics Data System (ADS)

    Garcia Gil, I.; Perez Moreno, R.; Perez Navarro, O.; Platania, V.; Ozerov, D.; Leone, R.

    2012-04-01

    The object of LAST activity (Long term data Archive Study on new Technologies) is to perform an independent study on best practices and assessment of different archiving technologies mature for operation in the short and mid-term time frame, or available in the long-term with emphasis on technologies better suited to satisfy the requirements of ESA, LTDP and other European and Canadian EO partners in terms of digital information preservation and data accessibility and exploitation. During the last phase of the project, a testing of several archiving solutions has been performed in order to evaluate their suitability. In particular, dCache, aimed to provide a file system tree view of the data repository exchanging this data with backend (tertiary) Storage Systems as well as space management, pool attraction, dataset replication, hot spot determination and recovery from disk or node failures. Connected to a tertiary storage system, dCache simulates unlimited direct access storage space. Data exchanges to and from the underlying HSM are performed automatically and invisibly to the user Dcache was created to solve the requirements of big computer centers and universities with big amounts of data, putting their efforts together and founding EMI (European Middleware Initiative). At the moment being, Dcache is mature enough to be implemented, being used by several research centers of relevance (e.g. LHC storing up to 50TB/day). This solution has been not used so far in Earth Observation and the results of the study are summarized in this article, focusing on the capacities over a simulated environment to get in line with the ESA requirements for a geographically distributed storage. The challenge of a geographically distributed storage system can be summarized as the way to provide a maximum quality for storage and dissemination services with the minimum cost.

  14. Collaborative modeling: the missing piece of distributed simulation

    NASA Astrophysics Data System (ADS)

    Sarjoughian, Hessam S.; Zeigler, Bernard P.

    1999-06-01

    The Department of Defense overarching goal of performing distributed simulation by overcoming geographic and time constraints has brought the problem of distributed modeling to the forefront. The High Level Architecture standard is primarily intended for simulation interoperability. However, as indicated, the existence of a distributed modeling infrastructure plays a fundamental and central role in supporting the development of distributed simulations. In this paper, we describe some fundamental distributed modeling concepts and their implications for constructing successful distributed simulations. In addition, we discuss the Collaborative DEVS Modeling environment that has been devised to enable graphically dispersed modelers to collaborate and synthesize modular and hierarchical models. We provide an actual example of the use of Collaborative DEVS Modeler in application to a project involving corporate partners developing an HLA-compliant distributed simulation exercise.

  15. Research on techniques for computer three-dimensional simulation of satellites and night sky

    NASA Astrophysics Data System (ADS)

    Yan, Guangwei; Hu, Haitao

    2007-11-01

    To study space attack-defense technology, a simulation of satellites is needed. We design and implement a 3d simulating system of satellites. The satellites are rendered under the Night sky background. The system structure is as follows: one computer is used to simulate the orbital of satellites, the other computers are used to render 3d simulation scene. To get a realistic effect, a three-channel multi-projector display system is constructed. We use MultiGen Creator to construct satellite and star models. We use MultiGen Distributed Vega to render the three-channel scene. There are one master and three slaves. The master controls the three slaves to render three channels separately. To get satellites' positions and attitudes, the master communicates with the satellite orbit simulator based on TCP/IP protocol. Then it calculates the observer's position, the satellites' position, the moon's and the sun's position and transmits the data to the slaves. To get a smooth orbit of target satellites, an orbit prediction method is used. Because the target satellite data packets and the attack satellite data packets cannot keep synchronization in the network, a target satellite dithering phenomenon will occur when the scene is rendered. To resolve this problem, an anti-dithering algorithm is designed. To render Night sky background, a file which stores stars' position and brightness data is used. According to the brightness of each star, the stars are classified into different magnitude. The star model is scaled according to the magnitude. All the stars are distributed on a celestial sphere. Experiments show, the whole system can run correctly, and the frame rate can reach 30Hz. The system can be used in a space attack-defense simulation field.

  16. Simulation of rotor blade element turbulence

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.; Duisenberg, Ken

    1995-01-01

    A piloted, motion-based simulation of Sikorsky's Black Hawk helicopter was used as a platform for the investigation of rotorcraft responses to vertical turbulence. By using an innovative temporal and geometrical distribution algorithm that preserved the statistical characteristics of the turbulence over the rotor disc, stochastic velocity components were applied at each of twenty blade-element stations. This model was implemented on NASA Ames' Vertical Motion Simulator (VMS), and ten test pilots were used to establish that the model created realistic cues. The objectives of this research included the establishment of a simulation-technology basis for future investigation into real-time turbulence modeling. This goal was achieved; our extensive additions to the rotor model added less than a 10 percent computational overhead. Using a VAX 9000 computer the entire simulation required a cycle time of less than 12 msec. Pilot opinion during this simulation was generally quite favorable. For low speed flight the consensus was that SORBET (acronym for title) was better than the conventional body-fixed model, which was used for comparison purposes, and was determined to be too violent (like a washboard). For high speed flight the pilots could not identify differences between these models. These opinions were something of a surprise because only the vertical turbulence component on the rotor system was implemented in SORBET. Because of the finite-element distribution of the inputs, induced outputs were observed in all translational and rotational axes. Extensive post-simulation spectral analyses of the SORBET model suggest that proper rotorcraft turbulence modeling requires that vertical atmospheric disturbances not be superimposed at the vehicle center of gravity but, rather, be input into the rotor system, where the rotor-to-body transfer function severely attenuates high frequency rotorcraft responses.

  17. Integration of environmental simulation models with satellite remote sensing and geographic information systems technologies: case studies

    USGS Publications Warehouse

    Steyaert, Louis T.; Loveland, Thomas R.; Brown, Jesslyn F.; Reed, Bradley C.

    1993-01-01

    Environmental modelers are testing and evaluating a prototype land cover characteristics database for the conterminous United States developed by the EROS Data Center of the U.S. Geological Survey and the University of Nebraska Center for Advanced Land Management Information Technologies. This database was developed from multi temporal, 1-kilometer advanced very high resolution radiometer (AVHRR) data for 1990 and various ancillary data sets such as elevation, ecological regions, and selected climatic normals. Several case studies using this database were analyzed to illustrate the integration of satellite remote sensing and geographic information systems technologies with land-atmosphere interactions models at a variety of spatial and temporal scales. The case studies are representative of contemporary environmental simulation modeling at local to regional levels in global change research, land and water resource management, and environmental simulation modeling at local to regional levels in global change research, land and water resource management and environmental risk assessment. The case studies feature land surface parameterizations for atmospheric mesoscale and global climate models; biogenic-hydrocarbons emissions models; distributed parameter watershed and other hydrological models; and various ecological models such as ecosystem, dynamics, biogeochemical cycles, ecotone variability, and equilibrium vegetation models. The case studies demonstrate the important of multi temporal AVHRR data to develop to develop and maintain a flexible, near-realtime land cover characteristics database. Moreover, such a flexible database is needed to derive various vegetation classification schemes, to aggregate data for nested models, to develop remote sensing algorithms, and to provide data on dynamic landscape characteristics. The case studies illustrate how such a database supports research on spatial heterogeneity, land use, sensitivity analysis, and scaling issues involving regional extrapolations and parameterizations of dynamic land processes within simulation models.

  18. Research of Ancient Architectures in Jin-Fen Area Based on GIS&BIM Technology

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Zheng, Qiuhong; Gao, Huiying; Sun, Hai

    2017-05-01

    The number of well-preserved ancient buildings located in Shanxi Province, enjoying the absolute maximum proportion of ancient architectures in China, is about 18418, among which, 9053 buildings have the structural style of wood frame. The value of the application of BIM (Building Information Modeling) and GIS (Geographic Information System) is gradually probed and testified in the corresponding fields of ancient architecture’s spatial distribution information management, routine maintenance and special conservation & restoration, the evaluation and simulation of related disasters, such as earthquake. The research objects are ancient architectures in JIN-FEN area, which were first investigated by Sicheng LIANG and recorded in his work of “Chinese ancient architectures survey report”. The research objects, i.e. the ancient architectures in Jin-Fen area include those in Sicheng LIANG’s investigation, and further adjustments were made through authors’ on-site investigation and literature searching & collection. During this research process, the spatial distributing Geodatabase of research objects is established utilizing GIS. The BIM components library for ancient buildings is formed combining on-site investigation data and precedent classic works, such as “Yingzao Fashi”, a treatise on architectural methods in Song Dynasty, “Yongle Encyclopedia” and “Gongcheng Zuofa Zeli”, case collections of engineering practice, by the Ministry of Construction of Qing Dynasty. A building of Guangsheng temple in Hongtong county is selected as an example to elaborate the BIM model construction process based on the BIM components library for ancient buildings. Based on the foregoing work results of spatial distribution data, attribute data of features, 3D graphic information and parametric building information model, the information management system for ancient architectures in Jin-Fen Area, utilizing GIS&BIM technology, could be constructed to support the further research of seismic disaster analysis and seismic performance simulation.

  19. Use of Simulation Technology in Dental Education.

    ERIC Educational Resources Information Center

    Buchanan, Judith Ann

    2001-01-01

    Discusses the impact of current simulation laboratories on dental education and reviews advanced technology simulation that has recently become available or is in the developmental stage. Addresses the abilities of advanced technology simulation, its advantages and disadvantages, and its potential to affect dental education. (EV)

  20. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    NASA Astrophysics Data System (ADS)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.

  1. Learning style and laparoscopic experience in psychomotor skill performance using a virtual reality surgical simulator.

    PubMed

    Windsor, John A; Diener, Scott; Zoha, Farah

    2008-06-01

    People learn in different ways, and training techniques and technologies should accommodate individual learning needs. This pilot study looks at the relationship between learning style, as measured with the Multiple Intelligences Developmental Assessment Scales (MIDAS), laparoscopic surgery experience and psychomotor skill performance using the MIST VR surgical simulator. Five groups of volunteer subjects were selected from undergraduate tertiary students, medical students, novice surgical trainees, advanced surgical trainees and experienced laparoscopic surgeons. Each group was administered the MIDAS followed by two simulated surgical tasks on the MIST VR simulator. There was a striking homogeny of learning styles amongst experienced laparoscopic surgeons. Significant differences in the distribution of primary learning styles were found (P < .01) between subjects with minimal surgical training and those with considerable experience. A bodily-kinesthetic learning style, irrespective of experience, was associated with the best performance of the laparoscopic tasks. This is the first study to highlight the relationship between learning style, psychomotor skill and laparoscopic surgical experience with implications for surgeon selection, training and credentialling.

  2. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  3. Jet aircraft hydrocarbon fuels technology

    NASA Technical Reports Server (NTRS)

    Longwell, J. P. (Editor)

    1978-01-01

    A broad specification, referee fuel was proposed for research and development. This fuel has a lower, closely specified hydrogen content and higher final boiling point and freezing point than ASTM Jet A. The workshop recommended various priority items for fuel research and development. Key items include prediction of tradeoffs among fuel refining, distribution, and aircraft operating costs; combustor liner temperature and emissions studies; and practical simulator investigations of the effect of high freezing point and low thermal stability fuels on aircraft fuel systems.

  4. Real-Time Characterization of Aerospace Structures Using Onboard Strain Measurement Technologies and Inverse Finite Element Method

    DTIC Science & Technology

    2011-09-01

    strain data provided by in-situ strain sensors. The application focus is on the stain data obtained from FBG (Fiber Bragg Grating) sensor arrays...sparsely distributed lines to simulate strain data from FBG (Fiber Bragg Grating) arrays that provide either single-core (axial) or rosette (tri...when the measured strain data are sparse, as it is often the case when FBG sensors are used. For an inverse element without strain-sensor data, the

  5. Finite element analysis of the upsetting of a 5056 aluminum alloy sample with consideration of its microstructure

    NASA Astrophysics Data System (ADS)

    Voronin, S. V.; Chaplygin, K. K.

    2017-12-01

    Computer simulation of upsetting the finite element models (FEMs) of an isotropic 5056 aluminum alloy sample and a 5056 aluminum alloy sample with consideration of microstructure is carried out. The stress and strain distribution patterns at different process stages are obtained. The strain required for the deformation of the FEMs of 5056 alloy samples is determined. The influence of the material microstructure on the stress-strain behavior and technological parameters are demonstrated.

  6. Influence of a Small Fraction of Individuals with Enhanced Mutations on a Population Genetic Pool

    NASA Astrophysics Data System (ADS)

    Cebrat, S.; Stauffer, D.

    It has been observed that a higher mutation load could be introduced into the genomes of children conceived by assisted reproduction technology (fertilization in-vitro). This generates two effects — slightly higher mutational pressure on the whole genetic pool of population and inhomogeneity of mutation distributions in the genetic pool. Computer simulations of the Penna ageing model suggest that already a small fraction of births with enhanced number of new mutations can negatively influence the whole population.

  7. Data sets for manuscript titled Unexpected benefits of reducing aerosol cooling effects

    EPA Pesticide Factsheets

    These data sets were created using extensive model simulation results from the WRF-CMAQ model, population distributions, and through the use of an health impact assessment model - see manuscript for details.This dataset is associated with the following publication:Xing, J., J. Wang, R. Mathur , J. Pleim , S. Wang, C. Hogrefe , C. Gan, D. Wong , and J. Hao. Unexpected Benefits of Reducing Aerosol Cooling Effects. ENVIRONMENTAL SCIENCE & TECHNOLOGY. American Chemical Society, Washington, DC, USA, 50(14): 7527–7534, (2016).

  8. Research of the application of the new communication technologies for distribution automation

    NASA Astrophysics Data System (ADS)

    Zhong, Guoxin; Wang, Hao

    2018-03-01

    Communication network is a key factor of distribution automation. In recent years, new communication technologies for distribution automation have a rapid development in China. This paper introduces the traditional communication technologies of distribution automation and analyse the defects of these traditional technologies. Then this paper gives a detailed analysis on some new communication technologies for distribution automation including wired communication and wireless communication and then gives an application suggestion of these new technologies.

  9. Traffic Flow Density Distribution Based on FEM

    NASA Astrophysics Data System (ADS)

    Ma, Jing; Cui, Jianming

    In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.

  10. Numerical Simulation of Polysilicon Solid-liquid Interface Transmogrification in Heat Transfer Process

    NASA Astrophysics Data System (ADS)

    Yang, Xi; Ma, Wenhui; Lv, Guoqiang; Zhang, Mingyu

    2018-01-01

    The shape of solid-liquid interface during the directional solidification process, which is difficult to be observed and measured in actual processes, controls the grain orientation and grain size of polysilicon ingot. We carried out numerical calculations of the directional solidification progress of polycrystalline silicon and invested the means to deal with the latent heat of solidification in numerical simulation. The distributions of the temperature field of the melt for the crystallization progress as well as the transformation of the solid-liquid interface were obtained. The simulation results are consistent with the experimental outcomes. The results show that the curvature of solid-liquid interface is small and stability, larger grain sized columnar crystal can be grown in the laboratory-scale furnace at a solidification rate of 10 μm•s-1. It shall provide important theoretical basis for metallurgical process and polysilicon production technology.

  11. Design and simulation analysis of a novel pressure sensor based on graphene film

    NASA Astrophysics Data System (ADS)

    Nie, M.; Xia, Y. H.; Guo, A. Q.

    2018-02-01

    A novel pressure sensor structure based on graphene film as the sensitive membrane was proposed in this paper, which solved the problem to measure low and minor pressure with high sensitivity. Moreover, the fabrication process was designed which can be compatible with CMOS IC fabrication technology. Finite element analysis has been used to simulate the displacement distribution of the thin movable graphene film of the designed pressure sensor under the different pressures with different dimensions. From the simulation results, the optimized structure has been obtained which can be applied in the low measurement range from 10hPa to 60hPa. The length and thickness of the graphene film could be designed as 100μm and 0.2μm, respectively. The maximum mechanical stress on the edge of the sensitive membrane was 1.84kPa, which was far below the breaking strength of the silicon nitride and graphene film.

  12. A holistic approach to SIM platform and its application to early-warning satellite system

    NASA Astrophysics Data System (ADS)

    Sun, Fuyu; Zhou, Jianping; Xu, Zheyao

    2018-01-01

    This study proposes a new simulation platform named Simulation Integrated Management (SIM) for the analysis of parallel and distributed systems. The platform eases the process of designing and testing both applications and architectures. The main characteristics of SIM are flexibility, scalability, and expandability. To improve the efficiency of project development, new models of early-warning satellite system were designed based on the SIM platform. Finally, through a series of experiments, the correctness of SIM platform and the aforementioned early-warning satellite models was validated, and the systematical analyses for the orbital determination precision of the ballistic missile during its entire flight process were presented, as well as the deviation of the launch/landing point. Furthermore, the causes of deviation and prevention methods will be fully explained. The simulation platform and the models will lay the foundations for further validations of autonomy technology in space attack-defense architecture research.

  13. Simulation of Cooling Rate Effects on Ti-48Al-2Cr-2Nb Crack Formation in Direct Laser Deposition

    NASA Astrophysics Data System (ADS)

    Yan, Lei; Li, Wei; Chen, Xueyang; Zhang, Yunlu; Newkirk, Joe; Liou, Frank; Dietrich, David

    2017-03-01

    Transient temperature history is vital in direct laser deposition (DLD) as it reveals the cooling rate at specific temperatures. Cooling rate directly relates to phase transformation and types of microstructure formed in deposits. In this paper, finite element analysis simulation was employed to study the transient temperature history and cooling rate at different experimental setups in the Ti-48Al-2Cr-2Nb DLD process. An innovative prediction strategy was developed to model with a moving Gaussian distribution heat source and element birth and death technology in ANSYS®, and fabricate crack-free deposits. This approach helps to understand and analyze the impact of cooling rate and also explain phase information gathered from x-ray diffraction.

  14. Guidelines for developing distributed virtual environment applications

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    1998-08-01

    We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Happenny, Sean F.

    The United States’ power infrastructure is aging, underfunded, and vulnerable to cyber attack. Emerging smart grid technologies may take some of the burden off of existing systems and make the grid as a whole more efficient, reliable, and secure. The Pacific Northwest National Laboratory (PNNL) is funding research into several aspects of smart grid technology and grid security, creating a software simulation tool that will allow researchers to test power distribution networks utilizing different smart grid technologies to determine how the grid and these technologies react under different circumstances. Demonstrating security in embedded systems is another research area PNNL ismore » tackling. Many of the systems controlling the U.S. critical infrastructure, such as the power grid, lack integrated security and the networks protecting them are becoming easier to breach. Providing a virtual power substation network to each student team at the National Collegiate Cyber Defense Competition, thereby supporting the education of future cyber security professionals, is another way PNNL is helping to strengthen the security of the nation’s power infrastructure.« less

  16. Increasing the resilience and security of the United States' power infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Happenny, Sean F.

    2015-08-01

    The United States' power infrastructure is aging, underfunded, and vulnerable to cyber attack. Emerging smart grid technologies may take some of the burden off of existing systems and make the grid as a whole more efficient, reliable, and secure. The Pacific Northwest National Laboratory (PNNL) is funding research into several aspects of smart grid technology and grid security, creating a software simulation tool that will allow researchers to test power infrastructure control and distribution paradigms by utilizing different smart grid technologies to determine how the grid and these technologies react under different circumstances. Understanding how these systems behave in real-worldmore » conditions will lead to new ways to make our power infrastructure more resilient and secure. Demonstrating security in embedded systems is another research area PNNL is tackling. Many of the systems controlling the U.S. critical infrastructure, such as the power grid, lack integrated security and the aging networks protecting them are becoming easier to attack.« less

  17. Loss Estimations due to Earthquakes and Secondary Technological Hazards

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2009-04-01

    Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

  18. MO-H-19A-03: Patient Specific Bolus with 3D Printing Technology for Electron Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, W; Swann, B; Siderits, R

    2014-06-15

    Purpose: Bolus is widely used in electron radiotherapy to achieve desired dose distribution. 3D printing technologies provide clinicians with easy access to fabricate patient specific bolus accommodating patient body surface irregularities and tissue inhomogeneity. This study presents the design and the clinical workflow of 3D printed bolus for patient electron therapy in our clinic. Methods: Patient simulation CT images free of bolus were exported from treatment planning system (TPS) to an in-house developed software package. Bolus with known material properties was designed in the software package and then exported back to the TPS as a structure. Dose calculation was carriedmore » out to examine the coverage of the target. After satisfying dose distribution was achieved, the bolus structure was transferred in Standard Tessellation Language (STL) file format for the 3D printer to generate the machine codes for printing. Upon receiving printed bolus, a quick quality assurance was performed with patient resimulated with bolus in place to verify the bolus dosimetric property before treatment started. Results: A patient specific bolus for electron radiotherapy was designed and fabricated in Form 1 3D printer with methacrylate photopolymer resin. Satisfying dose distribution was achieved in patient with bolus setup. Treatment was successfully finished for one patient with the 3D printed bolus. Conclusion: The electron bolus fabrication with 3D printing technology was successfully implemented in clinic practice.« less

  19. Simulating the effect of ignition source type on forest fire statistics

    NASA Astrophysics Data System (ADS)

    Krenn, Roland; Hergarten, Stefan

    2010-05-01

    Forest fires belong to the most frightening natural hazards, and have long-term ecological and economic effects on the regions involved. It was found that their frequency-area distributions show power-law behaviour under a wide variety of conditions, interpreting them as a self-organised critical phenomenon. Using computer simulations, self-organised critical behaviour manifests in simple cellular automaton models. With respect to ignition source, forest fires can be categorised as lightning-induced or as a result of human activity. Lightning fires are considered to be natural, whereas ``man made'' fires are frequently caused by some sort of technological disaster, such as sparks from wheels of trains, the rupture of overhead electrical lines, the misuse of electrical or mechanical devices and so on. Taking into account that such events rarely occur deep in the woods, man made fires should start preferably on the edge of a forest or where the forest is not very dense. We present a modification in the self-organised critical Drossel-Schwabl forest fire model that takes these two different triggering mechanisms into account and increases the scaling exponent of the frequency-area distribution by ca. 1/3. Combined simulations further predict a dependence of the overall event-size distribution on the ratio of lightning-induced and man made fires as well as a splitting of their partial distributions. Lightning is identified as the dominant mechanism in the regime of the largest fires. The results are confirmed by the analysis of the Canadian Large Fire Database and suggest that lightning-induced and man made forest fires cannot be treated separately in wildfire modelling, hazard assessment and forest management.

  20. Orbital transfer rocket engine technology program: Soft wear ring seal technology

    NASA Technical Reports Server (NTRS)

    Lariviere, Brian W.

    1992-01-01

    Liquid oxygen (LOX) compatibility tests, including autogenous ignition, promoted ignition, LOX impact tests, and friction and wear tests on different PV products were conducted for several polymer materials as verification for the implementation of soft wear ring seals in advanced rocket engine turbopumps. Thermoplastics, polyimide based materials, and polyimide-imide base materials were compared for oxygen compatibility, specific wear coefficient, wear debris production, and heat dissipation mechanisms. A thermal model was generated that simulated the frictional heating input and calculated the surface temperature and temperature distribution within the seal. The predictions were compared against measured values. Heat loads in the model were varied to better match the test data and determine the difference between the measured and the calculated coefficients of friction.

  1. Design of fast signal processing readout front-end electronics implemented in CMOS 40 nm technology

    NASA Astrophysics Data System (ADS)

    Kleczek, Rafal

    2016-12-01

    The author presents considerations on the design of fast readout front-end electronics implemented in a CMOS 40 nm technology with an emphasis on the system dead time, noise performance and power dissipation. The designed processing channel consists of a charge sensitive amplifier with different feedback types (Krummenacher, resistive and constant current blocks), a threshold setting block, a discriminator and a counter with logic circuitry. The results of schematic and post-layout simulations with randomly generated input pulses in a time domain according to the Poisson distribution are presented and analyzed. Dead time below 20 ns is possible while keeping noise ENC ≈ 90 e- for a detector capacitance CDET = 160 fF.

  2. Mid-Infrared Trace Gas Sensor Technology Based on Intracavity Quartz-Enhanced Photoacoustic Spectroscopy

    PubMed Central

    Wojtas, Jacek; Gluszek, Aleksander; Hudzikowski, Arkadiusz; Tittel, Frank K.

    2017-01-01

    The application of compact inexpensive trace gas sensor technology to a mid-infrared nitric oxide (NO) detectoion using intracavity quartz-enhanced photoacoustic spectroscopy (I-QEPAS) is reported. A minimum detection limit of 4.8 ppbv within a 30 ms integration time was demonstrated by using a room-temperature, continuous-wave, distributed-feedback quantum cascade laser (QCL) emitting at 5.263 µm (1900.08 cm−1) and a new compact design of a high-finesse bow-tie optical cavity with an integrated resonant quartz tuning fork (QTF). The optimum configuration of the bow-tie cavity was simulated using custom software. Measurements were performed with a wavelength modulation scheme (WM) using a 2f detection procedure. PMID:28273836

  3. Stochastic modeling to identify requirements for centralized monitoring of distributed wastewater treatment.

    PubMed

    Hug, T; Maurer, M

    2012-01-01

    Distributed (decentralized) wastewater treatment can, in many situations, be a valuable alternative to a centralized sewer network and wastewater treatment plant. However, it is critical for its acceptance whether the same overall treatment performance can be achieved without on-site staff, and whether its performance can be measured. In this paper we argue and illustrate that the system performance depends not only on the design performance and reliability of the individual treatment units, but also significantly on the monitoring scheme, i.e. on the reliability of the process information. For this purpose, we present a simple model of a fleet of identical treatment units. Thereby, their performance depends on four stochastic variables: the reliability of the treatment unit, the respond time for the repair of failed units, the reliability of on-line sensors, and the frequency of routine inspections. The simulated scenarios show a significant difference between the true performance and the observations by the sensors and inspections. The results also illustrate the trade-off between investing in reactor and sensor technology and in human interventions in order to achieve a certain target performance. Modeling can quantify such effects and thereby support the identification of requirements for the centralized monitoring of distributed treatment units. The model approach is generic and can be extended and applied to various distributed wastewater treatment technologies and contexts.

  4. Room temperature solid-state quantum emitters in the telecom range

    PubMed Central

    Bodrog, Zoltán; Adamo, Giorgio; Gali, Adam

    2018-01-01

    On-demand, single-photon emitters (SPEs) play a key role across a broad range of quantum technologies. In quantum networks and quantum key distribution protocols, where photons are used as flying qubits, telecom wavelength operation is preferred because of the reduced fiber loss. However, despite the tremendous efforts to develop various triggered SPE platforms, a robust source of triggered SPEs operating at room temperature and the telecom wavelength is still missing. We report a triggered, optically stable, room temperature solid-state SPE operating at telecom wavelengths. The emitters exhibit high photon purity (~5% multiphoton events) and a record-high brightness of ~1.5 MHz. The emission is attributed to localized defects in a gallium nitride (GaN) crystal. The high-performance SPEs embedded in a technologically mature semiconductor are promising for on-chip quantum simulators and practical quantum communication technologies. PMID:29670945

  5. Maturation of biomass-to-biofuels conversion technology pathways for rapid expansion of biofuels production: A system dynamics perspective

    DOE PAGES

    Vimmerstedt, Laura J.; Bush, Brian W.; Hsu, Dave D.; ...

    2014-08-12

    The Biomass Scenario Model (BSM) is a system-dynamics simulation model intended to explore the potential for rapid expansion of the biofuels industry. The model is not predictive — it uses scenario assumptions based on various types of data to simulate industry development, emphasizing how incentives and technological learning-by-doing might accelerate industry growth. The BSM simulates major sectors of the biofuels industry, including feedstock production and logistics, conversion, distribution, and end uses, as well as interactions among sectors. The model represents conversion of biomass to biofuels as a set of technology pathways, each of which has allowable feedstocks, capital and operatingmore » costs, allowable products, and other defined characteristics. This study and the BSM address bioenergy modeling analytic needs that were identified in recent literature reviews. Simulations indicate that investments are most effective at expanding biofuels production through learning-by-doing when they are coordinated with respect to timing, pathway, and target sector within the biofuels industry. Effectiveness metrics include timing and magnitude of increased production, incentive cost and cost effectiveness, and avoidance of windfall profits. Investment costs and optimal investment targets have inherent risks and uncertainties, such as the relative value of investment in more-mature versus less mature pathways. These can be explored through scenarios, but cannot be precisely predicted. Dynamic competition, including competition for cellulosic feedstocks and ethanol market shares, intensifies during times of rapid growth. Ethanol production increases rapidly, even up to Renewable Fuel Standards-targeted volumes of biofuel, in simulations that allow higher blending proportions of ethanol in gasoline-fueled vehicles. Published 2014. This document is a U.S. Government work and is in the public domain in the USA. Biofuels, Bioproducts, Biorefining published by John Wiley & Sons, Ltd on behalf of Society of Chemical Industry.« less

  6. Simulation study on the impact of air distribution on formaldehyde pollutant distribution in room

    NASA Astrophysics Data System (ADS)

    Wu, Jingtao; Wang, Jun; Cheng, Zhu

    2017-01-01

    In this paper, physical and mathematical model of a room was established based on the Airpak software. The velocity distribution, air age distribution, formaldehyde concentration distribution and Predicted Mean Vote(PMV), Predicted Percentage Dissatisfied(PPD) distribution in the ward of a hospital were simulated. In addition, the air volume was doubled, the change of indoor pollutant concentration distribution was simulated. And further, the change of air age was simulated. Through the simulation, it can help arrange the position of the air supply port, so it is very necessary to increase the comfort of the staff in the room. Finally, through the simulation of pollutant concentration distribution, it can be seen that when concentration of indoor pollutants was high, the supply air flow rate should be increased appropriately. Indoor pollutant will be discharged as soon as possible, which is very beneficial to human body health.

  7. Simulation, design and fabrication of a planar micro thermoelectric generator

    NASA Astrophysics Data System (ADS)

    Pelegrini, S.; Adami, A.; Collini, C.; Conci, P.; Lorenzelli, L.; Pasa, A. A.

    2013-05-01

    This study describes the design, simulation, and micro fabrication of a micro thermoelectric generator (μTEG) based on planar technology using constantan (CuNi) and copper (Cu) thermocouples deposited electrochemically (ECD) on silicon substrate. The present thin film technology can be manufactured into large area and also on flexible substrate with low cost of production and can be used to exploit waste heat from equipments or hot surfaces in general. In the current implementation, the silicon structure has been designed and optimized with analytical models and FE simulations in order to exploit the different thermal conductivity of silicon and air gaps to produce the maximum temperature difference on a planar surface. The results showed that a temperature difference of 10K across the structure creates a temperature difference of 5.3K on the thermocouples, thus providing an efficiency of thermal distribution up to 55%, depending on the heat convection at the surface. Efficiency of module has been experimentally tested under different working condition, showing the dependence of module output on the external heat exchange (natural and forced convection). Maximum generated potential at 6m/s airflow is 5.7V/m2 K and thermoelectric efficiency is 1.9μW K-2 m-2.

  8. Multi-agent coordination algorithms for control of distributed energy resources in smart grids

    NASA Astrophysics Data System (ADS)

    Cortes, Andres

    Sustainable energy is a top-priority for researchers these days, since electricity and transportation are pillars of modern society. Integration of clean energy technologies such as wind, solar, and plug-in electric vehicles (PEVs), is a major engineering challenge in operation and management of power systems. This is due to the uncertain nature of renewable energy technologies and the large amount of extra load that PEVs would add to the power grid. Given the networked structure of a power system, multi-agent control and optimization strategies are natural approaches to address the various problems of interest for the safe and reliable operation of the power grid. The distributed computation in multi-agent algorithms addresses three problems at the same time: i) it allows for the handling of problems with millions of variables that a single processor cannot compute, ii) it allows certain independence and privacy to electricity customers by not requiring any usage information, and iii) it is robust to localized failures in the communication network, being able to solve problems by simply neglecting the failing section of the system. We propose various algorithms to coordinate storage, generation, and demand resources in a power grid using multi-agent computation and decentralized decision making. First, we introduce a hierarchical vehicle-one-grid (V1G) algorithm for coordination of PEVs under usage constraints, where energy only flows from the grid in to the batteries of PEVs. We then present a hierarchical vehicle-to-grid (V2G) algorithm for PEV coordination that takes into consideration line capacity constraints in the distribution grid, and where energy flows both ways, from the grid in to the batteries, and from the batteries to the grid. Next, we develop a greedy-like hierarchical algorithm for management of demand response events with on/off loads. Finally, we introduce distributed algorithms for the optimal control of distributed energy resources, i.e., generation and storage in a microgrid. The algorithms we present are provably correct and tested in simulation. Each algorithm is assumed to work on a particular network topology, and simulation studies are carried out in order to demonstrate their convergence properties to a desired solution.

  9. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE PAGES

    Huang, Qiuhua; Vittal, Vijay

    2018-05-09

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  10. Advanced EMT and Phasor-Domain Hybrid Simulation with Simulation Mode Switching Capability for Transmission and Distribution Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Qiuhua; Vittal, Vijay

    Conventional electromagnetic transient (EMT) and phasor-domain hybrid simulation approaches presently exist for trans-mission system level studies. Their simulation efficiency is generally constrained by the EMT simulation. With an increasing number of distributed energy resources and non-conventional loads being installed in distribution systems, it is imperative to extend the hybrid simulation application to include distribution systems and integrated transmission and distribution systems. Meanwhile, it is equally important to improve the simulation efficiency as the modeling scope and complexity of the detailed system in the EMT simulation increases. To meet both requirements, this paper introduces an advanced EMT and phasor-domain hybrid simulationmore » approach. This approach has two main features: 1) a comprehensive phasor-domain modeling framework which supports positive-sequence, three-sequence, three-phase and mixed three-sequence/three-phase representations and 2) a robust and flexible simulation mode switching scheme. The developed scheme enables simulation switching from hybrid simulation mode back to pure phasor-domain dynamic simulation mode to achieve significantly improved simulation efficiency. The proposed method has been tested on integrated transmission and distribution systems. In conclusion, the results show that with the developed simulation switching feature, the total computational time is significantly reduced compared to running the hybrid simulation for the whole simulation period, while maintaining good simulation accuracy.« less

  11. Modeling spallation reactions in tungsten and uranium targets with the Geant4 toolkit

    NASA Astrophysics Data System (ADS)

    Malyshkin, Yury; Pshenichnov, Igor; Mishustin, Igor; Greiner, Walter

    2012-02-01

    We study primary and secondary reactions induced by 600 MeV proton beams in monolithic cylindrical targets made of natural tungsten and uranium by using Monte Carlo simulations with the Geant4 toolkit [1-3]. Bertini intranuclear cascade model, Binary cascade model and IntraNuclear Cascade Liège (INCL) with ABLA model [4] were used as calculational options to describe nuclear reactions. Fission cross sections, neutron multiplicity and mass distributions of fragments for 238U fission induced by 25.6 and 62.9 MeV protons are calculated and compared to recent experimental data [5]. Time distributions of neutron leakage from the targets and heat depositions are calculated. This project is supported by Siemens Corporate Technology.

  12. Aerospace Applications of Weibull and Monte Carlo Simulation with Importance Sampling

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.

    1998-01-01

    Recent developments in reliability modeling and computer technology have made it practical to use the Weibull time to failure distribution to model the system reliability of complex fault-tolerant computer-based systems. These system models are becoming increasingly popular in space systems applications as a result of mounting data that support the decreasing Weibull failure distribution and the expectation of increased system reliability. This presentation introduces the new reliability modeling developments and demonstrates their application to a novel space system application. The application is a proposed guidance, navigation, and control (GN&C) system for use in a long duration manned spacecraft for a possible Mars mission. Comparisons to the constant failure rate model are presented and the ramifications of doing so are discussed.

  13. Physical mechanisms affecting hot carrier-induced degradation in gallium nitride HEMTs

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shubhajit

    Gallium Nitride or GaN-based high electron mobility transistors (HEMTs) is currently the most promising device technology in several key military and civilian applications due to excellent high-power as well as high-frequency performance. Even though the performance figures are outstanding, GaN-based HEMTs are not as mature as some competing technologies, which means that establishing the reliability of the technology is important to enable use in critical applications. The objective of this research is to understand the physical mechanisms affecting the reliability of GaN HEMTs at moderate drain biases (typically VDS < 30 V in the devices considered here). The degradation in device performance is believed to be due to the formation or modification of charged defects near the interface by hydrogen depassivation processes (due to electron-activated hydrogen removal) from energetic carriers. A rate-equation describing the defect generation process is formulated based on this assumption. A combination of ensemble Monte-Carlo (EMC) simulation statistics, ab-initio density functional theory (DFT) calculations, and accelerated stress experiments is used to relate the candidate defects to the overall degradation behavior (VT and gm). The focus of this work is on the 'semi-ON' mode of transistor operation in which the degradation is usually observed to be at its highest. This semi-ON state is reasonably close to the biasing region of class-AB high power amplifiers, which are popular because of the combination of high efficiency and low distortion that is associated with this configuration. The carrier-energy distributions are obtained using an EMC simulator that was developed specifically for III-V HFETs. The rate equation is used to model the degradation at different operating conditions as well as longer stress times from the result of one short duration stress test, by utilizing the carrier-energy distribution obtained from EMC simulations for one baseline condition. This work also attempts to identify the spatial location of these defects, and how this impacts the V T shift and gm degradation of the devices.

  14. Distributed Multiple Access Control for the Wireless Mesh Personal Area Networks

    NASA Astrophysics Data System (ADS)

    Park, Moo Sung; Lee, Byungjoo; Rhee, Seung Hyong

    Mesh networking technologies for both high-rate and low-rate wireless personal area networks (WPANs) are under development by several standardization bodies. They are considering to adopt distributed TDMA MAC protocols to provide seamless user mobility as well as a good peer-to-peer QoS in WPAN mesh. It has been, however, pointed out that the absence of a central controller in the wireless TDMA MAC may cause a severe performance degradation: e. g., fair allocation, service differentiation, and admission control may be hard to achieve or can not be provided. In this paper, we suggest a new framework of resource allocation for the distributed MAC protocols in WPANs. Simulation results show that our algorithm achieves both a fair resource allocation and flexible service differentiations in a fully distributed way for mesh WPANs where the devices have high mobility and various requirements. We also provide an analytical modeling to discuss about its unique equilibrium and to compute the lengths of reserved time slots at the stable point.

  15. Impact of distributions on the archetypes and prototypes in heterogeneous nanoparticle ensembles.

    PubMed

    Fernandez, Michael; Wilson, Hugh F; Barnard, Amanda S

    2017-01-05

    The magnitude and complexity of the structural and functional data available on nanomaterials requires data analytics, statistical analysis and information technology to drive discovery. We demonstrate that multivariate statistical analysis can recognise the sets of truly significant nanostructures and their most relevant properties in heterogeneous ensembles with different probability distributions. The prototypical and archetypal nanostructures of five virtual ensembles of Si quantum dots (SiQDs) with Boltzmann, frequency, normal, Poisson and random distributions are identified using clustering and archetypal analysis, where we find that their diversity is defined by size and shape, regardless of the type of distribution. At the complex hull of the SiQD ensembles, simple configuration archetypes can efficiently describe a large number of SiQDs, whereas more complex shapes are needed to represent the average ordering of the ensembles. This approach provides a route towards the characterisation of computationally intractable virtual nanomaterial spaces, which can convert big data into smart data, and significantly reduce the workload to simulate experimentally relevant virtual samples.

  16. High-Fidelity Coupled Monte-Carlo/Thermal-Hydraulics Calculations

    NASA Astrophysics Data System (ADS)

    Ivanov, Aleksandar; Sanchez, Victor; Ivanov, Kostadin

    2014-06-01

    Monte Carlo methods have been used as reference reactor physics calculation tools worldwide. The advance in computer technology allows the calculation of detailed flux distributions in both space and energy. In most of the cases however, those calculations are done under the assumption of homogeneous material density and temperature distributions. The aim of this work is to develop a consistent methodology for providing realistic three-dimensional thermal-hydraulic distributions by coupling the in-house developed sub-channel code SUBCHANFLOW with the standard Monte-Carlo transport code MCNP. In addition to the innovative technique of on-the fly material definition, a flux-based weight-window technique has been introduced to improve both the magnitude and the distribution of the relative errors. Finally, a coupled code system for the simulation of steady-state reactor physics problems has been developed. Besides the problem of effective feedback data interchange between the codes, the treatment of temperature dependence of the continuous energy nuclear data has been investigated.

  17. Level crossings and excess times due to a superposition of uncorrelated exponential pulses

    NASA Astrophysics Data System (ADS)

    Theodorsen, A.; Garcia, O. E.

    2018-01-01

    A well-known stochastic model for intermittent fluctuations in physical systems is investigated. The model is given by a superposition of uncorrelated exponential pulses, and the degree of pulse overlap is interpreted as an intermittency parameter. Expressions for excess time statistics, that is, the rate of level crossings above a given threshold and the average time spent above the threshold, are derived from the joint distribution of the process and its derivative. Limits of both high and low intermittency are investigated and compared to previously known results. In the case of a strongly intermittent process, the distribution of times spent above threshold is obtained analytically. This expression is verified numerically, and the distribution of times above threshold is explored for other intermittency regimes. The numerical simulations compare favorably to known results for the distribution of times above the mean threshold for an Ornstein-Uhlenbeck process. This contribution generalizes the excess time statistics for the stochastic model, which find applications in a wide diversity of natural and technological systems.

  18. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Note on Two-Phase Phenomena in Financial Markets

    NASA Astrophysics Data System (ADS)

    Jiang, Shi-Mei; Cai, Shi-Min; Zhou, Tao; Zhou, Pei-Ling

    2008-06-01

    The two-phase behaviour in financial markets actually means the bifurcation phenomenon, which represents the change of the conditional probability from an unimodal to a bimodal distribution. We investigate the bifurcation phenomenon in Hang-Seng index. It is observed that the bifurcation phenomenon in financial index is not universal, but specific under certain conditions. For Hang-Seng index and randomly generated time series, the phenomenon just emerges when the power-law exponent of absolute increment distribution is between 1 and 2 with appropriate period. Simulations on a randomly generated time series suggest the bifurcation phenomenon itself is subject to the statistics of absolute increment, thus it may not be able to reflect essential financial behaviours. However, even under the same distribution of absolute increment, the range where bifurcation phenomenon occurs is far different from real market to artificial data, which may reflect certain market information.

  19. [Micro-simulation of firms' heterogeneity on pollution intensity and regional characteristics].

    PubMed

    Zhao, Nan; Liu, Yi; Chen, Ji-Ning

    2009-11-01

    In the same industrial sector, heterogeneity of pollution intensity exists among firms. There are some errors if using sector's average pollution intensity, which are calculated by limited number of firms in environmental statistic database to represent the sector's regional economic-environmental status. Based on the production function which includes environmental depletion as input, a micro-simulation model on firms' operational decision making is proposed. Then the heterogeneity of firms' pollution intensity can be mechanically described. Taking the mechanical manufacturing sector in Deyang city, 2005 as the case, the model's parameters were estimated. And the actual COD emission intensities of environmental statistic firms can be properly matched by the simulation. The model's results also show that the regional average COD emission intensity calculated by the environmental statistic firms (0.002 6 t per 10 000 yuan fixed asset, 0.001 5 t per 10 000 yuan production value) is lower than the regional average intensity calculated by all the firms in the region (0.003 0 t per 10 000 yuan fixed asset, 0.002 3 t per 10 000 yuan production value). The difference among average intensities in the six counties is significant as well. These regional characteristics of pollution intensity attribute to the sector's inner-structure (firms' scale distribution, technology distribution) and its spatial deviation.

  20. Comparison of CFD simulations to non-rotating MEXICO blades experiment in the LTT wind tunnel of TUDelft

    NASA Astrophysics Data System (ADS)

    Zhang, Ye; van Zuijlen, Alexander; van Bussel, Gerard

    2014-06-01

    In this paper, three dimensional flow over non-rotating MEXICO blades is simulated by CFD methods. The numerical results are compared with the latest MEXICO wind turbine blades measurements obtained in the low speed low turbulence (LTT) wind tunnel of Delft University of Technology. This study aims to validate CFD codes by using these experimental data measured in well controlled conditions. In order to avoid use of wind tunnel corrections, both the blades and the wind tunnel test section are modelled in the simulations. The ability of Menter's k - ω shear stress transport (SST) turbulence model is investigated at both attached flow and massively separated flow cases. Steady state Reynolds averaged Navier Stokes (RANS) equations are solved in these computations. The pressure distribution at three measured sections are compared under the conditions of different inflow velocities and a range of angles of attack. The comparison shows that at attached flow condition, good agreement can be obtained for all three airfoil sections. Even with massively separated flow, still fairly good pressure distribution comparison can be found for the DU and NACA airfoil sections, although the RISØ section shows poor comparison. At the near stall case, considerable deviations exists on the forward half part of the upper surface for all three sections.

  1. [The Diagnostics of Detonation Flow External Field Based on Multispectral Absorption Spectroscopy Technology].

    PubMed

    Lü, Xiao-jing; Li, Ning; Weng, Chun-sheng

    2016-03-01

    Compared with traditional sampling-based sensing method, absorption spectroscopy technology is well suitable for detonation flow diagnostics, since it can provide with us fast response, nonintrusive, sensitive solution for situ measurements of multiple flow-field parameters. The temperature and concentration test results are the average values along the laser path with traditional absorption spectroscopy technology, while the boundary of detonation flow external field is unknown and it changes all the time during the detonation engine works, traditional absorption spectroscopy technology is no longer suitable for detonation diagnostics. The trend of line strength with temperature varies with different absorption lines. By increasing the number of absorption lines in the test path, more information of the non-uniform flow field can be obtained. In this paper, based on multispectral absorption technology, the reconstructed model of detonation flow external field distribution was established according to the simulation results of space-time conservation element and solution element method, and a diagnostic method of detonation flow external field was given. The model deviation and calculation error of the least squares method adopted were studied by simulation, and the maximum concentration and temperature calculation error was 20.1% and 3.2%, respectively. Four absorption lines of H2O were chosen and detonation flow was scanned at the same time. The detonation external flow testing system was set up for the valveless gas-liquid continuous pulse detonation engine with the diameter of 80 mm. Through scanning H2O absorption lines with a high frequency of 10 kHz, the on-line detection of detonation external flow was realized by direct absorption method combined with time-division multiplexing technology, and the reconstruction of dynamic temperature distribution was realized as well for the first time, both verifying the feasibility of the test method. The test results show that both of the temperature and H2O concentration rose with the arrival of detonation wave. With the increase of the vertical distance between the detonation tube nozzle and the laser path, the time of temperature and concentration coming to the peak delayed, and the temperature variation trend tended to slow down. At 20 cm from detonation tube nozzle, the maximum temperature hit 1 329 K and the maximum H2O concentration of 0.19 occurred at 4 ms after ignition. The research can provide with us the support for expanding the detonation test field with absorption spectroscopy technology, and can also help to promote the detonation mechanism research and to enhance the level of detonation engine control technology.

  2. Dynamic Systems for Individual Tracking via Heterogeneous Information Integration and Crowd Source Distributed Simulation

    DTIC Science & Technology

    2015-12-04

    51   6.6   Power Consumption: Communications ...simulations executing on mobile computing platforms, an area not widely studied to date in the distributed simulation research community . A...simulation community . These initial studies focused on two conservative synchronization algorithms widely used in the distributed simulation field

  3. [New simulation technologies in neurosurgery].

    PubMed

    Byvaltsev, V A; Belykh, E G; Konovalov, N A

    2016-01-01

    The article presents a literature review on the current state of simulation technologies in neurosurgery, a brief description of the basic technology and the classification of simulation models, and examples of simulation models and skills simulators used in neurosurgery. Basic models for the development of physical skills, the spectrum of available computer virtual simulators, and their main characteristics are described. It would be instructive to include microneurosurgical training and a cadaver course of neurosurgical approaches in neurosurgery training programs and to extend the use of three-dimensional imaging. Technologies for producing three-dimensional anatomical models and patient-specific computer simulators as well as improvement of tactile feedback systems and display quality of virtual models are promising areas. Continued professional education necessitates further research for assessing the validity and practical use of simulators and physical models.

  4. Design and performance test of NIRS-based spinal cord lesion detector

    NASA Astrophysics Data System (ADS)

    Li, Nanxi; Li, Ting

    2018-02-01

    Spinal cord lesions can cause a series of severe complications, which can even lead to paralysis with high mortality. However, the traditional diagnosis of spinal cord lesion relies on complicated imaging modalities and other invasive and dangerous methods. Here, we have designed a small monitor based on NIRS technology for noninvasive monitoring for spinal cord lesions. The development of the instrument system includes the design of hardware circuits and the program of software. In terms of hardware, OPT1011 is selected as the light detector, and the appropriate probe distribution structure is selected according to the simulation result of Monte Carlo Simulation. At the same time, the powerful controller is selected as our system's central processing chip for the circuit design, and the data is transmitted by serial port to the host computer for post processing. Finally, we verify the stability and feasibility of the instrument system. It is found that the spinal signal could be obviously detected in the system, which indicates that our monitor based on NIRS technology has the potential to monitor the spinal lesion.

  5. A New Numerical Simulation technology of Multistage Fracturing in Horizontal Well

    NASA Astrophysics Data System (ADS)

    Cheng, Ning; Kang, Kaifeng; Li, Jianming; Liu, Tao; Ding, Kun

    2017-11-01

    Horizontal multi-stage fracturing is recognized the effective development technology of unconventional oil resources. Geological mechanics in the numerical simulation of hydraulic fracturing technology occupies very important position, compared with the conventional numerical simulation technology, because of considering the influence of geological mechanics. New numerical simulation of hydraulic fracturing can more effectively optimize the design of fracturing and evaluate the production after fracturing. This paper studies is based on the three-dimensional stress and rock physics parameters model, using the latest fluid-solid coupling numerical simulation technology to engrave the extension process of fracture and describes the change of stress field in fracturing process, finally predict the production situation.

  6. The Shale Hills Critical Zone Observatory for Embedded Sensing and Simulation

    NASA Astrophysics Data System (ADS)

    Duffy, C.; Davis, K.; Kane, T.; Boyer, E.

    2009-04-01

    The future of environmental observing systems will utilize embedded sensor networks with continuous real-time measurement of hydrologic, atmospheric, biogeochemical, and ecological variables across diverse terrestrial environments. Embedded environmental sensors, benefitting from advances in information sciences, networking technology, materials science, computing capacity, and data synthesis methods, are undergoing revolutionary change. It is now possible to field spatially-distributed, multi-node sensor networks that provide density and spatial coverage previously accessible only via numerical simulation. At the same time, computational tools are advancing rapidly to the point where it is now possible to simulate the physical processes controlling individual parcels of water and solutes through the complete terrestrial water cycle. Our goal for the Penn State Critical Zone Observatory is to apply environmental sensor arrays, integrated hydrologic models deployed and coordinated at a testbed within the Penn State Experimental Forest. The NSF-funded CZO is designed to observe the detailed space and time complexities of the water and energy cycle for a watershed and ultimately the river basin for all physical states and fluxes (groundwater, soil moisture, temperature, streamflow, latent heat, snowmelt, chemistry, isotopes etc.). Presently fully-coupled physical models are being developed that link the atmosphere-land-vegetation-subsurface system into a fully-coupled distributed system. During the last 5 years the Penn State Integrated Hydrologic Modeling System has been under development as an open-source community modeling project funded by NSF EAR/GEO and NSF CBET/ENG. PIHM represents a strategy for the formulation and solution of fully-coupled process equations at the watershed and river basin scales, and includes a tightly coupled GIS tool for data handling, domain decomposition, optimal unstructured grid generation, and model parameterization. (PIHM; http://sourceforge.net/projects/pihmmodel/; http://sourceforge.net/projects/pihmgis/ ) The CZO sensor and simulation system is being developed to have the following elements: 1) extensive, spatially-distributed smart sensor networks to gather intensive soil, geologic, hydrologic, geochemical and isotopic data; 2) spatially-explicit multiphysics models/solutions of the land-subsurface-vegetation-atmosphere system; and 3) parallel/distributed, adaptive algorithms for rapidly simulating the states of the watershed at high resolution, and 4) signal processing tools for data mining and parameter estimation. The prototype proposed sensor array and simulation system proposed is demonstrated with preliminary results from our first year.

  7. Interpreting ecological diversity indices applied to terminal restriction fragment length polymorphism data: insights from simulated microbial communities.

    PubMed

    Blackwood, Christopher B; Hudleston, Deborah; Zak, Donald R; Buyer, Jeffrey S

    2007-08-01

    Ecological diversity indices are frequently applied to molecular profiling methods, such as terminal restriction fragment length polymorphism (T-RFLP), in order to compare diversity among microbial communities. We performed simulations to determine whether diversity indices calculated from T-RFLP profiles could reflect the true diversity of the underlying communities despite potential analytical artifacts. These include multiple taxa generating the same terminal restriction fragment (TRF) and rare TRFs being excluded by a relative abundance (fluorescence) threshold. True community diversity was simulated using the lognormal species abundance distribution. Simulated T-RFLP profiles were generated by assigning each species a TRF size based on an empirical or modeled TRF size distribution. With a typical threshold (1%), the only consistently useful relationship was between Smith and Wilson evenness applied to T-RFLP data (TRF-E(var)) and true Shannon diversity (H'), with correlations between 0.71 and 0.81. TRF-H' and true H' were well correlated in the simulations using the lowest number of species, but this correlation declined substantially in simulations using greater numbers of species, to the point where TRF-H' cannot be considered a useful statistic. The relationships between TRF diversity indices and true indices were sensitive to the relative abundance threshold, with greatly improved correlations observed using a 0.1% threshold, which was investigated for comparative purposes but is not possible to consistently achieve with current technology. In general, the use of diversity indices on T-RFLP data provides inaccurate estimates of true diversity in microbial communities (with the possible exception of TRF-E(var)). We suggest that, where significant differences in T-RFLP diversity indices were found in previous work, these should be reinterpreted as a reflection of differences in community composition rather than a true difference in community diversity.

  8. New light field camera based on physical based rendering tracing

    NASA Astrophysics Data System (ADS)

    Chung, Ming-Han; Chang, Shan-Ching; Lee, Chih-Kung

    2014-03-01

    Even though light field technology was first invented more than 50 years ago, it did not gain popularity due to the limitation imposed by the computation technology. With the rapid advancement of computer technology over the last decade, the limitation has been uplifted and the light field technology quickly returns to the spotlight of the research stage. In this paper, PBRT (Physical Based Rendering Tracing) was introduced to overcome the limitation of using traditional optical simulation approach to study the light field camera technology. More specifically, traditional optical simulation approach can only present light energy distribution but typically lack the capability to present the pictures in realistic scenes. By using PBRT, which was developed to create virtual scenes, 4D light field information was obtained to conduct initial data analysis and calculation. This PBRT approach was also used to explore the light field data calculation potential in creating realistic photos. Furthermore, we integrated the optical experimental measurement results with PBRT in order to place the real measurement results into the virtually created scenes. In other words, our approach provided us with a way to establish a link of virtual scene with the real measurement results. Several images developed based on the above-mentioned approaches were analyzed and discussed to verify the pros and cons of the newly developed PBRT based light field camera technology. It will be shown that this newly developed light field camera approach can circumvent the loss of spatial resolution associated with adopting a micro-lens array in front of the image sensors. Detailed operational constraint, performance metrics, computation resources needed, etc. associated with this newly developed light field camera technique were presented in detail.

  9. Focus on Games & Simulations: Trends+Technologies+Case Studies

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2011-01-01

    A changing mindset combined with changing technology is driving the use of games and simulations. People are becoming more open to using games and simulations for learning, and, at the same time, the technologies are making the development of games and simulations easier and faster than a mere five years ago. Together, the changing mindset and the…

  10. Simulation in International Relations Education.

    ERIC Educational Resources Information Center

    Starkey, Brigid A.; Blake, Elizabeth L.

    2001-01-01

    Discusses the educational implications of simulations in international relations. Highlights include the development of international relations simulations; the role of technology; the International Communication and Negotiation Simulations (ICONS) project at the University of Maryland; evolving information technology; and simulating real-world…

  11. Quark fragmentation functions in NJL-jet model

    NASA Astrophysics Data System (ADS)

    Bentz, Wolfgang; Matevosyan, Hrayr; Thomas, Anthony

    2014-09-01

    We report on our studies of quark fragmentation functions in the Nambu-Jona-Lasinio (NJL) - jet model. The results of Monte-Carlo simulations for the fragmentation functions to mesons and nucleons, as well as to pion and kaon pairs (dihadron fragmentation functions) are presented. The important role of intermediate vector meson resonances for those semi-inclusive deep inelastic production processes is emphasized. Our studies are very relevant for the extraction of transverse momentum dependent quark distribution functions from measured scattering cross sections. We report on our studies of quark fragmentation functions in the Nambu-Jona-Lasinio (NJL) - jet model. The results of Monte-Carlo simulations for the fragmentation functions to mesons and nucleons, as well as to pion and kaon pairs (dihadron fragmentation functions) are presented. The important role of intermediate vector meson resonances for those semi-inclusive deep inelastic production processes is emphasized. Our studies are very relevant for the extraction of transverse momentum dependent quark distribution functions from measured scattering cross sections. Supported by Grant in Aid for Scientific Research, Japanese Ministry of Education, Culture, Sports, Science and Technology, Project No. 20168769.

  12. Performance of two differently designed permeable reactive barriers with sulfate and zinc solutions.

    PubMed

    Pérez, Norma; Schwarz, Alex O; Barahona, Esteban; Sanhueza, Pamela; Diaz, Isabel; Urrutia, Homero

    2018-06-18

    For the first time, this laboratory-scale study evaluates the feasibility of incorporating diffusive exchange in permeable reactive barriers. In order to do this, the performance of two permeable reactive barriers (PRB) with different internal substrate arrangements were compared during the administration of a sulfate solution without metals (for 163 days) and with metals (for 60 days), simulating groundwater contaminated with acid mine drainage (AMD). In order to simulate a traditional PRB, a homogeneous distribution was implemented in the first reactor and the other PRB reactor utilized diffusion-active technology (DAPRB). In the DAPRB, the distribution of the reactive material was interspersed with the conductive material. The measurements in the internal ports showed that transverse gradients of sulfide formed in the DAPRB, causing the diffusion of sulfide from the substrate toward the layer interface, which is where the sulfide reacts by forming complexes with the metal. The DAPRB prevents the microorganisms from direct contact with AMD. This protection caused greater activity (sulfide production). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  14. SU-F-18C-09: Assessment of OSL Dosimeter Technology in the Validation of a Monte Carlo Radiation Transport Code for CT Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, D; Kost, S; Pickens, D

    Purpose: To assess the utility of optically stimulated luminescent (OSL) dosimeter technology in calibrating and validating a Monte Carlo radiation transport code for computed tomography (CT). Methods: Exposure data were taken using both a standard CT 100-mm pencil ionization chamber and a series of 150-mm OSL CT dosimeters. Measurements were made at system isocenter in air as well as in standard 16-cm (head) and 32-cm (body) CTDI phantoms at isocenter and at the 12 o'clock positions. Scans were performed on a Philips Brilliance 64 CT scanner for 100 and 120 kVp at 300 mAs with a nominal beam width ofmore » 40 mm. A radiation transport code to simulate the CT scanner conditions was developed using the GEANT4 physics toolkit. The imaging geometry and associated parameters were simulated for each ionization chamber and phantom combination. Simulated absorbed doses were compared to both CTDI{sub 100} values determined from the ion chamber and to CTDI{sub 100} values reported from the OSLs. The dose profiles from each simulation were also compared to the physical OSL dose profiles. Results: CTDI{sub 100} values reported by the ion chamber and OSLs are generally in good agreement (average percent difference of 9%), and provide a suitable way to calibrate doses obtained from simulation to real absorbed doses. Simulated and real CTDI{sub 100} values agree to within 10% or less, and the simulated dose profiles also predict the physical profiles reported by the OSLs. Conclusion: Ionization chambers are generally considered the standard for absolute dose measurements. However, OSL dosimeters may also serve as a useful tool with the significant benefit of also assessing the radiation dose profile. This may offer an advantage to those developing simulations for assessing radiation dosimetry such as verification of spatial dose distribution and beam width.« less

  15. IP-Based Video Modem Extender Requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pierson, L G; Boorman, T M; Howe, R E

    2003-12-16

    Visualization is one of the keys to understanding large complex data sets such as those generated by the large computing resources purchased and developed by the Advanced Simulation and Computing program (aka ASCI). In order to be convenient to researchers, visualization data must be distributed to offices and large complex visualization theaters. Currently, local distribution of the visual data is accomplished by distance limited modems and RGB switches that simply do not scale to hundreds of users across the local, metropolitan, and WAN distances without incurring large costs in fiber plant installation and maintenance. Wide Area application over the DOEmore » Complex is infeasible using these limited distance RGB extenders. On the other hand, Internet Protocols (IP) over Ethernet is a scalable well-proven technology that can distribute large volumes of data over these distances. Visual data has been distributed at lower resolutions over IP in industrial applications. This document describes requirements of the ASCI program in visual signal distribution for the purpose of identifying industrial partners willing to develop products to meet ASCI's needs.« less

  16. Data Driven Smart Proxy for CFD Application of Big Data Analytics & Machine Learning in Computational Fluid Dynamics, Report Two: Model Building at the Cell Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ansari, A.; Mohaghegh, S.; Shahnam, M.

    To ensure the usefulness of simulation technologies in practice, their credibility needs to be established with Uncertainty Quantification (UQ) methods. In this project, smart proxy is introduced to significantly reduce the computational cost of conducting large number of multiphase CFD simulations, which is typically required for non-intrusive UQ analysis. Smart proxy for CFD models are developed using pattern recognition capabilities of Artificial Intelligence (AI) and Data Mining (DM) technologies. Several CFD simulation runs with different inlet air velocities for a rectangular fluidized bed are used to create a smart CFD proxy that is capable of replicating the CFD results formore » the entire geometry and inlet velocity range. The smart CFD proxy is validated with blind CFD runs (CFD runs that have not played any role during the development of the smart CFD proxy). The developed and validated smart CFD proxy generates its results in seconds with reasonable error (less than 10%). Upon completion of this project, UQ studies that rely on hundreds or thousands of smart CFD proxy runs can be accomplished in minutes. Following figure demonstrates a validation example (blind CFD run) showing the results from the MFiX simulation and the smart CFD proxy for pressure distribution across a fluidized bed at a given time-step (the layer number corresponds to the vertical location in the bed).« less

  17. MATSIM -The Development and Validation of a Numerical Voxel Model based on the MATROSHKA Phantom

    NASA Astrophysics Data System (ADS)

    Beck, Peter; Rollet, Sofia; Berger, Thomas; Bergmann, Robert; Hajek, Michael; Latocha, Marcin; Vana, Norbert; Zechner, Andrea; Reitz, Guenther

    The AIT Austrian Institute of Technology coordinates the project MATSIM (MATROSHKA Simulation) in collaboration with the Vienna University of Technology and the German Aerospace Center. The aim of the project is to develop a voxel-based model of the MATROSHKA anthro-pomorphic torso used at the International Space Station (ISS) as foundation to perform Monte Carlo high-energy particle transport simulations for different irradiation conditions. Funded by the Austrian Space Applications Programme (ASAP), MATSIM is a co-investigation with the European Space Agency (ESA) ELIPS project MATROSHKA, an international collaboration of more than 18 research institutes and space agencies from all over the world, under the science and project lead of the German Aerospace Center. The MATROSHKA facility is designed to determine the radiation exposure of an astronaut onboard ISS and especially during an ex-travehicular activity. The numerical model developed in the frame of MATSIM is validated by reference measurements. In this report we give on overview of the model development and compare photon and neutron irradiations of the detector-equipped phantom torso with Monte Carlo simulations using FLUKA. Exposure to Co-60 photons was realized in the standard ir-radiation laboratory at Seibersdorf, while investigations with neutrons were performed at the thermal column of the Vienna TRIGA Mark-II reactor. The phantom was loaded with passive thermoluminescence dosimeters. In addition, first results of the calculated dose distribution within the torso are presented for a simulated exposure in low-Earth orbit.

  18. Fission Surface Power Technology Development Update

    NASA Technical Reports Server (NTRS)

    Palac, Donald T.; Mason, Lee S.; Houts, Michael G.; Harlow, Scott

    2011-01-01

    Power is a critical consideration in planning exploration of the surfaces of the Moon, Mars, and places beyond. Nuclear power is an important option, especially for locations in the solar system where sunlight is limited or environmental conditions are challenging (e.g., extreme cold, dust storms). NASA and the Department of Energy are maintaining the option for fission surface power for the Moon and Mars by developing and demonstrating technology for a fission surface power system. The Fission Surface Power Systems project has focused on subscale component and subsystem demonstrations to address the feasibility of a low-risk, low-cost approach to space nuclear power for surface missions. Laboratory demonstrations of the liquid metal pump, reactor control drum drive, power conversion, heat rejection, and power management and distribution technologies have validated that the fundamental characteristics and performance of these components and subsystems are consistent with a Fission Surface Power preliminary reference concept. In addition, subscale versions of a non-nuclear reactor simulator, using electric resistance heating in place of the reactor fuel, have been built and operated with liquid metal sodium-potassium and helium/xenon gas heat transfer loops, demonstrating the viability of establishing system-level performance and characteristics of fission surface power technologies without requiring a nuclear reactor. While some component and subsystem testing will continue through 2011 and beyond, the results to date provide sufficient confidence to proceed with system level technology readiness demonstration. To demonstrate the system level readiness of fission surface power in an operationally relevant environment (the primary goal of the Fission Surface Power Systems project), a full scale, 1/4 power Technology Demonstration Unit (TDU) is under development. The TDU will consist of a non-nuclear reactor simulator, a sodium-potassium heat transfer loop, a power conversion unit with electrical controls, and a heat rejection system with a multi-panel radiator assembly. Testing is planned at the Glenn Research Center Vacuum Facility 6 starting in 2012, with vacuum and liquid-nitrogen cold walls to provide simulation of operationally relevant environments. A nominal two-year test campaign is planned including a Phase 1 reactor simulator and power conversion test followed by a Phase 2 integrated system test with radiator panel heat rejection. The testing is expected to demonstrate the readiness and availability of fission surface power as a viable power system option for NASA's exploration needs. In addition to surface power, technology development work within this project is also directly applicable to in-space fission power and propulsion systems.

  19. Towards end to end technology modeling: Carbon nanotube and thermoelectric devices

    NASA Astrophysics Data System (ADS)

    Salamat, Shuaib

    The goal of this work is to demonstrate the feasibility of end-to-end ("atoms to applications") technology modeling. Two different technologies were selected to drive this work. The first technology is carbon nanotube field-effect transistors (CNTFETs), and the goal is to model device level variability and identify the origin of variations in these devices. Recently, there has been significant progress in understanding the physics of carbon nanotube electronic devices and in identifying their potential applications. For nanotubes, the carrier mobility is high, so low bias transport across several hundred nanometers is nearly ballistic, and the deposition of high-k gate dielectrics does not degrade the carrier mobility. The conduction and valence bands are symmetric (useful for complimentary application) and the bandstructure is direct (enables optical emission). Because of these striking features, carbon nanotubes (CNTs) have received much attention. Carbon nanotubes field-effect transistors (CNTFETs) are one of the main potential candidates for large-area electronics. In this research model, systematic simulation approaches are applied to understand the intrinsic performance variability in CNTFETs. It is shown that control over diameter distribution is critically important process parameter for attaining high performance transistors and circuits with characteristics rivaling those of state-of-the-art Si technology. The second technology driver concerns the development of a multi-scale framework for thermoelectric device design. An essential step in the development of new materials and devices for thermoelectrics is to develop accurate, efficient, and realistic models. The ready availability of user friendly ab-initio codes and the ever-increasing computing power have made the band structure calculations routine. Thermoelectric device design, however, is still largely done at the effective mass level. Tools that allow device designers to make use of sophisticated electronic structure and phonon dispersion calculations are needed. We have developed a proof-of-concept, integrated, multi-scale design framework for TE technology. Beginning from full electronic and phonon dispersions, Landauer approach is used to evaluate the temperature-dependent thermoelectric transport parameters needed for device simulation. A comprehensive SPICE-based model for electro-thermal transport has also been developed to serve as a bridge between the materials and device level descriptions and the system level simulations. This prototype framework has been used to design a thermoelectric cooler for managing hot spots in the integrated circuit chips. What's more, as a byproduct of this research a suite of educational and simulation resources have been developed and deployed, on the nanoHUB.org science gateway to serve as a resource for the TE community.

  20. Intelligent distributed medical image management

    NASA Astrophysics Data System (ADS)

    Garcia, Hong-Mei C.; Yun, David Y.

    1995-05-01

    The rapid advancements in high performance global communication have accelerated cooperative image-based medical services to a new frontier. Traditional image-based medical services such as radiology and diagnostic consultation can now fully utilize multimedia technologies in order to provide novel services, including remote cooperative medical triage, distributed virtual simulation of operations, as well as cross-country collaborative medical research and training. Fast (efficient) and easy (flexible) retrieval of relevant images remains a critical requirement for the provision of remote medical services. This paper describes the database system requirements, identifies technological building blocks for meeting the requirements, and presents a system architecture for our target image database system, MISSION-DBS, which has been designed to fulfill the goals of Project MISSION (medical imaging support via satellite integrated optical network) -- an experimental high performance gigabit satellite communication network with access to remote supercomputing power, medical image databases, and 3D visualization capabilities in addition to medical expertise anywhere and anytime around the country. The MISSION-DBS design employs a synergistic fusion of techniques in distributed databases (DDB) and artificial intelligence (AI) for storing, migrating, accessing, and exploring images. The efficient storage and retrieval of voluminous image information is achieved by integrating DDB modeling and AI techniques for image processing while the flexible retrieval mechanisms are accomplished by combining attribute- based and content-based retrievals.

  1. Studies of the 3D surface roughness height

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avisane, Anita; Rudzitis, Janis; Kumermanis, Maris

    2013-12-16

    Nowadays nano-coatings occupy more and more significant place in technology. Innovative, functional coatings acquire new aspects from the point of view of modern technologies, considering the aggregate of physical properties that can be achieved manipulating in the production process with the properties of coatings’ surfaces on micro- and nano-level. Nano-coatings are applied on machine parts, friction surfaces, contacting parts, corrosion surfaces, transparent conducting films (TCF), etc. The equipment available at present for the production of transparent conducting oxide (TCO) coatings with highest quality is based on expensive indium tin oxide (ITO) material; therefore cheaper alternatives are being searched for. Onemore » such offered alternative is zink oxide (ZnO) nano-coatings. Evaluating the TCF physical and mechanical properties and in view of the new ISO standard (EN ISO 25178) on the introduction of surface texture (3D surface roughness) in the engineering calculations, it is necessary to examine the height of 3D surface roughness, which is one of the most significant roughness parameters. The given paper studies the average values of 3D surface roughness height and the most often applied distribution laws are as follows: the normal distribution and Rayleigh distribution. The 3D surface is simulated by a normal random field.« less

  2. Single-intensity-recording optical encryption technique based on phase retrieval algorithm and QR code

    NASA Astrophysics Data System (ADS)

    Wang, Zhi-peng; Zhang, Shuai; Liu, Hong-zhao; Qin, Yi

    2014-12-01

    Based on phase retrieval algorithm and QR code, a new optical encryption technology that only needs to record one intensity distribution is proposed. In this encryption process, firstly, the QR code is generated from the information to be encrypted; and then the generated QR code is placed in the input plane of 4-f system to have a double random phase encryption. For only one intensity distribution in the output plane is recorded as the ciphertext, the encryption process is greatly simplified. In the decryption process, the corresponding QR code is retrieved using phase retrieval algorithm. A priori information about QR code is used as support constraint in the input plane, which helps solve the stagnation problem. The original information can be recovered without distortion by scanning the QR code. The encryption process can be implemented either optically or digitally, and the decryption process uses digital method. In addition, the security of the proposed optical encryption technology is analyzed. Theoretical analysis and computer simulations show that this optical encryption system is invulnerable to various attacks, and suitable for harsh transmission conditions.

  3. Distributed cooperating processes in a mobile robot control system

    NASA Technical Reports Server (NTRS)

    Skillman, Thomas L., Jr.

    1988-01-01

    A mobile inspection robot has been proposed for the NASA Space Station. It will be a free flying autonomous vehicle that will leave a berthing unit to accomplish a variety of inspection tasks around the Space Station, and then return to its berth to recharge, refuel, and transfer information. The Flying Eye robot will receive voice communication to change its attitude, move at a constant velocity, and move to a predefined location along a self generated path. This mobile robot control system requires integration of traditional command and control techniques with a number of AI technologies. Speech recognition, natural language understanding, task and path planning, sensory abstraction and pattern recognition are all required for successful implementation. The interface between the traditional numeric control techniques and the symbolic processing to the AI technologies must be developed, and a distributed computing approach will be needed to meet the real time computing requirements. To study the integration of the elements of this project, a novel mobile robot control architecture and simulation based on the blackboard architecture was developed. The control system operation and structure is discussed.

  4. A study using a Monte Carlo method of the optimal configuration of a distribution network in terms of power loss sensing.

    PubMed

    Moon, Hyun Ho; Lee, Jong Joo; Choi, Sang Yule; Cha, Jae Sang; Kang, Jang Mook; Kim, Jong Tae; Shin, Myong Chul

    2011-01-01

    Recently there have been many studies of power systems with a focus on "New and Renewable Energy" as part of "New Growth Engine Industry" promoted by the Korean government. "New And Renewable Energy"-especially focused on wind energy, solar energy and fuel cells that will replace conventional fossil fuels-is a part of the Power-IT Sector which is the basis of the SmartGrid. A SmartGrid is a form of highly-efficient intelligent electricity network that allows interactivity (two-way communications) between suppliers and consumers by utilizing information technology in electricity production, transmission, distribution and consumption. The New and Renewable Energy Program has been driven with a goal to develop and spread through intensive studies, by public or private institutions, new and renewable energy which, unlike conventional systems, have been operated through connections with various kinds of distributed power generation systems. Considerable research on smart grids has been pursued in the United States and Europe. In the United States, a variety of research activities on the smart power grid have been conducted within EPRI's IntelliGrid research program. The European Union (EU), which represents Europe's Smart Grid policy, has focused on an expansion of distributed generation (decentralized generation) and power trade between countries with improved environmental protection. Thus, there is current emphasis on a need for studies that assesses the economic efficiency of such distributed generation systems. In this paper, based on the cost of distributed power generation capacity, calculations of the best profits obtainable were made by a Monte Carlo simulation. Monte Carlo simulations that rely on repeated random sampling to compute their results take into account the cost of electricity production, daily loads and the cost of sales and generate a result faster than mathematical computations. In addition, we have suggested the optimal design, which considers the distribution loss associated with power distribution systems focus on sensing aspect and distributed power generation.

  5. Earth System Grid II (ESG): Turning Climate Model Datasets Into Community Resources

    NASA Astrophysics Data System (ADS)

    Williams, D.; Middleton, D.; Foster, I.; Nevedova, V.; Kesselman, C.; Chervenak, A.; Bharathi, S.; Drach, B.; Cinquni, L.; Brown, D.; Strand, G.; Fox, P.; Garcia, J.; Bernholdte, D.; Chanchio, K.; Pouchard, L.; Chen, M.; Shoshani, A.; Sim, A.

    2003-12-01

    High-resolution, long-duration simulations performed with advanced DOE SciDAC/NCAR climate models will produce tens of petabytes of output. To be useful, this output must be made available to global change impacts researchers nationwide, both at national laboratories and at universities, other research laboratories, and other institutions. To this end, we propose to create a new Earth System Grid, ESG-II - a virtual collaborative environment that links distributed centers, users, models, and data. ESG-II will provide scientists with virtual proximity to the distributed data and resources that they require to perform their research. The creation of this environment will significantly increase the scientific productivity of U.S. climate researchers by turning climate datasets into community resources. In creating ESG-II, we will integrate and extend a range of Grid and collaboratory technologies, including the DODS remote access protocols for environmental data, Globus Toolkit technologies for authentication, resource discovery, and resource access, and Data Grid technologies developed in other projects. We will develop new technologies for (1) creating and operating "filtering servers" capable of performing sophisticated analyses, and (2) delivering results to users. In so doing, we will simultaneously contribute to climate science and advance the state of the art in collaboratory technology. We expect our results to be useful to numerous other DOE projects. The three-year R&D program will be undertaken by a talented and experienced team of computer scientists at five laboratories (ANL, LBNL, LLNL, NCAR, ORNL) and one university (ISI), working in close collaboration with climate scientists at several sites.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less

  7. Assessment of Clinical Skills Using Simulator Technologies

    ERIC Educational Resources Information Center

    Srinivasan, Malathi; Hwang, Judith C.; West, Daniel; Yellowlees, Peter M.

    2006-01-01

    Objective: Simulation technologies are used to assess and teach competencies through the provision of reproducible stimuli. They have exceptional utility in assessing responses to clinical stimuli that occur sporadically or infrequently. In this article, the authors describe the utility of emerging simulation technologies, and discuss critical…

  8. An Object Oriented Extensible Architecture for Affordable Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.

    2003-01-01

    Driven by a need to explore and develop propulsion systems that exceeded current computing capabilities, NASA Glenn embarked on a novel strategy leading to the development of an architecture that enables propulsion simulations never thought possible before. Full engine 3 Dimensional Computational Fluid Dynamic propulsion system simulations were deemed impossible due to the impracticality of the hardware and software computing systems required. However, with a software paradigm shift and an embracing of parallel and distributed processing, an architecture was designed to meet the needs of future propulsion system modeling. The author suggests that the architecture designed at the NASA Glenn Research Center for propulsion system modeling has potential for impacting the direction of development of affordable weapons systems currently under consideration by the Applied Vehicle Technology Panel (AVT).

  9. 1H line width dependence on MAS speed in solid state NMR - Comparison of experiment and simulation

    NASA Astrophysics Data System (ADS)

    Sternberg, Ulrich; Witter, Raiker; Kuprov, Ilya; Lamley, Jonathan M.; Oss, Andres; Lewandowski, Józef R.; Samoson, Ago

    2018-06-01

    Recent developments in magic angle spinning (MAS) technology permit spinning frequencies of ≥100 kHz. We examine the effect of such fast MAS rates upon nuclear magnetic resonance proton line widths in the multi-spin system of β-Asp-Ala crystal. We perform powder pattern simulations employing Fokker-Plank approach with periodic boundary conditions and 1H-chemical shift tensors calculated using the bond polarization theory. The theoretical predictions mirror well the experimental results. Both approaches demonstrate that homogeneous broadening has a linear-quadratic dependency on the inverse of the MAS spinning frequency and that, at the faster end of the spinning frequencies, the residual spectral line broadening becomes dominated by chemical shift distributions and susceptibility effects even for crystalline systems.

  10. Large-Eddy Simulation of the Base Flow of a Cylindrical Space Vehicle Configuration

    NASA Astrophysics Data System (ADS)

    Meiß, J.-H.; Schröder, W.

    2009-01-01

    A Large-Eddy Simulation (LES) is performed out to in- vestigate high Reynolds number base flow of an axisymmetric rocket-like configuration having an underex- panded nozzle flow. The subsonic base region of low pressure levels is characterized and bounded by the interaction of the freestream of Mach 5.3 and the wide plume of the hot exhaust jet of Mach 3.8. An analysis of the base flow shows that the system of base area vortices determines the highly time-dependent pressure distribution and causes an upstream convection of hot exhaust gas. A comparison of the results with experiments conducted at the German Aerospace Center (DLR) Cologne shows good agreement. The investigation is part of the German RESPACE Pro- gram, which focuses on Key Technologies for Reusable Space Systems.

  11. Electro-thermo-mechanical coupling analysis of deep drawing with resistance heating for aluminum matrix composites sheet

    NASA Astrophysics Data System (ADS)

    Zhang, Kaifeng; Zhang, Tuoda; Wang, Bo

    2013-05-01

    Recently, electro-plastic forming to be a focus of attention in materials hot processing research area, because it is a sort of energy-saving, high efficient and green manufacturing technology. An electro-thermo-mechanical model can be adopted to carry out the sequence simulation of aluminum matrix composites sheet deep drawing via electro-thermal coupling and thermal-mechanical coupling method. The first step of process is resistance heating of sheet, then turn off the power, and the second step is deep drawing. Temperature distribution of SiCp/2024Al composite sheet by resistance heating and sheet deep drawing deformation were analyzed. During the simulation, effect of contact resistances, temperature coefficient of resistance for electrode material and SiCp/2024Al composite on temperature distribution were integrally considered. The simulation results demonstrate that Sicp/2024Al composite sheet can be rapidly heated to 400° in 30s using resistances heating and the sheet temperature can be controlled by adjusting the current density. Physical properties of the electrode materials can significantly affect the composite sheet temperature distribution. The temperature difference between the center and the side of the sheet is proportional to the thermal conductivity of the electrode, the principal cause of which is that the heat transfers from the sheet to the electrode. SiCp/2024Al thin-wall part can be intactly manufactured at strain rate of 0.08s-1 and the sheet thickness thinning rate is limited within 20%, which corresponds well to the experimental result.

  12. Power flow analysis and optimal locations of resistive type superconducting fault current limiters.

    PubMed

    Zhang, Xiuchang; Ruiz, Harold S; Geng, Jianzhao; Shen, Boyang; Fu, Lin; Zhang, Heng; Coombs, Tim A

    2016-01-01

    Based on conventional approaches for the integration of resistive-type superconducting fault current limiters (SFCLs) on electric distribution networks, SFCL models largely rely on the insertion of a step or exponential resistance that is determined by a predefined quenching time. In this paper, we expand the scope of the aforementioned models by considering the actual behaviour of an SFCL in terms of the temperature dynamic power-law dependence between the electrical field and the current density, characteristic of high temperature superconductors. Our results are compared to the step-resistance models for the sake of discussion and clarity of the conclusions. Both SFCL models were integrated into a power system model built based on the UK power standard, to study the impact of these protection strategies on the performance of the overall electricity network. As a representative renewable energy source, a 90 MVA wind farm was considered for the simulations. Three fault conditions were simulated, and the figures for the fault current reduction predicted by both fault current limiting models have been compared in terms of multiple current measuring points and allocation strategies. Consequently, we have shown that the incorporation of the E - J characteristics and thermal properties of the superconductor at the simulation level of electric power systems, is crucial for estimations of reliability and determining the optimal locations of resistive type SFCLs in distributed power networks. Our results may help decision making by distribution network operators regarding investment and promotion of SFCL technologies, as it is possible to determine the maximum number of SFCLs necessary to protect against different fault conditions at multiple locations.

  13. SU-E-I-88: Realistic Pathological Simulations of the NCAT and Zubal Anthropomorphic Models, Based on Clinical PET/CT Data.

    PubMed

    Papadimitroulas, P; Loudos, G; Le Maitre, A; Efthimiou, N; Visvikis, D; Nikiforidis, G; Kagadis, G C

    2012-06-01

    In the present study a patient-specific dataset of realistic PET simulations was created, taking into account the variability of clinical oncology data. Tumor variability was tested in the simulated results. A comparison of the produced simulated data was performed to clinical PET/CT data, for the validation and the evaluation of the procedure. Clinical PET/CT data of oncology patients were used as the basis of the simulated variability inserting patient-specific characteristics in the NCAT and the Zubal anthropomorphic phantoms. GATE Monte Carlo toolkit was used for simulating a commercial PET scanner. The standard computational anthropomorphic phantoms were adapted to the CT data (organ shapes), using a fitting algorithm. The activity map was derived from PET images. Patient tumors were segmented and inserted in the phantom, using different activity distributions. The produced simulated data were reconstructed using the STIR opensource software and compared to the original clinical ones. The accuracy of the procedure was tested in four different oncology cases. Each pathological situation was illustrated simulating a) a healthy body, b) insertion of the clinical tumor with homogenous activity, and c) insertion of the clinical tumor with variable activity (voxel-by-voxel) based on the clinical PET data. The accuracy of the presented dataset was compared to the original PET/CT data. Partial Volume Correction (PVC) was also applied in the simulated data. In this study patient-specific characteristics were used in computational anthropomorphic models for simulating realistic pathological patients. Voxel-by-voxel activity distribution with PVC within the tumor gives the most accurate results. Radiotherapy applications can utilize the benefits of the accurate realistic imaging simulations, using the anatomicaland biological information of each patient. Further work will incorporate the development of analytical anthropomorphic models with motion and cardiac correction, combined with pathological patients to achieve high accuracy in tumor imaging. This research was supported by the Joint Research and Technology Program between Greece and France; 2009-2011 (protocol ID: 09FR103). © 2012 American Association of Physicists in Medicine.

  14. 75 FR 35689 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-23

    ... using realistic simulations.\\14\\ \\13\\ Id. P 1331. \\14\\ Reliability Standard PER-002-0. 9. In Order No... development process to: (1) Include formal training requirements for reliability coordinators similar to those... simulation technology such as a simulator, virtual technology, or other technology in their emergency...

  15. Proceedings of the Spacecraft Charging Technology Conference: Executive Summary

    NASA Technical Reports Server (NTRS)

    Pike, C. P.; Whipple, E. C., Jr.; Stevens, N. J.; Minges, M. L.; Lehn, W. L.; Bunn, M. H.

    1977-01-01

    Aerospace environments are reviewed in reference to spacecraft charging. Modelling, a theoretical scheme which can be used to describe the structure of the sheath around the spacecraft and to calculate the charging currents within, is discussed. Materials characterization is considered for experimental determination of the behavior of typical spacecraft materials when exposed to simulated geomagnetic substorm conditions. Materials development is also examined for controlling and minimizing spacecraft charging or at least for distributing the charge in an equipotential manner, using electrical conductive surfaces for materials exposed to space environment.

  16. Capability and Interface Assessment of Gaming Technologies for Future Multi-Unmanned Air Vehicle Systems

    DTIC Science & Technology

    2011-08-01

    resource management games (e.g., Sim City 2000), board game simulations (e.g., VASSAL), and abstract games (e.g., Tetris). The second purpose of the...which occur simultaneously o E.g., Starcraft  Board game o A computer game that emulates a board game o E.g., Archon  2D Side View o A game...a mouse  Joypad o E.g., A playstation/X-box controller  Accelerometer o E.g., A Wii Controller  Touch 22 Distribution A: Approved for

  17. Distribution of Monochrome Screen Luminance in the CTOL Visual Technology Research Simulator.

    DTIC Science & Technology

    1980-11-01

    runway lines 3.8 4.2 8. Carrier runway 2.5 3.5 9. FLOLS* Meatball ** 2.0 2.4 10. FLOLS Background 0.68 1.3 *Fresnal Lens Optical Landing System...Standard U.S. Navy carrier optical landing device). ** Meatball is the light source of the FLOLS which the pilot uses for glideslope information in a carrier...LANDING DISPLAY FOV (Foot Lamberts) 1. Carrier Deck Runway Lighting 1.5 2. Carrier Runway Area 0.048 3. FLOLS Meatball 0.6 Figure 5 also shows the

  18. The State of Simulations: Soft-Skill Simulations Emerge as a Powerful New Form of E-Learning.

    ERIC Educational Resources Information Center

    Aldrich, Clark

    2001-01-01

    Presents responses of leaders from six simulation companies about challenges and opportunities of soft-skills simulations in e-learning. Discussion includes: evaluation metrics; role of subject matter experts in developing simulations; video versus computer graphics; technology needed to run simulations; technology breakthroughs; pricing;…

  19. Evaluation of methods of reducing community noise impact around San Jose municipal airport

    NASA Technical Reports Server (NTRS)

    Glick, J. M.; Shevell, R. S.; Bowles, J. V.

    1975-01-01

    A computer simulation of the airport noise impact on the surrounding communities was used to evaluate alternate operational procedures, improved technology, and land use conversion as methods of reducing community noise impact in the airport vicinity. In addition, a constant density population distribution was analyzed for possible application to other airport communities with fairly uniform population densities and similar aircraft operational patterns. The introduction of sound absorption material (SAM) was found to reduce community noise annoyance by over 25 percent, and the introduction of refan was found to reduce community annoyance by over 60 percent. Replacing the present aircraft was found to reduce the noise problem to very small proportions, and the introduction of an advanced technology twin was found to essentially eliminate the community noise problem.

  20. Dynamic clustering scheme based on the coordination of management and control in multi-layer and multi-region intelligent optical network

    NASA Astrophysics Data System (ADS)

    Niu, Xiaoliang; Yuan, Fen; Huang, Shanguo; Guo, Bingli; Gu, Wanyi

    2011-12-01

    A Dynamic clustering scheme based on coordination of management and control is proposed to reduce network congestion rate and improve the blocking performance of hierarchical routing in Multi-layer and Multi-region intelligent optical network. Its implement relies on mobile agent (MA) technology, which has the advantages of efficiency, flexibility, functional and scalability. The paper's major contribution is to adjust dynamically domain when the performance of working network isn't in ideal status. And the incorporation of centralized NMS and distributed MA control technology migrate computing process to control plane node which releases the burden of NMS and improves process efficiently. Experiments are conducted on Multi-layer and multi-region Simulation Platform for Optical Network (MSPON) to assess the performance of the scheme.

  1. Mr. Vetro: A Collective Simulation for Teaching Health Science

    ERIC Educational Resources Information Center

    Ioannidou, Andri; Repenning, Alexander; Webb, David; Keyser, Diane; Luhn, Lisa; Daetwyler, Christof

    2010-01-01

    Why has technology become prevalent in science education without fundamentally improving test scores or student attitudes? We claim that the core of the problem is "how" technology is being used. Technologies such as simulations are currently not used to their full potential. For instance, physiology simulations often follow textbooks by…

  2. Lessons for continuing medical education from simulation research in undergraduate and graduate medical education: effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines.

    PubMed

    McGaghie, William C; Siddall, Viva J; Mazmanian, Paul E; Myers, Janet

    2009-03-01

    Simulation technology is widely used in undergraduate and graduate medical education as well as for personnel training and evaluation in other healthcare professions. Simulation provides safe and effective opportunities for learners at all levels to practice and acquire clinical skills needed for patient care. A growing body of research evidence documents the utility of simulation technology for educating healthcare professionals. However, simulation has not been widely endorsed or used for continuing medical education (CME). This article reviews and evaluates evidence from studies on simulation technology in undergraduate and graduate medical education and addresses its implications for CME. The Agency for Healthcare Research and Quality Evidence Report suggests that simulation training is effective, especially for psychomotor and communication skills, but that the strength of the evidence is low. In another review, the Best Evidence Medical Education collaboration supported the use of simulation technology, focusing on high-fidelity medical simulations under specific conditions. Other studies enumerate best practices that include mastery learning, deliberate practice, and recognition and attention to cultural barriers within the medical profession that present obstacles to wider use of this technology. Simulation technology is a powerful tool for the education of physicians and other healthcare professionals at all levels. Its educational effectiveness depends on informed use for trainees, including providing feedback, engaging learners in deliberate practice, integrating simulation into an overall curriculum, as well as on the instruction and competence of faculty in its use. Medical simulation complements, but does not replace, educational activities based on real patient-care experiences.

  3. An AST-ELM Method for Eliminating the Influence of Charging Phenomenon on ECT.

    PubMed

    Wang, Xiaoxin; Hu, Hongli; Jia, Huiqin; Tang, Kaihao

    2017-12-09

    Electrical capacitance tomography (ECT) is a promising imaging technology of permittivity distributions in multiphase flow. To reduce the effect of charging phenomenon on ECT measurement, an improved extreme learning machine method combined with adaptive soft-thresholding (AST-ELM) is presented and studied for image reconstruction. This method can provide a nonlinear mapping model between the capacitance values and medium distributions by using machine learning but not an electromagnetic-sensitive mechanism. Both simulation and experimental tests are carried out to validate the performance of the presented method, and reconstructed images are evaluated by relative error and correlation coefficient. The results have illustrated that the image reconstruction accuracy by the proposed AST-ELM method has greatly improved than that by the conventional methods under the condition with charging object.

  4. An AST-ELM Method for Eliminating the Influence of Charging Phenomenon on ECT

    PubMed Central

    Wang, Xiaoxin; Hu, Hongli; Jia, Huiqin; Tang, Kaihao

    2017-01-01

    Electrical capacitance tomography (ECT) is a promising imaging technology of permittivity distributions in multiphase flow. To reduce the effect of charging phenomenon on ECT measurement, an improved extreme learning machine method combined with adaptive soft-thresholding (AST-ELM) is presented and studied for image reconstruction. This method can provide a nonlinear mapping model between the capacitance values and medium distributions by using machine learning but not an electromagnetic-sensitive mechanism. Both simulation and experimental tests are carried out to validate the performance of the presented method, and reconstructed images are evaluated by relative error and correlation coefficient. The results have illustrated that the image reconstruction accuracy by the proposed AST-ELM method has greatly improved than that by the conventional methods under the condition with charging object. PMID:29232850

  5. Physics-based simulation of EM and SM in TSV-based 3D IC structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kteyan, Armen; Sukharev, Valeriy; Zschech, Ehrenfried

    2014-06-19

    Evolution of stresses in through-silicon-vias (TSVs) and in the TSV landing pad due to the stress migration (SM) and electromigration (EM) phenomena are considered. It is shown that an initial stress distribution existing in a TSV depends on its architecture and copper fill technology. We demonstrate that in the case of proper copper annealing the SM-induced redistribution of atoms results in uniform distributions of the hydrostatic stress and concentration of vacancies along each segment. In this case, applied EM stressing generates atom migration that is characterized by kinetics depending on the preexisting equilibrium concentration of vacancies. Stress-induced voiding in TSVmore » is considered. EM induced voiding in TSV landing pad is analyzed in details.« less

  6. Surgical simulation: Current practices and future perspectives for technical skills training.

    PubMed

    Bjerrum, Flemming; Thomsen, Ann Sofia Skou; Nayahangan, Leizl Joy; Konge, Lars

    2018-06-17

    Simulation-based training (SBT) has become a standard component of modern surgical education, yet successful implementation of evidence-based training programs remains challenging. In this narrative review, we use Kern's framework for curriculum development to describe where we are now and what lies ahead for SBT within surgery with a focus on technical skills in operative procedures. Despite principles for optimal SBT (proficiency-based, distributed, and deliberate practice) having been identified, massed training with fixed time intervals or a fixed number of repetitions is still being extensively used, and simulators are generally underutilized. SBT should be part of surgical training curricula, including theoretical, technical, and non-technical skills, and be based on relevant needs assessments. Furthermore, training should follow evidence-based theoretical principles for optimal training, and the effect of training needs to be evaluated using relevant outcomes. There is a larger, still unrealized potential of surgical SBT, which may be realized in the near future as simulator technologies evolve, more evidence-based training programs are implemented, and cost-effectiveness and impact on patient safety is clearly demonstrated.

  7. Model of load balancing using reliable algorithm with multi-agent system

    NASA Astrophysics Data System (ADS)

    Afriansyah, M. F.; Somantri, M.; Riyadi, M. A.

    2017-04-01

    Massive technology development is linear with the growth of internet users which increase network traffic activity. It also increases load of the system. The usage of reliable algorithm and mobile agent in distributed load balancing is a viable solution to handle the load issue on a large-scale system. Mobile agent works to collect resource information and can migrate according to given task. We propose reliable load balancing algorithm using least time first byte (LFB) combined with information from the mobile agent. In system overview, the methodology consisted of defining identification system, specification requirements, network topology and design system infrastructure. The simulation method for simulated system was using 1800 request for 10 s from the user to the server and taking the data for analysis. Software simulation was based on Apache Jmeter by observing response time and reliability of each server and then compared it with existing method. Results of performed simulation show that the LFB method with mobile agent can perform load balancing with efficient systems to all backend server without bottleneck, low risk of server overload, and reliable.

  8. Theoretical Technology Research for the International Solar Terrestrial Physics (ISTP) Program

    NASA Technical Reports Server (NTRS)

    Ashour-Abdalla, Maha; Curtis, Steve (Technical Monitor)

    2002-01-01

    During the last four years the UCLA (University of California, Los Angeles) IGPP (Institute of Geophysics and Planetary Physics) Space Plasma Simulation Group has continued its theoretical effort to develop a Mission Oriented Theory (MOT) for the International Solar Terrestrial Physics (ISTP) program. This effort has been based on a combination of approaches: analytical theory, large-scale kinetic (LSK) calculations, global magnetohydrodynamic (MHD) simulations and self-consistent plasma kinetic (SCK) simulations. These models have been used to formulate a global interpretation of local measurements made by the ISTP spacecraft. The regions of applications of the MOT cover most of the magnetosphere: solar wind, low- and high- latitude magnetospheric boundary, near-Earth and distant magnetotail, and auroral region. Most recent investigations include: plasma processes in the electron foreshock, response of the magnetospheric cusp, particle entry in the magnetosphere, sources of observed distribution functions in the magnetotail, transport of oxygen ions, self-consistent evolution of the magnetotail, substorm studies, effects of explosive reconnection, and auroral acceleration simulations. A complete list of the activities completed under the grant follow.

  9. Ab-initio simulations on adhesion and material transfer between contacting Al and TiN surfaces

    NASA Astrophysics Data System (ADS)

    Feldbauer, Gregor; Wolloch, Michael; Mohn, Peter; Redinger, Josef; Vernes, Andras

    2014-03-01

    Contacts of surfaces at the atomic scale are crucial in many modern applications from analytical techniques like indentation or AFM experiments to technologies such as nano- and micro-electro-mechanical-systems (N-/M-EMS). Furthermore, detailed insights into such contacts are fundamental for a better understanding of tribological processes like wear. A series of simulations is performed within the framework of Density Functional Theory (DFT) to investigate the approaching, contact and subsequent separation of two atomically flat surfaces consisting of different materials. Aluminum (Al) and titanium-nitride (TiN) slabs have been chosen as a model system representing the interaction between a soft and a hard material. The approaching and separation is simulated by moving one slab in discrete steps and allowing for electronic and ionic relaxations after each one. The simulations reveal the influences of different surface orientations ((001), (011), (111)) and alignments of the surfaces with respect to each other on the adhesion, equilibrium distance, charge distribution and material transfer between the surfaces. Material transfer is observed for configurations where the interface is stronger than the softer material.

  10. Today's Business Simulation Industry

    ERIC Educational Resources Information Center

    Summers, Gary J.

    2004-01-01

    New technologies are transforming the business simulation industry. The technologies come from research in computational fields of science, and they endow simulations with new capabilities and qualities. These capabilities and qualities include computerized behavioral simulations, online feedback and coaching, advanced interfaces, learning on…

  11. Simulation Environment Synchronizing Real Equipment for Manufacturing Cell

    NASA Astrophysics Data System (ADS)

    Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro

    Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.

  12. Simulation of the Impact of New Aircraft and Satellite-Based Ocean Surface Wind Measurements on H*Wind Analyses

    NASA Technical Reports Server (NTRS)

    Miller, TImothy L.; Atlas, R. M.; Black, P. G.; Case, J. L.; Chen, S. S.; Hood, R. E.; Johnson, J. W.; Jones, L.; Ruf, C. S.; Uhlborn, E. W.

    2008-01-01

    Accurate observations of surface ocean vector winds (OVW) with high spatial and temporal resolution are required for understanding and predicting tropical cyclones. As NASA's QuikSCAT and Navy's WindSat operate beyond their design life, many members of the weather and climate science communities recognize the importance of developing new observational technologies and strategies to meet the essential need for OVW information to improve hurricane intensity and location forecasts. The Hurricane Imaging Radiometer (HIRAD) is an innovative technology development which offers new and unique remotely sensed satellite observations of both extreme oceanic wind events and strong precipitation. It is based on the airborne Stepped Frequency Microwave Radiometer (SFMR), which is the only proven remote sensing technique for observing tropical cyclone (TC) ocean surface wind speeds and rain rates. The proposed HIRAD instrument advances beyond the current nadir viewing SFMR to an equivalent wide-swath SFMR imager using passive microwave synthetic thinned aperture radiometer (STAR) technology. This sensor will operate over 4-7 GHz (C-band frequencies) where the required TC remote sensing physics has been validated by both SFMR and WindSat radiometers. The instrument is described in more detail in a paper by Jones et al. presented to the Tropical Meteorology Special Symposium at this AMS Annual Meeting. Simulated HIRAD passes through a simulation of hurricane Frances are being developed to demonstrate HIRAD estimation of surface wind speed over a wide swath in the presence of heavy rain. These are currently being used in "quick" OSSEs (Observing System Simulation Experiments) with H'Wind analyses as the discriminating tool. The H'Wind analysis, a product of the Hurricane Research Division of NOAA's Atlantic , Oceanographic and Meteorological Laboratory, brings together wind measurements from a variety of observation platforms into an objective analysis of the distribution of wind speeds in a tropical cyclone. This product is designed to improve understanding of the extent and strength of the wind field, and to improve the assessment of hurricane intensity. See http://www.aoml.noaa._ov/hrd/data sub/wind.html. Observations have been simulated from both aircraft altitudes and space. The simulated flight patterns for the aircraft platform cases have been designed to duplicate the timing and flight patterns used in routine NOAA and USAF hurricane surveillance flights, and the spaceborne case simulates a TRMM orbit and altitude.

  13. Report of the AAPM Task Group No. 105: Issues associated with clinical implementation of Monte Carlo-based photon and electron external beam treatment planning.

    PubMed

    Chetty, Indrin J; Curran, Bruce; Cygler, Joanna E; DeMarco, John J; Ezzell, Gary; Faddegon, Bruce A; Kawrakow, Iwan; Keall, Paul J; Liu, Helen; Ma, C M Charlie; Rogers, D W O; Seuntjens, Jan; Sheikh-Bagheri, Daryoush; Siebers, Jeffrey V

    2007-12-01

    The Monte Carlo (MC) method has been shown through many research studies to calculate accurate dose distributions for clinical radiotherapy, particularly in heterogeneous patient tissues where the effects of electron transport cannot be accurately handled with conventional, deterministic dose algorithms. Despite its proven accuracy and the potential for improved dose distributions to influence treatment outcomes, the long calculation times previously associated with MC simulation rendered this method impractical for routine clinical treatment planning. However, the development of faster codes optimized for radiotherapy calculations and improvements in computer processor technology have substantially reduced calculation times to, in some instances, within minutes on a single processor. These advances have motivated several major treatment planning system vendors to embark upon the path of MC techniques. Several commercial vendors have already released or are currently in the process of releasing MC algorithms for photon and/or electron beam treatment planning. Consequently, the accessibility and use of MC treatment planning algorithms may well become widespread in the radiotherapy community. With MC simulation, dose is computed stochastically using first principles; this method is therefore quite different from conventional dose algorithms. Issues such as statistical uncertainties, the use of variance reduction techniques, the ability to account for geometric details in the accelerator treatment head simulation, and other features, are all unique components of a MC treatment planning algorithm. Successful implementation by the clinical physicist of such a system will require an understanding of the basic principles of MC techniques. The purpose of this report, while providing education and review on the use of MC simulation in radiotherapy planning, is to set out, for both users and developers, the salient issues associated with clinical implementation and experimental verification of MC dose algorithms. As the MC method is an emerging technology, this report is not meant to be prescriptive. Rather, it is intended as a preliminary report to review the tenets of the MC method and to provide the framework upon which to build a comprehensive program for commissioning and routine quality assurance of MC-based treatment planning systems.

  14. Challenges of future aircraft propulsion: A review of distributed propulsion technology and its potential application for the all electric commercial aircraft

    NASA Astrophysics Data System (ADS)

    Gohardani, Amir S.; Doulgeris, Georgios; Singh, Riti

    2011-07-01

    This paper highlights the role of distributed propulsion technology for future commercial aircraft. After an initial historical perspective on the conceptual aspects of distributed propulsion technology and a glimpse at numerous aircraft that have taken distributed propulsion technology to flight, the focal point of the review is shifted towards a potential role this technology may entail for future commercial aircraft. Technological limitations and challenges of this specific technology are also considered in combination with an all electric aircraft concept, as means of predicting the challenges associated with the design process of a next generation commercial aircraft.

  15. Teaching Engineering Statistics with Technology, Group Learning, Contextual Projects, Simulation Models and Student Presentations

    ERIC Educational Resources Information Center

    Romeu, Jorge Luis

    2008-01-01

    This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…

  16. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  17. A research program to assess the impact of the electromagnetic pulse on electric power systems

    NASA Astrophysics Data System (ADS)

    McConnell, B. W.; Barnes, P. R.

    A strong electromagnetic pulse (EMP) with an electric-field component on the order of tens of kilovolts per meter is produced by a nuclear detonation in or above the atmosphere. This paper presents an overview and a summary of the results to date of a program formulated to address the research and development of technologies and systems required to assess and reduce the impact of EMP on electric power systems. The technologies and systems being considered include simulation models, methods of assessment, definition of required experiments and data, development of protective hardware, and the creation or revision of operating and control procedures. Results to date include the development of relatively simple unclassified EMP environment models, the development of methods for extending EMP coupling models to the large transmission and distribution network associated with the electric power system, and the performance of a parametric study of HEMP induced surges using an appropriate EMP environment. An experiment to investigate the effect of corona on the coupling of EMP to conductors has been defined and has been performed in an EMP simulator. Experiments to determine the response of key components to simulated EMP surges and an investigation of the impact of steep-front, short-duration impulse on a selected number of the insulation systems used in electric power systems apparatus are being performed.

  18. Cyber-Physical Test Platform for Microgrids: Combining Hardware, Hardware-in-the-Loop, and Network-Simulator-in-the-Loop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, Austin; Chakraborty, Sudipta; Wang, Dexin

    This paper presents a cyber-physical testbed, developed to investigate the complex interactions between emerging microgrid technologies such as grid-interactive power sources, control systems, and a wide variety of communication platforms and bandwidths. The cyber-physical testbed consists of three major components for testing and validation: real time models of a distribution feeder model with microgrid assets that are integrated into the National Renewable Energy Laboratory's (NREL) power hardware-in-the-loop (PHIL) platform; real-time capable network-simulator-in-the-loop (NSIL) models; and physical hardware including inverters and a simple system controller. Several load profiles and microgrid configurations were tested to examine the effect on system performance withmore » increasing channel delays and router processing delays in the network simulator. Testing demonstrated that the controller's ability to maintain a target grid import power band was severely diminished with increasing network delays and laid the foundation for future testing of more complex cyber-physical systems.« less

  19. Optimal Shape Design of Mail-Slot Nacelle on N3-X Hybrid Wing-Body Configuration

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungjin; Liou, Meng-Sing

    2013-01-01

    System studies show that a N3-X hybrid wing-body aircraft with a turboelectric distributed propulsion system using a mail-slot inlet/nozzle nacelle can meet the environmental and performance goals for N+3 generation transports (three generations beyond the current air transport technology level) set by NASA's Subsonic Fixed Wing Project. In this study, a Navier-Stokes flow simulation of N3-X on hybrid unstructured meshes was conducted, including the mail-slot propulsor. The geometry of the mail-slot propulsor was generated by a CAD (Computer-Aided Design)-free shape parameterization. A novel body force model generation approach was suggested for a more realistic and efficient simulation of the flow turning, pressure rise and loss effects of the fan blades and the inlet-fan interactions. Flow simulation results of the N3-X demonstrates the validity of the present approach. An optimal Shape design of the mail-slot nacelle surface was conducted to reduce strength of shock waves and flow separations on the cowl surface.

  20. Cybersim: geographic, temporal, and organizational dynamics of malware propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santhi, Nandakishore; Yan, Guanhua; Eidenbenz, Stephan

    2010-01-01

    Cyber-infractions into a nation's strategic security envelope pose a constant and daunting challenge. We present the modular CyberSim tool which has been developed in response to the need to realistically simulate at a national level, software vulnerabilities and resulting mal ware propagation in online social networks. CyberSim suite (a) can generate realistic scale-free networks from a database of geocoordinated computers to closely model social networks arising from personal and business email contacts and online communities; (b) maintains for each,bost a list of installed software, along with the latest published vulnerabilities; (d) allows designated initial nodes where malware gets introduced; (e)more » simulates, using distributed discrete event-driven technology, the spread of malware exploiting a specific vulnerability, with packet delay and user online behavior models; (f) provides a graphical visualization of spread of infection, its severity, businesses affected etc to the analyst. We present sample simulations on a national level network with millions of computers.« less

Top