Sample records for factory automation simulation

  1. Arcnet(R) On-Fiber -- A Viable Factory Automation Alternative

    NASA Astrophysics Data System (ADS)

    Karlin, Geof; Tucker, Carol S.

    1987-01-01

    Manufacturers need to improve their operating methods and increase their productivity so they can compete successfully in the marketplace. This goal can be achieved through factory automation, and the key to this automation is successful data base management and factory integration. However, large scale factory automation and integration requires effective communications, and this has given rise to an interest in various Local Area Networks or LANs. In a completely integrated and automated factory, the entire organization must have access to the data base, and all departments and functions must be able to communicate with each other. Traditionally, these departments and functions use incompatible equipment, and the ability to make such equipment communicate presents numerous problems. ARCNET, a token-passing LAN which has a significant presence in the office environment today, coupled with fiber optic cable, the cable of the future, provide an effective, low-cost solution to a number of these problems.

  2. West europe Report, Science and Technology.

    DTIC Science & Technology

    1986-04-15

    BLICK DURCH DIE WIRTSCHAFT, 21 Feb 86) 38 Seiaf: Elsag /lBM’s New Creation in Factory Automation (Mauro Flego Interview; AUTOMAZIONE INTEGRATA...SEIAF: ELSAG /IBM’S NEW CREATION IN FACTORY AUTOMATION Milan AUTOMAZIONE INTEGRATA in Italian Apr 85 pp 110-112 [Interview with Mauro Flego...objectives of SEIAF? [Answer] SEIAF, or better—the joint venture ELSAG /IBM—concerns itself with electronic and computer systems for factory automation

  3. Cognitive performance deficits in a simulated climb of Mount Everest - Operation Everest II

    NASA Technical Reports Server (NTRS)

    Kennedy, R. S.; Dunlap, W. P.; Banderet, L. E.; Smith, M. G.; Houston, C. S.

    1989-01-01

    Cognitive function at simulated altitude was investigated in a repeated-measures within-subject study of performance by seven volunteers in a hypobaric chamber, in which atmospheric pressure was systematically lowered over a period of 40 d to finally reach a pressure equivalent to 8845 m, the approximate height of Mount Everest. The automated performance test system employed compact computer design; automated test administrations, data storage, and retrieval; psychometric properties of stability and reliability; and factorial richness. Significant impairments of cognitive function were seen for three of the five tests in the battery; on two tests, grammatical reasoning and pattern comparison, every subject showed a substantial decrement.

  4. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  5. Replicating systems concepts: Self-replicating lunar factory and demonstration

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Automation of lunar mining and manufacturing facility maintenance and repair is addressed. Designing the factory as an automated, multiproduct, remotely controlled, reprogrammable Lunar Manufacturing Facility capable of constructing duplicates of itself which would themselves be capable of further replication is proposed.

  6. Automated workflows for modelling chemical fate, kinetics and toxicity.

    PubMed

    Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P

    2017-12-01

    Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Engineering the smart factory

    NASA Astrophysics Data System (ADS)

    Harrison, Robert; Vera, Daniel; Ahmad, Bilal

    2016-10-01

    The fourth industrial revolution promises to create what has been called the smart factory. The vision is that within such modular structured smart factories, cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralised decisions. This paper provides a view of this initiative from an automation systems perspective. In this context it considers how future automation systems might be effectively configured and supported through their lifecycles and how integration, application modelling, visualisation and reuse of such systems might be best achieved. The paper briefly describes limitations in current engineering methods, and new emerging approaches including the cyber physical systems (CPS) engineering tools being developed by the automation systems group (ASG) at Warwick Manufacturing Group, University of Warwick, UK.

  8. Gearing up to the factory of the future

    NASA Astrophysics Data System (ADS)

    Godfrey, D. E.

    1985-01-01

    The features of factories and manufacturing techniques and tools of the near future are discussed. The spur to incorporate new technologies on the factory floor will originate in management, who must guide the interfacing of computer-enhanced equipment with traditional manpower, materials and machines. Electronic control with responsiveness and flexibility will be the key concept in an integrated approach to processing materials. Microprocessor controlled laser and fluid cutters add accuracy to cutting operations. Unattended operation will become feasible when automated inspection is added to a work station through developments in robot vision. Optimum shop management will be achieved through AI programming of parts manufacturing, optimized work flows, and cost accounting. The automation enhancements will allow designers to affect directly parts being produced on the factory floor.

  9. Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke

    2007-01-19

    We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype ofmore » a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.« less

  10. JPRS Report, Science & Technology, Europe & Latin America.

    DTIC Science & Technology

    1988-01-22

    Rex Malik; ZERO UN INFORMATIQUE, 31 Aug 87) 25 FACTORY AUTOMATION, ROBOTICS West Europe Seeks To Halt Japanese Inroads in Machine Tool Sector...aircraft. 25048 CSO: 3698/A014 26 FACTORY AUTOMATION, ROBOTICS vrEST EUROpE WEST EUROPE SEEKS TO HALT JAPANESE INROADS IN MACHINE TOOL SECTOR...Trumpf, by the same journalist; first paragraph is L’USINE NOUVELLE introduction] [Excerpts] European machine - tool builders are stepping up mutual

  11. Integrating PCLIPS into ULowell's Lincoln Logs: Factory of the future

    NASA Technical Reports Server (NTRS)

    Mcgee, Brenda J.; Miller, Mark D.; Krolak, Patrick; Barr, Stanley J.

    1990-01-01

    We are attempting to show how independent but cooperating expert systems, executing within a parallel production system (PCLIPS), can operate and control a completely automated, fault tolerant prototype of a factory of the future (The Lincoln Logs Factory of the Future). The factory consists of a CAD system for designing the Lincoln Log Houses, two workcells, and a materials handling system. A workcell consists of two robots, part feeders, and a frame mounted vision system.

  12. The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards

    DTIC Science & Technology

    1986-08-01

    Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques

  13. Robotics and Automation Education: Developing the Versatile, Practical Lab.

    ERIC Educational Resources Information Center

    Stenerson, Jon

    1986-01-01

    Elements of the development of a robotics and automation laboratory are discussed. These include the benefits of upgrading current staff, ways to achieve this staff development, formation of a robotics factory automation committee, topics to be taught with a robot, elements of a laboratory, laboratory funding, and design safety. (CT)

  14. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT.

    PubMed

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2016-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the "Islands of Automation" dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing.

  15. The Learning Basis of Automated Factories: The Case of FIAT. Training Discussion Paper No. 86.

    ERIC Educational Resources Information Center

    Araujo e Oliveira, Joao Batista

    As part of a study on the impact of automation on training, extensive interviews were conducted at two of Fiat's plants, Termoli and Casino, Italy. Termoli, a plant built in the mid-1980s with automation in mind, production of engines and gear boxes was very much integrated by automation devices. Casino produced some individual components but was…

  16. Advances in greenhouse automation and controlled environment agriculture: A transition to plant factories and urban farming

    USDA-ARS?s Scientific Manuscript database

    Greenhouse cultivation has evolved from simple covered rows of open-fields crops to highly sophisticated controlled environment agriculture (CEA) facilities that projected the image of plant factories for urban farming. The advances and improvements in CEA have promoted the scientific solutions for ...

  17. Minifactory: a precision assembly system adaptable to the product life cycle

    NASA Astrophysics Data System (ADS)

    Muir, Patrick F.; Rizzi, Alfred A.; Gowdy, Jay W.

    1997-12-01

    Automated product assembly systems are traditionally designed with the intent that they will be operated with few significant changes for as long as the product is being manufactured. This approach to factory design and programming has may undesirable qualities which have motivated the development of more 'flexible' systems. In an effort to improve agility, different types of flexibility have been integrated into factory designs. Specifically, automated assembly systems have been endowed with the ability to assemble differing products by means of computer-controlled robots, and to accommodate variations in parts locations and dimensions by means of sensing. The product life cycle (PLC) is a standard four-stage model of the performance of a product from the time that it is first introduced in the marketplace until the time that it is discontinued. Manufacturers can improve their return on investment by adapting the production process to the PLC. We are developing two concepts to enable manufacturers to more readily achieve this goal: the agile assembly architecture (AAA), an abstract framework for distributed modular automation; and minifactory, our physical instantation of this architecture for the assembly of precision electro-mechanical devices. By examining the requirements which each PLC stage places upon the production system, we identify characteristics of factory design and programming which are appropriate for that stage. As the product transitions from one stage to the next, the factory design and programing should also transition from one embodiment to the next in order to achieve the best return on investment. Modularity of the factory components, highly flexible product transport mechanisms, and a high level of distributed intelligence are key characteristics of minifactory that enable this adaptation.

  18. Development of integrated control system for smart factory in the injection molding process

    NASA Astrophysics Data System (ADS)

    Chung, M. J.; Kim, C. Y.

    2018-03-01

    In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.

  19. Automation; The New Industrial Revolution.

    ERIC Educational Resources Information Center

    Arnstein, George E.

    Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…

  20. Automation Training Tools of the Future.

    ERIC Educational Resources Information Center

    Rehg, James

    1986-01-01

    Manufacturing isn't what it used to be, and the United States must ensure its position in the world trade market by educating factory workers in new automated systems. A computer manufacturing engineer outlines the training requirements of a modern workforce and details robotic training devices suitable for classroom use. (JN)

  1. Reviving the Rural Factory: Automation and Work in the South. Executive Summary.

    ERIC Educational Resources Information Center

    Rosenfeld, Stuart A.; And Others

    This document is the executive summary for a two volume report on technological innovation and southern rural industrial development. The first volume examines public and private factors that influence investment decisions in new technologies and the outcomes of those decisions; effects of automation on employment and the workplace; outcomes of…

  2. Reviving the Rural Factory: Automation and Work in the South. Volumes 1 and 2.

    ERIC Educational Resources Information Center

    Rosenfeld, Stuart A.; And Others

    These two volumes examine how the public sector can help revitalize southern rural counties adversely affected by global competition and technological advances. The first volume examines public and private factors that influence investment decisions in new technologies and outcomes of those decisions; effects of automation on employment and the…

  3. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT

    PubMed Central

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2017-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the “Islands of Automation” dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing. PMID:28691121

  4. Advanced Map For Real-Time Process Control

    NASA Astrophysics Data System (ADS)

    Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto

    1987-10-01

    MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.

  5. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  6. Machine vision for various manipulation tasks

    NASA Astrophysics Data System (ADS)

    Domae, Yukiyasu

    2017-03-01

    Bin-picking, re-grasping, pick-and-place, kitting, etc. There are many manipulation tasks in the fields of automation of factory, warehouse and so on. The main problem of the automation is that the target objects (items/parts) have various shapes, weights and surface materials. In my talk, I will show latest machine vision systems and algorithms against the problem.

  7. Intelligent robot trends for factory automation

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    1997-09-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent economic and technical trends. The robotics industry now has a billion-dollar market in the U.S. and is growing. Feasibility studies are presented which also show unaudited healthy rates of return for a variety of robotic applications. Technically, the machines are faster, cheaper, more repeatable, more reliable and safer. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. However, the road from inspiration to successful application is still long and difficult, often taking decades to achieve a new product. More cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit both industry and society.

  8. Automated production of plant-based vaccines and pharmaceuticals.

    PubMed

    Wirz, Holger; Sauer-Budge, Alexis F; Briggs, John; Sharpe, Aaron; Shu, Sudong; Sharon, Andre

    2012-12-01

    A fully automated "factory" was developed that uses tobacco plants to produce large quantities of vaccines and other therapeutic biologics within weeks. This first-of-a-kind factory takes advantage of a plant viral vector technology to produce specific proteins within the leaves of rapidly growing plant biomass. The factory's custom-designed robotic machines plant seeds, nurture the growing plants, introduce a viral vector that directs the plant to produce a target protein, and harvest the biomass once the target protein has accumulated in the plants-all in compliance with Food and Drug Administration (FDA) guidelines (e.g., current Good Manufacturing Practices). The factory was designed to be time, cost, and space efficient. The plants are grown in custom multiplant trays. Robots ride up and down a track, servicing the plants and delivering the trays from the lighted, irrigated growth modules to each processing station as needed. Using preprogrammed robots and processing equipment eliminates the need for human contact, preventing potential contamination of the process and economizing the operation. To quickly produce large quantities of protein-based medicines, we transformed a laboratory-based biological process and scaled it into an industrial process. This enables quick, safe, and cost-effective vaccine production that would be required in case of a pandemic.

  9. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    NASA Astrophysics Data System (ADS)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  10. A research factory for polymer microdevices: muFac

    NASA Astrophysics Data System (ADS)

    Anthony, Brian W.; Hardt, David E.; Hale, Melinda; Zarrouati, Nadege

    2010-02-01

    As part of our research on the manufacturing science of micron scale polymer-based devices, an automated production cell has been developed to explore its use in a volume manufacturing environment. This "micro-factory" allows the testing of models and hardware that have resulted from research on material characterization and simulation, tooling and equipment design and control, and process control and metrology. More importantly it has allowed us to identify the problems that exist between and within unit-processes. This paper details our efforts to produce basic micro-fluidic products in high volume at acceptable production rates and quality levels. The device chosen for our first product is a simple binary micromixer with 40×50 micron channel cross section manufactured by embossing of PMMA. The processes in the cell include laser cutting and drilling, hot embossing, thermal bonding and high-speed inspection of the components. Our goal is to create a "lights-out" factory that can make long production runs (e.g. an 8 hour shift) at high rates (Takt time of less than 3 minutes) with consistent quality. This contrasts with device foundries where prototypes in limited quantities but with high variety are the goal. Accordingly, rate and yield are dominant factors in this work, along with the need for precise material handling strategies. Production data will be presented to include process run charts, sampled functional testing of the products and measures of the overall system throughput.

  11. The historical development and basis of human factors guidelines for automated systems in aeronautical operations

    NASA Technical Reports Server (NTRS)

    Ciciora, J. A.; Leonard, S. D.; Johnson, N.; Amell, J.

    1984-01-01

    In order to derive general design guidelines for automated systems a study was conducted on the utilization and acceptance of existing automated systems as currently employed in several commercial fields. Four principal study area were investigated by means of structured interviews, and in some cases questionnaires. The study areas were aviation, a both scheduled airline and general commercial aviation; process control and factory applications; office automation; and automation in the power industry. The results of over eighty structured interviews were analyzed and responses categoried as various human factors issues for use by both designers and users of automated equipment. These guidelines address such items as general physical features of automated equipment; personnel orientation, acceptance, and training; and both personnel and system reliability.

  12. Robots and the Economy.

    ERIC Educational Resources Information Center

    Albus, James S.

    1984-01-01

    Spectacular advances in microcomputers are forging new technological frontiers in robotics. For example, many factories will be totally automated. Economic implications of the new technology of robotics for the future are examined. (RM)

  13. Automated Scheduling Via Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  14. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  15. Effects of Self-Instructional Methods and Above Real Time Training (ARTT) for Maneuvering Tasks on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Ali, Syed Firasat; Khan, Javed Khan; Rossi, Marcia J.; Crane, Peter; Heath, Bruce E.; Knighten, Tremaine; Culpepper, Christi

    2003-01-01

    Personal computer based flight simulators are expanding opportunities for providing low-cost pilot training. One advantage of these devices is the opportunity to incorporate instructional features into training scenarios that might not be cost effective with earlier systems. Research was conducted to evaluate the utility of different instructional features using a coordinated level turn as an aircraft maneuvering task. In study I, a comparison was made between automated computer grades of performance with certified flight instructors grades. Every one of the six student volunteers conducted a flight with level turns at two different bank angles. The automated computer grades were based on prescribed tolerances on bank angle, airspeed and altitude. Two certified flight instructors independently examined the video tapes of heads up and instrument displays of the flights and graded them. The comparison of automated grades with the instructors grades was based on correlations between them. In study II, a 2x2 between subjects factorial design was used to devise and conduct an experiment. Comparison was made between real time training and above real time training and between feedback and no feedback in training. The performance measure to monitor progress in training was based on deviations in bank angle and altitude. The performance measure was developed after completion of the experiment including the training and test flights. It was not envisaged before the experiment. The experiment did not include self- instructions as it was originally planned, although feedback by experimenter to the trainee was included in the study.

  16. Effects of Self-Instructional Methods and Above Real Time Training (ARTT) for Maneuvering Tasks on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Norlin, Ken (Technical Monitor); Ali, Syed Firasat; Khan, M. Javed; Rossi, Marcia J.; Crane, Peter; Heath, Bruce E.; Knighten, Tremaine; Culpepper, Christi

    2003-01-01

    Personal computer based flight simulators are expanding opportunities for providing low-cost pilot training. One advantage of these devices is the opportunity to incorporate instructional features into training scenarios that might not be cost effective with earlier systems. Research was conducted to evaluate the utility of different instructional features using a coordinated level turn as an aircraft maneuvering task. In study I, a comparison was made between automated computer grades of performance with certified flight instructors grades. Every one of the six student volunteers conducted a flight with level turns at two different bank angles. The automated computer grades were based on prescribed tolerances on bank angle, airspeed and altitude. Two certified flight instructors independently examined the video tapes of heads up and instrument displays of the flights and graded them. The comparison of automated grades with the instructors grades ms based on correlations between them. In study II, a 2x2 between subjects factorial design was used to devise and conduct an experiment. Comparison was made between real time training and above real time training and between feedback and no feedback in training. The performance measure to monitor progress in training was based on deviations in bank angle and altitude. The performance measure was developed after completion of the experiment including the training and test flights. It was not envisaged before the experiment. The experiment did not include self-instructions as it was originally planned, although feedback by experimenter to the trainee was included in the study.

  17. An Automated and Continuous Plant Weight Measurement System for Plant Factory

    PubMed Central

    Chen, Wei-Tai; Yeh, Yu-Hui F.; Liu, Ting-Yu; Lin, Ta-Te

    2016-01-01

    In plant factories, plants are usually cultivated in nutrient solution under a controllable environment. Plant quality and growth are closely monitored and precisely controlled. For plant growth evaluation, plant weight is an important and commonly used indicator. Traditional plant weight measurements are destructive and laborious. In order to measure and record the plant weight during plant growth, an automated measurement system was designed and developed herein. The weight measurement system comprises a weight measurement device and an imaging system. The weight measurement device consists of a top disk, a bottom disk, a plant holder and a load cell. The load cell with a resolution of 0.1 g converts the plant weight on the plant holder disk to an analog electrical signal for a precise measurement. The top disk and bottom disk are designed to be durable for different plant sizes, so plant weight can be measured continuously throughout the whole growth period, without hindering plant growth. The results show that plant weights measured by the weight measurement device are highly correlated with the weights estimated by the stereo-vision imaging system; hence, plant weight can be measured by either method. The weight growth of selected vegetables growing in the National Taiwan University plant factory were monitored and measured using our automated plant growth weight measurement system. The experimental results demonstrate the functionality, stability and durability of this system. The information gathered by this weight system can be valuable and beneficial for hydroponic plants monitoring research and agricultural research applications. PMID:27066040

  18. An Automated and Continuous Plant Weight Measurement System for Plant Factory.

    PubMed

    Chen, Wei-Tai; Yeh, Yu-Hui F; Liu, Ting-Yu; Lin, Ta-Te

    2016-01-01

    In plant factories, plants are usually cultivated in nutrient solution under a controllable environment. Plant quality and growth are closely monitored and precisely controlled. For plant growth evaluation, plant weight is an important and commonly used indicator. Traditional plant weight measurements are destructive and laborious. In order to measure and record the plant weight during plant growth, an automated measurement system was designed and developed herein. The weight measurement system comprises a weight measurement device and an imaging system. The weight measurement device consists of a top disk, a bottom disk, a plant holder and a load cell. The load cell with a resolution of 0.1 g converts the plant weight on the plant holder disk to an analog electrical signal for a precise measurement. The top disk and bottom disk are designed to be durable for different plant sizes, so plant weight can be measured continuously throughout the whole growth period, without hindering plant growth. The results show that plant weights measured by the weight measurement device are highly correlated with the weights estimated by the stereo-vision imaging system; hence, plant weight can be measured by either method. The weight growth of selected vegetables growing in the National Taiwan University plant factory were monitored and measured using our automated plant growth weight measurement system. The experimental results demonstrate the functionality, stability and durability of this system. The information gathered by this weight system can be valuable and beneficial for hydroponic plants monitoring research and agricultural research applications.

  19. Improvement of an automated protein crystal exchange system PAM for high-throughput data collection

    PubMed Central

    Hiraki, Masahiko; Yamada, Yusuke; Chavas, Leonard M. G.; Wakatsuki, Soichi; Matsugaki, Naohiro

    2013-01-01

    Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable. PMID:24121334

  20. Effects of Levels of Automation for Advanced Small Modular Reactors: Impacts on Performance, Workload, and Situation Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johanna Oxstrand; Katya Le Blanc

    The Human-Automation Collaboration (HAC) research effort is a part of the Department of Energy (DOE) sponsored Advanced Small Modular Reactor (AdvSMR) program conducted at Idaho National Laboratory (INL). The DOE AdvSMR program focuses on plant design and management, reduction of capital costs as well as plant operations and maintenance costs (O&M), and factory production costs benefits.

  1. Research on the ITOC based scheduling system for ship piping production

    NASA Astrophysics Data System (ADS)

    Li, Rui; Liu, Yu-Jun; Hamada, Kunihiro

    2010-12-01

    Manufacturing of ship piping systems is one of the major production activities in shipbuilding. The schedule of pipe production has an important impact on the master schedule of shipbuilding. In this research, the ITOC concept was introduced to solve the scheduling problems of a piping factory, and an intelligent scheduling system was developed. The system, in which a product model, an operation model, a factory model, and a knowledge database of piping production were integrated, automated the planning process and production scheduling. Details of the above points were discussed. Moreover, an application of the system in a piping factory, which achieved a higher level of performance as measured by tardiness, lead time, and inventory, was demonstrated.

  2. Methodology on Investigating the Influences of Automated Material Handling System in Automotive Assembly Process

    NASA Astrophysics Data System (ADS)

    Saffar, Seha; Azni Jafar, Fairul; Jamaludin, Zamberi

    2016-02-01

    A case study was selected as a method to collect data in actual industry situation. The study aimed to assess the influences of automated material handling system in automotive industry by proposing a new design of integration system through simulation, and analyze the significant effect and influence of the system. The method approach tool will be CAD Software (Delmia & Quest). The process of preliminary data gathering in phase 1 will collect all data related from actual industry situation. It is expected to produce a guideline and limitation in designing a new integration system later. In phase 2, an idea or concept of design will be done by using 10 principles of design consideration for manufacturing. A full factorial design will be used as design of experiment in order to analyze the performance measured of the integration system with the current system in case study. From the result of the experiment, an ANOVA analysis will be done to study the performance measured. Thus, it is expected that influences can be seen from the improvement made in the system.

  3. Parametric analysis of plastic strain and force distribution in single pass metal spinning

    NASA Astrophysics Data System (ADS)

    Choudhary, Shashank; Tejesh, Chiruvolu Mohan; Regalla, Srinivasa Prakash; Suresh, Kurra

    2013-12-01

    Metal spinning also known as spin forming is one of the sheet metal working processes by which an axis-symmetric part can be formed from a flat sheet metal blank. Parts are produced by pressing a blunt edged tool or roller on to the blank which in turn is mounted on a rotating mandrel. This paper discusses about the setting up a 3-D finite element simulation of single pass metal spinning in LS-Dyna. Four parameters were considered namely blank thickness, roller nose radius, feed ratio and mandrel speed and the variation in forces and plastic strain were analysed using the full-factorial design of experiments (DOE) method of simulation experiments. For some of these DOE runs, physical experiments on extra deep drawing (EDD) sheet metal were carried out using En31 tool on a lathe machine. Simulation results are able to predict the zone of unsafe thinning in the sheet and high forming forces that are hint to the necessity for less-expensive and semi-automated machine tools to help the household and small scale spinning workers widely prevalent in India.

  4. Automated array assembly

    NASA Technical Reports Server (NTRS)

    Daiello, R. V.

    1977-01-01

    A general technology assessment and manufacturing cost analysis was presented. A near-term (1982) factory design is described, and the results of an experimental production study for the large-scale production of flat-panel silicon and solar-cell arrays are detailed.

  5. Optimizing ELISAs for precision and robustness using laboratory automation and statistical design of experiments.

    PubMed

    Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete

    2008-08-20

    Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.

  6. Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.

    PubMed

    Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark

    2018-03-20

    Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could pose a serious hazard in complex take-over situations where situation awareness is required to prepare for threats. Driver fatigue monitoring or controllable distraction through non-driving tasks could be necessary to ensure alertness and availability during highly automated driving. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Virtual Collaborative Simulation Environment for Integrated Product and Process Development

    NASA Technical Reports Server (NTRS)

    Gulli, Michael A.

    1997-01-01

    Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.

  8. An intelligent CNC machine control system architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.J.; Loucks, C.S.

    1996-10-01

    Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less

  9. Engineering biological systems using automated biofoundries

    PubMed Central

    Chao, Ran; Mishra, Shekhar; Si, Tong; Zhao, Huimin

    2017-01-01

    Engineered biological systems such as genetic circuits and microbial cell factories have promised to solve many challenges in the modern society. However, the artisanal processes of research and development are slow, expensive, and inconsistent, representing a major obstacle in biotechnology and bioengineering. In recent years, biological foundries or biofoundries have been developed to automate design-build-test engineering cycles in an effort to accelerate these processes. This review summarizes the enabling technologies for such biofoundries as well as their early successes and remaining challenges. PMID:28602523

  10. Middle Level Learning Number 48

    ERIC Educational Resources Information Center

    Freedman, Eric B.; Roberts, Scott L.

    2013-01-01

    Two articles are presented in this column. "Life in an Auto Factory: Simulating how Labor and Management Interact" by Eric B. Freedman describes a classroom simulation of management and labor relations in an auto factory. Classroom handouts are included. The next article, "Women of Action and County Names: Mary Musgrove County--Why…

  11. MapFactory - Towards a mapping design pattern for big geospatial data

    NASA Astrophysics Data System (ADS)

    Rautenbach, Victoria; Coetzee, Serena

    2018-05-01

    With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.

  12. A Data Envelopment Analysis Model for Selecting Material Handling System Designs

    NASA Astrophysics Data System (ADS)

    Liu, Fuh-Hwa Franklin; Kuo, Wan-Ting

    The material handling system under design is an unmanned job shop with an automated guided vehicle that transport loads within the processing machines. The engineering task is to select the design alternatives that are the combinations of the four design factors: the ratio of production time to transportation time, mean job arrival rate to the system, input/output buffer capacities at each processing machine, and the vehicle control strategies. Each of the design alternatives is simulated to collect the upper and lower bounds of the five performance indices. We develop a Data Envelopment Analysis (DEA) model to assess the 180 designs with imprecise data of the five indices. The three-ways factorial experiment analysis for the assessment results indicates the buffer capacity and the interaction of job arrival rate and buffer capacity affect the performance significantly.

  13. Preparing Students for "The End of Work".

    ERIC Educational Resources Information Center

    Rifkin, Jeremy

    1997-01-01

    With workerless factories, virtual companies, and shrinking governments becoming reality, nations will be hard-pressed to employ millions of "surplus" young people in an increasingly automated global economy. An elitist knowledge sector cannot accommodate enough displaced workers. To advance the goals of civil education, educators must…

  14. The Plight of Manufacturing: What Can Be Done?

    ERIC Educational Resources Information Center

    Cyert, Richard M.

    1985-01-01

    Proposes that full automation is the best current option for the United States' manufacturing industries. Advocates increased use of electronics, robotics, and computers in the establishment of unmanned factories. Implications of this movement are examined in terms of labor, management, and the structure of the economy. (ML)

  15. Robotics: Past, Present, and Future.

    ERIC Educational Resources Information Center

    Dunne, Maurice J.

    Robots are finally receiving wide-spread attention as a means to realize the goal of automating factories. In the 1960's robot use was limited by unfavorable acquisition and operating costs and the affordable control technology limiting applications to relatively simple jobs. During the 1970's productivity of manufacturing organizations declined…

  16. The RAVEN Toolbox and Its Use for Generating a Genome-scale Metabolic Model for Penicillium chrysogenum

    PubMed Central

    Agren, Rasmus; Liu, Liming; Shoaie, Saeed; Vongsangnak, Wanwipa; Nookaew, Intawat; Nielsen, Jens

    2013-01-01

    We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production. PMID:23555215

  17. Linking structural biology with genome research: Beamlines for the Berlin ``Protein Structure Factory'' initiative

    NASA Astrophysics Data System (ADS)

    Illing, Gerd; Saenger, Wolfram; Heinemann, Udo

    2000-06-01

    The Protein Structure Factory will be established to characterize proteins encoded by human genes or cDNAs, which will be selected by criteria of potential structural novelty or medical or biotechnological usefulness. It represents an integrative approach to structure analysis combining bioinformatics techniques, automated gene expression and purification of gene products, generation of a biophysical fingerprint of the proteins and the determination of their three-dimensional structures either by NMR spectroscopy or by X-ray diffraction. The use of synchrotron radiation will be crucial to the Protein Structure Factory: high brilliance and tunable wavelengths are prerequisites for fast data collection, the use of small crystals and multiwavelength anomalous diffraction (MAD) phasing. With the opening of BESSY II, direct access to a third-generation XUV storage ring source with excellent conditions is available nearby. An insertion device with two MAD beamlines and one constant energy station will be set up until 2001.

  18. Current status and future prospects of an automated sample exchange system PAM for protein crystallography

    NASA Astrophysics Data System (ADS)

    Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.

    2013-03-01

    To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.

  19. Visiting An "Egg Factory" on the Farm: A Resource Unit.

    ERIC Educational Resources Information Center

    Ediger, Marlow

    The resource unit indicates how elementary school teachers can use contemporary poultry farming to teach the concepts of change and specialization in American society and to show the effects of automation of American farms. The unit lists general objectives for students: to develop an understanding of farm specialization, especially in egg…

  20. Effective application of multiple locus variable number of tandem repeats analysis to tracing Staphylococcus aureus in food-processing environment.

    PubMed

    Rešková, Z; Koreňová, J; Kuchta, T

    2014-04-01

    A total of 256 isolates of Staphylococcus aureus were isolated from 98 samples (34 swabs and 64 food samples) obtained from small or medium meat- and cheese-processing plants in Slovakia. The strains were genotypically characterized by multiple locus variable number of tandem repeats analysis (MLVA), involving multiplex polymerase chain reaction (PCR) with subsequent separation of the amplified DNA fragments by an automated flow-through gel electrophoresis. With the panel of isolates, MLVA produced 31 profile types, which was a sufficient discrimination to facilitate the description of spatial and temporal aspects of contamination. Further data on MLVA discrimination were obtained by typing a subpanel of strains by multiple locus sequence typing (MLST). MLVA coupled to automated electrophoresis proved to be an effective, comparatively fast and inexpensive method for tracing S. aureus contamination of food-processing factories. Subspecies genotyping of microbial contaminants in food-processing factories may facilitate identification of spatial and temporal aspects of the contamination. This may help to properly manage the process hygiene. With S. aureus, multiple locus variable number of tandem repeats analysis (MLVA) proved to be an effective method for the purpose, being sufficiently discriminative, yet comparatively fast and inexpensive. The application of automated flow-through gel electrophoresis to separation of DNA fragments produced by multiplex PCR helped to improve the accuracy and speed of the method. © 2013 The Society for Applied Microbiology.

  1. @neurIST complex information processing toolchain for the integrated management of cerebral aneurysms

    PubMed Central

    Villa-Uriol, M. C.; Berti, G.; Hose, D. R.; Marzo, A.; Chiarini, A.; Penrose, J.; Pozo, J.; Schmidt, J. G.; Singh, P.; Lycett, R.; Larrabide, I.; Frangi, A. F.

    2011-01-01

    Cerebral aneurysms are a multi-factorial disease with severe consequences. A core part of the European project @neurIST was the physical characterization of aneurysms to find candidate risk factors associated with aneurysm rupture. The project investigated measures based on morphological, haemodynamic and aneurysm wall structure analyses for more than 300 cases of ruptured and unruptured aneurysms, extracting descriptors suitable for statistical studies. This paper deals with the unique challenges associated with this task, and the implemented solutions. The consistency of results required by the subsequent statistical analyses, given the heterogeneous image data sources and multiple human operators, was met by a highly automated toolchain combined with training. A testimonial of the successful automation is the positive evaluation of the toolchain by over 260 clinicians during various hands-on workshops. The specification of the analyses required thorough investigations of modelling and processing choices, discussed in a detailed analysis protocol. Finally, an abstract data model governing the management of the simulation-related data provides a framework for data provenance and supports future use of data and toolchain. This is achieved by enabling the easy modification of the modelling approaches and solution details through abstract problem descriptions, removing the need of repetition of manual processing work. PMID:22670202

  2. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    PubMed Central

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  3. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment.

    PubMed

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye

    2016-06-07

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  4. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-06-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  5. Rapid prototyping 3D virtual world interfaces within a virtual factory environment

    NASA Technical Reports Server (NTRS)

    Kosta, Charles Paul; Krolak, Patrick D.

    1993-01-01

    On-going work into user requirements analysis using CLIPS (NASA/JSC) expert systems as an intelligent event simulator has led to research into three-dimensional (3D) interfaces. Previous work involved CLIPS and two-dimensional (2D) models. Integral to this work was the development of the University of Massachusetts Lowell parallel version of CLIPS, called PCLIPS. This allowed us to create both a Software Bus and a group problem-solving environment for expert systems development. By shifting the PCLIPS paradigm to use the VEOS messaging protocol we have merged VEOS (HlTL/Seattle) and CLIPS into a distributed virtual worlds prototyping environment (VCLIPS). VCLIPS uses the VEOS protocol layer to allow multiple experts to cooperate on a single problem. We have begun to look at the control of a virtual factory. In the virtual factory there are actors and objects as found in our Lincoln Logs Factory of the Future project. In this artificial reality architecture there are three VCLIPS entities in action. One entity is responsible for display and user events in the 3D virtual world. Another is responsible for either simulating the virtual factory or communicating with the real factory. The third is a user interface expert. The interface expert maps user input levels, within the current prototype, to control information for the factory. The interface to the virtual factory is based on a camera paradigm. The graphics subsystem generates camera views of the factory on standard X-Window displays. The camera allows for view control and object control. Control or the factory is accomplished by the user reaching into the camera views to perform object interactions. All communication between the separate CLIPS expert systems is done through VEOS.

  6. Engineering biological systems using automated biofoundries.

    PubMed

    Chao, Ran; Mishra, Shekhar; Si, Tong; Zhao, Huimin

    2017-07-01

    Engineered biological systems such as genetic circuits and microbial cell factories have promised to solve many challenges in the modern society. However, the artisanal processes of research and development are slow, expensive, and inconsistent, representing a major obstacle in biotechnology and bioengineering. In recent years, biological foundries or biofoundries have been developed to automate design-build-test engineering cycles in an effort to accelerate these processes. This review summarizes the enabling technologies for such biofoundries as well as their early successes and remaining challenges. Copyright © 2017 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  7. Automated multiplex genome-scale engineering in yeast

    PubMed Central

    Si, Tong; Chao, Ran; Min, Yuhao; Wu, Yuying; Ren, Wen; Zhao, Huimin

    2017-01-01

    Genome-scale engineering is indispensable in understanding and engineering microorganisms, but the current tools are mainly limited to bacterial systems. Here we report an automated platform for multiplex genome-scale engineering in Saccharomyces cerevisiae, an important eukaryotic model and widely used microbial cell factory. Standardized genetic parts encoding overexpression and knockdown mutations of >90% yeast genes are created in a single step from a full-length cDNA library. With the aid of CRISPR-Cas, these genetic parts are iteratively integrated into the repetitive genomic sequences in a modular manner using robotic automation. This system allows functional mapping and multiplex optimization on a genome scale for diverse phenotypes including cellulase expression, isobutanol production, glycerol utilization and acetic acid tolerance, and may greatly accelerate future genome-scale engineering endeavours in yeast. PMID:28469255

  8. VizieR Online Data Catalog: Absorption velocities for 21 super-luminous SNe Ic (Liu+, 2017)

    NASA Astrophysics Data System (ADS)

    Liu, Y.-Q.; Modjaz, M.; Bianco, F. B.

    2018-04-01

    We have collected the spectra of all available super-luminous supernovae (SLSNe) Ic that have a date of maximum light published before April of 2016. These SLSNe Ic were mainly discovered and observed by the All-Sky Automated Survey for Supernovae (ASAS-SN), the Catalina Real-Time Transient Survey, the Dark Energy Survey (DES), the Hubble Space Telescope Cluster Supernova Survey, the Pan-STARRS1 Medium Deep Survey (PS1), the Public ESO Spectroscopic Survey of Transient Objects (PESSTO), the Intermediate Palomar Transient Factory (iPTF) as well as the Palomar Transient Factory (PTF), and the Supernova Legacy Survey (SNLS). See table 1. (2 data files).

  9. Pion Production for Neutrino Factory-challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breton, Florian; Le Couedic, Clement; Soler, F. J. P.

    2011-10-06

    One of the key issues in the design of a Neutrino Factory target station is the determination of the optimum kinetic energy of the proton beam due to the large uncertainties in simulations of protons impinging on nuclear targets. In this paper we have developed a procedure to correct GEANT4 simulations for the HARP data, and we have determined the yield of muons expected at the front-end of a Neutrino Factory as a function of target material (Be, C, Al, Ta and Pb) and energy (3-12 GeV).The maximum muon yield is found between 5 and 8 GeV for high Zmore » targets and 3 GeV for low Z targets.« less

  10. Impact of Simulation Technology on Die and Stamping Business

    NASA Astrophysics Data System (ADS)

    Stevens, Mark W.

    2005-08-01

    Over the last ten years, we have seen an explosion in the use of simulation-based techniques to improve the engineering, construction, and operation of GM production tools. The impact has been as profound as the overall switch to CAD/CAM from the old manual design and construction methods. The changeover to N/C machining from duplicating milling machines brought advances in accuracy and speed to our construction activity. It also brought significant reductions in fitting sculptured surfaces. Changing over to CAD design brought similar advances in accuracy, and today's use of solid modeling has enhanced that accuracy gain while finally leading to the reduction in lead time and cost through the development of parametric techniques. Elimination of paper drawings for die design, along with the process of blueprinting and distribution, provided the savings required to install high capacity computer servers, high-speed data transmission lines and integrated networks. These historic changes in the application of CAE technology in manufacturing engineering paved the way for the implementation of simulation to all aspects of our business. The benefits are being realized now, and the future holds even greater promise as the simulation techniques mature and expand. Every new line of dies is verified prior to casting for interference free operation. Sheet metal forming simulation validates the material flow, eliminating the high costs of physical experimentation dependent on trial and error methods of the past. Integrated forming simulation and die structural analysis and optimization has led to a reduction in die size and weight on the order of 30% or more. The latest techniques in factory simulation enable analysis of automated press lines, including all stamping operations with corresponding automation. This leads to manufacturing lines capable of running at higher levels of throughput, with actual results providing the capability of two or more additional strokes per minute. As we spread these simulation techniques to the balance of our business, from blank de-stacking to the racking of parts, we anticipate continued reduction in lead-time and engineering expense while improving quality and start-up execution. The author will provide an overview of technology and business evolution of the math-based process that brought an historical transition and revitalization to the die and stamping industry in the past decade. Finally, the author will give an outlook for future business needs and technology development directions.

  11. Putting Automated Visual Inspection Systems To Work On The Factory Floor: What's Missing?

    NASA Astrophysics Data System (ADS)

    Waltz, Frederick M.; Snyder, Michael A.; Batchelor, Bruce G.

    1990-02-01

    Machine vision systems and other automated visual inspection (AVI) systems have been proving their usefulness in factories for more than a decade. In spite of this, the number of installed systems is far below the number that could profitably be employed. In the opinion of the authors, the primary reason for this is the high cost of customizing vision systems to meet applications requirements. A three-part approach to this problem has proven to be useful: 1. A multi-phase paradigm for customer interaction, system specification, system development, and system installation; 2. A powerful and easy-to-use system development environment, including a a flexible laboratory lighting setup, plus software-based tools to assist in the design of image acquisition systems, b. an image processing environment with a very large repertoire of image processing and feature extraction operations and an easy-to-use command interpreter having macro capabilities, and c. an image analysis environment with high-level constructs, a flexible and powerful syntax, and a "seamless" interface to the image processing level; and 3. A moderately-priced high-speed "target" system fully compatible with the development environment, so that algorithms developed thereon can be transferred directly to the factory environment without further development costs or reprogramming. Items 1 and 2 are covered in other papers1,23,4,5 and are touched on here only briefly. Item 3 is the main subject of this paper. Our major motivation in presenting this paper is to offer suggestions to vendors developing commercial boards and systems, in hopes that the special needs of industrial inspection can be met.

  12. USSR Report, Chemistry.

    DTIC Science & Technology

    1985-01-17

    potas- sium oxides. Only then does the mixture react to form ammonia . A method for synthesizing ammonium from nitrogen and hydrogen, along with a...manufactured by this method , most of which is used in the synthesis of nitrogen fertilizers. A modern ammonia factory is a complex, highly automated...V. Karyakin; ZHURNAL ANALITICHESKOY KHIMII, No 8, Aug 84) 5 CATALYSTS Ammonia Synthesis and Homogenous Catalysts (0. Yefimov; LENINSKOYE ZNAMYA

  13. Optimization-based Approach to Cross-layer Resource Management in Wireless Networked Control Systems

    DTIC Science & Technology

    2013-05-01

    interest from both academia and industry [37], finding applications in un- manned robotic vehicles, automated highways and factories, smart homes and...is stable when the scaler varies slowly. The algorithm is further extended to utilize the slack resource in the network, which leads to the...model . . . . . . . . . . . . . . . . 66 Optimal sampling rate allocation formulation . . . . . 67 Price-based algorithm

  14. A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots.

    PubMed

    Sherwin, Tyrone; Easte, Mikala; Chen, Andrew Tzer-Yeu; Wang, Kevin I-Kai; Dai, Wenbin

    2018-02-14

    Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system.

  15. A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots

    PubMed Central

    Sherwin, Tyrone; Easte, Mikala; Wang, Kevin I-Kai; Dai, Wenbin

    2018-01-01

    Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system. PMID:29443906

  16. Automation Applications in an Advanced Air Traffic Management System : Volume 5B. DELTA Simulation Model - Programmer's Guide.

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 5 describes the DELTA Simulation Model. It includes all documentation of the DELTA (Determine Effective Levels of Task Automation) computer simulation developed by TRW for use in the Automation Applications Study. Volume 5A includes a user's m...

  17. Transition to manual: Comparing simulator with on-road control transitions.

    PubMed

    Eriksson, A; Banks, V A; Stanton, N A

    2017-05-01

    Whilst previous research has explored how driver behaviour in simulators may transfer to the open road, there has been relatively little research showing the same transfer within the field of driving automation. As a consequence, most research into human-automation interaction has primarily been carried out in a research laboratory or on closed-circuit test tracks. The aim of this study was to assess whether research into non-critical control transactions in highly automated vehicles performed in driving simulators correlate with road driving conditions. Twenty six drivers drove a highway scenario using an automated driving mode in the simulator and twelve drivers drove on a public motorway in a Tesla Model S with the Autopilot activated. Drivers were asked to relinquish, or resume control from the automation when prompted by the vehicle interface in both the simulator and on road condition. Drivers were generally faster to resume control in the on-road driving condition. However, strong positive correlations were found between the simulator and on road driving conditions for drivers transferring control to and from automation. No significant differences were found with regard to workload, perceived usefulness and satisfaction between the simulator and on-road drives. The results indicate high levels of relative validity of driving simulators as a research tool for automated driving research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. The Clone Factory

    ERIC Educational Resources Information Center

    Stoddard, Beryl

    2005-01-01

    Have humans been cloned? Is it possible? Immediate interest is sparked when students are asked these questions. In response to their curiosity, the clone factory activity was developed to help them understand the process of cloning. In this activity, students reenact the cloning process, in a very simplified simulation. After completing the…

  19. Education as Simulation Game: A Critical Hermeneutic.

    ERIC Educational Resources Information Center

    Palermo, James

    1979-01-01

    This paper examines a specific educational game called "Popcorn Factory." First, it gives a detailed description of the game, then shifts the description into a critical hermeneutical framework, analyzing the deep structures at work in the "Popcorn Factory" according to the theories of Freud and Marcuse. (Author/SJL)

  20. Aft segment dome-to-stiffener factory joint insulation void elimination

    NASA Technical Reports Server (NTRS)

    Jensen, S. K.

    1991-01-01

    Since the detection of voids in the internal insulation of the dome-to-stiffener factory joint of the 15B aft segment, all aft segment dome-to-stiffener factory joints were x-rated and all were found to contain voids. Using a full-scale process simulation article (PSA), the objective was to demonstrate that the proposed changes in the insulation layup and vacuum bagging processes will greatly reduce or eliminate voids without adversely affecting the configuration of performance of the insulation which serves as a primary seal over the factory joint. The PSA-8 aft segment was insulated and cured using standard production processes.

  1. Improvement of productivity in low volume production industry layout by using witness simulation software

    NASA Astrophysics Data System (ADS)

    Jaffrey, V.; Mohamed, N. M. Z. N.; Rose, A. N. M.

    2017-10-01

    In almost all manufacturing industry, increased productivity and better efficiency of the production line are the most important goals. Most factories especially small scale factory has less awareness of manufacturing system optimization and lack of knowledge about it and uses the traditional way of management. Problems that are commonly identified in the factory are a high idle time of labour and also small production. This study is done in a Small and Medium Enterprises (SME) low volume production company. Data collection and problems affecting productivity and efficiency are identified. In this study, Witness simulation software is being used to simulate the layout and the output is focusing on the improvement of layout in terms of productivity and efficiency. In this study, the layout is rearranged by reducing the travel time from a workstation to another workstation. Then, the improved layout is modelled and the machine and labour statistic of both, original and improved layout is taken. Productivity and efficiency are calculated for both layout and then being compared.

  2. iPTF report of bright transients

    NASA Astrophysics Data System (ADS)

    Cannella, Chris; Kuesters, Daniel; Ferretti, Raphael; Blagorodnova, Nadejda; Adams, Scott; Kupfer, Thomas; Neill, James D.; Walters, Richard; Yan, Lin; Kulkarni, Shri

    2017-02-01

    The intermediate Palomar Transient Factory (iPTF; ATel #4807) reports the following bright ( Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R), and RB5 (Wozniak et al. 2013AAS...22143105W).

  3. Korean Affairs Report No. 286

    DTIC Science & Technology

    1983-06-08

    will attempt to destroy factories and industrial estab- lishments in Pusan, Masan, Taegu, Ulsan, Pohang, Kwangju and other major cities, it said... industry is expected to be intensified in the months to come with the Daewoo Electronic Co developing many new models since it took over the electric...The projected optical fiber communications systems will mark a milestone in the annals of the Korean electronic industry because automation in the

  4. Muon Accelerator Program (MAP) | Neutrino Factory | Research Goals

    Science.gov Websites

    ; Committees Research Goals Research & Development Design & Simulation Technology Development Systems Demonstrations Activities MASS Muon Cooling MuCool Test Area MICE Experiment MERIT Muon Collider Research Goals Why Muons at the Energy Frontier? How does it work? Graphics Animation Neutrino Factory Research Goals

  5. Virtual Factory Framework for Supporting Production Planning and Control.

    PubMed

    Kibira, Deogratias; Shao, Guodong

    2017-01-01

    Developing optimal production plans for smart manufacturing systems is challenging because shop floor events change dynamically. A virtual factory incorporating engineering tools, simulation, and optimization generates and communicates performance data to guide wise decision making for different control levels. This paper describes such a platform specifically for production planning. We also discuss verification and validation of the constituent models. A case study of a machine shop is used to demonstrate data generation for production planning in a virtual factory.

  6. DNA synthesis security.

    PubMed

    Nouri, Ali; Chyba, Christopher F

    2012-01-01

    It is generally assumed that genetic engineering advances will, inevitably, facilitate the misapplication of biotechnology toward the production of biological weapons. Unexpectedly, however, some of these very advances in the areas of DNA synthesis and sequencing may enable the implementation of automated and nonintrusive safeguards to avert the illicit applications of biotechnology. In the case of DNA synthesis, automated DNA screening tools could be built into DNA synthesizers in order to block the synthesis of hazardous agents. In addition, a comprehensive safety and security regime for dual-use genetic engineering research could include nonintrusive monitoring of DNA sequencing. This is increasingly feasible as laboratories outsource this service to just a few centralized sequencing factories. The adoption of automated, nonintrusive monitoring and surveillance of the DNA synthesis and sequencing pipelines may avert many risks associated with dual-use biotechnology. Here, we describe the historical background and current challenges associated with dual-use biotechnologies and propose strategies to address these challenges.

  7. Development and Validation of an Automated Simulation Capability in Support of Integrated Demand Management

    NASA Technical Reports Server (NTRS)

    Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh

    2017-01-01

    Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies that will inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. This paper also describes the initial validation of individual components of the automated simulation capability, and an example application comparing the performance of the IDM concept under two TBFM scheduling paradigms. The results and conclusions from this simulation compare closely to those from previous HITL simulations using similar scenarios, providing an initial validation of the automated simulation capability.

  8. An ODE-Based Wall Model for Turbulent Flow Simulations

    NASA Technical Reports Server (NTRS)

    Berger, Marsha J.; Aftosmis, Michael J.

    2017-01-01

    Fully automated meshing for Reynolds-Averaged Navier-Stokes Simulations, Mesh generation for complex geometry continues to be the biggest bottleneck in the RANS simulation process; Fully automated Cartesian methods routinely used for inviscid simulations about arbitrarily complex geometry; These methods lack of an obvious & robust way to achieve near wall anisotropy; Goal: Extend these methods for RANS simulation without sacrificing automation, at an affordable cost; Note: Nothing here is limited to Cartesian methods, and much becomes simpler in a body-fitted setting.

  9. CIM at GE's factory of the future

    NASA Astrophysics Data System (ADS)

    Waldman, H.

    Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.

  10. Impact of synthetic biology and metabolic engineering on industrial production of fine chemicals.

    PubMed

    Jullesson, David; David, Florian; Pfleger, Brian; Nielsen, Jens

    2015-11-15

    Industrial bio-processes for fine chemical production are increasingly relying on cell factories developed through metabolic engineering and synthetic biology. The use of high throughput techniques and automation for the design of cell factories, and especially platform strains, has played an important role in the transition from laboratory research to industrial production. Model organisms such as Saccharomyces cerevisiae and Escherichia coli remain widely used host strains for industrial production due to their robust and desirable traits. This review describes some of the bio-based fine chemicals that have reached the market, key metabolic engineering tools that have allowed this to happen and some of the companies that are currently utilizing these technologies for developing industrial production processes. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Development of sample exchange robot PAM-HC for beamline BL-1A at the photon factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiraki, Masahiko, E-mail: masahiko.hiraki@kek.jp; Department of Accelerator Science, SOKENDAI; Matsugaki, Naohiro

    A macromolecular crystallography beamline, BL-1A, has been built at the Photon Factory (PF) for low energy experiments and has been operational since 2010. We have installed a sample exchange robot, PAM (PF Automated Mounting system), similar to other macromolecular crystallography beamlines. However, following the installation of a helium chamber to reduce the absorption of the diffraction signal by air, we developed a new sample exchange robot to replace PAM. The new robot, named PAM-HC (Helium Chamber), is designed with the goal of minimizing leakage of helium gas from the chamber. Here, the PAM-HC hardware and the flow of its movementmore » are described. Furthermore, measurements of temperature changes during sample exchange are presented in this paper.« less

  12. Mission simulation as an approach to develop requirements for automation in Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)

    1996-01-01

    This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.

  13. The study design elements employed by researchers in preclinical animal experiments from two research domains and implications for automation of systematic reviews.

    PubMed

    O'Connor, Annette M; Totton, Sarah C; Cullen, Jonah N; Ramezani, Mahmood; Kalivarapu, Vijay; Yuan, Chaohui; Gilbert, Stephen B

    2018-01-01

    Systematic reviews are increasingly using data from preclinical animal experiments in evidence networks. Further, there are ever-increasing efforts to automate aspects of the systematic review process. When assessing systematic bias and unit-of-analysis errors in preclinical experiments, it is critical to understand the study design elements employed by investigators. Such information can also inform prioritization of automation efforts that allow the identification of the most common issues. The aim of this study was to identify the design elements used by investigators in preclinical research in order to inform unique aspects of assessment of bias and error in preclinical research. Using 100 preclinical experiments each related to brain trauma and toxicology, we assessed design elements described by the investigators. We evaluated Methods and Materials sections of reports for descriptions of the following design elements: 1) use of comparison group, 2) unit of allocation of the interventions to study units, 3) arrangement of factors, 4) method of factor allocation to study units, 5) concealment of the factors during allocation and outcome assessment, 6) independence of study units, and 7) nature of factors. Many investigators reported using design elements that suggested the potential for unit-of-analysis errors, i.e., descriptions of repeated measurements of the outcome (94/200) and descriptions of potential for pseudo-replication (99/200). Use of complex factor arrangements was common, with 112 experiments using some form of factorial design (complete, incomplete or split-plot-like). In the toxicology dataset, 20 of the 100 experiments appeared to use a split-plot-like design, although no investigators used this term. The common use of repeated measures and factorial designs means understanding bias and error in preclinical experimental design might require greater expertise than simple parallel designs. Similarly, use of complex factor arrangements creates novel challenges for accurate automation of data extraction and bias and error assessment in preclinical experiments.

  14. Evolution of solid rocket booster component testing

    NASA Technical Reports Server (NTRS)

    Lessey, Joseph A.

    1989-01-01

    The evolution of one of the new generation of test sets developed for the Solid Rocket Booster of the U.S. Space Transportation System. Requirements leading to factory checkout of the test set are explained, including the evolution from manual to semiautomated toward fully automated status. Individual improvements in the built-in test equipment, self-calibration, and software flexibility are addressed, and the insertion of fault detection to improve reliability is discussed.

  15. Development and Validation of a Low Cost, Flexible, Open Source Robot for Use as a Teaching and Research Tool across the Educational Spectrum

    ERIC Educational Resources Information Center

    Howell, Abraham L.

    2012-01-01

    In the high tech factories of today robots can be used to perform various tasks that span a wide spectrum that encompasses the act of performing high-speed, automated assembly of cell phones, laptops and other electronic devices to the compounding, filling, packaging and distribution of life-saving pharmaceuticals. As robot usage continues to…

  16. JPRS report: Science and Technology. Europe and Latin America

    NASA Astrophysics Data System (ADS)

    1988-01-01

    Articles from the popular and trade press are included on the following subjects: advanced materials, aerospace industry, automotive industry, biotechnology, computers, factory automation and robotics, microelectronics, and science and technology policy. The aerospace articles discuss briefly and in a nontechnical way the SAGEM bubble memories for space applications, Ariane V new testing facilities, innovative technologies of TDF-1 satellite, and the restructuring of the Aviation Division at France's Aerospatiale.

  17. Permethrin-Treated Clothing as Protection against the Dengue Vector, Aedes aegypti: Extent and Duration of Protection

    PubMed Central

    DeRaedt Banks, Sarah; Orsborne, James; Gezan, Salvador A.; Kaur, Harparkash; Wilder-Smith, Annelies; Lindsey, Steve W.; Logan, James G.

    2015-01-01

    Introduction Dengue transmission by the mosquito vector, Aedes aegypti, occurs indoors and outdoors during the day. Personal protection of individuals, particularly when outside, is challenging. Here we assess the efficacy and durability of different types of insecticide-treated clothing on laboratory-reared Ae. aegypti. Methods Standardised World Health Organisation Pesticide Evaluation Scheme (WHOPES) cone tests and arm-in-cage assays were used to assess knockdown (KD) and mortality of Ae. aegypti tested against factory-treated fabric, home-dipped fabric and microencapsulated fabric. Based on the testing of these three different treatment types, the most protective was selected for further analysis using arm-in cage assays with the effect of washing, ultra-violet light, and ironing investigated using high pressure liquid chromatography. Results Efficacy varied between the microencapsulated and factory dipped fabrics in cone testing. Factory-dipped clothing showed the greatest effect on KD (3 min 38.1%; 1 hour 96.5%) and mortality (97.1%) with no significant difference between this and the factory dipped school uniforms. Factory-dipped clothing was therefore selected for further testing. Factory dipped clothing provided 59% (95% CI = 49.2%– 66.9%) reduction in landing and a 100% reduction in biting in arm-in-cage tests. Washing duration and technique had a significant effect, with insecticidal longevity shown to be greater with machine washing (LW50 = 33.4) compared to simulated hand washing (LW50 = 17.6). Ironing significantly reduced permethrin content after 1 week of simulated use, with a 96.7% decrease after 3 months although UV exposure did not reduce permethrin content within clothing significantly after 3 months simulated use. Conclusion Permethrin-treated clothing may be a promising intervention in reducing dengue transmission. However, our findings also suggest that clothing may provide only short-term protection due to the effect of washing and ironing, highlighting the need for improved fabric treatment techniques. PMID:26440967

  18. Permethrin-Treated Clothing as Protection against the Dengue Vector, Aedes aegypti: Extent and Duration of Protection.

    PubMed

    DeRaedt Banks, Sarah; Orsborne, James; Gezan, Salvador A; Kaur, Harparkash; Wilder-Smith, Annelies; Lindsey, Steve W; Logan, James G

    2015-01-01

    Dengue transmission by the mosquito vector, Aedes aegypti, occurs indoors and outdoors during the day. Personal protection of individuals, particularly when outside, is challenging. Here we assess the efficacy and durability of different types of insecticide-treated clothing on laboratory-reared Ae. aegypti. Standardised World Health Organisation Pesticide Evaluation Scheme (WHOPES) cone tests and arm-in-cage assays were used to assess knockdown (KD) and mortality of Ae. aegypti tested against factory-treated fabric, home-dipped fabric and microencapsulated fabric. Based on the testing of these three different treatment types, the most protective was selected for further analysis using arm-in cage assays with the effect of washing, ultra-violet light, and ironing investigated using high pressure liquid chromatography. Efficacy varied between the microencapsulated and factory dipped fabrics in cone testing. Factory-dipped clothing showed the greatest effect on KD (3 min 38.1%; 1 hour 96.5%) and mortality (97.1%) with no significant difference between this and the factory dipped school uniforms. Factory-dipped clothing was therefore selected for further testing. Factory dipped clothing provided 59% (95% CI = 49.2%- 66.9%) reduction in landing and a 100% reduction in biting in arm-in-cage tests. Washing duration and technique had a significant effect, with insecticidal longevity shown to be greater with machine washing (LW50 = 33.4) compared to simulated hand washing (LW50 = 17.6). Ironing significantly reduced permethrin content after 1 week of simulated use, with a 96.7% decrease after 3 months although UV exposure did not reduce permethrin content within clothing significantly after 3 months simulated use. Permethrin-treated clothing may be a promising intervention in reducing dengue transmission. However, our findings also suggest that clothing may provide only short-term protection due to the effect of washing and ironing, highlighting the need for improved fabric treatment techniques.

  19. Microelectronics Revolution And The Impact Of Automation In The New Industrialized Countries

    NASA Astrophysics Data System (ADS)

    Baranauskas, Vitor

    1984-08-01

    A brief review of some important historical points on the origin of the Factories and the Industrial Revolution is presented with emphasis in the social problems related to the automation of the human labor. Until the World War I, the social changes provoked by the Industrial Revolution caused one division of the World in developed and underdeveloped countries. After that period, the less developed nations began their industrialization mainly through the Multinationals Corporations (MC). These enterprises were very important to the production and exportation of utilities and manufactures in general, mainly in those products which required intensive and direct human labor. At present time, with the pervasiveness of microelectronics in the automation, this age seems to reaching an end because all continous processes in industry tend economicaly toward total automation. This fact will cause a retraction in long-term investments and, beyond massive unemployment, there is a tendency for these MC industries to return to their original countries. The most promising alternative to avoid these events, and perhaps the unique, is to incentive an autonomous development in areas of high technology, as for instance, the microelectronics itself.

  20. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  1. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  2. Applying Semantic Web Services and Wireless Sensor Networks for System Integration

    NASA Astrophysics Data System (ADS)

    Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente

    In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.

  3. Advances in Composites Technology

    NASA Technical Reports Server (NTRS)

    Tenney, D. R.; Dexter, H. B.

    1985-01-01

    A significant level of research is currently focused on the development of tough resins and high strain fibers in an effort to gain improved damage tolerance. Moderate success has been achieved with the development of new resins such as PEEK and additional improvements look promising with new thermoplastic resins. Development of innovative material forms such as 2-D and 3-D woven fabrics and braided structural subelements is also expected to improve damage tolerance and durability of composite hardware. The new thrust in composites is to develop low cost manufacturing and design concepts to lower the cost of composite hardware. Processes being examined include automated material placement, filament winding, pultrusion, and thermoforming. The factory of the future will likely incorporate extensive automation in all aspects of manufacturing composite components.

  4. Automated Decomposition of Model-based Learning Problems

    NASA Technical Reports Server (NTRS)

    Williams, Brian C.; Millar, Bill

    1996-01-01

    A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.

  5. Toroidal magnetized iron neutrino detector for a neutrino factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bross, A.; Wands, R.; Bayes, R.

    2013-08-01

    A neutrino factory has unparalleled physics reach for the discovery and measurement of CP violation in the neutrino sector. A far detector for a neutrino factory must have good charge identification with excellent background rejection and a large mass. An elegant solution is to construct a magnetized iron neutrino detector (MIND) along the lines of MINOS, where iron plates provide a toroidal magnetic field and scintillator planes provide 3D space points. In this report, the current status of a simulation of a toroidal MIND for a neutrino factory is discussed in light of the recent measurements of largemore » $$\\theta_{13}$$. The response and performance using the 10 GeV neutrino factory configuration are presented. It is shown that this setup has equivalent $$\\delta_{CP}$$ reach to a MIND with a dipole field and is sensitive to the discovery of CP violation over 85% of the values of $$\\delta_{CP}$$.« less

  6. Planning for the semiconductor manufacturer of the future

    NASA Technical Reports Server (NTRS)

    Fargher, Hugh E.; Smith, Richard A.

    1992-01-01

    Texas Instruments (TI) is currently contracted by the Air Force Wright Laboratory and the Defense Advanced Research Projects Agency (DARPA) to develop the next generation flexible semiconductor wafer fabrication system called Microelectronics Manufacturing Science & Technology (MMST). Several revolutionary concepts are being pioneered on MMST, including the following: new single-wafer rapid thermal processes, in-situ sensors, cluster equipment, and advanced Computer Integrated Manufacturing (CIM) software. The objective of the project is to develop a manufacturing system capable of achieving an order of magnitude improvement in almost all aspects of wafer fabrication. TI was awarded the contract in Oct., 1988, and will complete development with a fabrication facility demonstration in April, 1993. An important part of MMST is development of the CIM environment responsible for coordinating all parts of the system. The CIM architecture being developed is based on a distributed object oriented framework made of several cooperating subsystems. The software subsystems include the following: process control for dynamic control of factory processes; modular processing system for controlling the processing equipment; generic equipment model which provides an interface between processing equipment and the rest of the factory; specification system which maintains factory documents and product specifications; simulator for modelling the factory for analysis purposes; scheduler for scheduling work on the factory floor; and the planner for planning and monitoring of orders within the factory. This paper first outlines the division of responsibility between the planner, scheduler, and simulator subsystems. It then describes the approach to incremental planning and the way in which uncertainty is modelled within the plan representation. Finally, current status and initial results are described.

  7. Factorial versus multi-arm multi-stage designs for clinical trials with multiple treatments.

    PubMed

    Jaki, Thomas; Vasileiou, Despina

    2017-02-20

    When several treatments are available for evaluation in a clinical trial, different design options are available. We compare multi-arm multi-stage with factorial designs, and in particular, we will consider a 2 × 2 factorial design, where groups of patients will either take treatments A, B, both or neither. We investigate the performance and characteristics of both types of designs under different scenarios and compare them using both theory and simulations. For the factorial designs, we construct appropriate test statistics to test the hypothesis of no treatment effect against the control group with overall control of the type I error. We study the effect of the choice of the allocation ratios on the critical value and sample size requirements for a target power. We also study how the possibility of an interaction between the two treatments A and B affects type I and type II errors when testing for significance of each of the treatment effects. We present both simulation results and a case study on an osteoarthritis clinical trial. We discover that in an optimal factorial design in terms of minimising the associated critical value, the corresponding allocation ratios differ substantially to those of a balanced design. We also find evidence of potentially big losses in power in factorial designs for moderate deviations from the study design assumptions and little gain compared with multi-arm multi-stage designs when the assumptions hold. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  8. Comparison of occupational exposure to carbon disulphide in a viscose rayon factory before and after technical adjustments.

    PubMed

    Bulat, Petar; Daemen, Edgard; Van Risseghem, Marleen; De Bacquer, Dirk; Tan, Xiaodong; Braeckman, Lutgart; Vanhoorne, Michel

    2002-01-01

    The objective of this follow-up study was to verify the efficacy of the technical adjustments gradually introduced in departments of a viscose rayon factory from 1989 onward. Personal exposure to carbon disulphide was assessed by means of personal monitoring through active sampling. Six job titles in three departments of the factory were sampled. Geometric means were calculated and used as estimates of time-weighted average (TWA) concentrations. The results from the present study were compared with similar measurements from a previous study in the same factory. Due to organizational changes, only three job titles (spinner, first spinner, and viscose preparator) could be compared directly. Two new job titles were identified, although tasks performed in these two job titles already existed. The measurements from one job title could not be compared, due to a substantial reorganization and automation of the tasks carried out in the department. The comparison before and after technical improvements shows that personal exposure of spinner and first spinner has been substantially reduced. Even the geometric means of measurements outside the fresh air mask are below the TWA-TLV (Threshold Limit Value). Despite the difficulties in comparing the results from the two studies, it is concluded that the technical measures reduced up to tenfold personal exposure to carbon disulphide and personal protection reduced it further by a factor two.

  9. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Papadogiannakis, S.; Taddia, F.; Petrushevska, T.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Nir, G.; Cao, Y.; Blagorodnova, N.; Kulkarni, S.

    2016-05-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artefacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  10. iPTF Discoveries of Recent Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Taddia, F.; Ferretti, R.; Papadogiannakis, S.; Petrushevska, T.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Hangard, L.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bar, I.; Cao, Y.; Kulkarni, S.; Blagorodnova, N.

    2016-05-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following core-collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  11. JPRS report: Science and technology. Europe and Latin America

    NASA Astrophysics Data System (ADS)

    1988-01-01

    Articles from the popular and trade press of Western Europe and Latin America are presented on advanced materials, aerospace and civial aviation, computers, defense industries, factory automation and robotics, lasers, senors, optics microelectronics, science and technology policy, biotechnology, marine technology, and nuclear developments. The aerospace articles include an overview of Austrian space activities and plans and a report on a panel of West German experts recommending against self-sufficiency for the Airbus.

  12. European Science Notes. Volume 39, Number 11.

    DTIC Science & Technology

    1985-11-01

    bacteria is factory by means of a mixed culture of being studied under continuous flow con- bacteria. ditions, especially as it depends on solar radiation...of 300 million operations per contraction chromium layers are plated second and occupying less than a cubic from aqueous solutions by an automated...small as 10 a in diame- tics of additives is needed to determine ter, allowing measurement in layers as their effects on microbial growth. thin as 20

  13. iPTF Discoveries of Recent Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Hangard, L.; De Cia, A.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Sagiv, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Bilgi, P.

    2015-04-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Core-Collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  14. iPTF Discoveries of Recent Type Ia Supernova

    NASA Astrophysics Data System (ADS)

    Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.; Andreoni, I.

    2015-10-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  15. iPTF Discoveries of Recent SNe Ia

    NASA Astrophysics Data System (ADS)

    Ferretti, R.; Fremling, C.; Johansson, J.; Karamehmetoglu, E.; Migotto, K.; Nyholm, A.; Papadogiannakis, S.; Taddia, F.; Petrushevska, T.; Roy, R.; Ben-Ami, S.; De Cia, A.; Dzigan, Y.; Horesh, A.; Khazov, D.; Manulis, I.; Rubin, A.; Sagiv, I.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.

    2015-02-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  16. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Papadogiannakis, S.; Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Petrushevska, T.; Nyholm, A.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Lunnan, R.; Blagorodnova, N.; Cao, Y.; Cenk, S. B.

    2016-01-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  17. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Taddia, F.; Horesh, A.; Khazov, D.; Knezevic, S.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Cao, Y.; Duggan, G.; Lunnan, R.; Blagorodnova, N.

    2015-11-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  18. iPTF Discoveries of Recent Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Duggan, G.; Lunnan, R.; Cao, Y.

    2015-09-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Core-Collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  19. iPTF Discovery of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Hangard, L.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Bar, I.; Horesh, A.; Johansson, J.; Khazov, D.; Knezevic, S.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Cao, Y.; Kulkarni, S.; Lunnan, R.; Ravi, V.; Vedantham, H. K.; Yan, L.

    2016-04-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  20. iPTF Discoveries of Recent Core-Collapse Supernovae

    NASA Astrophysics Data System (ADS)

    Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Lunnan, R.; Cao, Y.; Miller, A.

    2015-11-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Core-Collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  1. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.

    2016-02-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  2. iPTF Discovery of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Hangard, L.; Taddia, F.; Ferretti, R.; Papadogiannakis, S.; Petrushevska, T.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bar, I.; Lunnan, R.; Cenk, S. B.

    2016-02-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  3. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Papadogiannakis, S.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Ferretti, R.; Petrushevska, T.; Roy, R.; Taddia, F.; Bar, I.; Horesh, A.; Johansson, J.; Knezevic, S.; Leloudas, G.; Manulis, I.; Nir, G.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Arcavi, I.; Howell, D. A.; McCully, C.; Hosseinzadeh, G.; Valenti, S.; Blagorodnova, N.; Cao, Y.; Duggan, G.; Ravi, V.; Lunnan, R.

    2016-03-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  4. iPTF discoveries of recent type Ia supernovae

    NASA Astrophysics Data System (ADS)

    Papadogiannakis, S.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Petrushevska, T.; Roy, R.; De Cia, A.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Sagiv, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Cenko, S. B.; Capone, J.; Bartakk, M.

    2015-09-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  5. iPTF Discovery of Recent Type Ia Supernova

    NASA Astrophysics Data System (ADS)

    Hangard, L.; Petrushevska, T.; Papadogiannakis, S.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Kasliwal, M.

    2015-10-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  6. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.; Neill, J. D.; Walters, R.

    2016-04-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  7. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Papadogiannakis, S.; Taddia, F.; Petrushevska, T.; Fremling, C.; Hangard, L.; Johansson, J.; Karamehmetoglu, E.; Migotto, K.; Nyholm, A.; Roy, R.; Ben-Ami, S.; De Cia, A.; Dzigan, Y.; Horesh, A.; Khazov, D.; Soumagnac, M.; Manulis, I.; Rubin, A.; Sagiv, I.; Vreeswijk, P.; Yaron, O.; Bond, H.; Bilgi, P.; Cao, Y.; Duggan, G.

    2015-03-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  8. iPTF Discovery of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Hangard, L.; Ferretti, R.; Papadogiannakis, S.; Petrushevska, T.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Cook, D.

    2015-12-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  9. iPTF Discoveries of Recent Type Ia Supernova

    NASA Astrophysics Data System (ADS)

    Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.; Jencson, J.

    2015-11-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).

  10. Program For Simulation Of Trajectories And Events

    NASA Technical Reports Server (NTRS)

    Gottlieb, Robert G.

    1992-01-01

    Universal Simulation Executive (USE) program accelerates and eases generation of application programs for numerical simulation of continuous trajectories interrupted by or containing discrete events. Developed for simulation of multiple spacecraft trajectories with events as one spacecraft crossing the equator, two spacecraft meeting or parting, or firing rocket engine. USE also simulates operation of chemical batch processing factory. Written in Ada.

  11. Fatigue and voluntary utilization of automation in simulated driving.

    PubMed

    Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani

    2012-10-01

    A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.

  12. Fast-time Simulation of an Automated Conflict Detection and Resolution Concept

    NASA Technical Reports Server (NTRS)

    Windhorst, Robert; Erzberger, Heinz

    2006-01-01

    This paper investigates the effect on the National Airspace System of reducing air traffc controller workload by automating conflict detection and resolution. The Airspace Concept Evaluation System is used to perform simulations of the Cleveland Center with conventional and with automated conflict detection and resolution concepts. Results show that the automated conflict detection and resolution concept significantly decreases growth of delay as traffic demand is increased in en-route airspace.

  13. MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation

    NASA Technical Reports Server (NTRS)

    Charest, Leonard

    1994-01-01

    This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.

  14. Systems engineering: A formal approach. Part 1: System concepts

    NASA Astrophysics Data System (ADS)

    Vanhee, K. M.

    1993-03-01

    Engineering is the scientific discipline focused on the creation of new artifacts that are supposed to be of some use to our society. Different types of artifacts require different engineering approaches. However, in all these disciplines the development of a new artifact is divided into stages. Three stages can always be recognized: Analysis, Design, and Realization. The book considers only the first two stages of the development process. It focuses on a specific type of artifacts, called discrete dynamic systems. These systems consist of active components of actors that consume and produce passive components or tokens. Three subtypes are studied in more detail: business systems (like a factory or restaurant), information systems (whether automated or not), and automated systems (systems that are controlled by an automated information system). The first subtype is studied by industrial engineers, the last by software engineers and electrical engineers, whereas the second is a battlefield for all three disciplines. The union of these disciplines is called systems engineering.

  15. Continuous-flow automation and hemolysis index: a crucial combination.

    PubMed

    Lippi, Giuseppe; Plebani, Mario

    2013-04-01

    A paradigm shift has occurred in the role and organization of laboratory diagnostics over the past decades, wherein consolidation or networking of small laboratories into larger factories and point-of-care testing have simultaneously evolved and now seem to favorably coexist. There is now evidence, however, that the growing implementation of continuous-flow automation, especially in closed systems, has not eased the identification of hemolyzed specimens since the integration of preanalytical and analytical workstations would hide them from visual scrutiny, with an inherent risk that unreliable test results may be released to the stakeholders. Along with other technical breakthroughs, the new generation of laboratory instrumentation is increasingly equipped with systems that can systematically and automatically be tested for a broad series of interferences, the so-called serum indices, which also include the hemolysis index. The routine implementation of these technical tools in clinical laboratories equipped with continuous-flow automation carries several advantages and some drawbacks that are discussed in this article.

  16. High-speed autoverifying technology for printed wiring boards

    NASA Astrophysics Data System (ADS)

    Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi

    1996-10-01

    We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).

  17. Simulations of Highway Traffic With Various Degrees of Automation

    DOT National Transportation Integrated Search

    1996-01-01

    A traffic simulator to study highway traffic under various degrees of automation is being developed at Argonne National Laboratory. The key components of this simulator include a global and a local Expert Driver Model, a human factor study and a grap...

  18. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less

  19. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  20. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    NASA Technical Reports Server (NTRS)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  1. A Computer Program Functional Design of the Simulation Subsystem of an Automated Central Flow Control System

    DOT National Transportation Integrated Search

    1976-08-01

    This report contains a functional design for the simulation of a future automation concept in support of the ATC Systems Command Center. The simulation subsystem performs airport airborne arrival delay predictions and computes flow control tables for...

  2. iPTF Discoveries of Recent Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Johansson, J.; Migotto, K.; Nyholm, A.; Papadogiannakis, S.; Ben-Ami, S.; De Cia, A.; Dzigan, Y.; Horesh, A.; Leloudas, G.; Manulis, I.; Rubin, A.; Sagiv, I.; Vreeswijk, P.; Yaron, O.; Cao, Y.; Perley, D.; Miller, A.; Waszczak, A.; Kasliwal, M. M.; Hosseinzadeh, G.; Cenko, S. B.; Quimby, R.

    2015-05-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W). See ATel #7112 for additional details.

  3. Development and Validation of an Automated Simulation Capability in Support of Integrated Demand Management

    NASA Technical Reports Server (NTRS)

    Arneson, Heather; Evans, Antony D.; Li, Jinhua; Wei, Mei Yueh

    2017-01-01

    Integrated Demand Management (IDM) is a near- to mid-term NASA concept that proposes to address mismatches in air traffic system demand and capacity by using strategic flow management capabilities to pre-condition demand into the more tactical Time-Based Flow Management System (TBFM). This paper describes an automated simulation capability to support IDM concept development. The capability closely mimics existing human-in-the-loop (HITL) capabilities, while automating both the human components and collaboration between operational systems, and speeding up the real-time aircraft simulations. Such a capability allows for parametric studies to be carried out that can inform the HITL simulations, identifying breaking points and parameter values at which significant changes in system behavior occur. The paper describes the initial validation of the automated simulation capability against results from previous IDM HITL experiments, quantifying the differences. The simulator is then used to explore the performance of the IDM concept under the simple scenario of a capacity constrained airport under a wide range of wind conditions.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurt Derr; Milos Manic

    Time and location data play a very significant role in a variety of factory automation scenarios, such as automated vehicles and robots, their navigation, tracking, and monitoring, to services of optimization and security. In addition, pervasive wireless capabilities combined with time and location information are enabling new applications in areas such as transportation systems, health care, elder care, military, emergency response, critical infrastructure, and law enforcement. A person/object in proximity to certain areas for specific durations of time may pose a risk hazard either to themselves, others, or the environment. This paper presents a novel fuzzy based spatio-temporal risk calculationmore » DSTiPE method that an object with wireless communications presents to the environment. The presented Matlab based application for fuzzy spatio-temporal risk cluster extraction is verified on a diagonal vehicle movement example.« less

  5. Effects of imperfect automation on decision making in a simulated command and control task.

    PubMed

    Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja

    2007-02-01

    Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.

  6. Automated Metrics in a Virtual-Reality Myringotomy Simulator: Development and Construct Validity.

    PubMed

    Huang, Caiwen; Cheng, Horace; Bureau, Yves; Ladak, Hanif M; Agrawal, Sumit K

    2018-06-15

    The objectives of this study were: 1) to develop and implement a set of automated performance metrics into the Western myringotomy simulator, and 2) to establish construct validity. Prospective simulator-based assessment study. The Auditory Biophysics Laboratory at Western University, London, Ontario, Canada. Eleven participants were recruited from the Department of Otolaryngology-Head & Neck Surgery at Western University: four senior otolaryngology consultants and seven junior otolaryngology residents. Educational simulation. Discrimination between expert and novice participants on five primary automated performance metrics: 1) time to completion, 2) surgical errors, 3) incision angle, 4) incision length, and 5) the magnification of the microscope. Automated performance metrics were developed, programmed, and implemented into the simulator. Participants were given a standardized simulator orientation and instructions on myringotomy and tube placement. Each participant then performed 10 procedures and automated metrics were collected. The metrics were analyzed using the Mann-Whitney U test with Bonferroni correction. All metrics discriminated senior otolaryngologists from junior residents with a significance of p < 0.002. Junior residents had 2.8 times more errors compared with the senior otolaryngologists. Senior otolaryngologists took significantly less time to completion compared with junior residents. The senior group also had significantly longer incision lengths, more accurate incision angles, and lower magnification keeping both the umbo and annulus in view. Automated quantitative performance metrics were successfully developed and implemented, and construct validity was established by discriminating between expert and novice participants.

  7. Rapid toxicity detection in water quality control utilizing automated multispecies biomonitoring for permanent space stations

    NASA Technical Reports Server (NTRS)

    Morgan, E. L.; Young, R. C.; Smith, M. D.; Eagleson, K. W.

    1986-01-01

    The objective of this study was to evaluate proposed design characteristics and applications of automated biomonitoring devices for real-time toxicity detection in water quality control on-board permanent space stations. Simulated tests in downlinking transmissions of automated biomonitoring data to Earth-receiving stations were simulated using satellite data transmissions from remote Earth-based stations.

  8. Advanced Simulation in Undergraduate Pilot Training: Automatic Instructional System. Final Report for the Period March 1971-January 1975.

    ERIC Educational Resources Information Center

    Faconti, Victor; Epps, Robert

    The Advanced Simulator for Undergraduate Pilot Training (ASUPT) was designed to investigate the role of simulation in the future Undergraduate Pilot Training (UPT) program. The Automated Instructional System designed for the ASUPT simulator was described in this report. The development of the Automated Instructional System for ASUPT was based upon…

  9. Development of automation and robotics for space via computer graphic simulation methods

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken

    1988-01-01

    A robot simulation system, has been developed to perform automation and robotics system design studies. The system uses a procedure-oriented solid modeling language to produce a model of the robotic mechanism. The simulator generates the kinematics, inverse kinematics, dynamics, control, and real-time graphic simulations needed to evaluate the performance of the model. Simulation examples are presented, including simulation of the Space Station and the design of telerobotics for the Orbital Maneuvering Vehicle.

  10. Muscle activity and mood state during simulated plant factory work in individuals with cervical spinal cord injury

    PubMed Central

    Okahara, Satoshi; Kataoka, Masataka; Okuda, Kuniharu; Shima, Masato; Miyagaki, Keiko; Ohara, Hitoshi

    2016-01-01

    [Purpose] The present study investigated the physical and mental effects of plant factory work in individuals with cervical spinal cord injury and the use of a newly developed agricultural working environment. [Subjects] Six males with C5–C8 spinal cord injuries and 10 healthy volunteers participated. [Methods] Plant factory work involved three simulated repetitive tasks: sowing, transplantation, and harvesting. Surface electromyography was performed in the dominant upper arm, upper trapezius, anterior deltoid, and biceps brachii muscles. Subjects’ moods were monitored using the Profile of Mood States. [Results] Five males with C6–C8 injuries performed the same tasks as healthy persons; a male with a C5 injury performed fewer repetitions of tasks because it took longer. Regarding muscle activity during transplantation and harvesting, subjects with spinal cord injury had higher values for the upper trapezius and anterior deltoid muscles compared with healthy persons. The Profile of Mood States vigor scores were significantly higher after tasks in subjects with spinal cord injury. [Conclusion] Individuals with cervical spinal cord injury completed the plant factory work, though it required increased time and muscle activity. For individuals with C5–C8 injuries, it is necessary to develop an appropriate environment and assistive devices to facilitate their work. PMID:27134377

  11. Development of a Prototype Automation Simulation Scenario Generator for Air Traffic Management Software Simulations

    NASA Technical Reports Server (NTRS)

    Khambatta, Cyrus F.

    2007-01-01

    A technique for automated development of scenarios for use in the Multi-Center Traffic Management Advisor (McTMA) software simulations is described. The resulting software is designed and implemented to automate the generation of simulation scenarios with the intent of reducing the time it currently takes using an observational approach. The software program is effective in achieving this goal. The scenarios created for use in the McTMA simulations are based on data taken from data files from the McTMA system, and were manually edited before incorporation into the simulations to ensure accuracy. Despite the software s overall favorable performance, several key software issues are identified. Proposed solutions to these issues are discussed. Future enhancements to the scenario generator software may address the limitations identified in this paper.

  12. Automated Guideway Ground Transportation Network Simulation

    DOT National Transportation Integrated Search

    1975-08-01

    The report discusses some automated guideway management problems relating to ground transportation systems and provides an outline of the types of models and algorithms that could be used to develop simulation tools for evaluating system performance....

  13. Optimizing the balance between task automation and human manual control in simulated submarine track management.

    PubMed

    Chen, Stephanie I; Visser, Troy A W; Huf, Samuel; Loft, Shayne

    2017-09-01

    Automation can improve operator performance and reduce workload, but can also degrade operator situation awareness (SA) and the ability to regain manual control. In 3 experiments, we examined the extent to which automation could be designed to benefit performance while ensuring that individuals maintained SA and could regain manual control. Participants completed a simulated submarine track management task under varying task load. The automation was designed to facilitate information acquisition and analysis, but did not make task decisions. Relative to a condition with no automation, the continuous use of automation improved performance and reduced subjective workload, but degraded SA. Automation that was engaged and disengaged by participants as required (adaptable automation) moderately improved performance and reduced workload relative to no automation, but degraded SA. Automation engaged and disengaged based on task load (adaptive automation) provided no benefit to performance or workload, and degraded SA relative to no automation. Automation never led to significant return-to-manual deficits. However, all types of automation led to degraded performance on a nonautomated task that shared information processing requirements with automated tasks. Given these outcomes, further research is urgently required to establish how to design automation to maximize performance while keeping operators cognitively engaged. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Impact of pharmacy automation on patient waiting time: an application of computer simulation.

    PubMed

    Tan, Woan Shin; Chua, Siang Li; Yong, Keng Woh; Wu, Tuck Seng

    2009-06-01

    This paper aims to illustrate the use of computer simulation in evaluating the impact of a prototype automated dispensing system on waiting time in an outpatient pharmacy and its potential as a routine tool in pharmacy management. A discrete event simulation model was developed to investigate the impact of a prototype automated dispensing system on operational efficiency and service standards in an outpatient pharmacy. The simulation results suggest that automating the prescription-filing function using a prototype that picks and packs at 20 seconds per item will not assist the pharmacy in achieving the waiting time target of 30 minutes for all patients. Regardless of the state of automation, to meet the waiting time target, 2 additional pharmacists are needed to overcome the process bottleneck at the point of medication dispense. However, if the automated dispensing is the preferred option, the speed of the system needs to be twice as fast as the current configuration to facilitate the reduction of the 95th percentile patient waiting time to below 30 minutes. The faster processing speed will concomitantly allow the pharmacy to reduce the number of pharmacy technicians from 11 to 8. Simulation was found to be a useful and low cost method that allows an otherwise expensive and resource intensive evaluation of new work processes and technology to be completed within a short time.

  15. The simulation of air recirculation and fire/explosion phenomena within a semiconductor factory.

    PubMed

    I, Yet-Pole; Chiu, Yi-Long; Wu, Shi-Jen

    2009-04-30

    The semiconductor industry is the collection of capital-intensive firms that employ a variety of hazardous chemicals and engage in the design and fabrication of semiconductor devices. Owing to its processing characteristics, the fully confined structure of the fabrication area (fab) and the vertical airflow ventilation design restrict the applications of traditional consequence analysis techniques that are commonly used in other industries. The adverse situation also limits the advancement of a fire/explosion prevention design for the industry. In this research, a realistic model of a semiconductor factory with a fab, sub-fabrication area, supply air plenum, and return air plenum structures was constructed and the computational fluid dynamics algorithm was employed to simulate the possible fire/explosion range and its severity. The semiconductor factory has fan module units with high efficiency particulate air filters that can keep the airflow uniform within the cleanroom. This condition was modeled by 25 fans, three layers of porous ceiling, and one layer of porous floor. The obtained results predicted very well the real airflow pattern in the semiconductor factory. Different released gases, leak locations, and leak rates were applied to investigate their influence on the hazard range and severity. Common mitigation measures such as a water spray system and a pressure relief panel were also provided to study their potential effectiveness to relieve thermal radiation and overpressure hazards within a fab. The semiconductor industry can use this simulation procedure as a reference on how to implement a consequence analysis for a flammable gas release accident within an air recirculation cleanroom.

  16. Parameter Studies, time-dependent simulations and design with automated Cartesian methods

    NASA Technical Reports Server (NTRS)

    Aftosmis, Michael

    2005-01-01

    Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.

  17. Determining production level under uncertainty using fuzzy simulation and bootstrap technique, a case study

    NASA Astrophysics Data System (ADS)

    Hamidi, Mohammadreza; Shahanaghi, Kamran; Jabbarzadeh, Armin; Jahani, Ehsan; Pousti, Zahra

    2017-12-01

    In every production plant, it is necessary to have an estimation of production level. Sometimes there are many parameters affective in this estimation. In this paper, it tried to find an appropriate estimation of production level for an industrial factory called Barez in an uncertain environment. We have considered a part of production line, which has different production time for different kind of products, which means both environmental and system uncertainty. To solve the problem we have simulated the line and because of the uncertainty in the times, fuzzy simulation is considered. Required fuzzy numbers are estimated by the use of bootstrap technique. The results are used in production planning process by factory experts and have had satisfying consequences. Opinions of these experts about the efficiency of using this methodology, has been attached.

  18. Dynamic mapping of EDDL device descriptions to OPC UA

    NASA Astrophysics Data System (ADS)

    Atta Nsiah, Kofi; Schappacher, Manuel; Sikora, Axel

    2017-07-01

    OPC UA (Open Platform Communications Unified Architecture) is already a well-known concept used widely in the automation industry. In the area of factory automation, OPC UA models the underlying field devices such as sensors and actuators in an OPC UA server to allow connecting OPC UA clients to access device-specific information via a standardized information model. One of the requirements of the OPC UA server to represent field device data using its information model is to have advanced knowledge about the properties of the field devices in the form of device descriptions. The international standard IEC 61804 specifies EDDL (Electronic Device Description Language) as a generic language for describing the properties of field devices. In this paper, the authors describe a possibility to dynamically map and integrate field device descriptions based on EDDL into OPCUA.

  19. Level of Automation and Failure Frequency Effects on Simulated Lunar Lander Performance

    NASA Technical Reports Server (NTRS)

    Marquez, Jessica J.; Ramirez, Margarita

    2014-01-01

    A human-in-the-loop experiment was conducted at the NASA Ames Research Center Vertical Motion Simulator, where instrument-rated pilots completed a simulated terminal descent phase of a lunar landing. Ten pilots participated in a 2 x 2 mixed design experiment, with level of automation as the within-subjects factor and failure frequency as the between subjects factor. The two evaluated levels of automation were high (fully automated landing) and low (manual controlled landing). During test trials, participants were exposed to either a high number of failures (75% failure frequency) or low number of failures (25% failure frequency). In order to investigate the pilots' sensitivity to changes in levels of automation and failure frequency, the dependent measure selected for this experiment was accuracy of failure diagnosis, from which D Prime and Decision Criterion were derived. For each of the dependent measures, no significant difference was found for level of automation and no significant interaction was detected between level of automation and failure frequency. A significant effect was identified for failure frequency suggesting failure frequency has a significant effect on pilots' sensitivity to failure detection and diagnosis. Participants were more likely to correctly identify and diagnose failures if they experienced the higher levels of failures, regardless of level of automation

  20. Optimization of the Neutrino Factory, revisited

    NASA Astrophysics Data System (ADS)

    Agarwalla, Sanjib K.; Huber, Patrick; Tang, Jian; Winter, Walter

    2011-01-01

    We perform the baseline and energy optimization of the Neutrino Factory including the latest simulation results on the magnetized iron detector (MIND). We also consider the impact of τ decays, generated by νμ → ντ or ν e → ντ appearance, on the mass hierarchy, CP violation, and θ 13 discovery reaches, which we find to be negligible for the considered detector. For the baseline-energy optimization for small sin2 2 θ 13, we qualitatively recover the results with earlier simulations of the MIND detector. We find optimal baselines of about 2500km to 5000km for the CP violation measurement, where now values of E μ as low as about 12 GeV may be possible. However, for large sin2 2 θ 13, we demonstrate that the lower threshold and the backgrounds reconstructed at lower energies allow in fact for muon energies as low as 5 GeV at considerably shorter baselines, such as FNAL-Homestake. This implies that with the latest MIND analysis, low-and high-energy versions of the Neutrino Factory are just two different versions of the same experiment optimized for different parts of the parameter space. Apart from a green-field study of the updated detector performance, we discuss specific implementations for the two-baseline Neutrino Factory, where the considered detector sites are taken to be currently discussed underground laboratories. We find that reasonable setups can be found for the Neutrino Factory source in Asia, Europe, and North America, and that a triangular-shaped storage ring is possible in all cases based on geometrical arguments only.

  1. BPM Breakdown Potential in the PEP-II B-factory Storage Ring Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weathersby, Stephen; Novokhatski, Alexander; /SLAC

    2010-02-10

    High current B-Factory BPM designs incorporate a button type electrode which introduces a small gap between the button and the beam chamber. For achievable currents and bunch lengths, simulations indicate that electric potentials can be induced in this gap which are comparable to the breakdown voltage. This study characterizes beam induced voltages in the existing PEP-II storage ring collider BPM as a function of bunch length and beam current.

  2. Knowledge Based Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle.

    DTIC Science & Technology

    1988-04-13

    Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle Mark S. Fox, Nizwer Husain, Malcolm...McRoberts and Y.V.Reddy CMU-RI-TR-88-5 Intelligent Systems Laboratory The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania D T T 13...years of research in the application of Artificial Intelligence to Simulation. Our focus has been in two areas: the use of Al knowledge representation

  3. Exploiting Self-organization in Bioengineered Systems: A Computational Approach.

    PubMed

    Davis, Delin; Doloman, Anna; Podgorski, Gregory J; Vargis, Elizabeth; Flann, Nicholas S

    2017-01-01

    The productivity of bioengineered cell factories is limited by inefficiencies in nutrient delivery and waste and product removal. Current solution approaches explore changes in the physical configurations of the bioreactors. This work investigates the possibilities of exploiting self-organizing vascular networks to support producer cells within the factory. A computational model simulates de novo vascular development of endothelial-like cells and the resultant network functioning to deliver nutrients and extract product and waste from the cell culture. Microbial factories with vascular networks are evaluated for their scalability, robustness, and productivity compared to the cell factories without a vascular network. Initial studies demonstrate that at least an order of magnitude increase in production is possible, the system can be scaled up, and the self-organization of an efficient vascular network is robust. The work suggests that bioengineered multicellularity may offer efficiency improvements difficult to achieve with physical engineering approaches.

  4. Team play with a powerful and independent agent: a full-mission simulation study.

    PubMed

    Sarter, N B; Woods, D D

    2000-01-01

    One major problem with pilot-automation interaction on modern flight decks is a lack of mode awareness; that is, a lack of knowledge and understanding of the current and future status and behavior of the automation. A lack of mode awareness is not simply a pilot problem; rather, it is a symptom of a coordination breakdown between humans and machines. Recent changes in automation design can therefore be expected to have an impact on the nature of problems related to mode awareness. To examine how new automation properties might affect pilot-automation coordination, we performed a full-mission simulation study on one of the most advanced automated aircraft, the Airbus A-320. The results of this work indicate that mode errors and "automation surprises" still occur on these advanced aircraft. However, there appear to be more opportunities for delayed or missing interventions with undesirable system activities, possibly because of higher system autonomy and coupling.

  5. A Simple Method for Automated Equilibration Detection in Molecular Simulations.

    PubMed

    Chodera, John D

    2016-04-12

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.

  6. Communications among elements of a space construction ensemble

    NASA Technical Reports Server (NTRS)

    Davis, Randal L.; Grasso, Christopher A.

    1989-01-01

    Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.

  7. A simple method for automated equilibration detection in molecular simulations

    PubMed Central

    Chodera, John D.

    2016-01-01

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390

  8. Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation: A Case Study and Empirical Validation.

    PubMed

    Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M

    2015-09-01

    The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.

  9. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    NASA Astrophysics Data System (ADS)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  10. Automation-induced monitoring inefficiency: role of display location.

    PubMed

    Singh, I L; Molloy, R; Parasuraman, R

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  11. Automation-induced monitoring inefficiency: role of display location

    NASA Technical Reports Server (NTRS)

    Singh, I. L.; Molloy, R.; Parasuraman, R.

    1997-01-01

    Operators can be poor monitors of automation if they are engaged concurrently in other tasks. However, in previous studies of this phenomenon the automated task was always presented in the periphery, away from the primary manual tasks that were centrally displayed. In this study we examined whether centrally locating an automated task would boost monitoring performance during a flight-simulation task consisting of system monitoring, tracking and fuel resource management sub-tasks. Twelve nonpilot subjects were required to perform the tracking and fuel management tasks manually while watching the automated system monitoring task for occasional failures. The automation reliability was constant at 87.5% for six subjects and variable (alternating between 87.5% and 56.25%) for the other six subjects. Each subject completed four 30 min sessions over a period of 2 days. In each automation reliability condition the automation routine was disabled for the last 20 min of the fourth session in order to simulate catastrophic automation failure (0 % reliability). Monitoring for automation failure was inefficient when automation reliability was constant but not when it varied over time, replicating previous results. Furthermore, there was no evidence of resource or speed accuracy trade-off between tasks. Thus, automation-induced failures of monitoring cannot be prevented by centrally locating the automated task.

  12. Understanding reliance on automation: effects of error type, error distribution, age and experience

    PubMed Central

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  13. Understanding reliance on automation: effects of error type, error distribution, age and experience.

    PubMed

    Sanchez, Julian; Rogers, Wendy A; Fisk, Arthur D; Rovira, Ericka

    2014-03-01

    An obstacle detection task supported by "imperfect" automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation.

  14. The Living Cell as a Multi-agent Organisation: A Compositional Organisation Model of Intracellular Dynamics

    NASA Astrophysics Data System (ADS)

    Jonker, C. M.; Snoep, J. L.; Treur, J.; Westerhoff, H. V.; Wijngaards, W. C. A.

    Within the areas of Computational Organisation Theory and Artificial Intelligence, techniques have been developed to simulate and analyse dynamics within organisations in society. Usually these modelling techniques are applied to factories and to the internal organisation of their process flows, thus obtaining models of complex organisations at various levels of aggregation. The dynamics in living cells are often interpreted in terms of well-organised processes, a bacterium being considered a (micro)factory. This suggests that organisation modelling techniques may also benefit their analysis. Using the example of Escherichia coli it is shown how indeed agent-based organisational modelling techniques can be used to simulate and analyse E.coli's intracellular dynamics. Exploiting the abstraction levels entailed by this perspective, a concise model is obtained that is readily simulated and analysed at the various levels of aggregation, yet shows the cell's essential dynamic patterns.

  15. Lattice Commissioning Stretgy Simulation for the B Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, M.; Whittum, D.; Yan, Y.

    2011-08-26

    To prepare for the PEP-II turn on, we have studied one commissioning strategy with simulated lattice errors. Features such as difference and absolute orbit analysis and correction are discussed. To prepare for the commissioning of the PEP-II injection line and high energy ring (HER), we have developed a system for on-line orbit analysis by merging two existing codes: LEGO and RESOLVE. With the LEGO-RESOLVE system, we can study the problem of finding quadrupole alignment and beam position (BPM) offset errors with simulated data. We have increased the speed and versatility of the orbit analysis process by using a command filemore » written in a script language designed specifically for RESOLVE. In addition, we have interfaced the LEGO-RESOLVE system to the control system of the B-Factory. In this paper, we describe online analysis features of the LEGO-RESOLVE system and present examples of practical applications.« less

  16. The Center-TRACON Automation System: Simulation and field testing

    NASA Technical Reports Server (NTRS)

    Denery, Dallas G.; Erzberger, Heinz

    1995-01-01

    A new concept for air traffic management in the terminal area, implemented as the Center-TRACON Automation System, has been under development at NASA Ames in a cooperative program with the FAA since 1991. The development has been strongly influenced by concurrent simulation and field site evaluations. The role of simulation and field activities in the development process will be discussed. Results of recent simulation and field tests will be presented.

  17. Stages and levels of automation in support of space teleoperations.

    PubMed

    Li, Huiyang; Wickens, Christopher D; Sarter, Nadine; Sebok, Angelia

    2014-09-01

    This study examined the impact of stage of automation on the performance and perceived workload during simulated robotic arm control tasks in routine and off-nominal scenarios. Automation varies with respect to the stage of information processing it supports and its assigned level of automation. Making appropriate choices in terms of stages and levels of automation is critical to ensure robust joint system performance. To date, this issue has been empirically studied in domains such as aviation and medicine but not extensively in the context of space operations. A total of 36 participants played the role of a payload specialist and controlled a simulated robotic arm. Participants performed fly-to tasks with two types of automation (camera recommendation and trajectory control automation) of varying stage. Tasks were performed during routine scenarios and in scenarios in which either the trajectory control automation or a hazard avoidance automation failed. Increasing the stage of automation progressively improved performance and lowered workload when the automation was reliable, but incurred severe performance costs when the system failed. The results from this study support concerns about automation-induced complacency and automation bias when later stages of automation are introduced. The benefits of such automation are offset by the risk of catastrophic outcomes when system failures go unnoticed or become difficult to recover from. A medium stage of automation seems preferable as it provides sufficient support during routine operations and helps avoid potentially catastrophic outcomes in circumstances when the automation fails.

  18. Using Bayesian variable selection to analyze regular resolution IV two-level fractional factorial designs

    DOE PAGES

    Chipman, Hugh A.; Hamada, Michael S.

    2016-06-02

    Regular two-level fractional factorial designs have complete aliasing in which the associated columns of multiple effects are identical. Here, we show how Bayesian variable selection can be used to analyze experiments that use such designs. In addition to sparsity and hierarchy, Bayesian variable selection naturally incorporates heredity . This prior information is used to identify the most likely combinations of active terms. We also demonstrate the method on simulated and real experiments.

  19. Using Bayesian variable selection to analyze regular resolution IV two-level fractional factorial designs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chipman, Hugh A.; Hamada, Michael S.

    Regular two-level fractional factorial designs have complete aliasing in which the associated columns of multiple effects are identical. Here, we show how Bayesian variable selection can be used to analyze experiments that use such designs. In addition to sparsity and hierarchy, Bayesian variable selection naturally incorporates heredity . This prior information is used to identify the most likely combinations of active terms. We also demonstrate the method on simulated and real experiments.

  20. One of My Favorite Assignments: Automated Teller Machine Simulation.

    ERIC Educational Resources Information Center

    Oberman, Paul S.

    2001-01-01

    Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)

  1. Comparing effects of fire modeling methods on simulated fire patterns and succession: a case study in the Missouri Ozarks

    Treesearch

    Jian Yang; Hong S. He; Brian R. Sturtevant; Brian R. Miranda; Eric J. Gustafson

    2008-01-01

    We compared four fire spread simulation methods (completely random, dynamic percolation. size-based minimum travel time algorithm. and duration-based minimum travel time algorithm) and two fire occurrence simulation methods (Poisson fire frequency model and hierarchical fire frequency model) using a two-way factorial design. We examined these treatment effects on...

  2. Effects of automation of information-processing functions on teamwork.

    PubMed

    Wright, Melanie C; Kaber, David B

    2005-01-01

    We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.

  3. Responses of southeast Alaska understory species to variation in light and simulated herbivory

    Treesearch

    Thomas A. Hanley; Jeffrey C. Barnard

    2014-01-01

    Responses to variation in light intensity, simulated herbivory by clipping, and their interaction were studied over three seasons in a factorial experimental design. Six major species of southeast Alaska forest understories were studied, each as a separate experiment: bunchberry, Cornus canadensis L.; threeleaf foamflower, Tiarella...

  4. Skin diseases in workers at a perfume factory.

    PubMed

    Schubert, Hans-Jürgen

    2006-08-01

    The aim of this study is to find out the causes of skin diseases in one-third of the staff of a perfume factory, in which 10 different perfume sprays were being manufactured. Site inspection, dermatological examination and patch testing of all 26 persons at risk with 4 perfume oils and 30 ingredients of them. The results showed 6 bottlers were found suffering from allergic contact dermatitis, 2 from irritant contact dermatitis, 12 workers showed different strong reactions to various fragrances. The main causes of allergic contact dermatitis were 2 perfume oils (12 cases) and their ingredients geraniol (12 cases), benzaldehyde(9), cinnamic aldehyde (6), linalool, neroli oil, terpenes of lemon oil and orange oil (4 each). Nobody was tested positive to balsam of Peru. Job changes for office workers, packers or printers to other rooms, where they had no longer contact with fragrances, led to a settling. To conclude, automation and replacement of glass bottles by cartridges from non-fragile materials and using gloves may minimize the risk.

  5. The eldercare factory.

    PubMed

    Sharkey, Noel; Sharkey, Amanda

    2012-01-01

    Rapid advances in service robotics together with dramatic shifts in population demographics have led to the notion that technology may be the answer to our eldercare problems. Robots are being developed for feeding, washing, lifting, carrying and mobilising the elderly as well as monitoring their health. They are also being proposed as a substitute for companionship. While these technologies could accrue major benefits for society and empower the elderly, we must balance their use with the ethical costs. These include a potential reduction in human contact, increased feeling of objectification and loss of control, loss of privacy and personal freedom as well as deception and infantilisation. With appropriate guidelines in place before the introduction of robots en masse into the care system, robots could improve the lives of the elderly, reducing their dependence and creating more opportunities for social interaction. Without forethought, the elderly may find themselves in a barren world of machines, a world of automated care: a factory for the elderly. Copyright © 2011 S. Karger AG, Basel.

  6. Development of Moire machine vision

    NASA Technical Reports Server (NTRS)

    Harding, Kevin G.

    1987-01-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  7. Automated system for definition of life-cycle resources of electromechanical equipment

    NASA Astrophysics Data System (ADS)

    Zhukovskiy, Y.; Koteleva, N.

    2017-02-01

    The frequency of maintenance of electromechanical equipment depends on the plant, which uses and runs this equipment. Very often the maintenance frequency is poorly correlated with the actual state of the electromechanical equipment. Furthermore, traditional methods of diagnosis sometimes cannot work without stopping the process (for example, for equipment located in hard to reach places) and so the maintenance costs are increased. This problem can be solved using the indirect methods of diagnosing of the electromechanical equipment. The indirect methods often use the parameters in the real time and seldom use the parameters of traditional diagnostic methods for determination of the resource of electromechanical equipment. This article is dedicated to developing the structure of a special automated control system. This system must use the big flow of the information about the direct and indirect parameters of the equipment state from plants from different areas of industry and factories which produce the electromechanical equipment.

  8. Development of Moire machine vision

    NASA Astrophysics Data System (ADS)

    Harding, Kevin G.

    1987-10-01

    Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.

  9. Situation Awareness and Levels of Automation: Empirical Assessment of Levels of Automation in the Commercial Cockpit

    NASA Technical Reports Server (NTRS)

    Kaber, David B.; Schutte, Paul C. (Technical Monitor)

    2000-01-01

    This report has been prepared to closeout a NASA grant to Mississippi State University (MSU) for research into situation awareness (SA) and automation in the advanced commercial aircraft cockpit. The grant was divided into two obligations including $60,000 for the period from May 11, 2000 to December 25, 2000. The information presented in this report summarizes work completed through this obligation. It also details work to be completed with the balance of the current obligation and unobligated funds amounting to $50,043, which are to be granted to North Carolina State University for completion of the research project from July 31, 2000 to May 10, 2001. This research was to involve investigation of a broad spectrum of degrees of automation of complex systems on human-machine performance and SA. The work was to empirically assess the effect of theoretical levels of automation (LOAs) described in a taxonomy developed by Endsley & Kaber (1999) on naive and experienced subject performance and SA in simulated flight tasks. The study was to be conducted in the context of a realistic simulation of aircraft flight control. The objective of this work was to identify LOAs that effectively integrate humans and machines under normal operating conditions and failure modes. In general, the work was to provide insight into the design of automation in the commercial aircraft cockpit. Both laboratory and field investigations were to be conducted. At this point in time, a high-fidelity flight simulator of the McDonald Douglas (MD) 11 aircraft has been completed. The simulator integrates a reconfigurable flight simulator developed by the Georgia Institute of Technology and stand-alone simulations of MD-11 autoflight systems developed at MSU. Use of the simulator has been integrated into a study plan for the laboratory research and it is expected that the simulator will also be used in the field study with actual commercial pilots. In addition to the flight simulator, an electronic version of the Situation Awareness Global Assessment Technique (SAGAT) has been completed for measuring commercial pilot SA in flight tasks. The SAGAT is to be used in both the lab and field studies. Finally, the lab study has been designed and subjects have been recruited for participation in experiments. This study will investigate the effects of five levels of automation, described in Endsley & Kaber's (1999) taxonomy and applied to the MD-11 autoflight system, on private pilot performance and SA in basic flight tasks by using the MD-11 simulator. The field study remains to be planned and executed.

  10. Impact of Automation on Drivers' Performance in Agricultural Semi-Autonomous Vehicles.

    PubMed

    Bashiri, B; Mann, D D

    2015-04-01

    Drivers' inadequate mental workload has been reported as one of the negative effects of driving assistant systems and in-vehicle automation. The increasing trend of automation in agricultural vehicles raises some concerns about drivers' mental workload in such vehicles. Thus, a human factors perspective is needed to identify the consequences of such automated systems. In this simulator study, the effects of vehicle steering task automation (VSTA) and implement control and monitoring task automation (ICMTA) were investigated using a tractor-air seeder system as a case study. Two performance parameters (reaction time and accuracy of actions) were measured to assess drivers' perceived mental workload. Experiments were conducted using the tractor driving simulator (TDS) located in the Agricultural Ergonomics Laboratory at the University of Manitoba. Study participants were university students with tractor driving experience. According to the results, reaction time and number of errors made by drivers both decreased as the automation level increased. Correlations were found among performance parameters and subjective mental workload reported by the drivers.

  11. Evaluating an automated haptic simulator designed for veterinary students to learn bovine rectal palpation.

    PubMed

    Baillie, Sarah; Crossan, Andrew; Brewster, Stephen A; May, Stephen A; Mellor, Dominic J

    2010-10-01

    Simulators provide a potential solution to some of the challenges faced when teaching internal examinations to medical or veterinary students. A virtual reality simulator, the Haptic Cow, has been developed to teach bovine rectal palpation to veterinary students, and significant training benefits have been demonstrated. However, the training needs to be delivered by an instructor, a requirement that limits availability. This article describes the development and evaluation of an automated version that students could use on their own. An automated version was developed based on a recording of an expert's examination. The performance of two groups of eight students was compared. All students had undergone the traditional training in the course, namely lectures and laboratory practicals, and then group S used the simulator whereas group R had no additional training. The students were set the task of finding the uterus when examining cows. The simulator was then made available to students, and feedback about the "usability" was gathered with a questionnaire. The group whose training had been supplemented with a simulator session were significantly better at finding the uterus. The questionnaire feedback was positive and indicated that students found the simulator easy to use. The automated simulator equipped students with useful skills for examining cows. In addition, a simulator that does not need the presence of an instructor will increase the availability of training for students and be a more sustainable option for institutions.

  12. Automated Seat Cushion for Pressure Ulcer Prevention Using Real-Time Mapping, Offloading, and Redistribution of Interface Pressure

    DTIC Science & Technology

    2016-10-01

    testing as well as finite element simulation. Automation and control testing has been completed on a 5x5 array of bubble actuators to verify pressure...mechanical behavior at varying loads and internal pressures both by experimental testing as well as finite element simulation. Automation and control...A finite element (FE) model of the bubble actuator was developed in the commercial software ANSYS in order to determine the deformation of the

  13. Rapid Automated Aircraft Simulation Model Updating from Flight Data

    NASA Technical Reports Server (NTRS)

    Brian, Geoff; Morelli, Eugene A.

    2011-01-01

    Techniques to identify aircraft aerodynamic characteristics from flight measurements and compute corrections to an existing simulation model of a research aircraft were investigated. The purpose of the research was to develop a process enabling rapid automated updating of aircraft simulation models using flight data and apply this capability to all flight regimes, including flight envelope extremes. The process presented has the potential to improve the efficiency of envelope expansion flight testing, revision of control system properties, and the development of high-fidelity simulators for pilot training.

  14. Distributed dynamic simulations of networked control and building performance applications.

    PubMed

    Yahiaoui, Azzedine

    2018-02-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper.

  15. Building Airport Surface HITL Simulation Capability

    NASA Technical Reports Server (NTRS)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  16. Distributed dynamic simulations of networked control and building performance applications

    PubMed Central

    Yahiaoui, Azzedine

    2017-01-01

    The use of computer-based automation and control systems for smart sustainable buildings, often so-called Automated Buildings (ABs), has become an effective way to automatically control, optimize, and supervise a wide range of building performance applications over a network while achieving the minimum energy consumption possible, and in doing so generally refers to Building Automation and Control Systems (BACS) architecture. Instead of costly and time-consuming experiments, this paper focuses on using distributed dynamic simulations to analyze the real-time performance of network-based building control systems in ABs and improve the functions of the BACS technology. The paper also presents the development and design of a distributed dynamic simulation environment with the capability of representing the BACS architecture in simulation by run-time coupling two or more different software tools over a network. The application and capability of this new dynamic simulation environment are demonstrated by an experimental design in this paper. PMID:29568135

  17. Experimental Evaluation of an Integrated Datalink and Automation-Based Strategic Trajectory Concept

    NASA Technical Reports Server (NTRS)

    Mueller, Eric

    2007-01-01

    This paper presents research on the interoperability of trajectory-based automation concepts and technologies with modern Flight Management Systems and datalink communication available on many of today s commercial aircraft. A tight integration of trajectory-based ground automation systems with the aircraft Flight Management System through datalink will enable mid-term and far-term benefits from trajectory-based automation methods. A two-way datalink connection between the trajectory-based automation resident in the Center/TRACON Automation System and the Future Air Navigation System-1 integrated FMS/datalink in NASA Ames B747-400 Level D simulator has been established and extensive simulation of the use of datalink messages to generate strategic trajectories completed. A strategic trajectory is defined as an aircraft deviation needed to solve a conflict or honor a route request and then merge the aircraft back to its nominal preferred trajectory using a single continuous trajectory clearance. Engineers on the ground side of the datalink generated lateral and vertical trajectory clearances and transmitted them to the Flight Management System of the 747; the airborne automation then flew the new trajectory without human intervention, requiring the flight crew only to review and to accept the trajectory. This simulation established the protocols needed for a significant majority of the trajectory change types required to solve a traffic conflict or deviate around weather. This demonstration provides a basis for understanding the requirements for integration of trajectory-based automation with current Flight Management Systems and datalink to support future National Airspace System operations.

  18. The combination of simulation and response methodology and its application in an aggregate production plan

    NASA Astrophysics Data System (ADS)

    Chen, Zhiming; Feng, Yuncheng

    1988-08-01

    This paper describes an algorithmic structure for combining simulation and optimization techniques both in theory and practice. Response surface methodology is used to optimize the decision variables in the simulation environment. A simulation-optimization software has been developed and successfully implemented, and its application to an aggregate production planning simulation-optimization model is reported. The model's objective is to minimize the production cost and to generate an optimal production plan and inventory control strategy for an aircraft factory.

  19. A distribution-free multi-factorial profiler for harvesting information from high-density screenings.

    PubMed

    Besseris, George J

    2013-01-01

    Data screening is an indispensable phase in initiating the scientific discovery process. Fractional factorial designs offer quick and economical options for engineering highly-dense structured datasets. Maximum information content is harvested when a selected fractional factorial scheme is driven to saturation while data gathering is suppressed to no replication. A novel multi-factorial profiler is presented that allows screening of saturated-unreplicated designs by decomposing the examined response to its constituent contributions. Partial effects are sliced off systematically from the investigated response to form individual contrasts using simple robust measures. By isolating each time the disturbance attributed solely to a single controlling factor, the Wilcoxon-Mann-Whitney rank stochastics are employed to assign significance. We demonstrate that the proposed profiler possesses its own self-checking mechanism for detecting a potential influence due to fluctuations attributed to the remaining unexplainable error. Main benefits of the method are: 1) easy to grasp, 2) well-explained test-power properties, 3) distribution-free, 4) sparsity-free, 5) calibration-free, 6) simulation-free, 7) easy to implement, and 8) expanded usability to any type and size of multi-factorial screening designs. The method is elucidated with a benchmarked profiling effort for a water filtration process.

  20. A Distribution-Free Multi-Factorial Profiler for Harvesting Information from High-Density Screenings

    PubMed Central

    Besseris, George J.

    2013-01-01

    Data screening is an indispensable phase in initiating the scientific discovery process. Fractional factorial designs offer quick and economical options for engineering highly-dense structured datasets. Maximum information content is harvested when a selected fractional factorial scheme is driven to saturation while data gathering is suppressed to no replication. A novel multi-factorial profiler is presented that allows screening of saturated-unreplicated designs by decomposing the examined response to its constituent contributions. Partial effects are sliced off systematically from the investigated response to form individual contrasts using simple robust measures. By isolating each time the disturbance attributed solely to a single controlling factor, the Wilcoxon-Mann-Whitney rank stochastics are employed to assign significance. We demonstrate that the proposed profiler possesses its own self-checking mechanism for detecting a potential influence due to fluctuations attributed to the remaining unexplainable error. Main benefits of the method are: 1) easy to grasp, 2) well-explained test-power properties, 3) distribution-free, 4) sparsity-free, 5) calibration-free, 6) simulation-free, 7) easy to implement, and 8) expanded usability to any type and size of multi-factorial screening designs. The method is elucidated with a benchmarked profiling effort for a water filtration process. PMID:24009744

  1. Automated parameterization of intermolecular pair potentials using global optimization techniques

    NASA Astrophysics Data System (ADS)

    Krämer, Andreas; Hülsmann, Marco; Köddermann, Thorsten; Reith, Dirk

    2014-12-01

    In this work, different global optimization techniques are assessed for the automated development of molecular force fields, as used in molecular dynamics and Monte Carlo simulations. The quest of finding suitable force field parameters is treated as a mathematical minimization problem. Intricate problem characteristics such as extremely costly and even abortive simulations, noisy simulation results, and especially multiple local minima naturally lead to the use of sophisticated global optimization algorithms. Five diverse algorithms (pure random search, recursive random search, CMA-ES, differential evolution, and taboo search) are compared to our own tailor-made solution named CoSMoS. CoSMoS is an automated workflow. It models the parameters' influence on the simulation observables to detect a globally optimal set of parameters. It is shown how and why this approach is superior to other algorithms. Applied to suitable test functions and simulations for phosgene, CoSMoS effectively reduces the number of required simulations and real time for the optimization task.

  2. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  3. Automating NEURON Simulation Deployment in Cloud Resources

    PubMed Central

    Santamaria, Fidel

    2016-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341

  4. Data management and database framework for the MICE experiment

    NASA Astrophysics Data System (ADS)

    Martyniak, J.; Nebrensky, J. J.; Rajaram, D.; MICE Collaboration

    2017-10-01

    The international Muon Ionization Cooling Experiment (MICE) currently operating at the Rutherford Appleton Laboratory in the UK, is designed to demonstrate the principle of muon ionization cooling for application to a future Neutrino Factory or Muon Collider. We present the status of the framework for the movement and curation of both raw and reconstructed data. A raw data-mover has been designed to safely upload data files onto permanent tape storage as soon as they have been written out. The process has been automated, and checks have been built in to ensure the integrity of data at every stage of the transfer. The data processing framework has been recently redesigned in order to provide fast turnaround of reconstructed data for analysis. The automated reconstruction is performed on a dedicated machine in the MICE control room and any reprocessing is done at Tier-2 Grid sites. In conjunction with this redesign, a new reconstructed-data-mover has been designed and implemented. We also review the implementation of a robust database system that has been designed for MICE. The processing of data, whether raw or Monte Carlo, requires accurate knowledge of the experimental conditions. MICE has several complex elements ranging from beamline magnets to particle identification detectors to superconducting magnets. A Configuration Database, which contains information about the experimental conditions (magnet currents, absorber material, detector calibrations, etc.) at any given time has been developed to ensure accurate and reproducible simulation and reconstruction. A fully replicated, hot-standby database system has been implemented with a firewall-protected read-write master running in the control room, and a read-only slave running at a different location. The actual database is hidden from end users by a Web Service layer, which provides platform and programming language-independent access to the data.

  5. An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.

    ERIC Educational Resources Information Center

    Skakun, Ernest N.; Hakstian, A. Ralph

    Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…

  6. System Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model Programmer's Manual

    DOT National Transportation Integrated Search

    1982-07-01

    In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...

  7. Simulation Environment Synchronizing Real Equipment for Manufacturing Cell

    NASA Astrophysics Data System (ADS)

    Inukai, Toshihiro; Hibino, Hironori; Fukuda, Yoshiro

    Recently, manufacturing industries face various problems such as shorter product life cycle, more diversified customer needs. In this situation, it is very important to reduce lead-time of manufacturing system constructions. At the manufacturing system implementation stage, it is important to make and evaluate facility control programs for a manufacturing cell, such as ladder programs for programmable logical controllers (PLCs) rapidly. However, before the manufacturing systems are implemented, methods to evaluate the facility control programs for the equipment while mixing and synchronizing real equipment and virtual factory models on the computers have not been developed. This difficulty is caused by the complexity of the manufacturing system composed of a great variety of equipment, and stopped precise and rapid support of a manufacturing engineering process. In this paper, a manufacturing engineering environment (MEE) to support manufacturing engineering processes using simulation technologies is proposed. MEE consists of a manufacturing cell simulation environment (MCSE) and a distributed simulation environment (DSE). MCSE, which consists of a manufacturing cell simulator and a soft-wiring system, is emphatically proposed in detail. MCSE realizes making and evaluating facility control programs by using virtual factory models on computers before manufacturing systems are implemented.

  8. Intelligent robot trends and predictions for the .net future

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    2001-10-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent and future technical and economic trends. During the past twenty years the use of industrial robots that are equipped not only with precise motion control systems but also with sensors such as cameras, laser scanners, or tactile sensors that permit adaptation to a changing environment has increased dramatically. Intelligent robot products have been developed in many cases for factory automation and for some hospital and home applications. To reach an even higher degree of applications, the addition of learning may be required. Recently, learning theories such as the adaptive critic have been proposed. In this type of learning, a critic provides a grade to the controller of an action module such as a robot. The adaptive critic is a good model for human learning. In general, the critic may be considered to be the human with the teach pendant, plant manager, line supervisor, quality inspector or the consumer. If the ultimate critic is the consumer, then the quality inspector must model the consumer's decision-making process and use this model in the design and manufacturing operations. Can the adaptive critic be used to advance intelligent robots? Intelligent robots have historically taken decades to be developed and reduced to practice. Methods for speeding this development include technology such as rapid prototyping and product development and government, industry and university cooperation.

  9. Developing Automated Feedback Materials for a Training Simulator: An Interaction between Users and Researchers.

    ERIC Educational Resources Information Center

    Shlechter, Theodore M.; And Others

    This paper focuses upon the research and development (R&D) process associated with developing automated feedback materials for the SIMulation NETworking (SIMNET) training system. This R&D process involved a partnership among instructional developers, practitioners, and researchers. Users' input has been utilized to help: (1) design the…

  10. Pion Production from 5-15 GeV Beam for the Neutrino Factory Front-End Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prior, Gersende

    2010-03-30

    For the neutrino factory front-end study, the production of pions from a proton beam of 5-8 and 14 GeV kinetic energy on a Hg jet target has been simulated. The pion yields for two versions of the MARS15 code and two different field configurations have been compared. The particles have also been tracked from the target position down to the end of the cooling channel using the ICOOL code and the neutrino factory baseline lattice. The momentum-angle region of pions producing muons that survived until the end of the cooling channel has been compared with the region covered by HARPmore » data and the number of pions/muons as a function of the incoming beam energy is also reported.« less

  11. Constrained Local UniversE Simulations: a Local Group factory

    NASA Astrophysics Data System (ADS)

    Carlesi, Edoardo; Sorce, Jenny G.; Hoffman, Yehuda; Gottlöber, Stefan; Yepes, Gustavo; Libeskind, Noam I.; Pilipenko, Sergey V.; Knebe, Alexander; Courtois, Hélène; Tully, R. Brent; Steinmetz, Matthias

    2016-05-01

    Near-field cosmology is practised by studying the Local Group (LG) and its neighbourhood. This paper describes a framework for simulating the `near field' on the computer. Assuming the Λ cold dark matter (ΛCDM) model as a prior and applying the Bayesian tools of the Wiener filter and constrained realizations of Gaussian fields to the Cosmicflows-2 (CF2) survey of peculiar velocities, constrained simulations of our cosmic environment are performed. The aim of these simulations is to reproduce the LG and its local environment. Our main result is that the LG is likely a robust outcome of the ΛCDMscenario when subjected to the constraint derived from CF2 data, emerging in an environment akin to the observed one. Three levels of criteria are used to define the simulated LGs. At the base level, pairs of haloes must obey specific isolation, mass and separation criteria. At the second level, the orbital angular momentum and energy are constrained, and on the third one the phase of the orbit is constrained. Out of the 300 constrained simulations, 146 LGs obey the first set of criteria, 51 the second and 6 the third. The robustness of our LG `factory' enables the construction of a large ensemble of simulated LGs. Suitable candidates for high-resolution hydrodynamical simulations of the LG can be drawn from this ensemble, which can be used to perform comprehensive studies of the formation of the LG.

  12. 27th International Conference on CADCAM, Robotics and Factories of the Future 2014

    NASA Astrophysics Data System (ADS)

    Karamanoglu, Mehmet; Yang, Xin-She; Zivanovic, Aleksandar; Smith, Martin; Loureiro, Rui

    2014-07-01

    It is a great pleasure to welcome you to the 27th International Conference on CADCAM, Robotics and Factories of the Future, sponsored by the International Society for Productivity Enhancement, Middlesex University, Festo Limited GB, National Instruments UK & Ireland, the Sector Skills Council for Science, Engineering and Manufacturing Technologies and our proceedings publisher Institute of Physics Publications. This is the second time Middlesex University has played host to this longstanding international conference, last time being the 12th edition in 1996. The subject content of the conference remains current, focusing on cutting edge developments in research. The conference themes this year are divided into seven themes, Product Development and Sustainability, Modelling and Simulation, Automation, Robotics and Handling Systems, Advanced Quality Systems Tools and Quality Management, Human Aspects in Engineering Activities, Emerging Scenarios in Engineering Education and Training, and Emerging Technologies in Factories of the Future. The conference is organised into seven sessions running in parallel over three days, providing a platform to speakers from 16 different countries. The programme also features four eminent keynote speakers and a hands-on workshop organised by National Instruments. Organising an event such as this would not be possible without the help of many colleagues. I am grateful to the members of the Organising Committee, the International Scientific Committee, our sponsors and all those colleagues who helped in the review of many abstracts and consequently full papers. This required meticulous attention to detail and strict adherence to very tight deadlines. However large or small a conference is, the effort required to make the local arrangements work for all is not insignificant. The conference organisers acknowledge the particular efforts of Miss Mita Vaghi in providing her expertise in event management and her diligent support and Anete Ashton of IoP Publications for her guidance and help in producing the conference proceedings and online listing. The organisers also recognise the support provided by our sponsors and in particular to Richard Roberts and David Baker from National Instruments, Babak Jahanbani and Phil Holmes from Festo Ltd. Their continued support over the course of the planning period and also during the event itself is very much appreciated. We are also indebted to all the contributors to the conference, particularly the researchers, and practitioners. Professor Mehmet Karamanoglu Conference Chair Further details, including keynote speakers and committees, are available in the PDF

  13. Using Simulation as an Investigational Methodology to Explore the Impact of Technology on Team Communication and Patient Management: A Pilot Evaluation of the Effect of an Automated Compression Device.

    PubMed

    Gittinger, Matthew; Brolliar, Sarah M; Grand, James A; Nichol, Graham; Fernandez, Rosemarie

    2017-06-01

    This pilot study used a simulation-based platform to evaluate the effect of an automated mechanical chest compression device on team communication and patient management. Four-member emergency department interprofessional teams were randomly assigned to perform manual chest compressions (control, n = 6) or automated chest compressions (intervention, n = 6) during a simulated cardiac arrest with 2 phases: phase 1 baseline (ventricular tachycardia), followed by phase 2 (ventricular fibrillation). Patient management was coded using an Advanced Cardiovascular Life Support-based checklist. Team communication was categorized in the following 4 areas: (1) teamwork focus; (2) huddle events, defined as statements focused on re-establishing situation awareness, reinforcing existing plans, and assessing the need to adjust the plan; (3) clinical focus; and (4) profession of team member. Statements were aggregated for each team. At baseline, groups were similar with respect to total communication statements and patient management. During cardiac arrest, the total number of communication statements was greater in teams performing manual compressions (median, 152.3; interquartile range [IQR], 127.6-181.0) as compared with teams using an automated compression device (median, 105; IQR, 99.5-123.9). Huddle events were more frequent in teams performing automated chest compressions (median, 4.0; IQR, 3.1-4.3 vs. 2.0; IQR, 1.4-2.6). Teams randomized to the automated compression intervention had a delay to initial defibrillation (median, 208.3 seconds; IQR, 153.3-222.1 seconds) as compared with control teams (median, 63.2 seconds; IQR, 30.1-397.2 seconds). Use of an automated compression device may impact both team communication and patient management. Simulation-based assessments offer important insights into the effect of technology on healthcare teams.

  14. Simulation-based Randomized Comparative Assessment of Out-of-Hospital Cardiac Arrest Resuscitation Bundle Completion by Emergency Medical Service Teams Using Standard Life Support or an Experimental Automation-assisted Approach.

    PubMed

    Choi, Bryan; Asselin, Nicholas; Pettit, Catherine C; Dannecker, Max; Machan, Jason T; Merck, Derek L; Merck, Lisa H; Suner, Selim; Williams, Kenneth A; Jay, Gregory D; Kobayashi, Leo

    2016-12-01

    Effective resuscitation of out-of-hospital cardiac arrest (OHCA) patients is challenging. Alternative resuscitative approaches using electromechanical adjuncts may improve provider performance. Investigators applied simulation to study the effect of an experimental automation-assisted, goal-directed OHCA management protocol on EMS providers' resuscitation performance relative to standard protocols and equipment. Two-provider (emergency medical technicians (EMT)-B and EMT-I/C/P) teams were randomized to control or experimental group. Each team engaged in 3 simulations: baseline simulation (standard roles); repeat simulation (standard roles); and abbreviated repeat simulation (reversed roles, i.e., basic life support provider performing ALS tasks). Control teams used standard OHCA protocols and equipment (with high-performance cardiopulmonary resuscitation training intervention); for second and third simulations, experimental teams performed chest compression, defibrillation, airway, pulmonary ventilation, vascular access, medication, and transport tasks with goal-directed protocol and resuscitation-automating devices. Videorecorders and simulator logs collected resuscitation data. Ten control and 10 experimental teams comprised 20 EMT-B's; 1 EMT-I, 8 EMT-C's, and 11 EMT-P's; study groups were not fully matched. Both groups suboptimally performed chest compressions and ventilations at baseline. For their second simulations, control teams performed similarly except for reduced on-scene time, and experimental teams improved their chest compressions (P=0.03), pulmonary ventilations (P<0.01), and medication administration (P=0.02); changes in their performance of chest compression, defibrillation, airway, and transport tasks did not attain significance against control teams' changes. Experimental teams maintained performance improvements during reversed-role simulations. Simulation-based investigation into OHCA resuscitation revealed considerable variability and improvable deficiencies in small EMS teams. Goal-directed, automation-assisted OHCA management augmented select resuscitation bundle element performance without comprehensive improvement.

  15. Fleet Sizing of Automated Material Handling Using Simulation Approach

    NASA Astrophysics Data System (ADS)

    Wibisono, Radinal; Ai, The Jin; Ratna Yuniartha, Deny

    2018-03-01

    Automated material handling tends to be chosen rather than using human power in material handling activity for production floor in manufacturing company. One critical issue in implementing automated material handling is designing phase to ensure that material handling activity more efficient in term of cost spending. Fleet sizing become one of the topic in designing phase. In this research, simulation approach is being used to solve fleet sizing problem in flow shop production to ensure optimum situation. Optimum situation in this research means minimum flow time and maximum capacity in production floor. Simulation approach is being used because flow shop can be modelled into queuing network and inter-arrival time is not following exponential distribution. Therefore, contribution of this research is solving fleet sizing problem with multi objectives in flow shop production using simulation approach with ARENA Software

  16. Automatic insertion of simulated microcalcification clusters in a software breast phantom

    NASA Astrophysics Data System (ADS)

    Shankla, Varsha; Pokrajac, David D.; Weinstein, Susan P.; DeLeo, Michael; Tuite, Catherine; Roth, Robyn; Conant, Emily F.; Maidment, Andrew D.; Bakic, Predrag R.

    2014-03-01

    An automated method has been developed to insert realistic clusters of simulated microcalcifications (MCs) into computer models of breast anatomy. This algorithm has been developed as part of a virtual clinical trial (VCT) software pipeline, which includes the simulation of breast anatomy, mechanical compression, image acquisition, image processing, display and interpretation. An automated insertion method has value in VCTs involving large numbers of images. The insertion method was designed to support various insertion placement strategies, governed by probability distribution functions (pdf). The pdf can be predicated on histological or biological models of tumor growth, or estimated from the locations of actual calcification clusters. To validate the automated insertion method, a 2-AFC observer study was designed to compare two placement strategies, undirected and directed. The undirected strategy could place a MC cluster anywhere within the phantom volume. The directed strategy placed MC clusters within fibroglandular tissue on the assumption that calcifications originate from epithelial breast tissue. Three radiologists were asked to select between two simulated phantom images, one from each placement strategy. Furthermore, questions were posed to probe the rationale behind the observer's selection. The radiologists found the resulting cluster placement to be realistic in 92% of cases, validating the automated insertion method. There was a significant preference for the cluster to be positioned on a background of adipose or mixed adipose/fibroglandular tissues. Based upon these results, this automated lesion placement method will be included in our VCT simulation pipeline.

  17. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    PubMed

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  18. Automation of block assignment planning using a diagram-based scenario modeling method

    NASA Astrophysics Data System (ADS)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  19. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed are: (1) capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) investigation and simulation of various control methods including manual force/torque and active compliances control; (5) evaluation and implementation of three obstacle avoidance methods; (6) video simulation and edge detection; and (7) software simulation validation.

  20. Simulation in production of open rotor propellers: from optimal surface geometry to automated control of mechanical treatment

    NASA Astrophysics Data System (ADS)

    Grinyok, A.; Boychuk, I.; Perelygin, D.; Dantsevich, I.

    2018-03-01

    A complex method of the simulation and production design of open rotor propellers was studied. An end-to-end diagram was proposed for the evaluating, designing and experimental testing the optimal geometry of the propeller surface, for the machine control path generation as well as for simulating the cutting zone force condition and its relationship with the treatment accuracy which was defined by the propeller elastic deformation. The simulation data provided the realization of the combined automated path control of the cutting tool.

  1. Automation Applications in an Advanced Air Traffic Management System : Volume 5A. DELTA Simulation Model - User's Guide

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...

  2. On The Modeling of Educational Systems: II

    ERIC Educational Resources Information Center

    Grauer, Robert T.

    1975-01-01

    A unified approach to model building is developed from the separate techniques of regression, simulation, and factorial design. The methodology is applied in the context of a suburban school district. (Author/LS)

  3. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  4. Driving comfort, enjoyment and acceptance of automated driving - effects of drivers' age and driving style familiarity.

    PubMed

    Hartwich, Franziska; Beggiato, Matthias; Krems, Josef F

    2018-02-23

    Automated driving has the potential to improve the safety and efficiency of future traffic and to extend elderly peoples' driving life, provided it is perceived as comfortable and joyful and is accepted by drivers. Driving comfort could be enhanced by familiar automated driving styles based on drivers' manual driving styles. In a two-stage driving simulator study, effects of driving automation and driving style familiarity on driving comfort, enjoyment and system acceptance were examined. Twenty younger and 20 older drivers performed a manual and four automated drives of different driving style familiarity. Acceptance, comfort and enjoyment were assessed after driving with standardised questionnaires, discomfort during driving via handset control. Automation increased both age groups' comfort, but decreased younger drivers' enjoyment. Younger drivers showed higher comfort, enjoyment and acceptance with familiar automated driving styles, whereas older drivers preferred unfamiliar, automated driving styles tending to be faster than their age-affected manual driving styles. Practitioner Summary: Automated driving needs to be comfortable and enjoyable to be accepted by drivers, which could be enhanced by driving style individualisation. This approach was evaluated in a two-stage driving simulator study for different age groups. Younger drivers preferred familiar driving styles, whereas older drivers preferred driving styles unaffected by age.

  5. Transitioning Resolution Responsibility between the Controller and Automation Team in Simulated NextGen Separation Assurance

    NASA Technical Reports Server (NTRS)

    Cabrall, C.; Gomez, A.; Homola, J.; Hunt, S..; Martin, L.; Merccer, J.; Prevott, T.

    2013-01-01

    As part of an ongoing research effort on separation assurance and functional allocation in NextGen, a controller- in-the-loop study with ground-based automation was conducted at NASA Ames' Airspace Operations Laboratory in August 2012 to investigate the potential impact of introducing self-separating aircraft in progressively advanced NextGen timeframes. From this larger study, the current exploratory analysis of controller-automation interaction styles focuses on the last and most far-term time frame. Measurements were recorded that firstly verified the continued operational validity of this iteration of the ground-based functional allocation automation concept in forecast traffic densities up to 2x that of current day high altitude en-route sectors. Additionally, with greater levels of fully automated conflict detection and resolution as well as the introduction of intervention functionality, objective and subjective analyses showed a range of passive to active controller- automation interaction styles between the participants. Not only did the controllers work with the automation to meet their safety and capacity goals in the simulated future NextGen timeframe, they did so in different ways and with different attitudes of trust/use of the automation. Taken as a whole, the results showed that the prototyped controller-automation functional allocation framework was very flexible and successful overall.

  6. Plant responses to simulated hurricane impacts in a subtropical wet forest, Puerto Rico

    Treesearch

    Aaron B. Shiels; Jess K. Zimmerman; Diana C. García-Montiel; Inge Jonckheere; Jennifer Holm; David Horton; Nicholas Brokaw

    2010-01-01

    1. We simulated two key components of severe hurricane disturbance, canopy openness and detritus deposition, to determine the independent and interactive effects of these components on woody plant recruitment and forest structure. 2. We increased canopy openness by trimming branches and added or subtracted canopy detritus in a factorial design. Plant responses were...

  7. A Laboratory Glass-Cockpit Flight Simulator for Automation and Communications Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Gregory M.; Heers, Susan T.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    A laboratory glass-cockpit flight simulator supporting research on advanced commercial flight deck and Air Traffic Control (ATC) automation and communication interfaces has been developed at the Aviation Operations Branch at the NASA Ames Research Center. This system provides independent and integrated flight and ATC simulator stations, party line voice and datalink communications, along with video and audio monitoring and recording capabilities. Over the last several years, it has been used to support the investigation of flight human factors research issues involving: communication modality; message content and length; graphical versus textual presentation of information, and human accountability for automation. This paper updates the status of this simulator, describing new functionality in the areas of flight management system, EICAS display, and electronic checklist integration. It also provides an overview of several experiments performed using this simulator, including their application areas and results. Finally future enhancements to its ATC (integration of CTAS software) and flight deck (full crew operations) functionality are described.

  8. Automated Sequence Processor: Something Old, Something New

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Schrock, Mitchell; Fisher, Forest; Himes, Terry

    2012-01-01

    High productivity required for operations teams to meet schedules Risk must be minimized. Scripting used to automate processes. Scripts perform essential operations functions. Automated Sequence Processor (ASP) was a grass-roots task built to automate the command uplink process System engineering task for ASP revitalization organized. ASP is a set of approximately 200 scripts written in Perl, C Shell, AWK and other scripting languages.. ASP processes/checks/packages non-interactive commands automatically.. Non-interactive commands are guaranteed to be safe and have been checked by hardware or software simulators.. ASP checks that commands are non-interactive.. ASP processes the commands through a command. simulator and then packages them if there are no errors.. ASP must be active 24 hours/day, 7 days/week..

  9. Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.

    PubMed

    Greenlee, Eric T; DeLucia, Patricia R; Newton, David C

    2018-06-01

    The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.

  10. Proceedings of the Automated Weather Support Technical Exchange Conference (6th). Held at the U.S. Naval Academy, Annapolis, Maryland on 21-24 September 1970.

    DTIC Science & Technology

    forecasting, Tailored meteorological support, Automated processing and application of satellite data, and Environmental simulation. The sixth and final session was devoted to a panel discussion on Automated meteorological support.

  11. Computer experimental analysis of the CHP performance of a 100 kW e SOFC Field Unit by a factorial design

    NASA Astrophysics Data System (ADS)

    Calì, M.; Santarelli, M. G. L.; Leone, P.

    Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.

  12. Demonstration of the feasibility of automated silicon solar cell fabrication

    NASA Technical Reports Server (NTRS)

    Thornhill, J. W.; Taylor, W. E.

    1976-01-01

    An analysis of estimated costs indicate that for an annual output of 4,747,000 hexagonal cells (38 mm. on a side) a total factory cost of $0.866 per cell could be achieved. For cells with 14% efficiency at AMO intensity (1353 watts per square meter), this annual production rate is equivalent to 3,373 kilowatts and a manufacturing cost of $1.22 per watt of electrical output. A laboratory model of such a facility was operated to produce a series of demonstration runs, producing hexagonal cells, 2 x 2 cm cells and 2 x 4 cm cells.

  13. The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic

    NASA Technical Reports Server (NTRS)

    Armstrong, Curtis D.; Humphreys, William M.

    2003-01-01

    We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.

  14. Real-time simulations for automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Cuseo, John A.

    1991-01-01

    Although the individual technologies for automated rendezvous and capture (AR&C) exist, they have not yet been integrated to produce a working system in the United States. Thus, real-time integrated systems simulations are critical to the development and pre-flight demonstration of an AR&C capability. Real-time simulations require a level of development more typical of a flight system compared to purely analytical methods, thus providing confidence in derived design concepts. This presentation will describe Martin Marietta's Space Operations Simulation (SOS) Laboratory, a state-of-the-art real-time simulation facility for AR&C, along with an implementation for the Satellite Servicer System (SSS) Program.

  15. Simulation Assessment Validation Environment (SAVE). Software User’s Manual

    DTIC Science & Technology

    2000-09-01

    requirements and decisions are made. The integration is leveraging work from other DoD organizations so that high -end results are attainable much faster than...planning through the modeling and simulation data capture and visualization process. The planners can complete the manufacturing process plan with a high ...technologies. This tool is also used to perform “ high level” factory process simulation prior to full CAD model development and help define feasible

  16. Accomplishments of the heavy electron particle accelerator program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuffer, D.; Stratakis, D.; Palmer, M.

    The Muon Accelerator Program (MAP) has completed a four-year study on the feasibility of muon colliders and on using stored muon beams for neutrinos. That study was broadly successful in its goals, establishing the feasibility of heavy lepton colliders (HLCs) from the 125 GeV Higgs Factory to more than 10 TeV, as well as exploring using a μ storage ring (MSR) for neutrinos, and establishing that MSRs could provide factory-level intensities of νe (more » $$\\bar{ve}$$) and $$\\bar{vμ}$$ (νμ) beams. The key components of the collider and neutrino factory systems were identified. Feasible designs and detailed simulations of all of these components have been obtained, including some initial hardware component tests, setting the stage for future implementation where resources are available and the precise physics goals become apparent.« less

  17. Muon Sources for Particle Physics - Accomplishments of the Muon Accelerator Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neuffer, D.; Stratakis, D.; Palmer, M.

    The Muon Accelerator Program (MAP) completed a four-year study on the feasibility of muon colliders and on using stored muon beams for neutrinos. That study was broadly successful in its goals, establishing the feasibility of lepton colliders from the 125 GeV Higgs Factory to more than 10 TeV, as well as exploring using a μ storage ring (MSR) for neutrinos, and establishing that MSRs could provide factory-level intensities of νe (ν more » $$\\bar{e}$$) and ν $$\\bar{μ}$$) (ν μ) beams. The key components of the collider and neutrino factory systems were identified. Feasible designs and detailed simulations of all of these components were obtained, including some initial hardware component tests, setting the stage for future implementation where resources are available and clearly associated physics goals become apparent« less

  18. Exploring time series retrieved from cardiac implantable devices for optimizing patient follow-up

    PubMed Central

    Guéguin, Marie; Roux, Emmanuel; Hernández, Alfredo I; Porée, Fabienne; Mabo, Philippe; Graindorge, Laurence; Carrault, Guy

    2008-01-01

    Current cardiac implantable devices (ID) are equipped with a set of sensors that can provide useful information to improve patient follow-up and to prevent health deterioration in the postoperative period. In this paper, data obtained from an ID with two such sensors (a transthoracic impedance sensor and an accelerometer) are analyzed in order to evaluate their potential application for the follow-up of patients treated with a cardiac resynchronization therapy (CRT). A methodology combining spatio-temporal fuzzy coding and multiple correspondence analysis (MCA) is applied in order to: i) reduce the dimensionality of the data and provide new synthetic indices based on the “factorial axes” obtained from MCA, ii) interpret these factorial axes in physiological terms and iii) analyze the evolution of the patient’s status by projecting the acquired data into the plane formed by the first two factorial axes named “factorial plane”. In order to classify the different evolution patterns, a new similarity measure is proposed and validated on simulated datasets, and then used to cluster observed data from 41 CRT patients. The obtained clusters are compared with the annotations on each patient’s medical record. Two areas on the factorial plane are identified, one being correlated with a health degradation of patients and the other with a stable clinical state. PMID:18838359

  19. Multilevel Factorial Experiments for Developing Behavioral Interventions: Power, Sample Size, and Resource Considerations†

    PubMed Central

    Dziak, John J.; Nahum-Shani, Inbal; Collins, Linda M.

    2012-01-01

    Factorial experimental designs have many potential advantages for behavioral scientists. For example, such designs may be useful in building more potent interventions, by helping investigators to screen several candidate intervention components simultaneously and decide which are likely to offer greater benefit before evaluating the intervention as a whole. However, sample size and power considerations may challenge investigators attempting to apply such designs, especially when the population of interest is multilevel (e.g., when students are nested within schools, or employees within organizations). In this article we examine the feasibility of factorial experimental designs with multiple factors in a multilevel, clustered setting (i.e., of multilevel multifactor experiments). We conduct Monte Carlo simulations to demonstrate how design elements such as the number of clusters, the number of lower-level units, and the intraclass correlation affect power. Our results suggest that multilevel, multifactor experiments are feasible for factor-screening purposes, because of the economical properties of complete and fractional factorial experimental designs. We also discuss resources for sample size planning and power estimation for multilevel factorial experiments. These results are discussed from a resource management perspective, in which the goal is to choose a design that maximizes the scientific benefit using the resources available for an investigation. PMID:22309956

  20. Multilevel factorial experiments for developing behavioral interventions: power, sample size, and resource considerations.

    PubMed

    Dziak, John J; Nahum-Shani, Inbal; Collins, Linda M

    2012-06-01

    Factorial experimental designs have many potential advantages for behavioral scientists. For example, such designs may be useful in building more potent interventions by helping investigators to screen several candidate intervention components simultaneously and to decide which are likely to offer greater benefit before evaluating the intervention as a whole. However, sample size and power considerations may challenge investigators attempting to apply such designs, especially when the population of interest is multilevel (e.g., when students are nested within schools, or when employees are nested within organizations). In this article, we examine the feasibility of factorial experimental designs with multiple factors in a multilevel, clustered setting (i.e., of multilevel, multifactor experiments). We conduct Monte Carlo simulations to demonstrate how design elements-such as the number of clusters, the number of lower-level units, and the intraclass correlation-affect power. Our results suggest that multilevel, multifactor experiments are feasible for factor-screening purposes because of the economical properties of complete and fractional factorial experimental designs. We also discuss resources for sample size planning and power estimation for multilevel factorial experiments. These results are discussed from a resource management perspective, in which the goal is to choose a design that maximizes the scientific benefit using the resources available for an investigation. (c) 2012 APA, all rights reserved

  1. Virtual commissioning of automated micro-optical assembly

    NASA Astrophysics Data System (ADS)

    Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian

    2015-02-01

    In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping

  2. Controlling Air Traffic (Simulated) in the Presence of Automation (CATS PAu) 1995: A Study of Measurement Techniques for Situation Awareness in Air Traffic Control

    NASA Technical Reports Server (NTRS)

    French, Jennifer R.

    1995-01-01

    As automated systems proliferate in aviation systems, human operators are taking on less and less of an active role in the jobs they once performed, often reducing what should be important jobs to tasks barely more complex than monitoring machines. When operators are forced into these roles, they risk slipping into hazardous states of awareness, which can lead to reduced skills, lack of vigilance, and the inability to react quickly and competently when there is a machine failure. Using Air Traffic Control (ATC) as a model, the present study developed tools for conducting tests focusing on levels of automation as they relate to situation awareness. Subjects participated in a two-and-a-half hour experiment that consisted of a training period followed by a simulation of air traffic control similar to the system presently used by the FAA, then an additional simulation employing automated assistance. Through an iterative design process utilizing numerous revisions and three experimental sessions, several measures for situational awareness in a simulated Air Traffic Control System were developed and are prepared for use in future experiments.

  3. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  4. Estimating post-marketing exposure to pharmaceutical products using ex-factory distribution data.

    PubMed

    Telfair, Tamara; Mohan, Aparna K; Shahani, Shalini; Klincewicz, Stephen; Atsma, Willem Jan; Thomas, Adrian; Fife, Daniel

    2006-10-01

    The pharmaceutical industry has an obligation to identify adverse reactions to drug products during all phases of drug development, including the post-marketing period. Estimates of population exposure to pharmaceutical products are important to the post-marketing surveillance of drugs, and provide a context for assessing the various risks and benefits, including drug safety, associated with drug treatment. This paper describes a systematic approach to estimating post-marketing drug exposure using ex-factory shipment data to estimate the quantity of medication available, and dosage information (stratified by indication or other factors as appropriate) to convert the quantity of medication to person time of exposure. Unlike the non-standardized methods often used to estimate exposure, this approach provides estimates whose calculations are explicit, documented, and consistent across products and over time. The methods can readily be carried out by an individual or small group specializing in this function, and lend themselves to automation. The present estimation approach is practical and relatively uncomplicated to implement. We believe it is a useful innovation. Copyright 2006 John Wiley & Sons, Ltd.

  5. A hybrid simulation approach for integrating safety behavior into construction planning: An earthmoving case study.

    PubMed

    Goh, Yang Miang; Askar Ali, Mohamed Jawad

    2016-08-01

    One of the key challenges in improving construction safety and health is the management of safety behavior. From a system point of view, workers work unsafely due to system level issues such as poor safety culture, excessive production pressure, inadequate allocation of resources and time and lack of training. These systemic issues should be eradicated or minimized during planning. However, there is a lack of detailed planning tools to help managers assess the impact of their upstream decisions on worker safety behavior. Even though simulation had been used in construction planning, the review conducted in this study showed that construction safety management research had not been exploiting the potential of simulation techniques. Thus, a hybrid simulation framework is proposed to facilitate integration of safety management considerations into construction activity simulation. The hybrid framework consists of discrete event simulation (DES) as the core, but heterogeneous, interactive and intelligent (able to make decisions) agents replace traditional entities and resources. In addition, some of the cognitive processes and physiological aspects of agents are captured using system dynamics (SD) approach. The combination of DES, agent-based simulation (ABS) and SD allows a more "natural" representation of the complex dynamics in construction activities. The proposed hybrid framework was demonstrated using a hypothetical case study. In addition, due to the lack of application of factorial experiment approach in safety management simulation, the case study demonstrated sensitivity analysis and factorial experiment to guide future research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. On Improved Least Squares Regression and Artificial Neural Network Meta-Models for Simulation via Control Variates

    DTIC Science & Technology

    2016-09-15

    18] under the context of robust parameter design for simulation. Bellucci’s technique is used in this research, primarily because the interior -point...Fundamentals of Radial Basis Neural Network (RBNN) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 1.2.2.2 Design of Experiments...with Neural Nets . . . . . . . . . . . . . 31 1.2.2.3 Factorial Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 1.2.2.4

  7. Technology Pathway Partnership Final Scientific Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, John C. Dr.; Godby, Larry A.

    2012-04-26

    This report covers the scientific progress and results made in the development of high efficiency multijunction solar cells and the light concentrating non-imaging optics for the commercial generation of renewable solar energy. During the contract period the efficiency of the multijunction solar cell was raised from 36.5% to 40% in commercially available fully qualified cells. In addition significant strides were made in automating production process for these cells in order to meet the costs required to compete with commercial electricity. Concurrent with the cells effort Boeing also developed a non imaging optical systems to raise the light intensity at themore » photovoltaic cell to the rage of 800 to 900 suns. Solar module efficiencies greater than 30% were consistently demonstrated. The technology and its manufacturing were maturated to a projected price of < $0.015 per kWh and demonstrated by automated assembly in a robotic factory with a throughput of 2 MWh/yr. The technology was demonstrated in a 100 kW power plant erected at California State University Northridge, CA.« less

  8. Test/score/report: Simulation techniques for automating the test process

    NASA Technical Reports Server (NTRS)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary to test a POCC/MOC software delivery, as well as improve the quality of the test process.

  9. Development Status: Automation Advanced Development Space Station Freedom Electric Power System

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Kish, James A.; Mellor, Pamela A.

    1990-01-01

    Electric power system automation for Space Station Freedom is intended to operate in a loop. Data from the power system is used for diagnosis and security analysis to generate Operations Management System (OMS) requests, which are sent to an arbiter, which sends a plan to a commander generator connected to the electric power system. This viewgraph presentation profiles automation software for diagnosis, scheduling, and constraint interfaces, and simulation to support automation development. The automation development process is diagrammed, and the process of creating Ada and ART versions of the automation software is described.

  10. Multi-disciplinary optimization of railway wheels

    NASA Astrophysics Data System (ADS)

    Nielsen, J. C. O.; Fredö, C. R.

    2006-06-01

    A numerical procedure for multi-disciplinary optimization of railway wheels, based on Design of Experiments (DOE) methodology and automated design, is presented. The target is a wheel design that meets the requirements for fatigue strength, while minimizing the unsprung mass and rolling noise. A 3-level full factorial (3LFF) DOE is used to collect data points required to set up Response Surface Models (RSM) relating design and response variables in the design space. Computationally efficient simulations are thereafter performed using the RSM to identify the solution that best fits the design target. A demonstration example, including four geometric design variables in a parametric finite element (FE) model, is presented. The design variables are wheel radius, web thickness, lateral offset between rim and hub, and radii at the transitions rim/web and hub/web, but more variables (including material properties) can be added if needed. To improve further the performance of the wheel design, a constrained layer damping (CLD) treatment is applied on the web. For a given load case, compared to a reference wheel design without CLD, a combination of wheel shape and damping optimization leads to the conclusion that a reduction in the wheel component of A-weighted rolling noise of 11 dB can be achieved if a simultaneous increase in wheel mass of 14 kg is accepted.

  11. Analysis of trust in autonomy for convoy operations

    NASA Astrophysics Data System (ADS)

    Gremillion, Gregory M.; Metcalfe, Jason S.; Marathe, Amar R.; Paul, Victor J.; Christensen, James; Drnec, Kim; Haynes, Benjamin; Atwater, Corey

    2016-05-01

    With growing use of automation in civilian and military contexts that engage cooperatively with humans, the operator's level of trust in the automated system is a major factor in determining the efficacy of the human-autonomy teams. Suboptimal levels of human trust in autonomy (TiA) can be detrimental to joint team performance. This mis-calibrated trust can manifest in several ways, such as distrust and complete disuse of the autonomy or complacency, which results in an unsupervised autonomous system. This work investigates human behaviors that may reflect TiA in the context of an automated driving task, with the goal of improving team performance. Subjects performed a simulated leaderfollower driving task with an automated driving assistant. The subjects had could choose to engage an automated lane keeping and active cruise control system of varying performance levels. Analysis of the experimental data was performed to identify contextual features of the simulation environment that correlated to instances of automation engagement and disengagement. Furthermore, behaviors that potentially indicate inappropriate TiA levels were identified in the subject trials using estimates of momentary risk and agent performance, as functions of these contextual features. Inter-subject and intra-subject trends in automation usage and performance were also identified. This analysis indicated that for poorer performing automation, TiA decreases with time, while higher performing automation induces less drift toward diminishing usage, and in some cases increases in TiA. Subject use of automation was also found to be largely influenced by course features.

  12. Prox-1 Automated Proximity Operations

    DTIC Science & Technology

    2016-01-13

    K.J., and Veto, M., "Automated Trajectory Control for On-Orbit Inspection in the Prox-1 Mission," Journal of Spacecraft and Rockets , in review...the operability of all commands needed for the minimum mission. A Simulated Communications Test was performed that demonstrated long-range uplink...Guidance, Navigation and Control subsystem 6 DOF simulation , and delivered for flight coding. Validation of the system’s capability to meet full

  13. Thermal photons in heavy ion collisions at 158 A GeV

    NASA Astrophysics Data System (ADS)

    Dutt, Sunil

    2018-05-01

    The essence of experimental ultra-relativistic heavy ion collision physics is the production and study of strongly interacting matter at extreme energy densities, temperatures and consequent search for equation of state of nuclear matter. The focus of the analysis has been to examine pseudo-rapidity distributions obtained for the γ-like particles in pre-shower photon multiplicity detector. This allows the extension of scaled factorial moment analysis to bin sizes smaller than those accessible to other experimental techniques. Scaled factorial moments are calculated using horizontal corrected and vertical analysis. The results are compared with simulation analysis using VENUS event generator.

  14. Automated Run-Time Mission and Dialog Generation

    DTIC Science & Technology

    2007-03-01

    Processing, Social Network Analysis, Simulation, Automated Scenario Generation 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified...9 D. SOCIAL NETWORKS...13 B. MISSION AND DIALOG GENERATION.................................................13 C. SOCIAL NETWORKS

  15. Sample Size Requirements and Study Duration for Testing Main Effects and Interactions in Completely Randomized Factorial Designs When Time to Event is the Outcome

    PubMed Central

    Moser, Barry Kurt; Halabi, Susan

    2013-01-01

    In this paper we develop the methodology for designing clinical trials with any factorial arrangement when the primary outcome is time to event. We provide a matrix formulation for calculating the sample size and study duration necessary to test any effect with a pre-specified type I error rate and power. Assuming that a time to event follows an exponential distribution, we describe the relationships between the effect size, the power, and the sample size. We present examples for illustration purposes. We provide a simulation study to verify the numerical calculations of the expected number of events and the duration of the trial. The change in the power produced by a reduced number of observations or by accruing no patients to certain factorial combinations is also described. PMID:25530661

  16. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  17. Jdpd: an open java simulation kernel for molecular fragment dissipative particle dynamics.

    PubMed

    van den Broek, Karina; Kuhn, Hubert; Zielesny, Achim

    2018-05-21

    Jdpd is an open Java simulation kernel for Molecular Fragment Dissipative Particle Dynamics with parallelizable force calculation, efficient caching options and fast property calculations. It is characterized by an interface and factory-pattern driven design for simple code changes and may help to avoid problems of polyglot programming. Detailed input/output communication, parallelization and process control as well as internal logging capabilities for debugging purposes are supported. The new kernel may be utilized in different simulation environments ranging from flexible scripting solutions up to fully integrated "all-in-one" simulation systems.

  18. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    PubMed

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  19. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    PubMed Central

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  20. Application of Hybrid Real-Time Power System Simulator for Designing and Researching of Relay Protection and Automation

    NASA Astrophysics Data System (ADS)

    Borovikov, Yu S.; Sulaymanov, A. O.; Andreev, M. V.

    2015-10-01

    Development, research and operation of smart grids (SG) with active-adaptive networks (AAS) are actual tasks for today. Planned integration of high-speed FACTS devices greatly complicates complex dynamic properties of power systems. As a result the operating conditions of equipment of power systems are significantly changing. Such situation creates the new actual problem of development and research of relay protection and automation (RPA) which will be able to adequately operate in the SGs and adapt to its regimes. Effectiveness of solution of the problem depends on using tools - different simulators of electric power systems. Analysis of the most famous and widely exploited simulators led to the conclusion about the impossibility of using them for solution of the mentioned problem. In Tomsk Polytechnic University developed the prototype of hybrid multiprocessor software and hardware system - Hybrid Real-Time Power System Simulator (HRTSim). Because of its unique features this simulator can be used for solution of mentioned tasks. This article introduces the concept of development and research of relay protection and automation with usage of HRTSim.

  1. Predicting Flows of Rarefied Gases

    NASA Technical Reports Server (NTRS)

    LeBeau, Gerald J.; Wilmoth, Richard G.

    2005-01-01

    DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.

  2. Automated and dynamic scheduling for geodetic VLBI - A simulation study for AuScope and global networks

    NASA Astrophysics Data System (ADS)

    Iles, E. J.; McCallum, L.; Lovell, J. E. J.; McCallum, J. N.

    2018-02-01

    As we move into the next era of geodetic VLBI, the scheduling process is one focus for improvement in terms of increased flexibility and the ability to react with changing conditions. A range of simulations were conducted to ascertain the impact of scheduling on geodetic results such as Earth Orientation Parameters (EOPs) and station coordinates. The potential capabilities of new automated scheduling modes were also simulated, using the so-called 'dynamic scheduling' technique. The primary aim was to improve efficiency for both cost and time without losing geodetic precision, particularly to maximise the uses of the Australian AuScope VLBI array. We show that short breaks in observation will not significantly degrade the results of a typical 24 h experiment, whereas simply shortening observing time degrades precision exponentially. We also confirm the new automated, dynamic scheduling mode is capable of producing the same standard of result as a traditional schedule, with close to real-time flexibility. Further, it is possible to use the dynamic scheduler to augment the 3 station Australian AuScope array and thereby attain EOPs of the current global precision with only intermittent contribution from 2 additional stations. We thus confirm automated, dynamic scheduling bears great potential for flexibility and automation in line with aims for future continuous VLBI operations.

  3. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  4. Vacuum mechatronics

    NASA Technical Reports Server (NTRS)

    Hackwood, Susan; Belinski, Steven E.; Beni, Gerardo

    1989-01-01

    The discipline of vacuum mechatronics is defined as the design and development of vacuum-compatible computer-controlled mechanisms for manipulating, sensing and testing in a vacuum environment. The importance of vacuum mechatronics is growing with an increased application of vacuum in space studies and in manufacturing for material processing, medicine, microelectronics, emission studies, lyophylisation, freeze drying and packaging. The quickly developing field of vacuum mechatronics will also be the driving force for the realization of an advanced era of totally enclosed clean manufacturing cells. High technology manufacturing has increasingly demanding requirements for precision manipulation, in situ process monitoring and contamination-free environments. To remove the contamination problems associated with human workers, the tendency in many manufacturing processes is to move towards total automation. This will become a requirement in the near future for e.g., microelectronics manufacturing. Automation in ultra-clean manufacturing environments is evolving into the concept of self-contained and fully enclosed manufacturing. A Self Contained Automated Robotic Factory (SCARF) is being developed as a flexible research facility for totally enclosed manufacturing. The construction and successful operation of a SCARF will provide a novel, flexible, self-contained, clean, vacuum manufacturing environment. SCARF also requires very high reliability and intelligent control. The trends in vacuum mechatronics and some of the key research issues are reviewed.

  5. Vehicle automation: a remedy for driver stress?

    PubMed

    Funke, G; Matthews, G; Warm, J S; Emo, A K

    2007-08-01

    The present study addressed the effects of stress, vehicle automation and subjective state on driver performance and mood in a simulated driving task. A total of 168 college students participated. Participants in the stress-induction condition completed a 'winter' drive, which included periodic loss of control episodes. Participants in the no-stress-induction condition were not exposed to loss of control. An additional, independent manipulation of vehicle speed was also conducted, consisting of two control conditions requiring manual speed regulation and a third in which vehicle speed was automatically regulated by the simulation. Stress and automation both influenced subjective distress, but the two factors did not interact. Driver performance data indicated that vehicle automation impacted performance similarly in the stress and no-stress conditions. Individual differences in subjective stress response and performance were also investigated. Resource theory provides a framework that partially but not completely explains the relationship between vehicle automation and driver stress. Implications for driver workload, safety and training are discussed.

  6. Human-Automation Cooperation for Separation Assurance in Future NextGen Environments

    NASA Technical Reports Server (NTRS)

    Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas

    2014-01-01

    A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.

  7. Automated CD-SEM recipe creation technology for mass production using CAD data

    NASA Astrophysics Data System (ADS)

    Kawahara, Toshikazu; Yoshida, Masamichi; Tanaka, Masashi; Ido, Sanyu; Nakano, Hiroyuki; Adachi, Naokaka; Abe, Yuichi; Nagatomo, Wataru

    2011-03-01

    Critical Dimension Scanning Electron Microscope (CD-SEM) recipe creation needs sample preparation necessary for matching pattern registration, and recipe creation on CD-SEM using the sample, which hinders the reduction in test production cost and time in semiconductor manufacturing factories. From the perspective of cost reduction and improvement of the test production efficiency, automated CD-SEM recipe creation without the sample preparation and the manual operation has been important in the production lines. For the automated CD-SEM recipe creation, we have introduced RecipeDirector (RD) that enables the recipe creation by using Computer-Aided Design (CAD) data and text data that includes measurement information. We have developed a system that automatically creates the CAD data and the text data necessary for the recipe creation on RD; and, for the elimination of the manual operation, we have enhanced RD so that all measurement information can be specified in the text data. As a result, we have established an automated CD-SEM recipe creation system without the sample preparation and the manual operation. For the introduction of the CD-SEM recipe creation system using RD to the production lines, the accuracy of the pattern matching was an issue. The shape of design templates for the matching created from the CAD data was different from that of SEM images in vision. Thus, a development of robust pattern matching algorithm that considers the shape difference was needed. The addition of image processing of the templates for the matching and shape processing of the CAD patterns in the lower layer has enabled the robust pattern matching. This paper describes the automated CD-SEM recipe creation technology for the production lines without the sample preparation and the manual operation using RD applied in Sony Semiconductor Kyusyu Corporation Kumamoto Technology Center (SCK Corporation Kumamoto TEC).

  8. Simulator evaluation of the final approach spacing tool

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.

    1990-01-01

    The design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course is described. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arrivals as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a 4-D trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST was implemented on a high performance workstation. It can be operated as a stand-alone in the Terminal Radar Approach Control (TRACON) Facility or as an element of a system integrated with automation tools in the Air Route Traffic Control Center (ARTCC). FAST was evaluated by experienced TRACON controllers in a real-time air traffic control simulation. Simulation results show that FAST significantly reduced controller workload and demonstrated a potential for an increase in landing rate.

  9. Design and evaluation of an air traffic control Final Approach Spacing Tool

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.; Nedell, William

    1991-01-01

    This paper describes the design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arriving aircraft as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a four-dimensional trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST has been implemented on a high-performance workstation. It can be operated as a stand-alone in the terminal radar approach control facility or as an element of a system integrated with automation tools in the air route traffic control center. FAST was evaluated by experienced air traffic controllers in a real-time air traffic control simulation. simulation results summarized in the paper show that the automation tools significantly reduced controller work load and demonstrated a potential for an increase in landing rate.

  10. Automated longwall guidance and control systems, phase 1

    NASA Technical Reports Server (NTRS)

    Rybak, S. C.

    1978-01-01

    Candidate vertical control systems (VCS) and face advancement systems (FAS) required to satisfactorily automate the longwall system were analyzed and simulated in order to develop an overall longwall system configuration for preliminary design.

  11. Functional Design of an Automated Instructional Support System for Operational Flight Trainers. Final Report, June 1976 through September 1977.

    ERIC Educational Resources Information Center

    Semple, Clarence A.; And Others

    Functional requirements for a highly automated, flexible, instructional support system for aircrew training simulators are presented. Automated support modes and associated features and capabilities are described, along with hardware and software functional requirements for implementing a baseline system in an operational flight training context.…

  12. Simulation based optimization on automated fibre placement process

    NASA Astrophysics Data System (ADS)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  13. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  14. Highly automated driving, secondary task performance, and driver state.

    PubMed

    Merat, Natasha; Jamson, A Hamish; Lai, Frank C H; Carsten, Oliver

    2012-10-01

    A driving simulator study compared the effect of changes in workload on performance in manual and highly automated driving. Changes in driver state were also observed by examining variations in blink patterns. With the addition of a greater number of advanced driver assistance systems in vehicles, the driver's role is likely to alter in the future from an operator in manual driving to a supervisor of highly automated cars. Understanding the implications of such advancements on drivers and road safety is important. A total of 50 participants were recruited for this study and drove the simulator in both manual and highly automated mode. As well as comparing the effect of adjustments in driving-related workload on performance, the effect of a secondary Twenty Questions Task was also investigated. In the absence of the secondary task, drivers' response to critical incidents was similar in manual and highly automated driving conditions. The worst performance was observed when drivers were required to regain control of driving in the automated mode while distracted by the secondary task. Blink frequency patterns were more consistent for manual than automated driving but were generally suppressed during conditions of high workload. Highly automated driving did not have a deleterious effect on driver performance, when attention was not diverted to the distracting secondary task. As the number of systems implemented in cars increases, an understanding of the implications of such automation on drivers' situation awareness, workload, and ability to remain engaged with the driving task is important.

  15. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, appendix A

    NASA Technical Reports Server (NTRS)

    Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.

    1984-01-01

    A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed were: (1) Capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) Capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) Postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) Investigation and simulation of various control methods including manual force/torque and active compliance control; (5) Evaluation and implementation of three obstacle avoidance methods; (6) Video simulation and edge detection; and (7) Software simulation validation. This appendix is the user's guide and includes examples of program runs and outputs as well as instructions for program use.

  16. Introduction of home electronics for the future

    NASA Astrophysics Data System (ADS)

    Yoshimoto, Hideyuki; Shirai, Iwao

    Development of electronics has accelerated the automation and labor saving at factories and offices. Home electronics is also expected to be needed more and more in Japan towards the 21st century, as the advanced information society and the elderly society will be accelerated, and women's participation in social affairs will be increased. Resources Council, which is the advisory organ of the Minister of State for Science and Technology, forecast to what extent home electronics will be popularized by the year of 2010. The Council expected to promote home electronics, because resource and energy saving should be accelerated and people should enjoy much more their individual lives at home.

  17. CIM for 300-mm semiconductor fab

    NASA Astrophysics Data System (ADS)

    Luk, Arthur

    1997-08-01

    Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.

  18. Response Surface Modeling of Combined-Cycle Propulsion Components using Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Steffen, C. J., Jr.

    2002-01-01

    Three examples of response surface modeling with CFD are presented for combined cycle propulsion components. The examples include a mixed-compression-inlet during hypersonic flight, a hydrogen-fueled scramjet combustor during hypersonic flight, and a ducted-rocket nozzle during all-rocket flight. Three different experimental strategies were examined, including full factorial, fractionated central-composite, and D-optimal with embedded Plackett-Burman designs. The response variables have been confined to integral data extracted from multidimensional CFD results. Careful attention to uncertainty assessment and modeling bias has been addressed. The importance of automating experimental setup and effectively communicating statistical results are emphasized.

  19. Automated Knowledge Discovery from Simulators

    NASA Technical Reports Server (NTRS)

    Burl, Michael C.; DeCoste, D.; Enke, B. L.; Mazzoni, D.; Merline, W. J.; Scharenbroich, L.

    2006-01-01

    In this paper, we explore one aspect of knowledge discovery from simulators, the landscape characterization problem, where the aim is to identify regions in the input/ parameter/model space that lead to a particular output behavior. Large-scale numerical simulators are in widespread use by scientists and engineers across a range of government agencies, academia, and industry; in many cases, simulators provide the only means to examine processes that are infeasible or impossible to study otherwise. However, the cost of simulation studies can be quite high, both in terms of the time and computational resources required to conduct the trials and the manpower needed to sift through the resulting output. Thus, there is strong motivation to develop automated methods that enable more efficient knowledge extraction.

  20. LAMMPS integrated materials engine (LIME) for efficient automation of particle-based simulations: application to equation of state generation

    NASA Astrophysics Data System (ADS)

    Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.

    2017-07-01

    We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.

  1. "Fab 13": The Learning Factory.

    ERIC Educational Resources Information Center

    Crooks, Steven M.; Eucker, Tom R.

    2001-01-01

    Describes how situated learning theory was employed in the design of Fab 13, a four-day simulation-based learning experience for manufacturing professionals at Intel Corporation. Presents a conceptual framework for understanding situated learning and discusses context, content, anchored instruction, facilitation, scaffolding, collaborating,…

  2. Translations from Kommunist, Number 13, September 1978

    DTIC Science & Technology

    1978-10-30

    programmed machine tool here is merely a component of a more complex reprogrammable technological system. This includes the robot machine tools with...sufficient possibilities for changing technological operations and processes and automated technological lines. 52 The reprogrammable automated sets will...simulate the possibilities of such sets. A new technological level will be developed in industry related to reprogrammable automated sets, their design

  3. The Effect of Appropriately and Inappropriately Applied Automation for the Control of Unmanned Systems on Operator Performance

    DTIC Science & Technology

    2009-09-01

    2.1 Participants Twelve civilians (7 men and 5 women ) with no prior experience with the Robotic NCO simulation participated in this study. The mean...operators in a multitasking environment. 15. SUBJECT TERMS design guidelines, robotics, simulation, unmanned systems, automation 16. SECURITY...model of operator performance, or a hybrid method which combines one or more of these different invocation techniques (e.g., critical events and

  4. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  5. Biological monitoring of workers exposed to 4,4'-methylenediphenyl diisocyanate (MDI) in 19 French polyurethane industries.

    PubMed

    Robert, A; Ducos, P; Francin, J M; Marsan, P

    2007-04-01

    To study the range of urinary levels of 4,4'-methylenedianiline (MDA), a metabolite of methylenediphenyl diisocyanate (MDI), across factories in the polyurethane industries and to evaluate the validity of this biomarker to assess MDI occupational exposure. Workers exposed to MDI, as well as non-occupationally exposed subjects, were studied and pre- and post-shift urine samples were collected from 169 workers of 19 French factories and 120 controls. Details on work activities and practices were collected by a questionnaire and workers were classified into three job categories. The identification and quantification of the total urinary MDA were performed by high-performance liquid chromatography with electrochemical detection (HPLC/EC). For all the factories, MDA was detectable in 73% of the post-shift urine samples. These post-shift values, in the range of <0.10 (detection limit)-23.60 microg/l, were significantly higher than those of the pre-shift samples. Urinary MDA levels in the control group were in the range of < 0.10-0.80 microg/l. The degree of automation of the mixing operation (polyols and MDI) appears as a determinant in the extent of exposure levels. The highest amounts of MDA in urine were found in the spraying or hot processes. The excretion levels of the workers directly exposed to the hardener containing the MDI monomer were significantly higher than those of the other workers. In addition, skin exposure to MDI monomer or to polyurethane resin during the curing step were always associated with significant MDA levels in urine. Total MDA in post-shift urine samples is a reliable biomarker to assess occupational exposure to MDI in various industrial applications and to help factories to improve their manufacturing processes and working practices. A biological guiding value not exceeding 7 microg/l (5 microg/g creatinine) could be proposed in France.

  6. Automated electric power management and control for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Mellor, Pamela A.; Kish, James A.

    1990-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. It strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. An integrated approach to the power system command and control problem is defined and used to direct technology development in: diagnosis, security monitoring and analysis, battery management, and cooperative problem-solving for resource allocation. The prototype automated power system is developed using simulations and test-beds.

  7. Experimental cosmic statistics - I. Variance

    NASA Astrophysics Data System (ADS)

    Colombi, Stéphane; Szapudi, István; Jenkins, Adrian; Colberg, Jörg

    2000-04-01

    Counts-in-cells are measured in the τCDM Virgo Hubble Volume simulation. This large N-body experiment has 109 particles in a cubic box of size 2000h-1Mpc. The unprecedented combination of size and resolution allows, for the first time, a realistic numerical analysis of the cosmic errors and cosmic correlations of statistics related to counts-in-cells measurements, such as the probability distribution function PN itself, its factorial moments Fk and the related cumulants ψ and SNs. These statistics are extracted from the whole simulation cube, as well as from 4096 subcubes of size 125h-1Mpc, each representing a virtual random realization of the local universe. The measurements and their scatter over the subvolumes are compared to the theoretical predictions of Colombi, Bouchet & Schaeffer for P0, and of Szapudi & Colombi and Szapudi, Colombi & Bernardeau for the factorial moments and the cumulants. The general behaviour of experimental variance and cross-correlations as functions of scale and order is well described by theoretical predictions, with a few per cent accuracy in the weakly non-linear regime for the cosmic error on factorial moments. On highly non-linear scales, however, all variants of the hierarchical model used by SC and SCB to describe clustering appear to become increasingly approximate, which leads to a slight overestimation of the error, by about a factor of two in the worst case. Because of the needed supplementary perturbative approach, the theory is less accurate for non-linear estimators, such as cumulants, than for factorial moments. The cosmic bias is evaluated as well, and, in agreement with SCB, is found to be insignificant compared with the cosmic variance in all regimes investigated. While higher order statistics were previously evaluated in several simulations, this work presents textbook quality measurements of SNs, 3<=N<=10, in an unprecedented dynamic range of 0.05 <~ ψ <~ 50. In the weakly non-linear regime the results confirm previous findings and agree remarkably well with perturbation theory predictions including the one-loop corrections based on spherical collapse by Fosalba & Gaztañaga. Extended perturbation theory is confirmed on all scales.

  8. Status of the MIND simulation and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cervera Villanueva, A.; Martin-Albo, J.; Laing, A.

    2010-03-30

    A realistic simulation of the Neutrino Factory detectors is required in order to fully understand the sensitivity of such a facility to the remaining parameters and degeneracies of the neutrino mixing matrix. Here described is the status of a modular software framework being developed to accommodate such a study. The results of initial studies of the reconstruction software and expected efficiency curves in the context of the golden channel are given.

  9. Enhancing Scheduling Performance for a Wafer Fabrication Factory: The Biobjective Slack-Diversifying Nonlinear Fluctuation-Smoothing Rule

    PubMed Central

    Chen, Toly; Wang, Yu Cheng

    2012-01-01

    A biobjective slack-diversifying nonlinear fluctuation-smoothing rule (biSDNFS) is proposed in the present work to improve the scheduling performance of a wafer fabrication factory. This rule was derived from a one-factor bi-objective nonlinear fluctuation-smoothing rule (1f-biNFS) by dynamically maximizing the standard deviation of the slack, which has been shown to benefit scheduling performance by several previous studies. The efficacy of the biSDNFS was validated with a simulated case; evidence was found to support its effectiveness. We also suggested several directions in which it can be exploited in the future. PMID:23509446

  10. Solenoid Fringe Field Effects for the Neutrino Factory Linac - MAD-X Investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. Aslaninejad,C. Bontoiu,J. Pasternak,J. Pozimski,Alex Bogacz

    2010-05-01

    International Design Study for the Neutrino Factory (IDS-NF) assumes the first stage of muon acceleration (up to 900 MeV) to be implemented with a solenoid based Linac. The Linac consists of three styles of cryo-modules, containing focusing solenoids and varying number of SRF cavities for acceleration. Fringe fields of the solenoids and the focusing effects in the SRF cavities have significant impact on the transverse beam dynamics. Using an analytical formula, the effects of fringe fields are studied in MAD-X. The resulting betatron functions are compared with the results of beam dynamics simulations using OptiM code.

  11. Evolution of cooperative behavior in simulation agents

    NASA Astrophysics Data System (ADS)

    Stroud, Phillip D.

    1998-03-01

    A simulated automobile factory paint shop is used as a testbed for exploring the emulation of human decision-making behavior. A discrete-events simulation of the paint shop as a collection of interacting Java actors is described. An evolutionary cognitive architecture is under development for building software actors to emulate humans in simulations of human- dominated complex systems. In this paper, the cognitive architecture is extended by implementing a persistent population of trial behaviors with an incremental fitness valuation update strategy, and by allowing a group of cognitive actors to share information. A proof-of-principle demonstration is presented.

  12. Real-time simulation of an F110/STOVL turbofan engine

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Ouzts, Peter J.

    1989-01-01

    A traditional F110-type turbofan engine model was extended to include a ventral nozzle and two thrust-augmenting ejectors for Short Take-Off Vertical Landing (STOVL) aircraft applications. Development of the real-time F110/STOVL simulation required special attention to the modeling approach to component performance maps, the low pressure turbine exit mixing region, and the tailpipe dynamic approximation. Simulation validation derives by comparing output from the ADSIM simulation with the output for a validated F110/STOVL General Electric Aircraft Engines FORTRAN deck. General Electric substantiated basic engine component characteristics through factory testing and full scale ejector data.

  13. An optimal model-based trajectory following architecture synthesising the lateral adaptive preview strategy and longitudinal velocity planning for highly automated vehicle

    NASA Astrophysics Data System (ADS)

    Cao, Haotian; Song, Xiaolin; Zhao, Song; Bao, Shan; Huang, Zhi

    2017-08-01

    Automated driving has received a broad of attentions from the academia and industry, since it is effective to greatly reduce the severity of potential traffic accidents and achieve the ultimate automobile safety and comfort. This paper presents an optimal model-based trajectory following architecture for highly automated vehicle in its driving tasks such as automated guidance or lane keeping, which includes a velocity-planning module, a steering controller and a velocity-tracking controller. The velocity-planning module considering the optimal time-consuming and passenger comforts simultaneously could generate a smooth velocity profile. The robust sliding mode control (SMC) steering controller with adaptive preview time strategy could not only track the target path well, but also avoid a big lateral acceleration occurred in its path-tracking progress due to a fuzzy-adaptive preview time mechanism introduced. In addition, an SMC controller with input-output linearisation method for velocity tracking is built and validated. Simulation results show this trajectory following architecture are effective and feasible for high automated driving vehicle, comparing with the Driver-in-the-Loop simulations performed by an experienced driver and novice driver, respectively. The simulation results demonstrate that the present trajectory following architecture could plan a satisfying longitudinal speed profile, track the target path well and safely when dealing with different road geometry structure, it ensures a good time efficiency and driving comfort simultaneously.

  14. A Fast-Time Simulation Tool for Analysis of Airport Arrival Traffic

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Meyn, Larry A.; Neuman, Frank

    2004-01-01

    The basic objective of arrival sequencing in air traffic control automation is to match traffic demand and airport capacity while minimizing delays. The performance of an automated arrival scheduling system, such as the Traffic Management Advisor developed by NASA for the FAA, can be studied by a fast-time simulation that does not involve running expensive and time-consuming real-time simulations. The fast-time simulation models runway configurations, the characteristics of arrival traffic, deviations from predicted arrival times, as well as the arrival sequencing and scheduling algorithm. This report reviews the development of the fast-time simulation method used originally by NASA in the design of the sequencing and scheduling algorithm for the Traffic Management Advisor. The utility of this method of simulation is demonstrated by examining the effect on delays of altering arrival schedules at a hub airport.

  15. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  16. A Sidekick for Membrane Simulations: Automated Ensemble Molecular Dynamics Simulations of Transmembrane Helices

    PubMed Central

    Hall, Benjamin A; Halim, Khairul Abd; Buyan, Amanda; Emmanouil, Beatrice; Sansom, Mark S P

    2016-01-01

    The interactions of transmembrane (TM) α-helices with the phospholipid membrane and with one another are central to understanding the structure and stability of integral membrane proteins. These interactions may be analysed via coarse-grained molecular dynamics (CGMD) simulations. To obtain statistically meaningful analysis of TM helix interactions, large (N ca. 100) ensembles of CGMD simulations are needed. To facilitate the running and analysis of such ensembles of simulations we have developed Sidekick, an automated pipeline software for performing high throughput CGMD simulations of α-helical peptides in lipid bilayer membranes. Through an end-to-end approach, which takes as input a helix sequence and outputs analytical metrics derived from CGMD simulations, we are able to predict the orientation and likelihood of insertion into a lipid bilayer of a given helix of family of helix sequences. We illustrate this software via analysis of insertion into a membrane of short hydrophobic TM helices containing a single cationic arginine residue positioned at different positions along the length of the helix. From analysis of these ensembles of simulations we estimate apparent energy barriers to insertion which are comparable to experimentally determined values. In a second application we use CGMD simulations to examine self-assembly of dimers of TM helices from the ErbB1 receptor tyrosine kinase, and analyse the numbers of simulation repeats necessary to obtain convergence of simple descriptors of the mode of packing of the two helices within a dimer. Our approach offers proof-of-principle platform for the further employment of automation in large ensemble CGMD simulations of membrane proteins. PMID:26580541

  17. Sidekick for Membrane Simulations: Automated Ensemble Molecular Dynamics Simulations of Transmembrane Helices.

    PubMed

    Hall, Benjamin A; Halim, Khairul Bariyyah Abd; Buyan, Amanda; Emmanouil, Beatrice; Sansom, Mark S P

    2014-05-13

    The interactions of transmembrane (TM) α-helices with the phospholipid membrane and with one another are central to understanding the structure and stability of integral membrane proteins. These interactions may be analyzed via coarse grained molecular dynamics (CGMD) simulations. To obtain statistically meaningful analysis of TM helix interactions, large (N ca. 100) ensembles of CGMD simulations are needed. To facilitate the running and analysis of such ensembles of simulations, we have developed Sidekick, an automated pipeline software for performing high throughput CGMD simulations of α-helical peptides in lipid bilayer membranes. Through an end-to-end approach, which takes as input a helix sequence and outputs analytical metrics derived from CGMD simulations, we are able to predict the orientation and likelihood of insertion into a lipid bilayer of a given helix of a family of helix sequences. We illustrate this software via analyses of insertion into a membrane of short hydrophobic TM helices containing a single cationic arginine residue positioned at different positions along the length of the helix. From analyses of these ensembles of simulations, we estimate apparent energy barriers to insertion which are comparable to experimentally determined values. In a second application, we use CGMD simulations to examine the self-assembly of dimers of TM helices from the ErbB1 receptor tyrosine kinase and analyze the numbers of simulation repeats necessary to obtain convergence of simple descriptors of the mode of packing of the two helices within a dimer. Our approach offers a proof-of-principle platform for the further employment of automation in large ensemble CGMD simulations of membrane proteins.

  18. Suitability of Synthetic Driving Profiles from Traffic Micro-Simulation for Real-World Energy Analysis: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Yunfei; Wood, Eric; Burton, Evan

    A shift towards increased levels of driving automation is generally expected to result in improved safety and traffic congestion outcomes. However, little empirical data exists to estimate the impact that automated driving could have on energy consumption and greenhouse gas emissions. In the absence of empirical data on differences between drive cycles from present day vehicles (primarily operated by humans) and future vehicles (partially or fully operated by computers) one approach is to model both situations over identical traffic conditions. Such an exercise requires traffic micro-simulation to not only accurately model vehicle operation under high levels of automation, but alsomore » (and potentially more challenging) vehicle operation under present day human drivers. This work seeks to quantify the ability of a commercial traffic micro-simulation program to accurately model real-world drive cycles in vehicles operated primarily by humans in terms of driving speed, acceleration, and simulated fuel economy. Synthetic profiles from models of freeway and arterial facilities near Atlanta, Georgia, are compared to empirical data collected from real-world drivers on the same facilities. Empirical and synthetic drive cycles are then simulated in a powertrain efficiency model to enable comparison on the basis of fuel economy. Synthetic profiles from traffic micro-simulation were found to exhibit low levels of transient behavior relative to the empirical data. Even with these differences, the synthetic and empirical data in this study agree well in terms of driving speed and simulated fuel economy. The differences in transient behavior between simulated and empirical data suggest that larger stochastic contributions in traffic micro-simulation (relative to those present in the traffic micro-simulation tool used in this study) are required to fully capture the arbitrary elements of human driving. Interestingly, the lack of stochastic contributions from models of human drivers in this study did not result in a significant discrepancy between fuel economy simulations based on synthetic and empirical data; a finding with implications on the potential energy efficiency gains of automated vehicle technology.« less

  19. Effects of ATC automation on precision approaches to closely space parallel runways

    NASA Technical Reports Server (NTRS)

    Slattery, R.; Lee, K.; Sanford, B.

    1995-01-01

    Improved navigational technology (such as the Microwave Landing System and the Global Positioning System) installed in modern aircraft will enable air traffic controllers to better utilize available airspace. Consequently, arrival traffic can fly approaches to parallel runways separated by smaller distances than are currently allowed. Previous simulation studies of advanced navigation approaches have found that controller workload is increased when there is a combination of aircraft that are capable of following advanced navigation routes and aircraft that are not. Research into Air Traffic Control automation at Ames Research Center has led to the development of the Center-TRACON Automation System (CTAS). The Final Approach Spacing Tool (FAST) is the component of the CTAS used in the TRACON area. The work in this paper examines, via simulation, the effects of FAST used for aircraft landing on closely spaced parallel runways. The simulation contained various combinations of aircraft, equipped and unequipped with advanced navigation systems. A set of simulations was run both manually and with an augmented set of FAST advisories to sequence aircraft, assign runways, and avoid conflicts. The results of the simulations are analyzed, measuring the airport throughput, aircraft delay, loss of separation, and controller workload.

  20. Data Handling and Communication

    NASA Astrophysics Data System (ADS)

    Hemmer, FréDéRic Giorgio Innocenti, Pier

    The following sections are included: * Introduction * Computing Clusters and Data Storage: The New Factory and Warehouse * Local Area Networks: Organizing Interconnection * High-Speed Worldwide Networking: Accelerating Protocols * Detector Simulation: Events Before the Event * Data Analysis and Programming Environment: Distilling Information * World Wide Web: Global Networking * References

  1. Use of an embedded, micro-randomised trial to investigate non-compliance in telehealth interventions.

    PubMed

    Law, Lisa M; Edirisinghe, Nuwani; Wason, James Ms

    2016-08-01

    Many types of telehealth interventions rely on activity from the patient in order to have a beneficial effect on their outcome. Remote monitoring systems require the patient to record regular measurements at home, for example, blood pressure, so clinicians can see whether the patient's health changes over time and intervene if necessary. A big problem in this type of intervention is non-compliance. Most telehealth trials report compliance rates, but they rarely compare compliance among various options of telehealth delivery, of which there may be many. Optimising telehealth delivery is vital for improving compliance and, therefore, clinical outcomes. We propose a trial design which investigates ways of improving compliance. For efficiency, this trial is embedded in a larger trial for evaluating clinical effectiveness. It employs a technique called micro-randomisation, where individual patients are randomised multiple times throughout the study. The aims of this article are (1) to verify whether the presence of an embedded secondary trial still allows valid analysis of the primary research and (2) to demonstrate the usefulness of the micro-randomisation technique for comparing compliance interventions. Simulation studies were used to simulate a large number of clinical trials, in which no embedded trial was used, a micro-randomised embedded trial was used, and a factorial embedded trial was used. Each simulation recorded the operating characteristics of the primary and secondary trials. We show that the type I error rate of the primary analysis was not affected by the presence of an embedded secondary trial. Furthermore, we show that micro-randomisation is superior to a factorial design as it reduces the variation caused by within-patient correlation. It therefore requires smaller sample sizes - our simulations showed a requirement of 128 patients for a micro-randomised trial versus 760 patients for a factorial design, in the presence of within-patient correlation. We believe that an embedded, micro-randomised trial is a feasible technique that can potentially be highly useful in telehealth trials. © The Author(s) 2016.

  2. Prototype space station automation system delivered and demonstrated at NASA

    NASA Technical Reports Server (NTRS)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support System (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of Space Station subsystems. The hierarchical and distributed real time controls system places the required controls authority at every level of the automation system architecture. As a demonstration of the automation technique, the ASCLSS system automated the Air Revitalization Group (ARG) of the Space Station regenerative Environmental Control and Life Support System (ECLSS) using real-time, high fidelity simulators of the ARG processess. This automation system represents an early flight prototype and an important test bed for evaluating Space Station controls technology including future application of ADA software in real-time control and the development and demonstration of embedded artificial intelligence and expert systems (AI/ES) in distributed automation and controls systems.

  3. Cooperative Vehicle–Highway Automation (CVHA) Technology : Simulation of Benefits and Operational Issues

    DOT National Transportation Integrated Search

    2017-03-01

    The past few years have witnessed a rapidly growing market in assistive driving technologies, designed to improve safety and operations by supporting driver performance. Often referred to as cooperative vehiclehighway automation (CVHA) systems, th...

  4. Automation reliability in unmanned aerial vehicle control: a reliance-compliance model of automation dependence in high workload.

    PubMed

    Dixon, Stephen R; Wickens, Christopher D

    2006-01-01

    Two experiments were conducted in which participants navigated a simulated unmanned aerial vehicle (UAV) through a series of mission legs while searching for targets and monitoring system parameters. The goal of the study was to highlight the qualitatively different effects of automation false alarms and misses as they relate to operator compliance and reliance, respectively. Background data suggest that automation false alarms cause reduced compliance, whereas misses cause reduced reliance. In two studies, 32 and 24 participants, including some licensed pilots, performed in-lab UAV simulations that presented the visual world and collected dependent measures. Results indicated that with the low-reliability aids, false alarms correlated with poorer performance in the system failure task, whereas misses correlated with poorer performance in the concurrent tasks. Compliance and reliance do appear to be affected by false alarms and misses, respectively, and are relatively independent of each other. Practical implications are that automated aids must be fairly reliable to provide global benefits and that false alarms and misses have qualitatively different effects on performance.

  5. Psychosocial factors associated with intended use of automated vehicles: A simulated driving study.

    PubMed

    Buckley, Lisa; Kaye, Sherrie-Anne; Pradhan, Anuj K

    2018-06-01

    This study applied the Theory of Planned Behavior (TPB) and the Technology Acceptance Model (TAM) to assess drivers' intended use of automated vehicles (AVs) after undertaking a simulated driving task. In addition, this study explored the potential for trust to account for additional variance to the psychosocial factors in TPB and TAM. Seventy-four participants (51% female) aged between 25 and 64 years (M = 42.8, SD = 12.9) undertook a 20 min simulated experimental drive in which participants experienced periods of automated driving and manual control. A survey task followed. A hierarchical regression analysis revealed that TPB constructs; attitude toward the behavior, subjective norms, and perceived behavioral control, were significant predictors of intentions to use AV. In addition, there was partial support for the test of TAM, with ease of use (but not usefulness) predicting intended use of AV (SAE Level 3). Trust contributed variance to both models beyond TPB or TAM constructs. The findings provide an important insight into factors that might reflect intended use of vehicles that are primarily automated (longitudinal, lateral, and manoeuvre controls) but require and allow drivers to have periods of manual control. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. AR-NE3A, a New Macromolecular Crystallography Beamline for Pharmaceutical Applications at the Photon Factory

    NASA Astrophysics Data System (ADS)

    Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi

    2010-06-01

    Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.

  7. Simulations of Continuous Descent Operations with Arrival-management Automation and Mixed Flight-deck Interval Management Equipage

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Kupfer, Michael; Martin, Lynne Hazel; Prevot, Thomas

    2013-01-01

    Air traffic management simulations conducted in the Airspace Operations Laboratory at NASA Ames Research Center have addressed the integration of trajectory-based arrival-management automation, controller tools, and Flight-Deck Interval Management avionics to enable Continuous Descent Operations (CDOs) during periods of sustained high traffic demand. The simulations are devoted to maturing the integrated system for field demonstration, and refining the controller tools, clearance phraseology, and procedures specified in the associated concept of operations. The results indicate a variety of factors impact the concept's safety and viability from a controller's perspective, including en-route preconditioning of arrival flows, useable clearance phraseology, and the characteristics of airspace, routes, and traffic-management methods in use at a particular site. Clear understanding of automation behavior and required shifts in roles and responsibilities is important for controller acceptance and realizing potential benefits. This paper discusses the simulations, drawing parallels with results from related European efforts. The most recent study found en-route controllers can effectively precondition arrival flows, which significantly improved route conformance during CDOs. Controllers found the tools acceptable, in line with previous studies.

  8. Decision support automation research in the en route air traffic control environment

    DOT National Transportation Integrated Search

    2002-01-01

    This study examined the effect of automated decision support on Certified Professional Controller (CPC) behavior. : Sixteen CPCs from Air Route Traffic Control Centers participated in human-in-the-loop simulations. CPCs controlled : two levels of tra...

  9. The Japanese Positron Factory

    NASA Astrophysics Data System (ADS)

    Okada, S.; Sunaga, H.; Kaneko, H.; Takizawa, H.; Kawasuso, A.; Yotsumoto, K.; Tanaka, R.

    1999-06-01

    The Positron Factory has been planned at Japan Atomic Energy Research Institute (JAERI). The factory is expected to produce linac-based monoenergetic positron beams having world-highest intensities of more than 1010e+/sec, which will be applied for R&D of materials science, biotechnology and basic physics & chemistry. In this article, results of the design studies are demonstrated for the following essential components of the facilities: 1) Conceptual design of a high-power electron linac with 100 MeV in beam energy and 100 kW in averaged beam power, 2) Performance tests of the RF window in the high-power klystron and of the electron beam window, 3) Development of a self-driven rotating electron-to-positron converter and the performance tests, 4) Proposal of multi-channel beam generation system for monoenergetic positrons, with a series of moderator assemblies based on a newly developed Monte Carlo simulation and the demonstrative experiment, 5) Proposal of highly efficient moderator structures, 6) Conceptual design of a local shield to suppress the surrounding radiation and activation levels.

  10. Rank-based permutation approaches for non-parametric factorial designs.

    PubMed

    Umlauft, Maria; Konietschke, Frank; Pauly, Markus

    2017-11-01

    Inference methods for null hypotheses formulated in terms of distribution functions in general non-parametric factorial designs are studied. The methods can be applied to continuous, ordinal or even ordered categorical data in a unified way, and are based only on ranks. In this set-up Wald-type statistics and ANOVA-type statistics are the current state of the art. The first method is asymptotically exact but a rather liberal statistical testing procedure for small to moderate sample size, while the latter is only an approximation which does not possess the correct asymptotic α level under the null. To bridge these gaps, a novel permutation approach is proposed which can be seen as a flexible generalization of the Kruskal-Wallis test to all kinds of factorial designs with independent observations. It is proven that the permutation principle is asymptotically correct while keeping its finite exactness property when data are exchangeable. The results of extensive simulation studies foster these theoretical findings. A real data set exemplifies its applicability. © 2017 The British Psychological Society.

  11. System reliability, performance and trust in adaptable automation.

    PubMed

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  12. A control strategy for grid-side converter of DFIG under unbalanced condition based on Dig SILENT/Power Factory

    NASA Astrophysics Data System (ADS)

    Han, Pingping; Zhang, Haitian; Chen, Lingqi; Zhang, Xiaoan

    2018-01-01

    The models of doubly fed induction generator (DFIG) and its grid-side converter (GSC) are established under unbalanced grid condition based on DIgSILENT/PowerFactory. According to the mathematical model, the vector equations of positive and negative sequence voltage and current are deduced in the positive sequence synchronous rotating reference frame d-q-0 when the characteristics of the simulation software are considered adequately. Moreover, the reference value of current component of GSC in the positive sequence frame d-q-0 under unbalanced condition can be obtained to improve the traditional control of GSC when the national issue of unbalanced current limits is combined. The simulated results indicate that the control strategy can restrain negative sequence current and the two times frequency power wave of GSC’s ac side effectively. The voltage of DC bus can be maintained a constant to ensure the uninterrupted operation of DFIG under unbalanced grid condition eventually.

  13. Lidar/DIAL detection of bomb factories

    NASA Astrophysics Data System (ADS)

    Fiorani, Luca; Puiu, Adriana; Rosa, Olga; Palucci, Antonio

    2013-10-01

    One of the aims of the project BONAS (BOmb factory detection by Networks of Advanced Sensors) is to develop a lidar/DIAL (differential absorption lidar) to detect precursors employed in the manufacturing of improvised explosive devices (IEDs). At first, a spectroscopic study has been carried out: the infrared (IR) gas phase spectrum of acetone, one of the more important IED precursors, has been procured from available databases and checked with cell measurements. Then, the feasibility of a lidar/DIAL for the detection of acetone vapors has been shown in laboratory, simulating the experimental conditions of a field campaign. Eventually, having in mind measurements in a real scenario, an interferent study has been performed, looking for all known compounds that share with acetone IR absorption in the spectral band selected for its detection. Possible interfering species were investigated, simulating both urban and industrial atmospheres and limits of acetone detection in both environments were identified. This study confirmed that a lidar/DIAL can detect low concentration of acetone at considerable distances.

  14. Intelligent robot trends for 1998

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    1998-10-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent technical and economic trends. Technically, the machines are faster, cheaper, more repeatable, more reliable and safer. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. Economically, the robotics industry now has a 1.1 billion-dollar market in the U.S. and is growing. Feasibility studies results are presented which also show decreasing costs for robots and unaudited healthy rates of return for a variety of robotic applications. However, the road from inspiration to successful application can be long and difficult, often taking decades to achieve a new product. A greater emphasis on mechatronics is needed in our universities. Certainly, more cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit industry and society.

  15. A multi-level analysis of the effects of age and gender stereotypes on trust in anthropomorphic technology by younger and older adults.

    PubMed

    Pak, Richard; McLaughlin, Anne Collins; Bass, Brock

    2014-01-01

    Previous research has shown that gender stereotypes, elicited by the appearance of the anthropomorphic technology, can alter perceptions of system reliability. The current study examined whether stereotypes about the perceived age and gender of anthropomorphic technology interacted with reliability to affect trust in such technology. Participants included a cross-section of younger and older adults. Through a factorial survey, participants responded to health-related vignettes containing anthropomorphic technology with a specific age, gender, and level of past reliability by rating their trust in the system. Trust in the technology was affected by the age and gender of the user as well as its appearance and reliability. Perceptions of anthropomorphic technology can be affected by pre-existing stereotypes about the capability of a specific age or gender. The perceived age and gender of automation can alter perceptions of the anthropomorphic technology such as trust. Thus, designers of automation should design anthropomorphic interfaces with an awareness that the perceived age and gender will interact with the user’s age and gender

  16. Kepler Planet Detection Metrics: Automatic Detection of Background Objects Using the Centroid Robovetter

    NASA Technical Reports Server (NTRS)

    Mullally, Fergal

    2017-01-01

    We present an automated method of identifying background eclipsing binaries masquerading as planet candidates in the Kepler planet candidate catalogs. We codify the manual vetting process for Kepler Objects of Interest (KOIs) described in Bryson et al. (2013) with a series of measurements and tests that can be performed algorithmically. We compare our automated results with a sample of manually vetted KOIs from the catalog of Burke et al. (2014) and find excellent agreement. We test the performance on a set of simulated transits and find our algorithm correctly identifies simulated false positives approximately 50 of the time, and correctly identifies 99 of simulated planet candidates.

  17. Design of a Screen Based Simulation for Training and Automated Assessment of Teamwork Skills

    DTIC Science & Technology

    2017-08-01

    AWARD NUMBER: W81XWH-16-1-0308 TITLE: Design of a Screen-Based Simulation for Training and Automated Assessment of Teamwork Skills PRINCIPAL...the author(s) and should not be construed as an official Department of the Army position, policy or decision unless so designated by other...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE August 2017 2. REPORT TYPE Annual 3. DATES

  18. Information prioritization for control and automation of space operations

    NASA Technical Reports Server (NTRS)

    Ray, Asock; Joshi, Suresh M.; Whitney, Cynthia K.; Jow, Hong N.

    1987-01-01

    The applicability of a real-time information prioritization technique to the development of a decision support system for control and automation of Space Station operations is considered. The steps involved in the technique are described, including the definition of abnormal scenarios and of attributes, measures of individual attributes, formulation and optimization of a cost function, simulation of test cases on the basis of the cost function, and examination of the simulation scenerios. A list is given comparing the intrinsic importances of various Space Station information data.

  19. 'SON-GO-KU' : a dream of automated library

    NASA Astrophysics Data System (ADS)

    Sato, Mamoru; Kishimoto, Juji

    In the process of automating libraries, the retrieval of books through the browsing of shelves is being overlooked. The telematic library is a document based DBMS which can deliver the content of books by simulating the browsing process. The retrieval actually simulates the process a person would use in selecting a book in a real library, where a visual presentation using a graphic display is substituted. The characteristics of prototype system "Son-Go-Ku" for such retrieval implemented in 1988 are mentioned.

  20. Comparison of oral surgery task performance in a virtual reality surgical simulator and an animal model using objective measures.

    PubMed

    Ioannou, Ioanna; Kazmierczak, Edmund; Stern, Linda

    2015-01-01

    The use of virtual reality (VR) simulation for surgical training has gathered much interest in recent years. Despite increasing popularity and usage, limited work has been carried out in the use of automated objective measures to quantify the extent to which performance in a simulator resembles performance in the operating theatre, and the effects of simulator training on real world performance. To this end, we present a study exploring the effects of VR training on the performance of dentistry students learning a novel oral surgery task. We compare the performance of trainees in a VR simulator and in a physical setting involving ovine jaws, using a range of automated metrics derived by motion analysis. Our results suggest that simulator training improved the motion economy of trainees without adverse effects on task outcome. Comparison of surgical technique on the simulator with the ovine setting indicates that simulator technique is similar, but not identical to real world technique.

  1. Keep Your Scanners Peeled: Gaze Behavior as a Measure of Automation Trust During Highly Automated Driving.

    PubMed

    Hergeth, Sebastian; Lorenz, Lutz; Vilimek, Roman; Krems, Josef F

    2016-05-01

    The feasibility of measuring drivers' automation trust via gaze behavior during highly automated driving was assessed with eye tracking and validated with self-reported automation trust in a driving simulator study. Earlier research from other domains indicates that drivers' automation trust might be inferred from gaze behavior, such as monitoring frequency. The gaze behavior and self-reported automation trust of 35 participants attending to a visually demanding non-driving-related task (NDRT) during highly automated driving was evaluated. The relationship between dispositional, situational, and learned automation trust with gaze behavior was compared. Overall, there was a consistent relationship between drivers' automation trust and gaze behavior. Participants reporting higher automation trust tended to monitor the automation less frequently. Further analyses revealed that higher automation trust was associated with lower monitoring frequency of the automation during NDRTs, and an increase in trust over the experimental session was connected with a decrease in monitoring frequency. We suggest that (a) the current results indicate a negative relationship between drivers' self-reported automation trust and monitoring frequency, (b) gaze behavior provides a more direct measure of automation trust than other behavioral measures, and (c) with further refinement, drivers' automation trust during highly automated driving might be inferred from gaze behavior. Potential applications of this research include the estimation of drivers' automation trust and reliance during highly automated driving. © 2016, Human Factors and Ergonomics Society.

  2. Systems Operations Studies for Automated Guideway Transit Systems : Discrete Event Simulation Model User's Manual.

    DOT National Transportation Integrated Search

    1982-06-01

    In order to examine specific Automated Guideway Transit (AGT) developments and concepts, and to build a better knowledge base for future decision-making, the Urban Mass Transportation Administration (UMTA) undertook a new program of studies and techn...

  3. Initial Wave-Type Identification with Neural Networks and its Contribution to Automated Processing in IMS Version 3.0

    DTIC Science & Technology

    1993-12-10

    applied to the 3-component IRIS/IDA data under simulated operational conditions. The result was a reduction in the number false-alarms produced by the automated processing and interpretation system by about 60%

  4. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  5. iPTF discovery and identification of bright transients

    NASA Astrophysics Data System (ADS)

    Adams, Scott; Karamehmetoglu, Emir; Roy, Rupak; Neill, James D.; Walters, Richard; Cook, Dave; Kupfer, Thomas; Cannella, Chris; Blagorodnova, Nadejda; Yan, Lin; Kasliwal, Mansi; Kulkarni, Shri

    2017-02-01

    The intermediate Palomar Transient Factory (ATel #4807) reports the discovery of the following bright transients. We report as ATel alerts all objects brighter than 19 mag. Our discoveries are reported in two filters: sdss-g and Mould-I, denoted as g and I. All magnitudes are obtained using difference image photometry based on the PTFIDE pipeline described in Masci et al. 2016.Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R), and RB5 (Wozniak et al. 2013AAS...22143105W).

  6. FY2017 Report on NISC Measurements and Detector Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Madison Theresa; Meierbachtol, Krista Cruse; Jordan, Tyler Alexander

    FY17 work focused on automation, both of the measurement analysis and comparison of simulations. The experimental apparatus was relocated and weeks of continuous measurements of the spontaneous fission source 252Cf was performed. Programs were developed to automate the conversion of measurements into ROOT data framework files with a simple terminal input. The complete analysis of the measurement (which includes energy calibration and the identification of correlated counts) can now be completed with a documented process which involves one simple execution line as well. Finally, the hurdles of slow MCNP simulations resulting in low simulation statistics have been overcome with themore » generation of multi-run suites which make use of the highperformance computing resources at LANL. Preliminary comparisons of measurements and simulations have been performed and will be the focus of FY18 work.« less

  7. The ability of healthy volunteers to simulate a neurologic field defect on automated perimetry.

    PubMed

    Ghate, Deepta; Bodnarchuk, Brian; Sanders, Sheila; Deokule, Sunil; Kedar, Sachin

    2014-03-01

    To determine if volunteers can simulate and reproduce 3 types of neurologic field defects: hemianopia, quadrantanopia, and central scotoma. Cross-sectional study. Thirty healthy volunteers new to perimetry (including automated perimetry). After informed consent, volunteers were randomized to 1 of the 3 visual field defects listed above. All visual field testing was performed on the right eye using the Humphrey Field Analyzer (HFA; Carl Zeiss Meditec, Dublin, CA) SITA Fast 24-2 protocol. Each volunteer was provided with standard new patient instructions and was shown a diagram of the defect to be simulated. Two sets of visual fields were performed on the right eye with 10 minutes between tests. Three experts used the Ocular Hypertension Treatment Study reading center criteria and determined if the simulation was successful. Proportion of volunteers able to simulate the assigned visual field. All 10 volunteers (100%) successfully simulated a hemianopia on the first and second fields. All 10 volunteers (100%) simulated a quadrantanopia on the first field and 9 (90%) did so on the second field. Eight volunteers (80%) successfully simulated a central scotoma in the first field and all 10 (100%) did so on in the second field. Reliability criteria were excellent. Forty-seven fields (78%) had 0 fixation losses, 48 (80%) had 0 false-positive results, and 44 (73%) had 0 false-negative results. It is easy to simulate reproducible and reliable neurologic field defects on automated perimetry using HFA. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  8. Development of microcomputer-based mental acuity tests.

    PubMed

    Turnage, J J; Kennedy, R S; Smith, M G; Baltzley, D R; Lane, N E

    1992-10-01

    Recent disasters have focused attention on performance problems due to the use of alcohol and controlled substances in the workplace. Environmental stressors such as thermal extremes, mixed gases, noise, motion, and vibration also have adverse effects on human performance and operator efficiency. However, the lack of a standardized, sensitive, human performance assessment battery has probably delayed the systematic study of the deleterious effects of various toxic chemicals and drugs at home and in the workplace. The collective goal of the research reported here is the development of a menu of tests embedded in a coherent package of hardware and software that may be useful in repeated-measures studies of a broad range of agents that can degrade human performance. A menu of 40 tests from the Automated Performance Test System (APTS) is described, and the series of interlocking studies supporting its development is reviewed. The APTS tests, which run on several versions of laptop portables and desktop personal computers, have been shown to be stable, reliable, and factorially rich, and to have predictive validities with holistic measures of intelligence and simulator performances. In addition, sensitivity studies have been conducted in which performance changes due to stressors, agents, and treatments were demonstrated. We believe that tests like those described here have prospective use as an adjunct to urine testing for the screening for performance loss of individuals who are granted access to workplaces and stations that impact public safety.

  9. Development of microcomputer-based mental acuity tests

    NASA Technical Reports Server (NTRS)

    Turnage, J. J.; Kennedy, R. S.; Smith, M. G.; Baltzley, D. R.; Lane, N. E.

    1992-01-01

    Recent disasters have focused attention on performance problems due to the use of alcohol and controlled substances in the workplace. Environmental stressors such as thermal extremes, mixed gases, noise, motion, and vibration also have adverse effects on human performance and operator efficiency. However, the lack of a standardized, sensitive, human performance assessment battery has probably delayed the systematic study of the deleterious effects of various toxic chemicals and drugs at home and in the workplace. The collective goal of the research reported here is the development of a menu of tests embedded in a coherent package of hardware and software that may be useful in repeated-measures studies of a broad range of agents that can degrade human performance. A menu of 40 tests from the Automated Performance Test System (APTS) is described, and the series of interlocking studies supporting its development is reviewed. The APTS tests, which run on several versions of laptop portables and desktop personal computers, have been shown to be stable, reliable, and factorially rich, and to have predictive validities with holistic measures of intelligence and simulator performances. In addition, sensitivity studies have been conducted in which performance changes due to stressors, agents, and treatments were demonstrated. We believe that tests like those described here have prospective use as an adjunct to urine testing for the screening for performance loss of individuals who are granted access to workplaces and stations that impact public safety.

  10. The relationship between cell phone use and management of driver fatigue: It's complicated.

    PubMed

    Saxby, Dyani Juanita; Matthews, Gerald; Neubauer, Catherine

    2017-06-01

    Voice communication may enhance performance during monotonous, potentially fatiguing driving conditions (Atchley & Chan, 2011); however, it is unclear whether safety benefits of conversation are outweighed by costs. The present study tested whether personalized conversations intended to simulate hands-free cell phone conversation may counter objective and subjective fatigue effects elicited by vehicle automation. A passive fatigue state (Desmond & Hancock, 2001), characterized by disengagement from the task, was induced using full vehicle automation prior to drivers resuming full control over the driving simulator. A conversation was initiated shortly after reversion to manual control. During the conversation an emergency event occurred. The fatigue manipulation produced greater task disengagement and slower response to the emergency event, relative to a control condition. Conversation did not mitigate passive fatigue effects; rather, it added worry about matters unrelated to the driving task. Conversation moderately improved vehicle control, as measured by SDLP, but it failed to counter fatigue-induced slowing of braking in response to an emergency event. Finally, conversation appeared to have a hidden danger in that it reduced drivers' insights into performance impairments when in a state of passive fatigue. Automation induced passive fatigue, indicated by loss of task engagement; yet, simulated cell phone conversation did not counter the subjective automation-induced fatigue. Conversation also failed to counter objective loss of performance (slower braking speed) resulting from automation. Cell phone conversation in passive fatigue states may impair drivers' awareness of their performance deficits. Practical applications: Results suggest that conversation, even using a hands-free device, may not be a safe way to reduce fatigue and increase alertness during transitions from automated to manual vehicle control. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.

  11. Response of Spartina Alterniflora to Sea Level Rise, Changing Precipitation Patterns, and Eutrophication

    EPA Science Inventory

    Sea level rise, precipitation, and eutrophication (3 X 3 X 2 factorial design) were simulated in tidal mesocosms in the US EPA Narragansett greenhouse. Each precipitation treatment (storm, drought, ambient rain) was represented in one of two tanks (control, fertilized). The contr...

  12. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    NASA Technical Reports Server (NTRS)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.

  13. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    PubMed

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  14. NextGen Technologies on the FAA's Standard Terminal Automation Replacement System

    NASA Technical Reports Server (NTRS)

    Witzberger, Kevin; Swenson, Harry; Martin, Lynne; Lin, Melody; Cheng, Jinn-Hwei

    2014-01-01

    This paper describes the integration, evaluation, and results from a high-fidelity human-in-the-loop (HITL) simulation of key NASA Air Traffic Management Technology Demonstration - 1 (ATD- 1) technologies implemented in an enhanced version of the FAA's Standard Terminal Automation Replacement System (STARS) platform. These ATD-1 technologies include: (1) a NASA enhanced version of the FAA's Time-Based Flow Management, (2) a NASA ground-based automation technology known as controller-managed spacing (CMS), and (3) a NASA advanced avionics airborne technology known as flight-deck interval management (FIM). These ATD-1 technologies have been extensively tested in large-scale HITL simulations using general-purpose workstations to study air transportation technologies. These general purpose workstations perform multiple functions and are collectively referred to as the Multi-Aircraft Control System (MACS). Researchers at NASA Ames Research Center and Raytheon collaborated to augment the STARS platform by including CMS and FIM advisory tools to validate the feasibility of integrating these automation enhancements into the current FAA automation infrastructure. NASA Ames acquired three STARS terminal controller workstations, and then integrated the ATD-1 technologies. HITL simulations were conducted to evaluate the ATD-1 technologies when using the STARS platform. These results were compared with the results obtained when the ATD-1 technologies were tested in the MACS environment. Results collected from the numerical data show acceptably minor differences, and, together with the subjective controller questionnaires showing a trend towards preferring STARS, validate the ATD-1/STARS integration.

  15. Highway Traffic Simulations on Multi-Processor Computers

    DOT National Transportation Integrated Search

    1997-01-01

    A computer model has been developed to simulate highway traffic for various degrees of automation with a high degree of fidelity in regard to driver control and vehicle characteristics. The model simulates vehicle maneuvering in a multi-lane highway ...

  16. Simulation Models for the Electric Power Requirements in a Guideway Transit System

    DOT National Transportation Integrated Search

    1980-04-01

    This report describes a computer simulation model developed at the Transportation Systems Center to study the electrical power distribution characteristics of Automated Guideway Transit (AGT) systems. The objective of this simulation effort is to pro...

  17. Work Practice Simulation of Complex Human-Automation Systems in Safety Critical Situations: The Brahms Generalized berlingen Model

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Linde, Charlotte; Seah, Chin; Shafto, Michael

    2013-01-01

    The transition from the current air traffic system to the next generation air traffic system will require the introduction of new automated systems, including transferring some functions from air traffic controllers to on­-board automation. This report describes a new design verification and validation (V&V) methodology for assessing aviation safety. The approach involves a detailed computer simulation of work practices that includes people interacting with flight-critical systems. The research is part of an effort to develop new modeling and verification methodologies that can assess the safety of flight-critical systems, system configurations, and operational concepts. The 2002 Ueberlingen mid-air collision was chosen for analysis and modeling because one of the main causes of the accident was one crew's response to a conflict between the instructions of the air traffic controller and the instructions of TCAS, an automated Traffic Alert and Collision Avoidance System on-board warning system. It thus furnishes an example of the problem of authority versus autonomy. It provides a starting point for exploring authority/autonomy conflict in the larger system of organization, tools, and practices in which the participants' moment-by-moment actions take place. We have developed a general air traffic system model (not a specific simulation of Überlingen events), called the Brahms Generalized Ueberlingen Model (Brahms-GUeM). Brahms is a multi-agent simulation system that models people, tools, facilities/vehicles, and geography to simulate the current air transportation system as a collection of distributed, interactive subsystems (e.g., airports, air-traffic control towers and personnel, aircraft, automated flight systems and air-traffic tools, instruments, crew). Brahms-GUeM can be configured in different ways, called scenarios, such that anomalous events that contributed to the Überlingen accident can be modeled as functioning according to requirements or in an anomalous condition, as occurred during the accident. Brahms-GUeM thus implicitly defines a class of scenarios, which include as an instance what occurred at Überlingen. Brahms-GUeM is a modeling framework enabling "what if" analysis of alternative work system configurations and thus facilitating design of alternative operations concepts. It enables subsequent adaption (reusing simulation components) for modeling and simulating NextGen scenarios. This project demonstrates that BRAHMS provides the capacity to model the complexity of air transportation systems, going beyond idealized and simple flights to include for example the interaction of pilots and ATCOs. The research shows clearly that verification and validation must include the entire work system, on the one hand to check that mechanisms exist to handle failures of communication and alerting subsystems and/or failures of people to notice, comprehend, or communicate problematic (unsafe) situations; but also to understand how people must use their own judgment in relating fallible systems like TCAS to other sources of information and thus to evaluate how the unreliability of automation affects system safety. The simulation shows in particular that distributed agents (people and automated systems) acting without knowledge of each others' actions can create a complex, dynamic system whose interactive behavior is unexpected and is changing too quickly to comprehend and control.

  18. Automated alignment of a reconfigurable optical system using focal-plane sensing and Kalman filtering.

    PubMed

    Fang, Joyce; Savransky, Dmitry

    2016-08-01

    Automation of alignment tasks can provide improved efficiency and greatly increase the flexibility of an optical system. Current optical systems with automated alignment capabilities are typically designed to include a dedicated wavefront sensor. Here, we demonstrate a self-aligning method for a reconfigurable system using only focal plane images. We define a two lens optical system with 8 degrees of freedom. Images are simulated given misalignment parameters using ZEMAX software. We perform a principal component analysis on the simulated data set to obtain Karhunen-Loève modes, which form the basis set whose weights are the system measurements. A model function, which maps the state to the measurement, is learned using nonlinear least-squares fitting and serves as the measurement function for the nonlinear estimator (extended and unscented Kalman filters) used to calculate control inputs to align the system. We present and discuss simulated and experimental results of the full system in operation.

  19. Simulation and Automation of Microwave Frequency Control in Dynamic Nuclear Polarization for Solid Polarized Targets

    NASA Astrophysics Data System (ADS)

    Perera, Gonaduwage; Johnson, Ian; Keller, Dustin

    2017-09-01

    Dynamic Nuclear Polarization (DNP) is used in most of the solid polarized target scattering experiments. Those target materials must be irradiated using microwaves at a frequency determined by the difference in the nuclear Larmor and electron paramagnetic resonance (EPR) frequencies. But the resonance frequency changes with time as a result of radiation damage. Hence the microwave frequency should be adjusted accordingly. Manually adjusting the frequency can be difficult, and improper adjustments negatively impact the polarization. In order to overcome these difficulties, two controllers were developed which automate the process of seeking and maintaining the optimal frequency: one being a standalone controller for a traditional DC motor and the other a LabVIEW VI for a stepper motor configuration. Further a Monte-Carlo simulation was developed which can accurately model the polarization over time as a function of microwave frequency. In this talk, analysis of the simulated data and recent improvements to the automated system will be presented. DOE.

  20. An investigation of sensory information, levels of automation, and piloting experience on unmanned aircraft pilot performance.

    DOT National Transportation Integrated Search

    2012-03-01

    "The current experiment was intended to examine the effect of sensory information on pilot reactions to system : failures within a UAS control station simulation. This research also investigated the level of automation used in : controlling the aircr...

  1. Assessing Orchestrated Simulation Through Modeling to Quantify the Benefits of Unmanned-Teaming in a Tactical ASW Scenario

    DTIC Science & Technology

    2018-03-01

    Results are compared to a previous study using a similar design of experiments but different simulation software. The baseline scenario for exploring the...behaviors are mimicked in this research, enabling Solem’s MANA results to be compared to our LITMUS’ results. By design , the principal difference...missions when using the second order NOLH, and compares favorably with the over six million in the full factorial design . 3. Advantages of Cluster

  2. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    DTIC Science & Technology

    2016-09-23

    Matthews1 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER H0HJ (53290813) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8... performance . Reliability Reliability of automation is a key factor in an operator’s reliance on automation. Previous work has shown that... Performance in a complex multiple-task environment during a laboratory-based simulation of occasional night work . Human Factors: The Journal of the

  3. Advances in the simulation and automated measurement of well-sorted granular material: 1. Simulation

    USGS Publications Warehouse

    Daniel Buscombe,; Rubin, David M.

    2012-01-01

    1. In this, the first of a pair of papers which address the simulation and automated measurement of well-sorted natural granular material, a method is presented for simulation of two-phase (solid, void) assemblages of discrete non-cohesive particles. The purpose is to have a flexible, yet computationally and theoretically simple, suite of tools with well constrained and well known statistical properties, in order to simulate realistic granular material as a discrete element model with realistic size and shape distributions, for a variety of purposes. The stochastic modeling framework is based on three-dimensional tessellations with variable degrees of order in particle-packing arrangement. Examples of sediments with a variety of particle size distributions and spatial variability in grain size are presented. The relationship between particle shape and porosity conforms to published data. The immediate application is testing new algorithms for automated measurements of particle properties (mean and standard deviation of particle sizes, and apparent porosity) from images of natural sediment, as detailed in the second of this pair of papers. The model could also prove useful for simulating specific depositional structures found in natural sediments, the result of physical alterations to packing and grain fabric, using discrete particle flow models. While the principal focus here is on naturally occurring sediment and sedimentary rock, the methods presented might also be useful for simulations of similar granular or cellular material encountered in engineering, industrial and life sciences.

  4. Complacency and Automation Bias in the Use of Imperfect Automation.

    PubMed

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  5. Adaptive function allocation reduces performance costs of static automation

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian

    1993-01-01

    Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.

  6. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Rives, T. B.; Ingels, F. M.

    1988-01-01

    An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.

  7. Deep convolutional neural networks as strong gravitational lens detectors

    NASA Astrophysics Data System (ADS)

    Schaefer, C.; Geiger, M.; Kuntzer, T.; Kneib, J.-P.

    2018-03-01

    Context. Future large-scale surveys with high-resolution imaging will provide us with approximately 105 new strong galaxy-scale lenses. These strong-lensing systems will be contained in large data amounts, however, which are beyond the capacity of human experts to visually classify in an unbiased way. Aim. We present a new strong gravitational lens finder based on convolutional neural networks (CNNs). The method was applied to the strong-lensing challenge organized by the Bologna Lens Factory. It achieved first and third place, respectively, on the space-based data set and the ground-based data set. The goal was to find a fully automated lens finder for ground-based and space-based surveys that minimizes human inspection. Methods: We compared the results of our CNN architecture and three new variations ("invariant" "views" and "residual") on the simulated data of the challenge. Each method was trained separately five times on 17 000 simulated images, cross-validated using 3000 images, and then applied to a test set with 100 000 images. We used two different metrics for evaluation, the area under the receiver operating characteristic curve (AUC) score, and the recall with no false positive (Recall0FP). Results: For ground-based data, our best method achieved an AUC score of 0.977 and a Recall0FP of 0.50. For space-based data, our best method achieved an AUC score of 0.940 and a Recall0FP of 0.32. Adding dihedral invariance to the CNN architecture diminished the overall score on space-based data, but achieved a higher no-contamination recall. We found that using committees of five CNNs produced the best recall at zero contamination and consistently scored better AUC than a single CNN. Conclusions: We found that for every variation of our CNN lensfinder, we achieved AUC scores close to 1 within 6%. A deeper network did not outperform simpler CNN models either. This indicates that more complex networks are not needed to model the simulated lenses. To verify this, more realistic lens simulations with more lens-like structures (spiral galaxies or ring galaxies) are needed to compare the performance of deeper and shallower networks.

  8. Examining single- and multiple-process theories of trust in automation.

    PubMed

    Rice, Stephen

    2009-07-01

    The author examined the effects of human responses to automation alerts and nonalerts. Previous research has shown that automation false alarms and misses have differential effects on human trust (i.e., automation false alarms tend to affect operator compliance, whereas automation misses tend to affect operator reliance). Participants performed a simulated combat task, whereby they examined aerial photographs for the presence of enemy targets. A diagnostic aid provided a recommendation during each trial. The author manipulated the reliability and response bias of the aid to provide appropriate data for state-trace analyses. The analyses provided strong evidence that only a multiple-process theory of operator trust can explain the effects of automation errors on human dependence behaviors. The author discusses the theoretical and practical implications of this finding.

  9. Automated subsystems control development. [for life support systems of space station

    NASA Technical Reports Server (NTRS)

    Block, R. F.; Heppner, D. B.; Samonski, F. H., Jr.; Lance, N., Jr.

    1985-01-01

    NASA has the objective to launch a Space Station in the 1990s. It has been found that the success of the Space Station engineering development, the achievement of initial operational capability (IOC), and the operation of a productive Space Station will depend heavily on the implementation of an effective automation and control approach. For the development of technology needed to implement the required automation and control function, a contract entitled 'Automated Subsystems Control for Life Support Systems' (ASCLSS) was awarded to two American companies. The present paper provides a description of the ASCLSS program. Attention is given to an automation and control architecture study, a generic automation and control approach for hardware demonstration, a standard software approach, application of Air Revitalization Group (ARG) process simulators, and a generic man-machine interface.

  10. The Interdependence of Computers, Robots, and People.

    ERIC Educational Resources Information Center

    Ludden, Laverne; And Others

    Computers and robots are becoming increasingly more advanced, with smaller and cheaper computers now doing jobs once reserved for huge multimillion dollar computers and with robots performing feats such as painting cars and using television cameras to simulate vision as they perform factory tasks. Technicians expect computers to become even more…

  11. Teaching Cockpit Automation in the Classroom

    NASA Technical Reports Server (NTRS)

    Casner, Stephen M.

    2003-01-01

    This study explores the idea of teaching fundamental cockpit automation concepts and skills to aspiring professional pilots in a classroom setting, without the use of sophisticated aircraft or equipment simulators. Pilot participants from a local professional pilot academy completed eighteen hours of classroom instruction that placed a strong emphasis on understanding the underlying principles of cockpit automation systems and their use in a multi-crew cockpit. The instructional materials consisted solely of a single textbook. Pilots received no hands-on instruction or practice during their training. At the conclusion of the classroom instruction, pilots completed a written examination testing their mastery of what had been taught during the classroom meetings. Following the written exam, each pilot was given a check flight in a full-mission Level D simulator of a Boeing 747-400 aircraft. Pilots were given the opportunity to fly one practice leg, and were then tested on all concepts and skills covered in the class during a second leg. The results of the written exam and simulator checks strongly suggest that instruction delivered in a traditional classroom setting can lead to high levels of preparation without the need for expensive airplane or equipment simulators.

  12. On the independence of compliance and reliance: are automation false alarms worse than misses?

    PubMed

    Dixon, Stephen R; Wickens, Christopher D; McCarley, Jason S

    2007-08-01

    Participants performed a tracking task and system monitoring task while aided by diagnostic automation. The goal of the study was to examine operator compliance and reliance as affected by automation failures and to clarify claims regarding independence of these two constructs. Background data revealed a trend toward nonindependence of the compliance-reliance constructs. Thirty-two undergraduate students performed the simulation that presented the visual display while dependent measures were collected. False alarm-prone automation hurt overall performance more than miss-prone automation. False alarm-prone automation also clearly affected both operator compliance and reliance, whereas miss-prone automation appeared to affect only operator reliance. Compliance and reliance do not appear to be entirely independent of each other. False alarms appear to be more damaging to overall performance than misses, and designers must take the compliance-reliance constructs into consideration.

  13. Automation in future air traffic management: effects of decision aid reliability on controller performance and mental workload.

    PubMed

    Metzger, Ulla; Parasuraman, Raja

    2005-01-01

    Future air traffic management concepts envisage shared decision-making responsibilities between controllers and pilots, necessitating that controllers be supported by automated decision aids. Even as automation tools are being introduced, however, their impact on the air traffic controller is not well understood. The present experiments examined the effects of an aircraft-to-aircraft conflict decision aid on performance and mental workload of experienced, full-performance level controllers in a simulated Free Flight environment. Performance was examined with both reliable (Experiment 1) and inaccurate automation (Experiment 2). The aid improved controller performance and reduced mental workload when it functioned reliably. However, detection of a particular conflict was better under manual conditions than under automated conditions when the automation was imperfect. Potential or actual applications of the results include the design of automation and procedures for future air traffic control systems.

  14. The EB factory project. II. Validation with the Kepler field in preparation for K2 and TESS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parvizi, Mahmoud; Paegert, Martin; Stassun, Keivan G., E-mail: mahmoud.parvizi@vanderbilt.edu

    Large repositories of high precision light curve data, such as the Kepler data set, provide the opportunity to identify astrophysically important eclipsing binary (EB) systems in large quantities. However, the rate of classical “by eye” human analysis restricts complete and efficient mining of EBs from these data using classical techniques. To prepare for mining EBs from the upcoming K2 mission as well as other current missions, we developed an automated end-to-end computational pipeline—the Eclipsing Binary Factory (EBF)—that automatically identifies EBs and classifies them into morphological types. The EBF has been previously tested on ground-based light curves. To assess the performancemore » of the EBF in the context of space-based data, we apply the EBF to the full set of light curves in the Kepler “Q3” Data Release. We compare the EBs identified from this automated approach against the human generated Kepler EB Catalog of ∼2600 EBs. When we require EB classification with ⩾90% confidence, we find that the EBF correctly identifies and classifies eclipsing contact (EC), eclipsing semi-detached (ESD), and eclipsing detached (ED) systems with a false positive rate of only 4%, 4%, and 8%, while complete to 64%, 46%, and 32%, respectively. When classification confidence is relaxed, the EBF identifies and classifies ECs, ESDs, and EDs with a slightly higher false positive rate of 6%, 16%, and 8%, while much more complete to 86%, 74%, and 62%, respectively. Through our processing of the entire Kepler “Q3” data set, we also identify 68 new candidate EBs that may have been missed by the human generated Kepler EB Catalog. We discuss the EBF's potential application to light curve classification for periodic variable stars more generally for current and upcoming surveys like K2 and the Transiting Exoplanet Survey Satellite.« less

  15. The Eb Factory Project. Ii. Validation With the Kepler Field in Preparation for K2 and Tess

    NASA Astrophysics Data System (ADS)

    Parvizi, Mahmoud; Paegert, Martin; Stassun, Keivan G.

    2014-12-01

    Large repositories of high precision light curve data, such as the Kepler data set, provide the opportunity to identify astrophysically important eclipsing binary (EB) systems in large quantities. However, the rate of classical “by eye” human analysis restricts complete and efficient mining of EBs from these data using classical techniques. To prepare for mining EBs from the upcoming K2 mission as well as other current missions, we developed an automated end-to-end computational pipeline—the Eclipsing Binary Factory (EBF)—that automatically identifies EBs and classifies them into morphological types. The EBF has been previously tested on ground-based light curves. To assess the performance of the EBF in the context of space-based data, we apply the EBF to the full set of light curves in the Kepler “Q3” Data Release. We compare the EBs identified from this automated approach against the human generated Kepler EB Catalog of ˜ 2600 EBs. When we require EB classification with ≥slant 90% confidence, we find that the EBF correctly identifies and classifies eclipsing contact (EC), eclipsing semi-detached (ESD), and eclipsing detached (ED) systems with a false positive rate of only 4%, 4%, and 8%, while complete to 64%, 46%, and 32%, respectively. When classification confidence is relaxed, the EBF identifies and classifies ECs, ESDs, and EDs with a slightly higher false positive rate of 6%, 16%, and 8%, while much more complete to 86%, 74%, and 62%, respectively. Through our processing of the entire Kepler “Q3” data set, we also identify 68 new candidate EBs that may have been missed by the human generated Kepler EB Catalog. We discuss the EBF's potential application to light curve classification for periodic variable stars more generally for current and upcoming surveys like K2 and the Transiting Exoplanet Survey Satellite.

  16. Automated sampling assessment for molecular simulations using the effective sample size

    PubMed Central

    Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.

    2010-01-01

    To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418

  17. Local-feature analysis for automated coarse-graining of bulk-polymer molecular dynamics simulations.

    PubMed

    Xue, Y; Ludovice, P J; Grover, M A

    2012-12-01

    A method for automated coarse-graining of bulk polymers is presented, using the data-mining tool of local feature analysis. Most existing methods for polymer coarse-graining define superatoms based on their covalent bonding topology along the polymer backbone, but here superatoms are defined based only on their correlated motions, as observed in molecular dynamics simulations. Correlated atomic motions are identified in the simulation data using local feature analysis, between atoms in the same or in different polymer chains. Groups of highly correlated atoms constitute the superatoms in the coarse-graining scheme, and the positions of their seed coordinates are then projected forward in time. Based on only the seed positions, local feature analysis enables the full reconstruction of all atomic positions. This reconstruction suggests an iterative scheme to reduce the computation of the simulations to initialize another short molecular dynamic simulation, identify new superatoms, and again project forward in time.

  18. Pilot interaction with automated airborne decision making systems

    NASA Technical Reports Server (NTRS)

    Rouse, W. B.; Hammer, J. M.; Mitchell, C. M.; Morris, N. M.; Lewis, C. M.; Yoon, W. C.

    1985-01-01

    Progress was made in the three following areas. In the rule-based modeling area, two papers related to identification and significane testing of rule-based models were presented. In the area of operator aiding, research focused on aiding operators in novel failure situations; a discrete control modeling approach to aiding PLANT operators was developed; and a set of guidelines were developed for implementing automation. In the area of flight simulator hardware and software, the hardware will be completed within two months and initial simulation software will then be integrated and tested.

  19. Automated Simulation Updates based on Flight Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Ward, David G.

    2007-01-01

    A statistically-based method for using flight data to update aerodynamic data tables used in flight simulators is explained and demonstrated. A simplified wind-tunnel aerodynamic database for the F/A-18 aircraft is used as a starting point. Flight data from the NASA F-18 High Alpha Research Vehicle (HARV) is then used to update the data tables so that the resulting aerodynamic model characterizes the aerodynamics of the F-18 HARV. Prediction cases are used to show the effectiveness of the automated method, which requires no ad hoc adjustments by the analyst.

  20. Improving the driver-automation interaction: an approach using automation uncertainty.

    PubMed

    Beller, Johannes; Heesen, Matthias; Vollrath, Mark

    2013-12-01

    The aim of this study was to evaluate whether communicating automation uncertainty improves the driver-automation interaction. A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs.low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. The presentation of automation uncertaintythrough a symbol improves overall driver-automation cooperation. Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver-automation cooperation.

  1. Displaying contextual information reduces the costs of imperfect decision automation in rapid retasking of ISR assets.

    PubMed

    Rovira, Ericka; Cross, Austin; Leitch, Evan; Bonaceto, Craig

    2014-09-01

    The impact of a decision support tool designed to embed contextual mission factors was investigated. Contextual information may enable operators to infer the appropriateness of data underlying the automation's algorithm. Research has shown the costs of imperfect automation are more detrimental than perfectly reliable automation when operators are provided with decision support tools. Operators may trust and rely on the automation more appropriately if they understand the automation's algorithm. The need to develop decision support tools that are understandable to the operator provides the rationale for the current experiment. A total of 17 participants performed a simulated rapid retasking of intelligence, surveillance, and reconnaissance (ISR) assets task with manual, decision automation, or contextual decision automation differing in two levels of task demand: low or high. Automation reliability was set at 80%, resulting in participants experiencing a mixture of reliable and automation failure trials. Dependent variables included ISR coverage and response time of replanning routes. Reliable automation significantly improved ISR coverage when compared with manual performance. Although performance suffered under imperfect automation, contextual decision automation helped to reduce some of the decrements in performance. Contextual information helps overcome the costs of imperfect decision automation. Designers may mitigate some of the performance decrements experienced with imperfect automation by providing operators with interfaces that display contextual information, that is, the state of factors that affect the reliability of the automation's recommendation.

  2. Decision Making In A High-Tech World: Automation Bias and Countermeasures

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)

    1996-01-01

    Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and resultant errors. To what extent these effects generalize to performance situations is not yet empirically established. The two studies to be presented represent concurrent efforts, with student and professional pilot samples, to determine the effects of accountability pressures on automation bias and on the verification of the accurate functioning of automated aids. Students (Experiment 1) and commercial pilots (Experiment 2) performed simulated flight tasks using automated aids. In both studies, participants who perceived themselves as accountable for their strategies of interaction with the automation were significantly more likely to verify its correctness, and committed significantly fewer automation-related errors than those who did not report this perception.

  3. A Human-Automation Interface Model to Guide Automation of System Functions: A Way to Achieve Manning Goals in New Systems

    DTIC Science & Technology

    2006-06-01

    levels of automation applied as per Figure 13. .................................. 60 x THIS PAGE...models generated for this thesis were set to run for 60 minutes. To run the simulation for the set time, the analyst provides a random number seed to...1984). The IMPRINT 59 workload value of 60 has been used by a consensus of workload modeling SMEs to represent the ‘high’ threshold, while the

  4. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    DOE PAGES

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  5. ASUPT Automated Objective Performance Measurement System.

    ERIC Educational Resources Information Center

    Waag, Wayne L.; And Others

    To realize its full research potential, a need exists for the development of an automated objective pilot performance evaluation system for use in the Advanced Simulation in Undergraduate Pilot Training (ASUPT) facility. The present report documents the approach taken for the development of performance measures and also presents data collected…

  6. Multiple Robots Localization Via Data Sharing

    DTIC Science & Technology

    2015-09-01

    multiple humans, each with specialized skills complementing each other, work to create the solution. Hence, there is a motivation to think in terms of...pygame.Color(255,255,255) COLORBLACK = pygame.Color(0,0,0) F. AUTOMATE.PY The automate.py file is a helper file to assist in running multiple simulation

  7. Predictive validity of driving-simulator assessments following traumatic brain injury: a preliminary study.

    PubMed

    Lew, Henry L; Poole, John H; Lee, Eun Ha; Jaffe, David L; Huang, Hsiu-Chen; Brodd, Edward

    2005-03-01

    To evaluate whether driving simulator and road test evaluations can predict long-term driving performance, we conducted a prospective study on 11 patients with moderate to severe traumatic brain injury. Sixteen healthy subjects were also tested to provide normative values on the simulator at baseline. At their initial evaluation (time-1), subjects' driving skills were measured during a 30-minute simulator trial using an automated 12-measure Simulator Performance Index (SPI), while a trained observer also rated their performance using a Driving Performance Inventory (DPI). In addition, patients were evaluated on the road by a certified driving evaluator. Ten months later (time-2), family members observed patients driving for at least 3 hours over 4 weeks and rated their driving performance using the DPI. At time-1, patients were significantly impaired on automated SPI measures of driving skill, including: speed and steering control, accidents, and vigilance to a divided-attention task. These simulator indices significantly predicted the following aspects of observed driving performance at time-2: handling of automobile controls, regulation of vehicle speed and direction, higher-order judgment and self-control, as well as a trend-level association with car accidents. Automated measures of simulator skill (SPI) were more sensitive and accurate than observational measures of simulator skill (DPI) in predicting actual driving performance. To our surprise, the road test results at time-1 showed no significant relation to driving performance at time-2. Simulator-based assessment of patients with brain injuries can provide ecologically valid measures that, in some cases, may be more sensitive than a traditional road test as predictors of long-term driving performance in the community.

  8. Development of a generic GMCC simulator.

    DOT National Transportation Integrated Search

    2001-11-01

    This document describes the development and current status of a high fidelity, human-in-the-loop simulator for Airway Facilities : Maintenance Control Centers and Operations Control Centers. Applications include Event Manager, Maintenance Automation ...

  9. Voxel-based morphometry and automated lobar volumetry: The trade-off between spatial scale and statistical correction

    PubMed Central

    Voormolen, Eduard H.J.; Wei, Corie; Chow, Eva W.C.; Bassett, Anne S.; Mikulis, David J.; Crawley, Adrian P.

    2011-01-01

    Voxel-based morphometry (VBM) and automated lobar region of interest (ROI) volumetry are comprehensive and fast methods to detect differences in overall brain anatomy on magnetic resonance images. However, VBM and automated lobar ROI volumetry have detected dissimilar gray matter differences within identical image sets in our own experience and in previous reports. To gain more insight into how diverging results arise and to attempt to establish whether one method is superior to the other, we investigated how differences in spatial scale and in the need to statistically correct for multiple spatial comparisons influence the relative sensitivity of either technique to group differences in gray matter volumes. We assessed the performance of both techniques on a small dataset containing simulated gray matter deficits and additionally on a dataset of 22q11-deletion syndrome patients with schizophrenia (22q11DS-SZ) vs. matched controls. VBM was more sensitive to simulated focal deficits compared to automated ROI volumetry, and could detect global cortical deficits equally well. Moreover, theoretical calculations of VBM and ROI detection sensitivities to focal deficits showed that at increasing ROI size, ROI volumetry suffers more from loss in sensitivity than VBM. Furthermore, VBM and automated ROI found corresponding GM deficits in 22q11DS-SZ patients, except in the parietal lobe. Here, automated lobar ROI volumetry found a significant deficit only after a smaller subregion of interest was employed. Thus, sensitivity to focal differences is impaired relatively more by averaging over larger volumes in automated ROI methods than by the correction for multiple comparisons in VBM. These findings indicate that VBM is to be preferred over automated lobar-scale ROI volumetry for assessing gray matter volume differences between groups. PMID:19619660

  10. A comparison of adaptive and adaptable automation under different levels of environmental stress.

    PubMed

    Sauer, Juergen; Kao, Chung-Shan; Wastell, David

    2012-01-01

    The effectiveness of different forms of adaptive and adaptable automation was examined under low- and high-stress conditions, in the form of different levels of noise. Thirty-six participants were assigned to one of the three types of variable automation (adaptive event-based, adaptive performance-based and adaptable serving as a control condition). Participants received 3 h of training on a simulation of a highly automated process control task and were subsequently tested during a 4-h session under noise exposure and quiet conditions. The results for performance suggested no clear benefits of one automation control mode over the other two. However, it emerged that participants under adaptable automation adopted a more active system management strategy and reported higher levels of self-confidence than in the two adaptive control modes. Furthermore, the results showed higher levels of perceived workload, fatigue and anxiety for performance-based adaptive automation control than the other two modes. This study compared two forms of adaptive automation (where the automated system flexibly allocates tasks between human and machine) with adaptable automation (where the human allocates the tasks). The adaptable mode showed marginal advantages. This is of relevance, given that this automation mode may also be easier to design.

  11. Technology demonstration of space intravehicular automation and robotics

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Barker, L. Keith

    1994-01-01

    Automation and robotic technologies are being developed and capabilities demonstrated which would increase the productivity of microgravity science and materials processing in the space station laboratory module, especially when the crew is not present. The Automation Technology Branch at NASA Langley has been working in the area of intravehicular automation and robotics (IVAR) to provide a user-friendly development facility, to determine customer requirements for automated laboratory systems, and to improve the quality and efficiency of commercial production and scientific experimentation in space. This paper will describe the IVAR facility and present the results of a demonstration using a simulated protein crystal growth experiment inside a full-scale mockup of the space station laboratory module using a unique seven-degree-of-freedom robot.

  12. Assessing the applicability of the Taguchi design method to an interrill erosion study

    NASA Astrophysics Data System (ADS)

    Zhang, F. B.; Wang, Z. L.; Yang, M. Y.

    2015-02-01

    Full-factorial experimental designs have been used in soil erosion studies, but are time, cost and labor intensive, and sometimes they are impossible to conduct due to the increasing number of factors and their levels to consider. The Taguchi design is a simple, economical and efficient statistical tool that only uses a portion of the total possible factorial combinations to obtain the results of a study. Soil erosion studies that use the Taguchi design are scarce and no comparisons with full-factorial designs have been made. In this paper, a series of simulated rainfall experiments using a full-factorial design of five slope lengths (0.4, 0.8, 1.2, 1.6, and 2 m), five slope gradients (18%, 27%, 36%, 48%, and 58%), and five rainfall intensities (48, 62.4, 102, 149, and 170 mm h-1) were conducted. Validation of the applicability of a Taguchi design to interrill erosion experiments was achieved by extracting data from the full dataset according to a theoretical Taguchi design. The statistical parameters for the mean quasi-steady state erosion and runoff rates of each test, the optimum conditions for producing maximum erosion and runoff, and the main effect and percentage contribution of each factor obtained from the full-factorial and Taguchi designs were compared. Both designs generated almost identical results. Using the experimental data from the Taguchi design, it was possible to accurately predict the erosion and runoff rates under the conditions that had been excluded from the Taguchi design. All of the results obtained from analyzing the experimental data for both designs indicated that the Taguchi design could be applied to interrill erosion studies and could replace full-factorial designs. This would save time, labor and costs by generally reducing the number of tests to be conducted. Further work should test the applicability of the Taguchi design to a wider range of conditions.

  13. Chemical Transformation Simulator

    EPA Science Inventory

    The Chemical Transformation Simulator (CTS) is a web-based, high-throughput screening tool that automates the calculation and collection of physicochemical properties for an organic chemical of interest and its predicted products resulting from transformations in environmental sy...

  14. Faster Aerodynamic Simulation With Cart3D

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.

  15. Situation Awareness and Levels of Automation

    NASA Technical Reports Server (NTRS)

    Kaber, David B.

    1999-01-01

    During the first year of this project, a taxonomy of theoretical levels of automation (LOAs) was applied to the advanced commercial aircraft by categorizing actual modes of McDonald Douglas MD-11 autoflight system operation in terms of the taxonomy. As well, high LOAs included in the taxonomy (e.g., supervisory control) were modeled in the context of MD-11 autoflight systems through development of a virtual flight simulator. The flight simulator was an integration of a re-configurable simulator developed by the Georgia Institute Technology and new software prototypes of autoflight system modules found in the MD-11 cockpit. In addition to this work, a version of the Situation Awareness Global Assessment Technique (SAGAT) was developed for application to commercial piloting tasks. A software package was developed to deliver the SAGAT and was integrated with the virtual flight simulator.

  16. Reaction control system/remote manipulator system automation

    NASA Technical Reports Server (NTRS)

    Hiers, Harry K.

    1990-01-01

    The objectives of this project is to evaluate the capability of the Procedural Reasoning System (PRS) in a typical real-time space shuttle application and to assess its potential for use in the Space Station Freedom. PRS, developed by SRI International, is a result of research in automating the monitoring and control of spacecraft systems. The particular application selected for the present work is the automation of malfunction handling procedures for the Shuttle Remote Manipulator System (SRMS). The SRMS malfunction procedures will be encoded within the PRS framework, a crew interface appropriate to the RMS application will be developed, and the real-time data interface software developed. The resulting PRS will then be integrated with the high-fidelity On-orbit Simulation of the NASA Johnson Space Center's System Engineering Simulator, and tests under various SRMS fault scenarios will be conducted.

  17. CAT/RF Simulation Lessons Learned

    DTIC Science & Technology

    2003-06-11

    IVSS-2003-MAS-7 CAT /RF Simulation Lessons Learned Christopher Mocnik Vetronics Technology Area, RDECOM TARDEC Tim Lee DCS Corporation...developed a re- configurable Unmanned Ground Vehicle (UGV) simulation for the Crew integration and Automation Test bed ( CAT ) and Robotics Follower (RF...Advanced Technology Demonstration (ATD) experiments. This simulation was developed as a component of the Embedded Simulation System (ESS) of the CAT

  18. A Modular and Extensible Architecture Integrating Sensors, Dynamic Displays of Anatomy and Physiology, and Automated Instruction for Innovations in Clinical Education

    ERIC Educational Resources Information Center

    Nelson, Douglas Allen, Jr.

    2017-01-01

    Adoption of simulation in healthcare education has increased tremendously over the past two decades. However, the resources necessary to perform simulation are immense. Simulators are large capital investments and require specialized training for both instructors and simulation support staff to develop curriculum using the simulator and to use the…

  19. AR-NE3A, a New Macromolecular Crystallography Beamline for Pharmaceutical Applications at the Photon Factory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko

    2010-06-23

    Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable ofmore » handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.« less

  20. The SuperNova Integral Field Spectrograph

    NASA Astrophysics Data System (ADS)

    Aldering, Gregory S.; Supernova Factory, Nearby

    2007-05-01

    The SuperNova Integral Field Spectrograph (SNIFS) is operated at the University of Hawaii 2.2 meter telescope on Mauna Kea by the Nearby Supernova Factory. The IFU has a 6x6 arcsecond field of view, and the combined blue and red channels simultaneously cover the full optical (320-1000 nm) spectral range. SNIFS was designed to allow spectrophotometry of supernovae under both photometric and non-photometric conditions. SNIFS is operated entirely remotely, in a quasi-automated mode, from as nearby as Hilo, Hawaii and as far away as Paris, France. Being mounted at the south bent Cassegrain focus of the UH 2.2-m, SNIFS is always available, either for regular Nearby Supernova Factory observations, or any of a range of programs conducted by astronomers at the University of Hawaii Institute for Astronomy. We illustrate some of the unique features of SNIFS and some of the science programs that have been undertaken using it. This work is supported in part by the Director, Office of Science, Office of High Energy and Nuclear Physics, of the U.S. Department of Energy under Contracts No. DE-FG0-92ER40704, by a grant from the Gordon & Betty Moore Foundation, and in France by CNRS/IN2P3, CNRS/INSU and PNC.

  1. Simulated sugar factory wastewater remediation kinetics using algal-bacterial raceway reactor promoted by polyacrylate polyalcohol.

    PubMed

    Memon, Abdul Rehman; Andresen, John; Habib, Muddasar; Jaffar, Muhammad

    2014-04-01

    The remediation kinetics of simulated sugar factory wastewater (SFW) using an algal-bacterial culture (ABC) of Chlorella vulgaris in association with Pseudomonas putida in a raceway reactor was found to be enhanced by 89% with the addition of 80ppm of copolymer Polyacrylate polyalcohol (PAPA). This was achieved by efficient suspension of the ABC throughout the water body maintaining optimum pH and dissolved oxygen that led to rapid COD removal and improved algal biomass production. The suspension of the ABC using the co-polymer PAPA maintained a DO of 8-10mgl(-1) compared to 2-3mgl(-1) when not suspended. As a result, the non-suspended ABC only achieved a 50% reduction in COD after 96h compared to a 89% COD removal using 80ppm PAPA suspension. In addition, the algae biomass increased from 0.4gl(-1)d(-1) for the non-suspended ABC to 1.1gl(-1)d(-1) when suspended using 80ppm PAPA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Automation and Robotics for Space-Based Systems, 1991

    NASA Technical Reports Server (NTRS)

    Williams, Robert L., II (Editor)

    1992-01-01

    The purpose of this in-house workshop was to assess the state-of-the-art of automation and robotics for space operations from an LaRC perspective and to identify areas of opportunity for future research. Over half of the presentations came from the Automation Technology Branch, covering telerobotic control, extravehicular activity (EVA) and intra-vehicular activity (IVA) robotics, hand controllers for teleoperation, sensors, neural networks, and automated structural assembly, all applied to space missions. Other talks covered the Remote Manipulator System (RMS) active damping augmentation, space crane work, modeling, simulation, and control of large, flexible space manipulators, and virtual passive controller designs for space robots.

  3. A simulation evaluation of a pilot interface with an automatic terminal approach system

    NASA Technical Reports Server (NTRS)

    Hinton, David A.

    1987-01-01

    The pilot-machine interface with cockpit automation is a critical factor in achieving the benefits of automation and reducing pilot blunders. To improve this interface, an automatic terminal approach system (ATAS) was conceived that can automatically fly a published instrument approach by using stored instrument approach data to automatically tune airplane radios and control an airplane autopilot and autothrottle. The emphasis in the ATAS concept is a reduction in pilot blunders and work load by improving the pilot-automation interface. A research prototype of an ATAS was developed and installed in the Langley General Aviation Simulator. A piloted simulation study of the ATAS concept showed fewer pilot blunders, but no significant change in work load, when compared with a baseline heading-select autopilot mode. With the baseline autopilot, pilot blunders tended to involve loss of navigational situational awareness or instrument misinterpretation. With the ATAS, pilot blunders tended to involve a lack of awareness of the current ATAS mode state or deficiencies in the pilots' mental model of how the system operated. The ATAS display provided adequate approach status data to maintain situational awareness.

  4. The Flatworld Simulation Control Architecture (FSCA): A Framework for Scalable Immersive Visualization Systems

    DTIC Science & Technology

    2004-12-01

    handling using the X10 home automation protocol. Each 3D graphics client renders its scene according to an assigned virtual camera position. By having...control protocol. DMX is a versatile and robust framework which overcomes limitations of the X10 home automation protocol which we are currently using

  5. Expertise Development With Different Types of Automation: A Function of Different Cognitive Abilities.

    PubMed

    Jipp, Meike

    2016-02-01

    I explored whether different cognitive abilities (information-processing ability, working-memory capacity) are needed for expertise development when different types of automation (information vs. decision automation) are employed. It is well documented that expertise development and the employment of automation lead to improved performance. Here, it is argued that a learner's ability to reason about an activity may be hindered by the employment of information automation. Additional feedback needs to be processed, thus increasing the load on working memory and decelerating expertise development. By contrast, the employment of decision automation may stimulate reasoning, increase the initial load on information-processing ability, and accelerate expertise development. Authors of past research have not investigated the interrelations between automation assistance, individual differences, and expertise development. Sixty-one naive learners controlled simulated air traffic with two types of automation: information automation and decision automation. Their performance was captured across 16 trials. Well-established tests were used to assess information-processing ability and working-memory capacity. As expected, learners' performance benefited from expertise development and decision automation. Furthermore, individual differences moderated the effect of the type of automation on expertise development: The employment of only information automation increased the load on working memory during later expertise development. The employment of decision automation initially increased the need to process information. These findings highlight the importance of considering individual differences and expertise development when investigating human-automation interaction. The results are relevant for selecting automation configurations for expertise development. © 2015, Human Factors and Ergonomics Society.

  6. Evaluation of High Density Air Traffic Operations with Automation for Separation Assurance, Weather Avoidance and Schedule Conformance

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Mercer, Joey S.; Martin, Lynne Hazel; Homola, Jeffrey R.; Cabrall, Christopher D.; Brasil, Connie L.

    2011-01-01

    In this paper we discuss the development and evaluation of our prototype technologies and procedures for far-term air traffic control operations with automation for separation assurance, weather avoidance and schedule conformance. Controller-in-the-loop simulations in the Airspace Operations Laboratory at the NASA Ames Research Center in 2010 have shown very promising results. We found the operations to provide high airspace throughput, excellent efficiency and schedule conformance. The simulation also highlighted areas for improvements: Short-term conflict situations sometimes resulted in separation violations, particularly for transitioning aircraft in complex traffic flows. The combination of heavy metering and growing weather resulted in an increased number of aircraft penetrating convective weather cells. To address these shortcomings technologies and procedures have been improved and the operations are being re-evaluated with the same scenarios. In this paper we will first describe the concept and technologies for automating separation assurance, weather avoidance, and schedule conformance. Second, the results from the 2010 simulation will be reviewed. We report human-systems integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. Next, improvements will be discussed that were made to address identified shortcomings. We conclude that, with further refinements, air traffic control operations with ground-based automated separation assurance can routinely provide currently unachievable levels of traffic throughput in the en route airspace.

  7. An automated procedure for calculating system matrices from perturbation data generated by an EAI Pacer and 100 hybrid computer system

    NASA Technical Reports Server (NTRS)

    Milner, E. J.; Krosel, S. M.

    1977-01-01

    Techniques are presented for determining the elements of the A, B, C, and D state variable matrices for systems simulated on an EAI Pacer 100 hybrid computer. An automated procedure systematically generates disturbance data necessary to linearize the simulation model and stores these data on a floppy disk. A separate digital program verifies this data, calculates the elements of the system matrices, and prints these matrices appropriately labeled. The partial derivatives forming the elements of the state variable matrices are approximated by finite difference calculations.

  8. Influencing Trust for Human-Automation Collaborative Scheduling of Multiple Unmanned Vehicles.

    PubMed

    Clare, Andrew S; Cummings, Mary L; Repenning, Nelson P

    2015-11-01

    We examined the impact of priming on operator trust and system performance when supervising a decentralized network of heterogeneous unmanned vehicles (UVs). Advances in autonomy have enabled a future vision of single-operator control of multiple heterogeneous UVs. Real-time scheduling for multiple UVs in uncertain environments requires the computational ability of optimization algorithms combined with the judgment and adaptability of human supervisors. Because of system and environmental uncertainty, appropriate operator trust will be instrumental to maintain high system performance and prevent cognitive overload. Three groups of operators experienced different levels of trust priming prior to conducting simulated missions in an existing, multiple-UV simulation environment. Participants who play computer and video games frequently were found to have a higher propensity to overtrust automation. By priming gamers to lower their initial trust to a more appropriate level, system performance was improved by 10% as compared to gamers who were primed to have higher trust in the automation. Priming was successful at adjusting the operator's initial and dynamic trust in the automated scheduling algorithm, which had a substantial impact on system performance. These results have important implications for personnel selection and training for futuristic multi-UV systems under human supervision. Although gamers may bring valuable skills, they may also be potentially prone to automation bias. Priming during training and regular priming throughout missions may be one potential method for overcoming this propensity to overtrust automation. © 2015, Human Factors and Ergonomics Society.

  9. Determination of tailored filter sets to create rayfiles including spatial and angular resolved spectral information.

    PubMed

    Rotscholl, Ingo; Trampert, Klaus; Krüger, Udo; Perner, Martin; Schmidt, Franz; Neumann, Cornelius

    2015-11-16

    To simulate and optimize optical designs regarding perceived color and homogeneity in commercial ray tracing software, realistic light source models are needed. Spectral rayfiles provide angular and spatial varying spectral information. We propose a spectral reconstruction method with a minimum of time consuming goniophotometric near field measurements with optical filters for the purpose of creating spectral rayfiles. Our discussion focuses on the selection of the ideal optical filter combination for any arbitrary spectrum out of a given filter set by considering measurement uncertainties with Monte Carlo simulations. We minimize the simulation time by a preselection of all filter combinations, which bases on factorial design.

  10. Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Donnell, James T.; Maile, Tobias; Rose, Cody

    Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less

  11. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  12. Hardware fault insertion and instrumentation system: Mechanization and validation

    NASA Technical Reports Server (NTRS)

    Benson, J. W.

    1987-01-01

    Automated test capability for extensive low-level hardware fault insertion testing is developed. The test capability is used to calibrate fault detection coverage and associated latency times as relevant to projecting overall system reliability. Described are modifications made to the NASA Ames Reconfigurable Flight Control System (RDFCS) Facility to fully automate the total test loop involving the Draper Laboratories' Fault Injector Unit. The automated capability provided included the application of sequences of simulated low-level hardware faults, the precise measurement of fault latency times, the identification of fault symptoms, and bulk storage of test case results. A PDP-11/60 served as a test coordinator, and a PDP-11/04 as an instrumentation device. The fault injector was controlled by applications test software in the PDP-11/60, rather than by manual commands from a terminal keyboard. The time base was especially developed for this application to use a variety of signal sources in the system simulator.

  13. Highly Automated Arrival Management and Control System Suitable for Early NextGen

    NASA Technical Reports Server (NTRS)

    Swenson, Harry N.; Jung, Jaewoo

    2013-01-01

    This is a presentation of previously published work conducted in the development of the Terminal Area Precision Scheduling and Spacing (TAPSS) system. Included are concept and technical descriptions of the TAPSS system and results from human in the loop simulations conducted at Ames Research Center. The Terminal Area Precision Scheduling and Spacing system has demonstrated through research and extensive high-fidelity simulation studies to have benefits in airport arrival throughput, supporting efficient arrival descents, and enabling mixed aircraft navigation capability operations during periods of high congestion. NASA is currently porting the TAPSS system into the FAA TBFM and STARS system prototypes to ensure its ability to operate in the FAA automation Infrastructure. NASA ATM Demonstration Project is using the the TAPSS technologies to provide the ground-based automation tools to enable airborne Interval Management (IM) capabilities. NASA and the FAA have initiated a Research Transition Team to enable potential TAPSS and IM Technology Transfer.

  14. Development of computer-aided design system of elastic sensitive elements of automatic metering devices

    NASA Astrophysics Data System (ADS)

    Kalinkina, M. E.; Kozlov, A. S.; Labkovskaia, R. I.; Pirozhnikova, O. I.; Tkalich, V. L.; Shmakov, N. A.

    2018-05-01

    The object of research is the element base of devices of control and automation systems, including in its composition annular elastic sensitive elements, methods of their modeling, calculation algorithms and software complexes for automation of their design processes. The article is devoted to the development of the computer-aided design system of elastic sensitive elements used in weight- and force-measuring automation devices. Based on the mathematical modeling of deformation processes in a solid, as well as the results of static and dynamic analysis, the calculation of elastic elements is given using the capabilities of modern software systems based on numerical simulation. In the course of the simulation, the model was a divided hexagonal grid of finite elements with a maximum size not exceeding 2.5 mm. The results of modal and dynamic analysis are presented in this article.

  15. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE PAGES

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...

    2016-02-26

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  16. Model-centric distribution automation: Capacity, reliability, and efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onen, Ahmet; Jung, Jaesung; Dilek, Murat

    A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less

  17. ProtSqueeze: simple and effective automated tool for setting up membrane protein simulations.

    PubMed

    Yesylevskyy, Semen O

    2007-01-01

    The major challenge in setting up membrane protein simulations is embedding the protein into the pre-equilibrated lipid bilayer. Several techniques were proposed to achieve optimal packing of the lipid molecules around the protein. However, all of them possess serious disadvantages, which limit their applicability and discourage the users of simulation packages from using them. In the present work, we analyzed existing approaches and proposed a new procedure of protein insertion into the lipid bilayer, which is implemented in the ProtSqueeze software. The advantages of ProtSqueeze are as follows: (1) the insertion algorithm is simple, understandable, and controllable; (2) the software can work with virtually any simulation package on virtually any platform; (3) no modification of the source code of the simulation package is needed; (4) the procedure of insertion is as automated as possible; (5) ProtSqueeze is distributed for free under a general public license. In this work, we present the architecture and the algorithm of ProtSqueeze and demonstrate its usage in case studies.

  18. The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment

    NASA Technical Reports Server (NTRS)

    Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.

    2011-01-01

    A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.

  19. Automating the Fireshed Assessment Process with ArcGIS

    Treesearch

    Alan Ager; Klaus Barber

    2006-01-01

    A library of macros was developed to automate the Fireshed process within ArcGIS. The macros link a number of vegetation simulation and wildfire behavior models (FVS, SVS, FARSITE, and FlamMap) with ESRI geodatabases, desktop software (Access, Excel), and ArcGIS. The macros provide for (1) an interactive linkage between digital imagery, vegetation data, FVS-FFE, and...

  20. Design of a solar array simulator for the NASA EOS testbed

    NASA Technical Reports Server (NTRS)

    Butler, Steve J.; Sable, Dan M.; Lee, Fred C.; Cho, Bo H.

    1992-01-01

    The present spacecraft solar array simulator addresses both dc and ac characteristics as well as changes in illumination and temperature and performance degradation over the course of array service life. The computerized control system used allows simulation of a complete orbit cycle, in addition to automated diagnostics. The simulator is currently interfaced with the NASA EOS testbed.

  1. Light Attenuation in a 14-year-old Loblolly Pine Stand as Influenced by Fertilization and Irrigation

    Treesearch

    D.A. Sampson; H. Lee Allen

    1998-01-01

    We examined empirical and simulated estimates of canopy light attenuation at SETRES (Southeast Tree Research and Education Site) a 2x2 factorial study of water and nutrients. Fertilized plots had signiticantly lower under-canopy PAR transmittance (Tc) when compared to non-fertilized plots. Light interception efftciency as measured by the...

  2. A Study of Interior Wiring, Color Coding, and Switching Principles by Simulation and Practice.

    ERIC Educational Resources Information Center

    McCormick, B. G.; McCormick, Robert S.

    After a preliminary introduction and a chapter on wiring and electricity safety procedures, this study text proceeds to offer a general coverage of single and polyphase alternating current electrical systems used to power factories, farms, small businesses, and homes. Electrical power, from its generation to its application, is discussed, with the…

  3. The Future of Architecture Collaborative Information Sharing: DoDAF Version 2.03 Updates

    DTIC Science & Technology

    2012-04-30

    Salamander x Select Solution Factory Select Business Solutions BPMN , UML x SimonTool Simon Labs x SimProcess CACI BPMN x System Architecture Management...for DoDAF Mega UML x Metastorm ProVision Metastorm BPMN x Naval Simulation System - 4 Aces METRON x NetViz CA x OPNET OPNET x Tool Name Vendor Primary

  4. Cervical screening programmes: can automation help? Evidence from systematic reviews, an economic analysis and a simulation modelling exercise applied to the UK.

    PubMed

    Willis, B H; Barton, P; Pearmain, P; Bryan, S; Hyde, C

    2005-03-01

    To assess the effectiveness and cost-effectiveness of adding automated image analysis to cervical screening programmes. Searching of all major electronic databases to the end of 2000 was supplemented by a detailed survey for unpublished UK literature. Four systematic reviews were conducted according to recognised guidance. The review of 'clinical effectiveness' included studies assessing reproducibility and impact on health outcomes and processes in addition to evaluations of test accuracy. A discrete event simulation model was developed, although the economic evaluation ultimately relied on a cost-minimisation analysis. The predominant finding from the systematic reviews was the very limited amount of rigorous primary research. None of the included studies refers to the only commercially available automated image analysis device in 2002, the AutoPap Guided Screening (GS) System. The results of the included studies were debatably most compatible with automated image analysis being equivalent in test performance to manual screening. Concerning process, there was evidence that automation does lead to reductions in average slide processing times. In the PRISMATIC trial this was reduced from 10.4 to 3.9 minutes, a statistically significant and practically important difference. The economic evaluation tentatively suggested that the AutoPap GS System may be efficient. The key proviso is that credible data become available to support that the AutoPap GS System has test performance and processing times equivalent to those obtained for PAPNET. The available evidence is still insufficient to recommend implementation of automated image analysis systems. The priority for action remains further research, particularly the 'clinical effectiveness' of the AutoPap GS System. Assessing the cost-effectiveness of introducing automation alongside other approaches is also a priority.

  5. Automated finite element meshing of the lumbar spine: Verification and validation with 18 specimen-specific models.

    PubMed

    Campbell, J Q; Coombs, D J; Rao, M; Rullkoetter, P J; Petrella, A J

    2016-09-06

    The purpose of this study was to seek broad verification and validation of human lumbar spine finite element models created using a previously published automated algorithm. The automated algorithm takes segmented CT scans of lumbar vertebrae, automatically identifies important landmarks and contact surfaces, and creates a finite element model. Mesh convergence was evaluated by examining changes in key output variables in response to mesh density. Semi-direct validation was performed by comparing experimental results for a single specimen to the automated finite element model results for that specimen with calibrated material properties from a prior study. Indirect validation was based on a comparison of results from automated finite element models of 18 individual specimens, all using one set of generalized material properties, to a range of data from the literature. A total of 216 simulations were run and compared to 186 experimental data ranges in all six primary bending modes up to 7.8Nm with follower loads up to 1000N. Mesh convergence results showed less than a 5% difference in key variables when the original mesh density was doubled. The semi-direct validation results showed that the automated method produced results comparable to manual finite element modeling methods. The indirect validation results showed a wide range of outcomes due to variations in the geometry alone. The studies showed that the automated models can be used to reliably evaluate lumbar spine biomechanics, specifically within our intended context of use: in pure bending modes, under relatively low non-injurious simulated in vivo loads, to predict torque rotation response, disc pressures, and facet forces. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Synthesis of ethological studies on behavioural adaptation of the astronaut to space flight conditions

    NASA Astrophysics Data System (ADS)

    Tafforin, Carole

    The motor behaviour of the astronaut as revealed in his movement, posture and orientation is treated as observable evidence of the subject's adaptation to space flight conditions. In addition to the conservative physiological homeostasies, the quantitative description of the astronaut's motor activity in microgravity is postulated in terms of an innovative regulation, within a temporal dynamic. The proposed ethological method consists of first drawing up a specific behavioural repertoire and then of using video recordings of space missions to describe each of the behavioural units observed in the ongoing flux context in which it occurred. Finally the data is quantified into frequencies of occurrence, transition and association and completed with factorial correlation analysis. Comparison of ground training ( g = 1) and space flight ( g = 0) between the first and last day of a mission up to return to Earth gravity simulated by an anti-orthostatic decubitus experiment, reveals the nature of the adaptive strategies implemented. These strategies are evidence of changes in the behavioural repertoire including the search for predominantly visual environmental cues and the progression of motor skill during the flight. The pre-flight period is defined as a phase involving automizing of motor patterns and the post-flight period as rehabituation of strategies which have already been acquired. The phenomena observed are discussed in terms of the new spatial representation and the body image, constructed by the astronaut during his adaptation. They are considered to be optimizing for the subject's relation to his environment.

  7. Explicit control of adaptive automation under different levels of environmental stress.

    PubMed

    Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter

    2011-08-01

    This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions.

  8. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    PubMed

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  9. FESetup: Automating Setup for Alchemical Free Energy Simulations.

    PubMed

    Loeffler, Hannes H; Michel, Julien; Woods, Christopher

    2015-12-28

    FESetup is a new pipeline tool which can be used flexibly within larger workflows. The tool aims to support fast and easy setup of alchemical free energy simulations for molecular simulation packages such as AMBER, GROMACS, Sire, or NAMD. Post-processing methods like MM-PBSA and LIE can be set up as well. Ligands are automatically parametrized with AM1-BCC, and atom mappings for a single topology description are computed with a maximum common substructure search (MCSS) algorithm. An abstract molecular dynamics (MD) engine can be used for equilibration prior to free energy setup or standalone. Currently, all modern AMBER force fields are supported. Ease of use, robustness of the code, and automation where it is feasible are the main development goals. The project follows an open development model, and we welcome contributions.

  10. Sharing control between humans and automation using haptic interface: primary and secondary task performance benefits.

    PubMed

    Griffiths, Paul G; Gillespie, R Brent

    2005-01-01

    This paper describes a paradigm for human/automation control sharing in which the automation acts through a motor coupled to a machine's manual control interface. The manual interface becomes a haptic display, continually informing the human about automation actions. While monitoring by feel, users may choose either to conform to the automation or override it and express their own control intentions. This paper's objective is to demonstrate that adding automation through haptic display can be used not only to improve performance on a primary task but also to reduce perceptual demands or free attention for a secondary task. Results are presented from three experiments in which 11 participants completed a lane-following task using a motorized steering wheel on a fixed-base driving simulator. The automation behaved like a copilot, assisting with lane following by applying torques to the steering wheel. Results indicate that haptic assist improves lane following by least 30%, p < .0001, while reducing visual demand by 29%, p < .0001, or improving reaction time in a secondary tone localization task by 18 ms, p = .0009. Potential applications of this research include the design of automation interfaces based on haptics that support human/automation control sharing better than traditional push-button automation interfaces.

  11. A Automated Tool for Supporting FMEAs of Digital Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less

  12. Aiding Vertical Guidance Understanding

    NASA Technical Reports Server (NTRS)

    Feary, Michael; McCrobie, Daniel; Alkin, Martin; Sherry, Lance; Polson, Peter; Palmer, Everett; McQuinn, Noreen

    1998-01-01

    A two-part study was conducted to evaluate modern flight deck automation and interfaces. In the first part, a survey was performed to validate the existence of automation surprises with current pilots. Results indicated that pilots were often surprised by the behavior of the automation. There were several surprises that were reported more frequently than others. An experimental study was then performed to evaluate (1) the reduction of automation surprises through training specifically for the vertical guidance logic, and (2) a new display that describes the flight guidance in terms of aircraft behaviors instead of control modes. The study was performed in a simulator that was used to run a complete flight with actual airline pilots. Three groups were used to evaluate the guidance display and training. In the training, condition, participants went through a training program for vertical guidance before flying the simulation. In the display condition, participants ran through the same training program and then flew the experimental scenario with the new Guidance-Flight Mode Annunciator (G-FMA). Results showed improved pilot performance when given training specifically for the vertical guidance logic and greater improvements when given the training and the new G-FMA. Using actual behavior of the avionics to design pilot training and FMA is feasible, and when the automated vertical guidance mode of the Flight Management System is engaged, the display of the guidance mode and targets yields improved pilot performance.

  13. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    NASA Technical Reports Server (NTRS)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence interaction with automation via significant effects on trust and system utilization. These findings have implications for both automation design and operator training.

  14. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  15. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  16. Why do drivers maintain short headways in fog? A driving-simulator study evaluating feeling of risk and lateral control during automated and manual car following.

    PubMed

    Saffarian, M; Happee, R; Winter, J C F de

    2012-01-01

    Drivers in fog tend to maintain short headways, but the reasons behind this phenomenon are not well understood. This study evaluated the effect of headway on lateral control and feeling of risk in both foggy and clear conditions. Twenty-seven participants completed four sessions in a driving simulator: clear automated (CA), clear manual (CM), fog automated (FA) and fog manual (FM). In CM and FM, the drivers used the steering wheel, throttle and brake pedals. In CA and FA, a controller regulated the distance to the lead car, and the driver only had to steer. Drivers indicated how much risk they felt on a touchscreen. Consistent with our hypothesis, feeling of risk and steering activity were elevated when the lead car was not visible. These results might explain why drivers adopt short headways in fog. Practitioner Summary: Fog poses a serious road safety hazard. Our driving-simulator study provides the first experimental evidence to explain the role of risk-feeling and lateral control in headway reduction. These results are valuable for devising effective driver assistance and support systems.

  17. Intelligent robot trends and predictions for the new millennium

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.; Mundhenk, Terrell N.

    1999-08-01

    An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The current use of these machines in outer space, medicine, hazardous materials, defense applications and industry is being pursued with vigor but little funding. In factory automation such robotics machines can improve productivity, increase product quality and improve competitiveness. The computer and the robot have both been developed during recent times. The intelligent robot combines both technologies and requires a thorough understanding and knowledge of mechatronics. In honor of the new millennium, this paper will present a discussion of futuristic trends and predictions. However, in keeping with technical tradition, a new technique for 'Follow the Leader' will also be presented in the hope of it becoming a new, useful and non-obvious technique.

  18. Intelligent Processing Equipment Research and Development Programs of the Department of Commerce

    NASA Technical Reports Server (NTRS)

    Simpson, J. A.

    1992-01-01

    The intelligence processing equipment (IPE) research and development (R&D) programs of the Department of Commerce are carried out within the National Institute of Standards and Technology (NIST). This institute has had work in support of industrial productivity as part of its mission since its founding in 1901. With the advent of factory automation these efforts have increasingly turned to R&D in IPE. The Manufacturing Engineering Laboratory (MEL) of NIST devotes a major fraction of its efforts to this end while other elements within the organization, notably the Material Science and Engineering Laboratory, have smaller but significant programs. An inventory of all such programs at NIST and a representative selection of projects that at least demonstrate the scope of the efforts are presented.

  19. An architecture and model for cognitive engineering simulation analysis - Application to advanced aviation automation

    NASA Technical Reports Server (NTRS)

    Corker, Kevin M.; Smith, Barry R.

    1993-01-01

    The process of designing crew stations for large-scale, complex automated systems is made difficult because of the flexibility of roles that the crew can assume, and by the rapid rate at which system designs become fixed. Modern cockpit automation frequently involves multiple layers of control and display technology in which human operators must exercise equipment in augmented, supervisory, and fully automated control modes. In this context, we maintain that effective human-centered design is dependent on adequate models of human/system performance in which representations of the equipment, the human operator(s), and the mission tasks are available to designers for manipulation and modification. The joint Army-NASA Aircrew/Aircraft Integration (A3I) Program, with its attendant Man-machine Integration Design and Analysis System (MIDAS), was initiated to meet this challenge. MIDAS provides designers with a test bed for analyzing human-system integration in an environment in which both cognitive human function and 'intelligent' machine function are described in similar terms. This distributed object-oriented simulation system, its architecture and assumptions, and our experiences from its application in advanced aviation crew stations are described.

  20. Thermal depth profiling of vascular lesions: automated regularization of reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Verkruysse, Wim; Choi, Bernard; Zhang, Jenny R.; Kim, Jeehyun; Nelson, J. Stuart

    2008-03-01

    Pulsed photo-thermal radiometry (PPTR) is a non-invasive, non-contact diagnostic technique used to locate cutaneous chromophores such as melanin (epidermis) and hemoglobin (vascular structures). Clinical utility of PPTR is limited because it typically requires trained user intervention to regularize the inversion solution. Herein, the feasibility of automated regularization was studied. A second objective of this study was to depart from modeling port wine stain PWS, a vascular skin lesion frequently studied with PPTR, as strictly layered structures since this may influence conclusions regarding PPTR reconstruction quality. Average blood vessel depths, diameters and densities derived from histology of 30 PWS patients were used to generate 15 randomized lesion geometries for which we simulated PPTR signals. Reconstruction accuracy for subjective regularization was compared with that for automated regularization methods. The objective regularization approach performed better. However, the average difference was much smaller than the variation between the 15 simulated profiles. Reconstruction quality depended more on the actual profile to be reconstructed than on the reconstruction algorithm or regularization method. Similar, or better, accuracy reconstructions can be achieved with an automated regularization procedure which enhances prospects for user friendly implementation of PPTR to optimize laser therapy on an individual patient basis.

  1. Prior Familiarization With Takeover Requests Affects Drivers' Takeover Performance and Automation Trust.

    PubMed

    Hergeth, Sebastian; Lorenz, Lutz; Krems, Josef F

    2017-05-01

    The objective for this study was to investigate the effects of prior familiarization with takeover requests (TORs) during conditional automated driving on drivers' initial takeover performance and automation trust. System-initiated TORs are one of the biggest concerns for conditional automated driving and have been studied extensively in the past. Most, but not all, of these studies have included training sessions to familiarize participants with TORs. This makes them hard to compare and might obscure first-failure-like effects on takeover performance and automation trust formation. A driving simulator study compared drivers' takeover performance in two takeover situations across four prior familiarization groups (no familiarization, description, experience, description and experience) and automation trust before and after experiencing the system. As hypothesized, prior familiarization with TORs had a more positive effect on takeover performance in the first than in a subsequent takeover situation. In all groups, automation trust increased after participants experienced the system. Participants who were given no prior familiarization with TORs reported highest automation trust both before and after experiencing the system. The current results extend earlier findings suggesting that prior familiarization with TORs during conditional automated driving will be most relevant for takeover performance in the first takeover situation and that it lowers drivers' automation trust. Potential applications of this research include different approaches to familiarize users with automated driving systems, better integration of earlier findings, and sophistication of experimental designs.

  2. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    USGS Publications Warehouse

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  3. Automated Rendezvous and Capture System Development and Simulation for NASA

    NASA Technical Reports Server (NTRS)

    Roe, Fred D.; Howard, Richard T.; Murphy, Leslie

    2004-01-01

    The United States does not have an Automated Rendezvous and Capture Docking (AR&C) capability and is reliant on manned control for rendezvous and docking of orbiting spacecraft. T h i s reliance on the labor intensive manned interface for control of rendezvous and docking vehicles has a significant impact on the cost of the operation of the International Space Station (ISS) and precludes the use of any U.S. expendable launch capabilities for Space Station resupply. The Marshall Space Flight Center (MSFC) has conducted pioneering research in the development of an automated rendezvous and capture (or docking) (AR&C) system for U.S. space vehicles. This A M C system was tested extensively using hardware-in-the-loop simulations in the Flight Robotics Laboratory, and a rendezvous sensor, the Video Guidance Sensor was developed and successfully flown on the Space Shuttle on flights STS-87 and STS-95, proving the concept of a video- based sensor. Further developments in sensor technology and vehicle and target configuration have lead to continued improvements and changes in AR&C system development and simulation. A new Advanced Video Guidance Sensor (AVGS) with target will be utilized as the primary navigation sensor on the Demonstration of Autonomous Rendezvous Technologies (DART) flight experiment in 2004. Realtime closed-loop simulations will be performed to validate the improved AR&C systems prior to flight.

  4. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  5. The influence of highly automated driving on the self-perception of drivers in the context of Conduct-by-Wire.

    PubMed

    Kauer, Michaela; Franz, Benjamin; Maier, Alexander; Bruder, Ralph

    2015-01-01

    Today, new driving paradigms are being introduced that aim to reduce the number of standalone driver assistance systems by combining these into one overarching system. This is done to reduce the demands on drivers but often leads to a higher degree of automation. Feasibility and driver behaviour are often the subject of studies, but this is contrasted by a lack of research into the influence of highly automated driving on the self-perception of drivers. This article begins to close this gap by investigating the influences of one highly automated driving concept--Conduct-by-Wire--on the self-perception of drivers via a combined driving simulator and interview study. The aim of this work is to identify changes in the role concept of drivers indicated by highly automated driving, to evaluate these changes from the drivers' point of view and to give suggestions of possible improvements to the design of highly automated vehicles.

  6. Transportation Planning with Immune System Derived Approach

    NASA Astrophysics Data System (ADS)

    Sugiyama, Kenji; Yaji, Yasuhito; Ootsuki, John Takuya; Fujimoto, Yasutaka; Sekiguchi, Takashi

    This paper presents an immune system derived approach for planning transportation of materials between manufacturing processes in the factory. Transportation operations are modeled by Petri Net, and divided into submodels. Transportation orders are derived from the firing sequences of those submodels through convergence calculation by the immune system derived excitation and suppression operations. Basic evaluation of this approach is conducted by simulation-based investigation.

  7. 2,4,6-Trinitrotoluene in soil and groundwater under a waste lagoon at the former Explosives Factory Maribyrnong (EFM), Victoria, Australia

    NASA Astrophysics Data System (ADS)

    Martel, Richard; Robertson, Timothy James; Doan, Minh Quan; Thiboutot, Sonia; Ampleman, Guy; Provatas, Arthur; Jenkins, Thomas

    2008-01-01

    Energetic materials contamination was investigated at the former Explosives Factory Maribyrnong, Victoria, Australia. Spectrophotometric/high performance liquid chromatography (HPLC) analysis was utilised to delineate a 5 tonne crystalline 2,4,6-trinitrotoluene (TNT) source in a former process waste lagoon that was found to be supplying contaminant leachate to the surficial clay aquitard with a maximum-recorded concentration of 7.0 ppm TNT. Groundwater within underlying sand and gravel aquifers was found to be uncontaminated due to upward hydraulic gradients resulting in slow plume development and propagation. Adsorption and microcosm test results from a parallel study were used as input parameters to simulate aqueous TNT transport in the clay aquitard using ATRANS20 software. The simulated TNT plume was localised within a few metres of the source, and at steady state, though leaching rate calculations suggest that without mitigation or other changes to the system, persistence of the source would be approximately 2,000 years. Remediation strategies may involve removal of the near surface source zone and infilling with an impermeable capping to impede leaching while facilitating ongoing natural attenuation by anaerobic degradation.

  8. Modeling of fire smoke movement in multizone garments building using two open source platforms

    NASA Astrophysics Data System (ADS)

    Khandoker, Md. Arifur Rahman; Galib, Musanna; Islam, Adnan; Rahman, Md. Ashiqur

    2017-06-01

    Casualty of garment factory workers from factory fire in Bangladesh is a recurring tragedy. Smoke, which is more fatal than fire itself, often propagates through different pathways from lower to upper floors during building fire. Among the toxic gases produced from a building fire, carbon monoxide (CO) can be deadly, even in small amounts. This paper models the propagation and transportation of fire induced smoke (CO) that resulted from the burning of synthetic polyester fibers using two open source platforms, CONTAM and Fire Dynamics Simulator (FDS). Smoke migration in a generic multistoried garment factory building in Bangladesh is modeled using CONTAM where each floor is compartmentalized by different zones. The elevator and stairway shafts are modeled by phantom zones to simulate contaminant (CO) transport from one floor to upper floors. FDS analysis involves burning of two different stacks of polyester jacket of six feet height and with a maximum heat release rate per unit area of 1500kw/m2 over a storage area 50m2 and 150m2, respectively. The resulting CO generation and removal rates from FDS are used in CONTAM to predict fire-borne CO propagation in different zones of the garment building. Findings of the study exhibit that the contaminant flow rate is a strong function of the position of building geometry, location of initiation of fire, amount of burnt material, presence of AHU and contaminant generation and removal rate of CO from the source location etc. The transport of fire-smoke in the building Hallways, stairways and lifts are also investigated in detail to examine the safe egress of the occupants in case of fire.

  9. Designing automation for complex work environments under different levels of stress.

    PubMed

    Sauer, Juergen; Nickel, Peter; Wastell, David

    2013-01-01

    This article examines the effectiveness of different forms of static and adaptable automation under low- and high-stress conditions. Forty participants were randomly assigned to one of four experimental conditions, comparing three levels of static automation (low, medium and high) and one level of adaptable automation, with the environmental stressor (noise) being varied as a within-subjects variable. Participants were trained for 4 h on a simulation of a process control environment, called AutoCAMS, followed by a 2.5-h testing session. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that operators preferred higher levels of automation under noise than under quiet conditions. A number of parameters indicated negative effects of noise exposure, such as performance impairments, physiological stress reactions and higher mental workload. It also emerged that adaptable automation provided advantages over low and intermediate static automation, with regard to mental workload, effort expenditure and diagnostic performance. The article concludes that for the design of automation a wider range of operational scenarios reflecting adverse as well as ideal working conditions needs to be considered. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Automated generation of virtual scenarios in driving simulator from highway design data.

    DOT National Transportation Integrated Search

    2010-09-01

    In 2008, the Texas Transportation Institute (TTI) began using a desktop driving simulator made by Realtime : Technologies, Inc. This system comes with a library of different roadway segment types that can be pieced : together to create driving scenar...

  11. Spec2Harv: Converting Spectrum output to HARVEST input

    Treesearch

    Eric J. Gustafson; Luke V. Rasmussen; Larry A. Leefers

    2003-01-01

    Spec2Harv was developed to automate the conversion of harvest schedules generated by the Spectrum model into script files that can be used by the HARVEST simulation model to simulate the implementation of the Spectrum schedules in a spatially explicit way.

  12. Modeling and Simulation in Support of Testing and Evaluation

    DTIC Science & Technology

    1997-03-01

    contains standardized automated test methodology, synthetic stimuli and environments based on TECOM Ground Truth data and physics . The VPG is a distributed...Systems Acquisition Management (FSAM) coursebook , Defense Systems Management College, January 1994. Crocker, Charles M. “Application of the Simulation

  13. What's skill got to do with it? Vehicle automation and driver mental workload.

    PubMed

    Young, M S; Stanton, N A

    2007-08-01

    Previous research has found that vehicle automation systems can reduce driver mental workload, with implications for attentional resources that can be detrimental to performance. The present paper considers how the development of automaticity within the driving task may influence performance in underload situations. Driver skill and vehicle automation were manipulated in a driving simulator, with four levels of each variable. Mental workload was assessed using a secondary task measure and eye movements were recorded to infer attentional capacity. The effects of automation on driver mental workload were quite robust across skill levels, but the most intriguing findings were from the eye movement data. It was found that, with little exception, attentional capacity and mental workload were directly related at all levels of driver skill, consistent with earlier studies. The results are discussed with reference to applied theories of cognition and the design of automation.

  14. NASA Systems Autonomy Demonstration Project - Development of Space Station automation technology

    NASA Technical Reports Server (NTRS)

    Bull, John S.; Brown, Richard; Friedland, Peter; Wong, Carla M.; Bates, William

    1987-01-01

    A 1984 Congressional expansion of the 1958 National Aeronautics and Space Act mandated that NASA conduct programs, as part of the Space Station program, which will yield the U.S. material benefits, particularly in the areas of advanced automation and robotics systems. Demonstration programs are scheduled for automated systems such as the thermal control, expert system coordination of Station subsystems, and automation of multiple subsystems. The programs focus the R&D efforts and provide a gateway for transfer of technology to industry. The NASA Office of Aeronautics and Space Technology is responsible for directing, funding and evaluating the Systems Autonomy Demonstration Project, which will include simulated interactions between novice personnel and astronauts and several automated, expert subsystems to explore the effectiveness of the man-machine interface being developed. Features and progress on the TEXSYS prototype thermal control system expert system are outlined.

  15. Birth of Industry 5.0: Making Sense of Big Data with Artificial Intelligence, "The Internet of Things" and Next-Generation Technology Policy.

    PubMed

    Özdemir, Vural; Hekim, Nezih

    2018-01-01

    Driverless cars with artificial intelligence (AI) and automated supermarkets run by collaborative robots (cobots) working without human supervision have sparked off new debates: what will be the impacts of extreme automation, turbocharged by the Internet of Things (IoT), AI, and the Industry 4.0, on Big Data and omics implementation science? The IoT builds on (1) broadband wireless internet connectivity, (2) miniaturized sensors embedded in animate and inanimate objects ranging from the house cat to the milk carton in your smart fridge, and (3) AI and cobots making sense of Big Data collected by sensors. Industry 4.0 is a high-tech strategy for manufacturing automation that employs the IoT, thus creating the Smart Factory. Extreme automation until "everything is connected to everything else" poses, however, vulnerabilities that have been little considered to date. First, highly integrated systems are vulnerable to systemic risks such as total network collapse in the event of failure of one of its parts, for example, by hacking or Internet viruses that can fully invade integrated systems. Second, extreme connectivity creates new social and political power structures. If left unchecked, they might lead to authoritarian governance by one person in total control of network power, directly or through her/his connected surrogates. We propose Industry 5.0 that can democratize knowledge coproduction from Big Data, building on the new concept of symmetrical innovation. Industry 5.0 utilizes IoT, but differs from predecessor automation systems by having three-dimensional (3D) symmetry in innovation ecosystem design: (1) a built-in safe exit strategy in case of demise of hyperconnected entrenched digital knowledge networks. Importantly, such safe exists are orthogonal-in that they allow "digital detox" by employing pathways unrelated/unaffected by automated networks, for example, electronic patient records versus material/article trails on vital medical information; (2) equal emphasis on both acceleration and deceleration of innovation if diminishing returns become apparent; and (3) next generation social science and humanities (SSH) research for global governance of emerging technologies: "Post-ELSI Technology Evaluation Research" (PETER). Importantly, PETER considers the technology opportunity costs, ethics, ethics-of-ethics, framings (epistemology), independence, and reflexivity of SSH research in technology policymaking. Industry 5.0 is poised to harness extreme automation and Big Data with safety, innovative technology policy, and responsible implementation science, enabled by 3D symmetry in innovation ecosystem design.

  16. Evaluation of an automated ultraviolet-C light disinfection device and patient hand hygiene for reduction of pathogen transfer from interactive touchscreen computer kiosks.

    PubMed

    Alhmidi, Heba; Cadnum, Jennifer L; Piedrahita, Christina T; John, Amrita R; Donskey, Curtis J

    2018-04-01

    Touchscreens are a potential source of pathogen transmission. In our facility, patients and visitors rarely perform hand hygiene after using interactive touchscreen computer kiosks. An automated ultraviolet-C touchscreen disinfection device was effective in reducing bacteriophage MS2, bacteriophage ϕX174, methicillin-resistant Staphylococcus aureus, and Clostridium difficile spores inoculated onto a touchscreen. In simulations, an automated ultraviolet-C touchscreen disinfection device alone or in combination with hand hygiene reduced transfer of the viruses from contaminated touchscreens to fingertips. Published by Elsevier Inc.

  17. Automated Meta-Aircraft Operations for a More Efficient and Responsive Air Transportation System

    NASA Technical Reports Server (NTRS)

    Hanson, Curt

    2015-01-01

    A brief overview is given of the on-going NASA Automated Cooperative Trajectories project. Current status and upcoming work is previewed. The motivating factors and innovative aspects of ACT are discussed along with technical challenges and the expected system-level impacts if the project is successful. Preliminary results from the NASA G-III hardware in the loop simulation are included.

  18. An automated system for terrain database construction

    NASA Technical Reports Server (NTRS)

    Johnson, L. F.; Fretz, R. K.; Logan, T. L.; Bryant, N. A.

    1987-01-01

    An automated Terrain Database Preparation System (TDPS) for the construction and editing of terrain databases used in computerized wargaming simulation exercises has been developed. The TDPS system operates under the TAE executive, and it integrates VICAR/IBIS image processing and Geographic Information System software with CAD/CAM data capture and editing capabilities. The terrain database includes such features as roads, rivers, vegetation, and terrain roughness.

  19. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  20. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.

    1984-01-01

    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.

  1. SiMon: Simulation Monitor for Computational Astrophysics

    NASA Astrophysics Data System (ADS)

    Xuran Qian, Penny; Cai, Maxwell Xu; Portegies Zwart, Simon; Zhu, Ming

    2017-09-01

    Scientific discovery via numerical simulations is important in modern astrophysics. This relatively new branch of astrophysics has become possible due to the development of reliable numerical algorithms and the high performance of modern computing technologies. These enable the analysis of large collections of observational data and the acquisition of new data via simulations at unprecedented accuracy and resolution. Ideally, simulations run until they reach some pre-determined termination condition, but often other factors cause extensive numerical approaches to break down at an earlier stage. In those cases, processes tend to be interrupted due to unexpected events in the software or the hardware. In those cases, the scientist handles the interrupt manually, which is time-consuming and prone to errors. We present the Simulation Monitor (SiMon) to automatize the farming of large and extensive simulation processes. Our method is light-weight, it fully automates the entire workflow management, operates concurrently across multiple platforms and can be installed in user space. Inspired by the process of crop farming, we perceive each simulation as a crop in the field and running simulation becomes analogous to growing crops. With the development of SiMon we relax the technical aspects of simulation management. The initial package was developed for extensive parameter searchers in numerical simulations, but it turns out to work equally well for automating the computational processing and reduction of observational data reduction.

  2. Automation of a Wave-Optics Simulation and Image Post-Processing Package on Riptide

    NASA Astrophysics Data System (ADS)

    Werth, M.; Lucas, J.; Thompson, D.; Abercrombie, M.; Holmes, R.; Roggemann, M.

    Detailed wave-optics simulations and image post-processing algorithms are computationally expensive and benefit from the massively parallel hardware available at supercomputing facilities. We created an automated system that interfaces with the Maui High Performance Computing Center (MHPCC) Distributed MATLAB® Portal interface to submit massively parallel waveoptics simulations to the IBM iDataPlex (Riptide) supercomputer. This system subsequently postprocesses the output images with an improved version of physically constrained iterative deconvolution (PCID) and analyzes the results using a series of modular algorithms written in Python. With this architecture, a single person can simulate thousands of unique scenarios and produce analyzed, archived, and briefing-compatible output products with very little effort. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA). The views, opinions, and/or findings expressed are those of the author(s) and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.

  3. A Tool for Parameter-space Explorations

    NASA Astrophysics Data System (ADS)

    Murase, Yohsuke; Uchitane, Takeshi; Ito, Nobuyasu

    A software for managing simulation jobs and results, named "OACIS", is presented. It controls a large number of simulation jobs executed in various remote servers, keeps these results in an organized way, and manages the analyses on these results. The software has a web browser front end, and users can submit various jobs to appropriate remote hosts from a web browser easily. After these jobs are finished, all the result files are automatically downloaded from the computational hosts and stored in a traceable way together with the logs of the date, host, and elapsed time of the jobs. Some visualization functions are also provided so that users can easily grasp the overview of the results distributed in a high-dimensional parameter space. Thus, OACIS is especially beneficial for the complex simulation models having many parameters for which a lot of parameter searches are required. By using API of OACIS, it is easy to write a code that automates parameter selection depending on the previous simulation results. A few examples of the automated parameter selection are also demonstrated.

  4. An expert system for simulating electric loads aboard Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Kukich, George; Dolce, James L.

    1990-01-01

    Space Station Freedom will provide an infrastructure for space experimentation. This environment will feature regulated access to any resources required by an experiment. Automated systems are being developed to manage the electric power so that researchers can have the flexibility to modify their experiment plan for contingencies or for new opportunities. To define these flexible power management characteristics for Space Station Freedom, a simulation is required that captures the dynamic nature of space experimentation; namely, an investigator is allowed to restructure his experiment and to modify its execution. This changes the energy demands for the investigator's range of options. An expert system competent in the domain of cryogenic fluid management experimentation was developed. It will be used to help design and test automated power scheduling software for Freedom's electric power system. The expert system allows experiment planning and experiment simulation. The former evaluates experimental alternatives and offers advice on the details of the experiment's design. The latter provides a real-time simulation of the experiment replete with appropriate resource consumption.

  5. Automated chemical kinetic modeling via hybrid reactive molecular dynamics and quantum chemistry simulations.

    PubMed

    Döntgen, Malte; Schmalz, Felix; Kopp, Wassja A; Kröger, Leif C; Leonhard, Kai

    2018-06-13

    An automated scheme for obtaining chemical kinetic models from scratch using reactive molecular dynamics and quantum chemistry simulations is presented. This methodology combines the phase space sampling of reactive molecular dynamics with the thermochemistry and kinetics prediction capabilities of quantum mechanics. This scheme provides the NASA polynomial and modified Arrhenius equation parameters for all species and reactions that are observed during the simulation and supplies them in the ChemKin format. The ab initio level of theory for predictions is easily exchangeable and the presently used G3MP2 level of theory is found to reliably reproduce hydrogen and methane oxidation thermochemistry and kinetics data. Chemical kinetic models obtained with this approach are ready-to-use for, e.g., ignition delay time simulations, as shown for hydrogen combustion. The presented extension of the ChemTraYzer approach can be used as a basis for methodologically advancing chemical kinetic modeling schemes and as a black-box approach to generate chemical kinetic models.

  6. The Walking Interventions Through Texting (WalkIT) Trial: Rationale, Design, and Protocol for a Factorial Randomized Controlled Trial of Adaptive Interventions for Overweight and Obese, Inactive Adults.

    PubMed

    Hurley, Jane C; Hollingshead, Kevin E; Todd, Michael; Jarrett, Catherine L; Tucker, Wesley J; Angadi, Siddhartha S; Adams, Marc A

    2015-09-11

    Walking is a widely accepted and frequently targeted health promotion approach to increase physical activity (PA). Interventions to increase PA have produced only small improvements. Stronger and more potent behavioral intervention components are needed to increase time spent in PA, improve cardiometabolic risk markers, and optimize health. Our aim is to present the rationale and methods from the WalkIT Trial, a 4-month factorial randomized controlled trial (RCT) in inactive, overweight/obese adults. The main purpose of the study was to evaluate whether intensive adaptive components result in greater improvements to adults' PA compared to the static intervention components. Participants enrolled in a 2x2 factorial RCT and were assigned to one of four semi-automated, text message-based walking interventions. Experimental components included adaptive versus static steps/day goals, and immediate versus delayed reinforcement. Principles of percentile shaping and behavioral economics were used to operationalize experimental components. A Fitbit Zip measured the main outcome: participants' daily physical activity (steps and cadence) over the 4-month duration of the study. Secondary outcomes included self-reported PA, psychosocial outcomes, aerobic fitness, and cardiorespiratory risk factors assessed pre/post in a laboratory setting. Participants were recruited through email listservs and websites affiliated with the university campus, community businesses and local government, social groups, and social media advertising. This study has completed data collection as of December 2014, but data cleaning and preliminary analyses are still in progress. We expect to complete analysis of the main outcomes in late 2015 to early 2016. The Walking Interventions through Texting (WalkIT) Trial will further the understanding of theory-based intervention components to increase the PA of men and women who are healthy, insufficiently active and are overweight or obese. WalkIT is one of the first studies focusing on the individual components of combined goal setting and reward structures in a factorial design to increase walking. The trial is expected to produce results useful to future research interventions and perhaps industry initiatives, primarily focused on mHealth, goal setting, and those looking to promote behavior change through performance-based incentives. ClinicalTrials.gov NCT02053259; https://clinicaltrials.gov/ct2/show/NCT02053259 (Archived by WebCite at http://www.webcitation.org/6b65xLvmg).

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ito, Akihiko; Inatomi, Motoko; Huntzinger, Deborah N.

    The seasonal-cycle amplitude (SCA) of the atmosphere–ecosystem carbon dioxide (CO 2) exchange rate is a useful metric of the responsiveness of the terrestrial biosphere to environmental variations. It is unclear, however, what underlying mechanisms are responsible for the observed increasing trend of SCA in atmospheric CO 2 concentration. Using output data from the Multi-scale Terrestrial Model Intercomparison Project (MsTMIP), we investigated how well the SCA of atmosphere–ecosystem CO 2 exchange was simulated with 15 contemporary terrestrial ecosystem models during the period 1901–2010. Also, we made attempt to evaluate the contributions of potential mechanisms such as atmospheric CO 2, climate, land-use,more » and nitrogen deposition, through factorial experiments using different combinations of forcing data. Under contemporary conditions, the simulated global-scale SCA of the cumulative net ecosystem carbon flux of most models was comparable in magnitude with the SCA of atmospheric CO 2 concentrations. Results from factorial simulation experiments showed that elevated atmospheric CO 2 exerted a strong influence on the seasonality amplification. When the model considered not only climate change but also land-use and atmospheric CO 2 changes, the majority of the models showed amplification trends of the SCAs of photosynthesis, respiration, and net ecosystem production (+0.19 % to +0.50 % yr -1). In the case of land-use change, it was difficult to separate the contribution of agricultural management to SCA because of inadequacies in both the data and models. The simulated amplification of SCA was approximately consistent with the observational evidence of the SCA in atmospheric CO 2 concentrations. Large inter-model differences remained, however, in the simulated global tendencies and spatial patterns of CO 2 exchanges. Further studies are required to identify a consistent explanation for the simulated and observed amplification trends, including their underlying mechanisms. Nevertheless, this study implied that monitoring of ecosystem seasonality would provide useful insights concerning ecosystem dynamics.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ito, Akihiko; Inatomi, Motoko; Huntzinger, Deborah N.

    The seasonal-cycle amplitude (SCA) of the atmosphere–ecosystem carbon dioxide (CO 2) exchange rate is a useful metric of the responsiveness of the terrestrial biosphere to environmental variations. It is unclear, however, what underlying mechanisms are responsible for the observed increasing trend of SCA in atmospheric CO 2 concentration. Using output data from the Multi-scale Terrestrial Model Intercomparison Project (MsTMIP), we investigated how well the SCA of atmosphere–ecosystem CO 2 exchange was simulated with 15 contemporary terrestrial ecosystem models during the period 1901–2010. Also, we made attempt to evaluate the contributions of potential mechanisms such as atmospheric CO 2, climate, land-use,more » and nitrogen deposition, through factorial experiments using different combinations of forcing data. Under contemporary conditions, the simulated global-scale SCA of the cumulative net ecosystem carbon flux of most models was comparable in magnitude with the SCA of atmospheric CO 2 concentrations. Results from factorial simulation experiments showed that elevated atmospheric CO 2 exerted a strong influence on the seasonality amplification. When the model considered not only climate change but also land-use and atmospheric CO 2 changes, the majority of the models showed amplification trends of the SCAs of photosynthesis, respiration, and net ecosystem production (+0.19 % to +0.50 % yr –1). In the case of land-use change, it was difficult to separate the contribution of agricultural management to SCA because of inadequacies in both the data and models. The simulated amplification of SCA was approximately consistent with the observational evidence of the SCA in atmospheric CO 2 concentrations. Large inter-model differences remained, however, in the simulated global tendencies and spatial patterns of CO 2 exchanges. Further studies are required to identify a consistent explanation for the simulated and observed amplification trends, including their underlying mechanisms. Furthermore, this study implied that monitoring of ecosystem seasonality would provide useful insights concerning ecosystem dynamics.« less

  9. Decadal trends in the seasonal-cycle amplitude of terrestrial CO 2 exchange resulting from the ensemble of terrestrial biosphere models

    DOE PAGES

    Ito, Akihiko; Inatomi, Motoko; Huntzinger, Deborah N.; ...

    2016-05-12

    The seasonal-cycle amplitude (SCA) of the atmosphere–ecosystem carbon dioxide (CO 2) exchange rate is a useful metric of the responsiveness of the terrestrial biosphere to environmental variations. It is unclear, however, what underlying mechanisms are responsible for the observed increasing trend of SCA in atmospheric CO 2 concentration. Using output data from the Multi-scale Terrestrial Model Intercomparison Project (MsTMIP), we investigated how well the SCA of atmosphere–ecosystem CO 2 exchange was simulated with 15 contemporary terrestrial ecosystem models during the period 1901–2010. Also, we made attempt to evaluate the contributions of potential mechanisms such as atmospheric CO 2, climate, land-use,more » and nitrogen deposition, through factorial experiments using different combinations of forcing data. Under contemporary conditions, the simulated global-scale SCA of the cumulative net ecosystem carbon flux of most models was comparable in magnitude with the SCA of atmospheric CO 2 concentrations. Results from factorial simulation experiments showed that elevated atmospheric CO 2 exerted a strong influence on the seasonality amplification. When the model considered not only climate change but also land-use and atmospheric CO 2 changes, the majority of the models showed amplification trends of the SCAs of photosynthesis, respiration, and net ecosystem production (+0.19 % to +0.50 % yr –1). In the case of land-use change, it was difficult to separate the contribution of agricultural management to SCA because of inadequacies in both the data and models. The simulated amplification of SCA was approximately consistent with the observational evidence of the SCA in atmospheric CO 2 concentrations. Large inter-model differences remained, however, in the simulated global tendencies and spatial patterns of CO 2 exchanges. Further studies are required to identify a consistent explanation for the simulated and observed amplification trends, including their underlying mechanisms. Furthermore, this study implied that monitoring of ecosystem seasonality would provide useful insights concerning ecosystem dynamics.« less

  10. The Loci Multidisciplinary Simulation System Overview and Status

    NASA Technical Reports Server (NTRS)

    Luke, Edward A.; Tong, Xiao-Ling; Tang, Lin

    2002-01-01

    This paper will discuss the Loci system, an innovative tool for developing tightly coupled multidisciplinary three dimensional simulations. This presentation will overview some of the unique capabilities of the Loci system to automate the assembly of numerical simulations from libraries of fundamental computational components. We will discuss the demonstration of the Loci system on coupled fluid-structure problems related to RBCC propulsion systems.

  11. A Methodology to Assess UrbanSim Scenarios

    DTIC Science & Technology

    2012-09-01

    Education LOE – Line of Effort MMOG – Massively Multiplayer Online Game MC3 – Maneuver Captain’s Career Course MSCCC – Maneuver Support...augmented reality simulations, increased automation and artificial intelligence simulation, and massively multiplayer online games (MMOG), among...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Turn-based strategy games and simulations are vital tools for military

  12. Increasing the Number of Replications in Item Response Theory Simulations: Automation through SAS and Disk Operating System

    ERIC Educational Resources Information Center

    Gagne, Phill; Furlow, Carolyn; Ross, Terris

    2009-01-01

    In item response theory (IRT) simulation research, it is often necessary to use one software package for data generation and a second software package to conduct the IRT analysis. Because this can substantially slow down the simulation process, it is sometimes offered as a justification for using very few replications. This article provides…

  13. On the application of hybrid meshes in hydraulic machinery CFD simulations

    NASA Astrophysics Data System (ADS)

    Schlipf, M.; Tismer, A.; Riedelbauch, S.

    2016-11-01

    The application of two different hybrid mesh types for the simulation of a Francis runner for automated optimization processes without user input is investigated. Those mesh types are applied to simplified test cases such as flow around NACA airfoils to identify the special mesh resolution effects with reduced complexity, like rotating cascade flows, as they occur in a turbomachine runner channel. The analysis includes the application of those different meshes on the geometries by keeping defined quality criteria and exploring the influences on the simulation results. All results are compared with reference values gained by simulations with blockstructured hexahedron meshes and the same numerical scheme. This avoids additional inaccuracies caused by further numerical and experimental measurement methods. The results show that a simulation with hybrid meshes built up by a blockstructured domain with hexahedrons around the blade in combination with a tetrahedral far field in the channel is sufficient to get results which are almost as accurate as the results gained by the reference simulation. Furthermore this method is robust enough for automated processes without user input and enables comparable meshes in size, distribution and quality for different similar geometries as occurring in optimization processes.

  14. Supporting skill acquisition in cochlear implant surgery through virtual reality simulation.

    PubMed

    Copson, Bridget; Wijewickrema, Sudanthi; Zhou, Yun; Piromchai, Patorn; Briggs, Robert; Bailey, James; Kennedy, Gregor; O'Leary, Stephen

    2017-03-01

    To evaluate the effectiveness of a virtual reality (VR) temporal bone simulator in training cochlear implant surgery. We compared the performance of 12 otolaryngology registrars conducting simulated cochlear implant surgery before (pre-test) and after (post-tests) receiving training on a VR temporal bone surgery simulator with automated performance feedback. The post-test tasks were two temporal bones, one that was a mirror image of the temporal bone used as a pre-test and the other, a novel temporal bone. Participant performances were assessed by an otologist with a validated cochlear implant competency assessment tool. Structural damage was derived from an automatically generated simulator metric and compared between time points. Wilcoxon signed-rank test showed that there was a significant improvement with a large effect size in the total performance scores between the pre-test (PT) and both the first and second post-tests (PT1, PT2) (PT-PT1: P = 0.007, r = 0.78, PT-PT2: P = 0.005, r = 0.82). The results of the study indicate that VR simulation with automated guidance can effectively be used to train surgeons in training complex temporal bone surgeries such as cochlear implantation.

  15. Application of multi-factorial design of experiments to successfully optimize immunoassays for robust measurements of therapeutic proteins.

    PubMed

    Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh

    2009-02-20

    Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.

  16. Automated Air Traffic Control Operations with Weather and Time-Constraints: A First Look at (Simulated) Far-Term Control Room Operations

    NASA Technical Reports Server (NTRS)

    Prevot, Thomas; Homola, Jeffrey R.; Martin, Lynne H.; Mercer, Joey S.; Cabrall, Christopher C.

    2011-01-01

    In this paper we discuss results from a recent high fidelity simulation of air traffic control operations with automated separation assurance in the presence of weather and time-constraints. We report findings from a human-in-the-loop study conducted in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. During four afternoons in early 2010, fifteen active and recently retired air traffic controllers and supervisors controlled high levels of traffic in a highly automated environment during three-hour long scenarios, For each scenario, twelve air traffic controllers operated eight sector positions in two air traffic control areas and were supervised by three front line managers, Controllers worked one-hour shifts, were relieved by other controllers, took a 3D-minute break, and worked another one-hour shift. On average, twice today's traffic density was simulated with more than 2200 aircraft per traffic scenario. The scenarios were designed to create peaks and valleys in traffic density, growing and decaying convective weather areas, and expose controllers to heavy and light metering conditions. This design enabled an initial look at a broad spectrum of workload, challenge, boredom, and fatigue in an otherwise uncharted territory of future operations. In this paper we report human/system integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. We conclude that, with further refinements. air traffic control operations with ground-based automated separation assurance can be an effective and acceptable means to routinely provide very high traffic throughput in the en route airspace.

  17. Advances in the simulation and automated measurement of well-sorted granular material: 2. Direct measures of particle properties

    USGS Publications Warehouse

    Buscombe, Daniel D.; Rubin, David M.

    2012-01-01

    1. In this, the second of a pair of papers on the structure of well-sorted natural granular material (sediment), new methods are described for automated measurements from images of sediment, of: 1) particle-size standard deviation (arithmetic sorting) with and without apparent void fraction; and 2) mean particle size in material with void fraction. A variety of simulations of granular material are used for testing purposes, in addition to images of natural sediment. Simulations are also used to establish that the effects on automated particle sizing of grains visible through the interstices of the grains at the very surface of a granular material continue to a depth of approximately 4 grain diameters and that this is independent of mean particle size. Ensemble root-mean squared error between observed and estimated arithmetic sorting coefficients for 262 images of natural silts, sands and gravels (drawn from 8 populations) is 31%, which reduces to 27% if adjusted for bias (slope correction between observed and estimated values). These methods allow non-intrusive and fully automated measurements of surfaces of unconsolidated granular material. With no tunable parameters or empirically derived coefficients, they should be broadly universal in appropriate applications. However, empirical corrections may need to be applied for the most accurate results. Finally, analytical formulas are derived for the one-step pore-particle transition probability matrix, estimated from the image's autocorrelogram, from which void fraction of a section of granular material can be estimated directly. This model gives excellent predictions of bulk void fraction yet imperfect predictions of pore-particle transitions.

  18. Advances in the simulation and automated measurement of well-sorted granular material: 2. Direct measures of particle properties

    NASA Astrophysics Data System (ADS)

    Buscombe, D.; Rubin, D. M.

    2012-06-01

    In this, the second of a pair of papers on the structure of well-sorted natural granular material (sediment), new methods are described for automated measurements from images of sediment, of: 1) particle-size standard deviation (arithmetic sorting) with and without apparent void fraction; and 2) mean particle size in material with void fraction. A variety of simulations of granular material are used for testing purposes, in addition to images of natural sediment. Simulations are also used to establish that the effects on automated particle sizing of grains visible through the interstices of the grains at the very surface of a granular material continue to a depth of approximately 4 grain diameters and that this is independent of mean particle size. Ensemble root-mean squared error between observed and estimated arithmetic sorting coefficients for 262 images of natural silts, sands and gravels (drawn from 8 populations) is 31%, which reduces to 27% if adjusted for bias (slope correction between observed and estimated values). These methods allow non-intrusive and fully automated measurements of surfaces of unconsolidated granular material. With no tunable parameters or empirically derived coefficients, they should be broadly universal in appropriate applications. However, empirical corrections may need to be applied for the most accurate results. Finally, analytical formulas are derived for the one-step pore-particle transition probability matrix, estimated from the image's autocorrelogram, from which void fraction of a section of granular material can be estimated directly. This model gives excellent predictions of bulk void fraction yet imperfect predictions of pore-particle transitions.

  19. [Design of Complex Cavity Structure in Air Route System of Automated Peritoneal Dialysis Machine].

    PubMed

    Quan, Xiaoliang

    2017-07-30

    This paper introduced problems about Automated Peritoneal Dialysis machine(APD) that the lack of technical issues such as the structural design of the complex cavities. To study the flow characteristics of this special structure, the application of ANSYS CFX software is used with k-ε turbulence model as the theoretical basis of fluid mechanics. The numerical simulation of flow field simulation result in the internal model can be gotten after the complex structure model is imported into ANSYS CFX module. Then, it will present the distribution of complex cavities inside the flow field and the flow characteristics parameter, which will provide an important reference design for APD design.

  20. Managing human error in aviation.

    PubMed

    Helmreich, R L

    1997-05-01

    Crew resource management (CRM) programs were developed to address team and leadership aspects of piloting modern airplanes. The goal is to reduce errors through team work. Human factors research and social, cognitive, and organizational psychology are used to develop programs tailored for individual airlines. Flight crews study accident case histories, group dynamics, and human error. Simulators provide pilots with the opportunity to solve complex flight problems. CRM in the simulator is called line-oriented flight training (LOFT). In automated cockpits CRM promotes the idea of automation as a crew member. Cultural aspects of aviation include professional, business, and national culture. The aviation CRM model has been adapted for training surgeons and operating room staff in human factors.

  1. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    NASA Astrophysics Data System (ADS)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  2. Procedural errors in air traffic control: effects of traffic density, expertise, and automation.

    PubMed

    Di Nocera, Francesco; Fabrizi, Roberto; Terenzi, Michela; Ferlazzo, Fabio

    2006-06-01

    Air traffic management requires operators to frequently shift between multiple tasks and/or goals with different levels of accomplishment. Procedural errors can occur when a controller accomplishes one of the tasks before the entire operation has been completed. The present study had two goals: first, to verify the occurrence of post-completion errors in air traffic control (ATC) tasks; and second, to assess effects on performance of medium term conflict detection (MTCD) tools. There were 18 military controllers who performed a simulated ATC task with and without automation support (MTCD vs. manual) in high and low air traffic density conditions. During the task, which consisted of managing several simulated flights in an enroute ATC scenario, a trace suddenly disappeared "after" the operator took the aircraft in charge, "during" the management of the trace, or "before" the pilot's first contact. In the manual condition, only the fault type "during" was found to be significantly different from the other two. On the contrary, when in the MTCD condition, the fault type "after" generated significantly less errors than the fault type "before." Additionally, automation was found to affect performance of junior controllers, whereas seniors' performance was not affected. Procedural errors can happen in ATC, but automation can mitigate this effect. Lack of benefits for the "before" fault type may be due to the fact that operators extend their reliance to a part of the task that is unsupported by the automated system.

  3. Automation bias: empirical results assessing influencing factors.

    PubMed

    Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C

    2014-05-01

    To investigate the rate of automation bias - the propensity of people to over rely on automated advice and the factors associated with it. Tested factors were attitudinal - trust and confidence, non-attitudinal - decision support experience and clinical experience, and environmental - task difficulty. The paradigm of simulated decision support advice within a prescribing context was used. The study employed within participant before-after design, whereby 26 UK NHS General Practitioners were shown 20 hypothetical prescribing scenarios with prevalidated correct and incorrect answers - advice was incorrect in 6 scenarios. They were asked to prescribe for each case, followed by being shown simulated advice. Participants were then asked whether they wished to change their prescription, and the post-advice prescription was recorded. Rate of overall decision switching was captured. Automation bias was measured by negative consultations - correct to incorrect prescription switching. Participants changed prescriptions in 22.5% of scenarios. The pre-advice accuracy rate of the clinicians was 50.38%, which improved to 58.27% post-advice. The CDSS improved the decision accuracy in 13.1% of prescribing cases. The rate of automation bias, as measured by decision switches from correct pre-advice, to incorrect post-advice was 5.2% of all cases - a net improvement of 8%. More immediate factors such as trust in the specific CDSS, decision confidence, and task difficulty influenced rate of decision switching. Lower clinical experience was associated with more decision switching. Age, DSS experience and trust in CDSS generally were not significantly associated with decision switching. This study adds to the literature surrounding automation bias in terms of its potential frequency and influencing factors. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Repeated Induction of Inattentional Blindness in a Simulated Aviation Environment

    NASA Technical Reports Server (NTRS)

    Kennedy, Kellie D.; Stephens, Chad L.; Williams, Ralph A.; Schutte, Paul C.

    2017-01-01

    The study reported herein is a subset of a larger investigation on the role of automation in the context of the flight deck and used a fixed-based, human-in-the-loop simulator. This paper explored the relationship between automation and inattentional blindness (IB) occurrences in a repeated induction paradigm using two types of runway incursions. The critical stimuli for both runway incursions were directly relevant to primary task performance. Sixty non-pilot participants performed the final five minutes of a landing scenario twice in one of three automation conditions: full automation (FA), partial automation (PA), and no automation (NA). The first induction resulted in a 70 percent (42 of 60) detection failure rate with those in the PA condition significantly more likely to detect the incursion compared to the FA condition or the NA condition. The second induction yielded a 50 percent detection failure rate. Although detection improved (detection failure rates declined) in all conditions, those in the FA condition demonstrated the greatest improvement with doubled detection rates. The detection behavior in the first trial did not preclude a failed detection in the second induction. Group membership (IB vs. Detection) in the FA condition showed a greater improvement than those in the NA condition and rated the Mental Demand and Effort subscales of the NASA-TLX (NASA Task Load Index) significantly higher for Time 2 compared Time 1. Participants in the FA condition used the experience of IB exposure to improve task performance whereas those in the NA condition did not, indicating the availability and reallocation of attentional resources in the FA condition. These findings support the role of engagement in operational attention detriment and the consideration of attentional failure causation to determine appropriate mitigation strategies.

  5. TEAM (Technologies Enabling Agile Manufacturing) shop floor control requirements guide: Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-28

    TEAM will create a shop floor control system (SFC) to link the pre-production planning to shop floor execution. SFC must meet the requirements of a multi-facility corporation, where control must be maintained between co-located facilities down to individual workstations within each facility. SFC must also meet the requirements of a small corporation, where there may only be one small facility. A hierarchical architecture is required to meet these diverse needs. The hierarchy contains the following levels: Enterprise, Factory, Cell, Station, and Equipment. SFC is focused on the top three levels. Each level of the hierarchy is divided into three basicmore » functions: Scheduler, Dispatcher, and Monitor. The requirements of each function depend on the hierarchical level in which it is to be used. For example, the scheduler at the Enterprise level must allocate production to individual factories and assign due-dates; the scheduler at the Cell level must provide detailed start and stop times of individual operations. Finally the system shall have the following features: distributed and open-architecture. Open architecture software is required in order that the appropriate technology be used at each level of the SFC hierarchy, and even at different instances within the same hierarchical level (for example, Factory A uses discrete-event simulation scheduling software, and Factory B uses an optimization-based scheduler). A distributed implementation is required to reduce the computational burden of the overall system, and allow for localized control. A distributed, open-architecture implementation will also require standards for communication between hierarchical levels.« less

  6. Extended System Operations Studies for Automated Guideway Transit Systems : Plan for Task 5--DPM Failure Management

    DOT National Transportation Integrated Search

    1981-06-01

    The purpose of Task 5 in the Extended System Operations Studies Project, DPM Failure Management, is to enhance the capabilities of the Downtown People Mover Simulation (DPMS) and the Discrete Event Simulation Model (DESM) by increasing the failure mo...

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Andrew; Lawrence, Earl

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less

  8. Automation for Accommodating Fuel-Efficient Descents in Constrained Airspace

    NASA Technical Reports Server (NTRS)

    Coopenbarger, Richard A.

    2010-01-01

    Continuous descents at low engine power are desired to reduce fuel consumption, emissions and noise during arrival operations. The challenge is to allow airplanes to fly these types of efficient descents without interruption during busy traffic conditions. During busy conditions today, airplanes are commonly forced to fly inefficient, step-down descents as airtraffic controllers work to ensure separation and maximize throughput. NASA in collaboration with government and industry partners is developing new automation to help controllers accommodate continuous descents in the presence of complex traffic and airspace constraints. This automation relies on accurate trajectory predictions to compute strategic maneuver advisories. The talk will describe the concept behind this new automation and provide an overview of the simulations and flight testing used to develop and refine its underlying technology.

  9. Preface to the special section on human factors and automation in vehicles: designing highly automated vehicles with the driver in mind.

    PubMed

    Merat, Natasha; Lee, John D

    2012-10-01

    This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.

  10. Neutrino Factory Targets and the MICE Beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walaron, Kenneth Andrew

    2007-01-01

    The future of particle physics in the next 30 years must include detailed study of neutrinos. The first proof of physics beyond the Standard Model of particle physics is evident in results from recent neutrino experiments which imply that neutrinos have mass and flavour mixing. The Neutrino Factory is the leading contender to measure precisely the neutrino mixing parameters to probe beyond the Standard Model physics. Significantly, one must look to measure the mixing angle θ 13 and investigate the possibility of leptonic CP violation. If found this may provide a key insight into the origins of the matter/anti- mattermore » asymmetry seen in the universe, through the mechanism of leptogenesis. The Neutrino Factory will be a large international multi-billion dollar experiment combining novel new accelerator and long-baseline detector technology. Arguably the most important and costly features of this facility are the proton driver and cooling channel. This thesis will present simulation work focused on determining the optimal proton driver energy to maximise pion production and also simulation of the transport of this pion °ux through some candidate transport lattices. Bench-marking of pion cross- sections calculated by MARS and GEANT4 codes to measured data from the HARP experiment is also presented. The cooling channel aims to reduce the phase-space volume of the decayed muon beam to a level that can be e±ciently injected into the accelerator system. The Muon Ionisation Cooling Experiment (MICE) hosted by the Rutherford Appleton laboratory, UK is a proof-of-principle experiment aimed at measuring ionisation cooling. The experiment will run parasitically to the ISIS accelerator and will produce muons from pion decay. The MICE beamline provides muon beams of variable emittance and momentum to the MICE experiment to enable measurement of cooling over a wide range of beam conditions. Simulation work in the design of this beamline is presented in this thesis as are results from an experiment to estimate the °ux from the target into the beamline acceptance.« less

  11. Platform for real-time simulation of dynamic systems and hardware-in-the-loop for control algorithms.

    PubMed

    de Souza, Isaac D T; Silva, Sergio N; Teles, Rafael M; Fernandes, Marcelo A C

    2014-10-15

    The development of new embedded algorithms for automation and control of industrial equipment usually requires the use of real-time testing. However, the equipment required is often expensive, which means that such tests are often not viable. The objective of this work was therefore to develop an embedded platform for the distributed real-time simulation of dynamic systems. This platform, called the Real-Time Simulator for Dynamic Systems (RTSDS), could be applied in both industrial and academic environments. In industrial applications, the RTSDS could be used to optimize embedded control algorithms. In the academic sphere, it could be used to support research into new embedded solutions for automation and control and could also be used as a tool to assist in undergraduate and postgraduate teaching related to the development of projects concerning on-board control systems.

  12. Platform for Real-Time Simulation of Dynamic Systems and Hardware-in-the-Loop for Control Algorithms

    PubMed Central

    de Souza, Isaac D. T.; Silva, Sergio N.; Teles, Rafael M.; Fernandes, Marcelo A. C.

    2014-01-01

    The development of new embedded algorithms for automation and control of industrial equipment usually requires the use of real-time testing. However, the equipment required is often expensive, which means that such tests are often not viable. The objective of this work was therefore to develop an embedded platform for the distributed real-time simulation of dynamic systems. This platform, called the Real-Time Simulator for Dynamic Systems (RTSDS), could be applied in both industrial and academic environments. In industrial applications, the RTSDS could be used to optimize embedded control algorithms. In the academic sphere, it could be used to support research into new embedded solutions for automation and control and could also be used as a tool to assist in undergraduate and postgraduate teaching related to the development of projects concerning on-board control systems. PMID:25320906

  13. Qgui: A high-throughput interface for automated setup and analysis of free energy calculations and empirical valence bond simulations in biological systems.

    PubMed

    Isaksen, Geir Villy; Andberg, Tor Arne Heim; Åqvist, Johan; Brandsdal, Bjørn Olav

    2015-07-01

    Structural information and activity data has increased rapidly for many protein targets during the last decades. In this paper, we present a high-throughput interface (Qgui) for automated free energy and empirical valence bond (EVB) calculations that use molecular dynamics (MD) simulations for conformational sampling. Applications to ligand binding using both the linear interaction energy (LIE) method and the free energy perturbation (FEP) technique are given using the estrogen receptor (ERα) as a model system. Examples of free energy profiles obtained using the EVB method for the rate-limiting step of the enzymatic reaction catalyzed by trypsin are also shown. In addition, we present calculation of high-precision Arrhenius plots to obtain the thermodynamic activation enthalpy and entropy with Qgui from running a large number of EVB simulations. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Automated planning of tangential breast intensity-modulated radiotherapy using heuristic optimization.

    PubMed

    Purdie, Thomas G; Dinniwell, Robert E; Letourneau, Daniel; Hill, Christine; Sharpe, Michael B

    2011-10-01

    To present an automated technique for two-field tangential breast intensity-modulated radiotherapy (IMRT) treatment planning. A total of 158 planned patients with Stage 0, I, and II breast cancer treated using whole-breast IMRT were retrospectively replanned using automated treatment planning tools. The tools developed are integrated into the existing clinical treatment planning system (Pinnacle(3)) and are designed to perform the manual volume delineation, beam placement, and IMRT treatment planning steps carried out by the treatment planning radiation therapist. The automated algorithm, using only the radio-opaque markers placed at CT simulation as inputs, optimizes the tangential beam parameters to geometrically minimize the amount of lung and heart treated while covering the whole-breast volume. The IMRT parameters are optimized according to the automatically delineated whole-breast volume. The mean time to generate a complete treatment plan was 6 min, 50 s ± 1 min 12 s. For the automated plans, 157 of 158 plans (99%) were deemed clinically acceptable, and 138 of 158 plans (87%) were deemed clinically improved or equal to the corresponding clinical plan when reviewed in a randomized, double-blinded study by one experienced breast radiation oncologist. In addition, overall the automated plans were dosimetrically equivalent to the clinical plans when scored for target coverage and lung and heart doses. We have developed robust and efficient automated tools for fully inversed planned tangential breast IMRT planning that can be readily integrated into clinical practice. The tools produce clinically acceptable plans using only the common anatomic landmarks from the CT simulation process as an input. We anticipate the tools will improve patient access to high-quality IMRT treatment by simplifying the planning process and will reduce the effort and cost of incorporating more advanced planning into clinical practice. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.

  15. Evaluation of Trajectory Errors in an Automated Terminal-Area Environment

    NASA Technical Reports Server (NTRS)

    Oseguera-Lohr, Rosa M.; Williams, David H.

    2003-01-01

    A piloted simulation experiment was conducted to document the trajectory errors associated with use of an airplane's Flight Management System (FMS) in conjunction with a ground-based ATC automation system, Center-TRACON Automation System (CTAS) in the terminal area. Three different arrival procedures were compared: current-day (vectors from ATC), modified (current-day with minor updates), and data link with FMS lateral navigation. Six active airline pilots flew simulated arrivals in a fixed-base simulator. The FMS-datalink procedure resulted in the smallest time and path distance errors, indicating that use of this procedure could reduce the CTAS arrival-time prediction error by about half over the current-day procedure. Significant sources of error contributing to the arrival-time error were crosstrack errors and early speed reduction in the last 2-4 miles before the final approach fix. Pilot comments were all very positive, indicating the FMS-datalink procedure was easy to understand and use, and the increased head-down time and workload did not detract from the benefit. Issues that need to be resolved before this method of operation would be ready for commercial use include development of procedures acceptable to controllers, better speed conformance monitoring, and FMS database procedures to support the approach transitions.

  16. ICEG2D (v2.0) - An Integrated Software Package for Automated Prediction of Flow Fields for Single-Element Airfoils With Ice Accretion

    NASA Technical Reports Server (NTRS)

    Thompson David S.; Soni, Bharat K.

    2001-01-01

    An integrated geometry/grid/simulation software package, ICEG2D, is being developed to automate computational fluid dynamics (CFD) simulations for single- and multi-element airfoils with ice accretions. The current version, ICEG213 (v2.0), was designed to automatically perform four primary functions: (1) generate a grid-ready surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generate high-quality structured and generalized grids starting from a defined surface definition, (3) generate the input and restart files needed to run the structured grid CFD solver NPARC or the generalized grid CFD solver HYBFL2D, and (4) using the flow solutions, generate solution-adaptive grids. ICEG2D (v2.0) can be operated in either a batch mode using a script file or in an interactive mode by entering directives from a command line within a Unix shell. This report summarizes activities completed in the first two years of a three-year research and development program to address automation issues related to CFD simulations for airfoils with ice accretions. As well as describing the technology employed in the software, this document serves as a users manual providing installation and operating instructions. An evaluation of the software is also presented.

  17. Training driving ability in a traumatic brain-injured individual using a driving simulator: a case report.

    PubMed

    Imhoff, Sarah; Lavallière, Martin; Germain-Robitaille, Mathieu; Teasdale, Normand; Fait, Philippe

    2017-01-01

    Traumatic brain injury (TBI) causes functional deficits that may significantly interfere with numerous activities of daily living such as driving. We report the case of a 20-year-old woman having lost her driver's license after sustaining a moderate TBI. We aimed to evaluate the effectiveness of an in-simulator training program with automated feedback on driving performance in a TBI individual. The participant underwent an initial and a final in-simulator driving assessment and 11 in-simulator training sessions with driving-specific automated feedbacks. Driving performance (simulation duration, speed regulation and lateral positioning) was measured in the driving simulator. Speeding duration decreased during training sessions from 1.50 ± 0.80 min (4.16 ± 2.22%) to 0.45 ± 0.15 min (0.44 ± 0.42%) but returned to initial duration after removal of feedbacks for the final assessment. Proper lateral positioning improved with training and was maintained at the final assessment. Time spent in an incorrect lateral position decreased from 18.85 min (53.61%) in the initial assessment to 1.51 min (4.64%) on the final assessment. Driving simulators represent an interesting therapeutic avenue. Considerable research efforts are needed to confirm the effectiveness of this method for driving rehabilitation of individuals who have sustained a TBI.

  18. Piloted simulation of a ground-based time-control concept for air traffic control

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Green, Steven M.

    1989-01-01

    A concept for aiding air traffic controllers in efficiently spacing traffic and meeting scheduled arrival times at a metering fix was developed and tested in a real time simulation. The automation aid, referred to as the ground based 4-D descent advisor (DA), is based on accurate models of aircraft performance and weather conditions. The DA generates suggested clearances, including both top-of-descent-point and speed-profile data, for one or more aircraft in order to achieve specific time or distance separation objectives. The DA algorithm is used by the air traffic controller to resolve conflicts and issue advisories to arrival aircraft. A joint simulation was conducted using a piloted simulator and an advanced concept air traffic control simulation to study the acceptability and accuracy of the DA automation aid from both the pilot's and the air traffic controller's perspectives. The results of the piloted simulation are examined. In the piloted simulation, airline crews executed controller issued descent advisories along standard curved path arrival routes, and were able to achieve an arrival time precision of + or - 20 sec at the metering fix. An analysis of errors generated in turns resulted in further enhancements of the algorithm to improve the predictive accuracy. Evaluations by pilots indicate general support for the concept and provide specific recommendations for improvement.

  19. Scenario management and automated scenario generation

    NASA Astrophysics Data System (ADS)

    McKeever, William; Gilmour, Duane; Lehman, Lynn; Stirtzinger, Anthony; Krause, Lee

    2006-05-01

    The military planning process utilizes simulation to determine the appropriate course of action (COA) that will achieve a campaign end state. However, due to the difficulty in developing and generating simulation level COAs, only a few COAs are simulated. This may have been appropriate for traditional conflicts but the evolution of warfare from attrition based to effects based strategies, as well as the complexities of 4 th generation warfare and asymmetric adversaries have placed additional demands on military planners and simulation. To keep pace with this dynamic, changing environment, planners must be able to perform continuous, multiple, "what-if" COA analysis. Scenario management and generation are critical elements to achieving this goal. An effects based scenario generation research project demonstrated the feasibility of automated scenario generation techniques which support multiple stove-pipe and emerging broad scope simulations. This paper will discuss a case study in which the scenario generation capability was employed to support COA simulations to identify plan effectiveness. The study demonstrated the effectiveness of using multiple simulation runs to evaluate the effectiveness of alternate COAs in achieving the overall campaign (metrics-based) objectives. The paper will discuss how scenario generation technology can be employed to allow military commanders and mission planning staff to understand the impact of command decisions on the battlespace of tomorrow.

  20. Automatic optimization of well locations in a North Sea fractured chalk reservoir using a front tracking reservoir simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rian, D.T.; Hage, A.

    1994-12-31

    A numerical simulator is often used as a reservoir management tool. One of its main purposes is to aid in the evaluation of number of wells, well locations and start time for wells. Traditionally, the optimization of a field development is done by a manual trial and error process. In this paper, an example of an automated technique is given. The core in the automization process is the reservoir simulator Frontline. Frontline is based on front tracking techniques, which makes it fast and accurate compared to traditional finite difference simulators. Due to its CPU-efficiency the simulator has been coupled withmore » an optimization module, which enables automatic optimization of location of wells, number of wells and start-up times. The simulator was used as an alternative method in the evaluation of waterflooding in a North Sea fractured chalk reservoir. Since Frontline, in principle, is 2D, Buckley-Leverett pseudo functions were used to represent the 3rd dimension. The area full field simulation model was run with up to 25 wells for 20 years in less than one minute of Vax 9000 CPU-time. The automatic Frontline evaluation indicated that a peripheral waterflood could double incremental recovery compared to a central pattern drive.« less

  1. A Controller-in-the Loop Simulation of Ground-Based Automated Separation Assurance in a NextGen Environment

    NASA Technical Reports Server (NTRS)

    Homola, J.; Prevot, Thomas; Mercer, Joey S.; Brasil, Connie L.; Martin, Lynne Hazel; Cabrall, C.

    2010-01-01

    A controller-in-the-loop simulation was conducted in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center to investigate the functional allocation aspects associated with ground-based automated separation assurance in a far-term NextGen environment. In this concept, ground-based automation handled the detection and resolution of strategic and tactical conflicts and alerted the controller to deferred situations. The controller was responsible for monitoring the automation and managing situations by exception. This was done in conditions both with and without arrival time constraints across two levels of traffic density. Results showed that although workload increased with an increase in traffic density, it was still manageable in most situations. The number of conflicts increased similarly with a related increase in the issuance of resolution clearances. Although over 99% of conflicts were resolved, operational errors did occur but were tied to local sector complexities. Feedback from the participants revealed that they thought they maintained reasonable situation awareness in this environment, felt that operations were highly acceptable at the lower traffic density level but were less so as it increased, and felt overall that the concept as it was introduced here was a positive step forward to accommodating the more complex environment envisioned as part of NextGen.

  2. Comparison of simulator fidelity model predictions with in-simulator evaluation data

    NASA Technical Reports Server (NTRS)

    Parrish, R. V.; Mckissick, B. T.; Ashworth, B. R.

    1983-01-01

    A full factorial in simulator experiment of a single axis, multiloop, compensatory pitch tracking task is described. The experiment was conducted to provide data to validate extensions to an analytic, closed loop model of a real time digital simulation facility. The results of the experiment encompassing various simulation fidelity factors, such as visual delay, digital integration algorithms, computer iteration rates, control loading bandwidths and proprioceptive cues, and g-seat kinesthetic cues, are compared with predictions obtained from the analytic model incorporating an optimal control model of the human pilot. The in-simulator results demonstrate more sensitivity to the g-seat and to the control loader conditions than were predicted by the model. However, the model predictions are generally upheld, although the predicted magnitudes of the states and of the error terms are sometimes off considerably. Of particular concern is the large sensitivity difference for one control loader condition, as well as the model/in-simulator mismatch in the magnitude of the plant states when the other states match.

  3. Validation of a digital mammographic unit model for an objective and highly automated clinical image quality assessment.

    PubMed

    Perez-Ponce, Hector; Daul, Christian; Wolf, Didier; Noel, Alain

    2013-08-01

    In mammography, image quality assessment has to be directly related to breast cancer indicator (e.g. microcalcifications) detectability. Recently, we proposed an X-ray source/digital detector (XRS/DD) model leading to such an assessment. This model simulates very realistic contrast-detail phantom (CDMAM) images leading to gold disc (representing microcalcifications) detectability thresholds that are very close to those of real images taken under the simulated acquisition conditions. The detection step was performed with a mathematical observer. The aim of this contribution is to include human observers into the disc detection process in real and virtual images to validate the simulation framework based on the XRS/DD model. Mathematical criteria (contrast-detail curves, image quality factor, etc.) are used to assess and to compare, from the statistical point of view, the cancer indicator detectability in real and virtual images. The quantitative results given in this paper show that the images simulated by the XRS/DD model are useful for image quality assessment in the case of all studied exposure conditions using either human or automated scoring. Also, this paper confirms that with the XRS/DD model the image quality assessment can be automated and the whole time of the procedure can be drastically reduced. Compared to standard quality assessment methods, the number of images to be acquired is divided by a factor of eight. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  4. Carryover effects of highly automated convoy driving on subsequent manual driving performance.

    PubMed

    Skottke, Eva-Maria; Debus, Günter; Wang, Lei; Huestegge, Lynn

    2014-11-01

    In the present study, we tested to what extent highly automated convoy driving involving small spacing ("platooning") may affect time headway (THW) and standard deviation of lateral position (SDLP) during subsequent manual driving. Although many previous studies have reported beneficial effects of automated driving, some research has also highlighted potential drawbacks, such as increased speed and reduced THW during the activation of semiautomated driving systems. Here, we rather focused on the question of whether switching from automated to manual driving may produce unwanted carryover effects on safety-relevant driving performance. We utilized a pre-post simulator design to measure THW and SDLP after highly automated driving and compared the data with those for a control group (manual driving throughout). Our data revealed that THW was reduced and SDLP increased after leaving the automation mode. A closer inspection of the data suggested that specifically the effect on THW is likely due to sensory and/or cognitive adaptation processes. Behavioral adaptation effects need to be taken into account in future implementations of automated convoy systems. Potential application areas of this research comprise automated freight traffic (truck convoys) and the design of driver assistance systems in general. Potential countermeasures against following at short distance as behavioral adaptation should be considered.

  5. Simulation of a Schema Theory-Based Knowledge Delivery System for Scientists.

    ERIC Educational Resources Information Center

    Vaughan, W. S., Jr.; Mavor, Anne S.

    A future, automated, interactive, knowledge delivery system for use by researchers was tested using a manual cognitive model. Conceptualized from schema/frame/script theories in cognitive psychology and artificial intelligence, this hypothetical system was simulated by two psychologists who interacted with four researchers in microbiology to…

  6. Producing gallium arsenide crystals in space

    NASA Technical Reports Server (NTRS)

    Randolph, R. L.

    1984-01-01

    The production of high quality crystals in space is a promising near-term application of microgravity processing. Gallium arsenide is the selected material for initial commercial production because of its inherent superior electronic properties, wide range of market applications, and broad base of on-going device development effort. Plausible product prices can absorb the high cost of space transportation for the initial flights provided by the Space Transportation System. The next step for bulk crystal growth, beyond the STS, is planned to come later with the use of free flyers or a space station, where real benefits are foreseen. The use of these vehicles, together with refinement and increasing automation of space-based crystal growth factories, will bring down costs and will support growing demands for high quality GaAs and other specialty electronic and electro-optical crystals grown in space.

  7. Spectral imaging applications: Remote sensing, environmental monitoring, medicine, military operations, factory automation and manufacturing

    NASA Technical Reports Server (NTRS)

    Gat, N.; Subramanian, S.; Barhen, J.; Toomarian, N.

    1996-01-01

    This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.

  8. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  9. Pharmacy students' retention of knowledge and skills following training in automated external defibrillator use.

    PubMed

    Kopacek, Karen Birckelbaw; Dopp, Anna Legreid; Dopp, John M; Vardeny, Orly; Sims, J Jason

    2010-08-10

    To assess pharmacy students' retention of knowledge about appropriate automated external defibrillator use and counseling points following didactic training and simulated experience. Following a lecture on sudden cardiac arrest and automated external defibrillator use, second-year doctor of pharmacy (PharmD) students were assessed on their ability to perform basic life support and deliver a shock at baseline, 3 weeks, and 4 months. Students completed a questionnaire to evaluate recall of counseling points for laypeople/the public. Mean time to shock delivery at baseline was 74 ± 25 seconds, which improved significantly at 3 weeks (50 ± 17 seconds, p < 0.001) and was maintained at 4 months (47 ± 18 seconds, p < 0.001). Recall of all signs and symptoms of sudden cardiac arrest and automated external defibrillator counseling points was diminished after 4 months. Pharmacy students can use automated external defibrillators to quickly deliver a shock and are able to retain this ability after 4 months. Refresher training/courses will be required to improve students' retention of automated external defibrillator counseling points to ensure their ability to deliver appropriate patient education.

  10. Analysis on energy use in reuse cement silo for campus building

    NASA Astrophysics Data System (ADS)

    Fidiya Nugrahani, Elita; Winda Murti, Izzati; Arifianti, Qurrotin M. O.

    2018-03-01

    Semen Gresik, the first cement factory in Indonesia owned by the government was operated since 1957 and stopped the operation around 1997. The owner, PT. Semen Indonesia (Persero) intended to reuse cement factory for the campus building, Universitas Internasional Semen Indonesia (UISI). This research proposed to analyze the future Energy Use Intensity (EUI) and recommendation energy efficiency in renovating silo through simulation. The result of future EUI in existing building was 234 kWh/m2.year. The scenarios created to reduce energy use in six sectors: window shades, window material, infiltration, daylighting, plug load, air-conditioning and operation schedule. The lowest EUI estimated at 98.27 by use 2/3 window shades, triple low emission window glass, lighting efficiency at 3.23 W/m2, maximize daylighting and occupancy control, minimize infiltration to 0.17 ACH, and 12/5 for operation schedule.

  11. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  12. Pilots' monitoring strategies and performance on automated flight decks: an empirical study combining behavioral and eye-tracking data.

    PubMed

    Sarter, Nadine B; Mumaw, Randall J; Wickens, Christopher D

    2007-06-01

    The objective of the study was to examine pilots' automation monitoring strategies and performance on highly automated commercial flight decks. A considerable body of research and operational experience has documented breakdowns in pilot-automation coordination on modern flight decks. These breakdowns are often considered symptoms of monitoring failures even though, to date, only limited and mostly anecdotal data exist concerning pilots' monitoring strategies and performance. Twenty experienced B-747-400 airline pilots flew a 1-hr scenario involving challenging automation-related events on a full-mission simulator. Behavioral, mental model, and eye-tracking data were collected. The findings from this study confirm that pilots monitor basic flight parameters to a much greater extent than visual indications of the automation configuration. More specifically, they frequently fail to verify manual mode selections or notice automatic mode changes. In other cases, they do not process mode annunciations in sufficient depth to understand their implications for aircraft behavior. Low system observability and gaps in pilots' understanding of complex automation modes were shown to contribute to these problems. Our findings describe and explain shortcomings in pilot's automation monitoring strategies and performance based on converging behavioral, eye-tracking, and mental model data. They confirm that monitoring failures are one major contributor to breakdowns in pilot-automation interaction. The findings from this research can inform the design of improved training programs and automation interfaces that support more effective system monitoring.

  13. Photosynthesis and oxidative stress in the restinga plant species Eugenia uniflora L. exposed to simulated acid rain and iron ore dust deposition: potential use in environmental risk assessment.

    PubMed

    Neves, Natália Rust; Oliva, Marco Antonio; da Cruz Centeno, Danilo; Costa, Alan Carlos; Ribas, Rogério Ferreira; Pereira, Eduardo Gusmão

    2009-06-01

    The Brazilian sandy coastal plain named restinga is frequently subjected to particulate and gaseous emissions from iron ore factories. These gases may come into contact with atmospheric moisture and produce acid rain. The effects of the acid rain on vegetation, combined with iron excess in the soil, can lead to the disappearance of sensitive species and decrease restinga biodiversity. The effects of iron ore dust deposition and simulated acid rain on photosynthesis and on antioxidant enzymes were investigated in Eugenia uniflora, a representative shrub species of the restinga. This study aimed to determine the possible utility of this species in environmental risk assessment. After the application of iron ore dust as iron solid particulate matter (SPM(Fe)) and simulated acid rain (pH 3.1), the 18-month old plants displayed brown spots and necrosis, typical symptoms of iron toxicity and injuries caused by acid rain, respectively. The acidity of the rain intensified leaf iron accumulation, which reached phytotoxic levels, mainly in plants exposed to iron ore dust. These plants showed the lowest values for net photosynthesis, stomatal conductance, transpiration, chlorophyll a content and electron transport rate through photosystem II (PSII). Catalase and superoxide dismutase activities were decreased by simulated acid rain. Peroxidase activity and membrane injury increased following exposure to acid rain and simultaneous SPM(Fe) application. Eugenia uniflora exhibited impaired photosynthetic and antioxidative metabolism in response to combined iron and acid rain stresses. This species could become a valuable tool in environmental risk assessment in restinga areas near iron ore pelletizing factories. Non-invasive evaluations of visual injuries, photosynthesis and chlorophyll a fluorescence, as well as invasive biochemical analysis could be used as markers.

  14. Factorials of real negative and imaginary numbers - A new perspective.

    PubMed

    Thukral, Ashwani K

    2014-01-01

    Presently, factorials of real negative numbers and imaginary numbers, except for zero and negative integers are interpolated using the Euler's gamma function. In the present paper, the concept of factorials has been generalised as applicable to real and imaginary numbers, and multifactorials. New functions based on Euler's factorial function have been proposed for the factorials of real negative and imaginary numbers. As per the present concept, the factorials of real negative numbers, are complex numbers. The factorials of real negative integers have their imaginary part equal to zero, thus are real numbers. Similarly, the factorials of imaginary numbers are complex numbers. The moduli of the complex factorials of real negative numbers, and imaginary numbers are equal to their respective real positive number factorials. Fractional factorials and multifactorials have been defined in a new perspective. The proposed concept has also been extended to Euler's gamma function for real negative numbers and imaginary numbers, and beta function.

  15. The application of FLUENT in simulating outcomes from chlorine leakage accidents in a typical chemical factory.

    PubMed

    Li, Jianfeng; Zhang, Bin; Tang, Sichuang; Tong, Ruipeng

    2016-05-01

    For improvements in market competitiveness, old brand chemical enterprises did some expansion and reconstruction on the base of original equipment. Because it is the reconstruction on the basis of the existing production equipment, it is bound to raise problems of reutilization existing in pipelines and equipment. A simplified typical chemical factory was established referring the actual workshop layout. Further, trustable accident scenarios were conducted to reveal the diffusion process. In a larger leakage rate, the chlorine leak-affected area in the downwind became larger a bit, also in a relatively shorter time, lethal scope will become larger quickly, resulting in more threats to the lives and properties in the vicinity of the factories. Further, it is not possible that the heavier-than-air effect of the chlorine will inevitably result in a higher concentration for a lower surface than that of higher surface. Actually at a certain height, a relatively higher monitoring surface has a larger diffusion range and a larger concentration than a relatively lower surface. It can be inferred that within a certain height, chlorine diffusion rate closer to the ground would be slower due to existence of turbulence or the relative resistance on the ground. © The Author(s) 2014.

  16. Translations on North Korea No. 622

    DTIC Science & Technology

    1978-10-13

    Pyongyang Power Station 5 July Electric Factory Hamhung Machine Tool Factory Kosan Plastic Pipe Factory Sog’wangea Plastic Pipe Factory 8...August Factory Double Chollima Hamhung Disabled Veterans’ Plastic Goods Factory Mangyongdae Machine Tool Factory Kangso Coal Mine Tongdaewon Garment...21 Jul 78 p 4) innovating in machine tool production (NC 21 Jul 78 p 2) in 40 days of the 蔴 days of combat" raised coal production 10 percent

  17. Simulation of a combined-cycle engine

    NASA Technical Reports Server (NTRS)

    Vangerpen, Jon

    1991-01-01

    A FORTRAN computer program was developed to simulate the performance of combined-cycle engines. These engines combine features of both gas turbines and reciprocating engines. The computer program can simulate both design point and off-design operation. Widely varying engine configurations can be evaluated for their power, performance, and efficiency as well as the influence of altitude and air speed. Although the program was developed to simulate aircraft engines, it can be used with equal success for stationary and automative applications.

  18. Development of the Automated AFAPL Engine Simulator Test for Lubricant Evaluation.

    DTIC Science & Technology

    1981-05-01

    including foreign nations. This technical report has been reviewed and is approved for publication. LEON 4JDEBROtUN R.D. DAYTO,*tighief Project Engineer...flow is jetted into the front and rear of the simulator gearbox to provide additional cooling to the gearbox. A heat exchanger is used to cool the oil...flow to the gearbox. Additional heat exchangers are used in the simulator and gearbox oil return lines to the external sump. The simulator test

  19. Modelling and simulating the forming of new dry automated lay-up reinforcements for primary structures

    NASA Astrophysics Data System (ADS)

    Bouquerel, Laure; Moulin, Nicolas; Drapier, Sylvain; Boisse, Philippe; Beraud, Jean-Marc

    2017-10-01

    While weight has been so far the main driver for the development of prepreg based-composites solutions for aeronautics, a new weight-cost trade-off tends to drive choices for next-generation aircrafts. As a response, Hexcel has designed a new dry reinforcement type for aircraft primary structures, which combines the benefits of automation, out-of-autoclave process cost-effectiveness, and mechanical performances competitive to prepreg solutions: HiTape® is a unidirectional (UD) dry carbon reinforcement with thermoplastic veil on each side designed for aircraft primary structures [1-3]. One privileged process route for HiTape® in high volume automated processes consists in forming initially flat dry reinforcement stacks, before resin infusion [4] or injection. Simulation of the forming step aims at predicting the geometry and mechanical properties of the formed stack (so-called preform) for process optimisation. Extensive work has been carried out on prepreg and dry woven fabrics forming behaviour and simulation, but the interest for dry non-woven reinforcements has emerged more recently. Some work has been achieved on non crimp fabrics but studies on the forming behaviour of UDs are seldom and deal with UD prepregs only. Tension and bending in the fibre direction, along with inter-ply friction have been identified as the main mechanisms controlling the HiTape® response during forming. Bending has been characterised using a modified Peirce's flexometer [5] and inter-ply friction study is under development. Anisotropic hyperelastic constitutive models have been selected to represent the assumed decoupled deformation mechanisms. Model parameters are then identified from associated experimental results. For forming simulation, a continuous approach at the macroscopic scale has been selected first, and simulation is carried out in the Zset framework [6] using proper shell finite elements.

  20. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

Top