West europe Report, Science and Technology.
1986-04-15
BLICK DURCH DIE WIRTSCHAFT, 21 Feb 86) 38 Seiaf: Elsag /lBM’s New Creation in Factory Automation (Mauro Flego Interview; AUTOMAZIONE INTEGRATA...SEIAF: ELSAG /IBM’S NEW CREATION IN FACTORY AUTOMATION Milan AUTOMAZIONE INTEGRATA in Italian Apr 85 pp 110-112 [Interview with Mauro Flego...objectives of SEIAF? [Answer] SEIAF, or better—the joint venture ELSAG /IBM—concerns itself with electronic and computer systems for factory automation
ERIC Educational Resources Information Center
Hudson, C. A.
1982-01-01
Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)
Automation; The New Industrial Revolution.
ERIC Educational Resources Information Center
Arnstein, George E.
Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…
Gearing up to the factory of the future
NASA Astrophysics Data System (ADS)
Godfrey, D. E.
1985-01-01
The features of factories and manufacturing techniques and tools of the near future are discussed. The spur to incorporate new technologies on the factory floor will originate in management, who must guide the interfacing of computer-enhanced equipment with traditional manpower, materials and machines. Electronic control with responsiveness and flexibility will be the key concept in an integrated approach to processing materials. Microprocessor controlled laser and fluid cutters add accuracy to cutting operations. Unattended operation will become feasible when automated inspection is added to a work station through developments in robot vision. Optimum shop management will be achieved through AI programming of parts manufacturing, optimized work flows, and cost accounting. The automation enhancements will allow designers to affect directly parts being produced on the factory floor.
Automation Training Tools of the Future.
ERIC Educational Resources Information Center
Rehg, James
1986-01-01
Manufacturing isn't what it used to be, and the United States must ensure its position in the world trade market by educating factory workers in new automated systems. A computer manufacturing engineer outlines the training requirements of a modern workforce and details robotic training devices suitable for classroom use. (JN)
Advanced Map For Real-Time Process Control
NASA Astrophysics Data System (ADS)
Shiobara, Yasuhisa; Matsudaira, Takayuki; Sashida, Yoshio; Chikuma, Makoto
1987-10-01
MAP, a communications protocol for factory automation proposed by General Motors [1], has been accepted by users throughout the world and is rapidly becoming a user standard. In fact, it is now a LAN standard for factory automation. MAP is intended to interconnect different devices, such as computers and programmable devices, made by different manufacturers, enabling them to exchange information. It is based on the OSI intercomputer com-munications protocol standard under development by the ISO. With progress and standardization, MAP is being investigated for application to process control fields other than factory automation [2]. The transmission response time of the network system and centralized management of data exchanged with various devices for distributed control are import-ant in the case of a real-time process control with programmable controllers, computers, and instruments connected to a LAN system. MAP/EPA and MINI MAP aim at reduced overhead in protocol processing and enhanced transmission response. If applied to real-time process control, a protocol based on point-to-point and request-response transactions limits throughput and transmission response. This paper describes an advanced MAP LAN system applied to real-time process control by adding a new data transmission control that performs multicasting communication voluntarily and periodically in the priority order of data to be exchanged.
Arcnet(R) On-Fiber -- A Viable Factory Automation Alternative
NASA Astrophysics Data System (ADS)
Karlin, Geof; Tucker, Carol S.
1987-01-01
Manufacturers need to improve their operating methods and increase their productivity so they can compete successfully in the marketplace. This goal can be achieved through factory automation, and the key to this automation is successful data base management and factory integration. However, large scale factory automation and integration requires effective communications, and this has given rise to an interest in various Local Area Networks or LANs. In a completely integrated and automated factory, the entire organization must have access to the data base, and all departments and functions must be able to communicate with each other. Traditionally, these departments and functions use incompatible equipment, and the ability to make such equipment communicate presents numerous problems. ARCNET, a token-passing LAN which has a significant presence in the office environment today, coupled with fiber optic cable, the cable of the future, provide an effective, low-cost solution to a number of these problems.
The Plight of Manufacturing: What Can Be Done?
ERIC Educational Resources Information Center
Cyert, Richard M.
1985-01-01
Proposes that full automation is the best current option for the United States' manufacturing industries. Advocates increased use of electronics, robotics, and computers in the establishment of unmanned factories. Implications of this movement are examined in terms of labor, management, and the structure of the economy. (ML)
Minifactory: a precision assembly system adaptable to the product life cycle
NASA Astrophysics Data System (ADS)
Muir, Patrick F.; Rizzi, Alfred A.; Gowdy, Jay W.
1997-12-01
Automated product assembly systems are traditionally designed with the intent that they will be operated with few significant changes for as long as the product is being manufactured. This approach to factory design and programming has may undesirable qualities which have motivated the development of more 'flexible' systems. In an effort to improve agility, different types of flexibility have been integrated into factory designs. Specifically, automated assembly systems have been endowed with the ability to assemble differing products by means of computer-controlled robots, and to accommodate variations in parts locations and dimensions by means of sensing. The product life cycle (PLC) is a standard four-stage model of the performance of a product from the time that it is first introduced in the marketplace until the time that it is discontinued. Manufacturers can improve their return on investment by adapting the production process to the PLC. We are developing two concepts to enable manufacturers to more readily achieve this goal: the agile assembly architecture (AAA), an abstract framework for distributed modular automation; and minifactory, our physical instantation of this architecture for the assembly of precision electro-mechanical devices. By examining the requirements which each PLC stage places upon the production system, we identify characteristics of factory design and programming which are appropriate for that stage. As the product transitions from one stage to the next, the factory design and programing should also transition from one embodiment to the next in order to achieve the best return on investment. Modularity of the factory components, highly flexible product transport mechanisms, and a high level of distributed intelligence are key characteristics of minifactory that enable this adaptation.
Automation U.S.A.: Overcoming Barriers to Automation.
ERIC Educational Resources Information Center
Brody, Herb
1985-01-01
Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…
An intelligent CNC machine control system architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.J.; Loucks, C.S.
1996-10-01
Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less
CIM for 300-mm semiconductor fab
NASA Astrophysics Data System (ADS)
Luk, Arthur
1997-08-01
Five years ago, factory automation (F/A) was not prevalent in the fab. Today facing the drastically changed market and the intense competition, management request the plant floor data be forward to their desktop computer. This increased demand rapidly pushed F/A to the computer integrated manufacturing (CIM). Through personalization, we successfully reduced a computer size, let them can be stored on our desktop. PC initiates a computer new era. With the advent of the network, the network computer (NC) creates fresh problems for us. When we plan to invest more than $3 billion to build new 300 mm fab, the next generation technology raises a challenging bar.
Replicating systems concepts: Self-replicating lunar factory and demonstration
NASA Technical Reports Server (NTRS)
1982-01-01
Automation of lunar mining and manufacturing facility maintenance and repair is addressed. Designing the factory as an automated, multiproduct, remotely controlled, reprogrammable Lunar Manufacturing Facility capable of constructing duplicates of itself which would themselves be capable of further replication is proposed.
Communications among elements of a space construction ensemble
NASA Technical Reports Server (NTRS)
Davis, Randal L.; Grasso, Christopher A.
1989-01-01
Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.
Cognitive performance deficits in a simulated climb of Mount Everest - Operation Everest II
NASA Technical Reports Server (NTRS)
Kennedy, R. S.; Dunlap, W. P.; Banderet, L. E.; Smith, M. G.; Houston, C. S.
1989-01-01
Cognitive function at simulated altitude was investigated in a repeated-measures within-subject study of performance by seven volunteers in a hypobaric chamber, in which atmospheric pressure was systematically lowered over a period of 40 d to finally reach a pressure equivalent to 8845 m, the approximate height of Mount Everest. The automated performance test system employed compact computer design; automated test administrations, data storage, and retrieval; psychometric properties of stability and reliability; and factorial richness. Significant impairments of cognitive function were seen for three of the five tests in the battery; on two tests, grammatical reasoning and pattern comparison, every subject showed a substantial decrement.
JPRS report: Science and Technology. Europe and Latin America
NASA Astrophysics Data System (ADS)
1988-01-01
Articles from the popular and trade press are included on the following subjects: advanced materials, aerospace industry, automotive industry, biotechnology, computers, factory automation and robotics, microelectronics, and science and technology policy. The aerospace articles discuss briefly and in a nontechnical way the SAGEM bubble memories for space applications, Ariane V new testing facilities, innovative technologies of TDF-1 satellite, and the restructuring of the Aviation Division at France's Aerospatiale.
NASA Astrophysics Data System (ADS)
Harrison, Robert; Vera, Daniel; Ahmad, Bilal
2016-10-01
The fourth industrial revolution promises to create what has been called the smart factory. The vision is that within such modular structured smart factories, cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralised decisions. This paper provides a view of this initiative from an automation systems perspective. In this context it considers how future automation systems might be effectively configured and supported through their lifecycles and how integration, application modelling, visualisation and reuse of such systems might be best achieved. The paper briefly describes limitations in current engineering methods, and new emerging approaches including the cyber physical systems (CPS) engineering tools being developed by the automation systems group (ASG) at Warwick Manufacturing Group, University of Warwick, UK.
JPRS report: Science and technology. Europe and Latin America
NASA Astrophysics Data System (ADS)
1988-01-01
Articles from the popular and trade press of Western Europe and Latin America are presented on advanced materials, aerospace and civial aviation, computers, defense industries, factory automation and robotics, lasers, senors, optics microelectronics, science and technology policy, biotechnology, marine technology, and nuclear developments. The aerospace articles include an overview of Austrian space activities and plans and a report on a panel of West German experts recommending against self-sufficiency for the Airbus.
Development of Moire machine vision
NASA Technical Reports Server (NTRS)
Harding, Kevin G.
1987-01-01
Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.
Development of Moire machine vision
NASA Astrophysics Data System (ADS)
Harding, Kevin G.
1987-10-01
Three dimensional perception is essential to the development of versatile robotics systems in order to handle complex manufacturing tasks in future factories and in providing high accuracy measurements needed in flexible manufacturing and quality control. A program is described which will develop the potential of Moire techniques to provide this capability in vision systems and automated measurements, and demonstrate artificial intelligence (AI) techniques to take advantage of the strengths of Moire sensing. Moire techniques provide a means of optically manipulating the complex visual data in a three dimensional scene into a form which can be easily and quickly analyzed by computers. This type of optical data manipulation provides high productivity through integrated automation, producing a high quality product while reducing computer and mechanical manipulation requirements and thereby the cost and time of production. This nondestructive evaluation is developed to be able to make full field range measurement and three dimensional scene analysis.
Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke
2007-01-19
We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype ofmore » a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.« less
CIM at GE's factory of the future
NASA Astrophysics Data System (ADS)
Waldman, H.
Functional features of a highly automated aircraft component batch processing factory are described. The system has processing, working, and methodology components. A rotating parts operation installed 20 yr ago features a high density of numerically controlled machines, and is connected to a hierarchical network of data communications and apparatus for moving the rotating parts and tools of engines. Designs produced at one location in the country are sent by telephone link to other sites for development of manufacturing plans, tooling, numerical control programs, and process instructions for the rotating parts. Direct numerical control is implemented at the work stations, which have instructions stored on tape for back-up in case the host computer goes down. Each machine is automatically monitored at 48 points and notice of failure can originate from any point in the system.
JPRS Report, Science & Technology, Europe & Latin America.
1988-01-22
Rex Malik; ZERO UN INFORMATIQUE, 31 Aug 87) 25 FACTORY AUTOMATION, ROBOTICS West Europe Seeks To Halt Japanese Inroads in Machine Tool Sector...aircraft. 25048 CSO: 3698/A014 26 FACTORY AUTOMATION, ROBOTICS vrEST EUROpE WEST EUROPE SEEKS TO HALT JAPANESE INROADS IN MACHINE TOOL SECTOR...Trumpf, by the same journalist; first paragraph is L’USINE NOUVELLE introduction] [Excerpts] European machine - tool builders are stepping up mutual
Integrating PCLIPS into ULowell's Lincoln Logs: Factory of the future
NASA Technical Reports Server (NTRS)
Mcgee, Brenda J.; Miller, Mark D.; Krolak, Patrick; Barr, Stanley J.
1990-01-01
We are attempting to show how independent but cooperating expert systems, executing within a parallel production system (PCLIPS), can operate and control a completely automated, fault tolerant prototype of a factory of the future (The Lincoln Logs Factory of the Future). The factory consists of a CAD system for designing the Lincoln Log Houses, two workcells, and a materials handling system. A workcell consists of two robots, part feeders, and a frame mounted vision system.
Robotics and Automation Education: Developing the Versatile, Practical Lab.
ERIC Educational Resources Information Center
Stenerson, Jon
1986-01-01
Elements of the development of a robotics and automation laboratory are discussed. These include the benefits of upgrading current staff, ways to achieve this staff development, formation of a robotics factory automation committee, topics to be taught with a robot, elements of a laboratory, laboratory funding, and design safety. (CT)
AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT.
Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred
2016-01-01
In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the "Islands of Automation" dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing.
The Learning Basis of Automated Factories: The Case of FIAT. Training Discussion Paper No. 86.
ERIC Educational Resources Information Center
Araujo e Oliveira, Joao Batista
As part of a study on the impact of automation on training, extensive interviews were conducted at two of Fiat's plants, Termoli and Casino, Italy. Termoli, a plant built in the mid-1980s with automation in mind, production of engines and gear boxes was very much integrated by automation devices. Casino produced some individual components but was…
The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards
1986-08-01
Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques
USDA-ARS?s Scientific Manuscript database
Greenhouse cultivation has evolved from simple covered rows of open-fields crops to highly sophisticated controlled environment agriculture (CEA) facilities that projected the image of plant factories for urban farming. The advances and improvements in CEA have promoted the scientific solutions for ...
Development of integrated control system for smart factory in the injection molding process
NASA Astrophysics Data System (ADS)
Chung, M. J.; Kim, C. Y.
2018-03-01
In this study, we proposed integrated control system for automation of injection molding process required for construction of smart factory. The injection molding process consists of heating, tool close, injection, cooling, tool open, and take-out. Take-out robot controller, image processing module, and process data acquisition interface module are developed and assembled to integrated control system. By adoption of integrated control system, the injection molding process can be simplified and the cost for construction of smart factory can be inexpensive.
Response Surface Modeling of Combined-Cycle Propulsion Components using Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Steffen, C. J., Jr.
2002-01-01
Three examples of response surface modeling with CFD are presented for combined cycle propulsion components. The examples include a mixed-compression-inlet during hypersonic flight, a hydrogen-fueled scramjet combustor during hypersonic flight, and a ducted-rocket nozzle during all-rocket flight. Three different experimental strategies were examined, including full factorial, fractionated central-composite, and D-optimal with embedded Plackett-Burman designs. The response variables have been confined to integral data extracted from multidimensional CFD results. Careful attention to uncertainty assessment and modeling bias has been addressed. The importance of automating experimental setup and effectively communicating statistical results are emphasized.
NASA Technical Reports Server (NTRS)
Ali, Syed Firasat; Khan, Javed Khan; Rossi, Marcia J.; Crane, Peter; Heath, Bruce E.; Knighten, Tremaine; Culpepper, Christi
2003-01-01
Personal computer based flight simulators are expanding opportunities for providing low-cost pilot training. One advantage of these devices is the opportunity to incorporate instructional features into training scenarios that might not be cost effective with earlier systems. Research was conducted to evaluate the utility of different instructional features using a coordinated level turn as an aircraft maneuvering task. In study I, a comparison was made between automated computer grades of performance with certified flight instructors grades. Every one of the six student volunteers conducted a flight with level turns at two different bank angles. The automated computer grades were based on prescribed tolerances on bank angle, airspeed and altitude. Two certified flight instructors independently examined the video tapes of heads up and instrument displays of the flights and graded them. The comparison of automated grades with the instructors grades was based on correlations between them. In study II, a 2x2 between subjects factorial design was used to devise and conduct an experiment. Comparison was made between real time training and above real time training and between feedback and no feedback in training. The performance measure to monitor progress in training was based on deviations in bank angle and altitude. The performance measure was developed after completion of the experiment including the training and test flights. It was not envisaged before the experiment. The experiment did not include self- instructions as it was originally planned, although feedback by experimenter to the trainee was included in the study.
NASA Technical Reports Server (NTRS)
Norlin, Ken (Technical Monitor); Ali, Syed Firasat; Khan, M. Javed; Rossi, Marcia J.; Crane, Peter; Heath, Bruce E.; Knighten, Tremaine; Culpepper, Christi
2003-01-01
Personal computer based flight simulators are expanding opportunities for providing low-cost pilot training. One advantage of these devices is the opportunity to incorporate instructional features into training scenarios that might not be cost effective with earlier systems. Research was conducted to evaluate the utility of different instructional features using a coordinated level turn as an aircraft maneuvering task. In study I, a comparison was made between automated computer grades of performance with certified flight instructors grades. Every one of the six student volunteers conducted a flight with level turns at two different bank angles. The automated computer grades were based on prescribed tolerances on bank angle, airspeed and altitude. Two certified flight instructors independently examined the video tapes of heads up and instrument displays of the flights and graded them. The comparison of automated grades with the instructors grades ms based on correlations between them. In study II, a 2x2 between subjects factorial design was used to devise and conduct an experiment. Comparison was made between real time training and above real time training and between feedback and no feedback in training. The performance measure to monitor progress in training was based on deviations in bank angle and altitude. The performance measure was developed after completion of the experiment including the training and test flights. It was not envisaged before the experiment. The experiment did not include self-instructions as it was originally planned, although feedback by experimenter to the trainee was included in the study.
Reviving the Rural Factory: Automation and Work in the South. Executive Summary.
ERIC Educational Resources Information Center
Rosenfeld, Stuart A.; And Others
This document is the executive summary for a two volume report on technological innovation and southern rural industrial development. The first volume examines public and private factors that influence investment decisions in new technologies and the outcomes of those decisions; effects of automation on employment and the workplace; outcomes of…
Reviving the Rural Factory: Automation and Work in the South. Volumes 1 and 2.
ERIC Educational Resources Information Center
Rosenfeld, Stuart A.; And Others
These two volumes examine how the public sector can help revitalize southern rural counties adversely affected by global competition and technological advances. The first volume examines public and private factors that influence investment decisions in new technologies and outcomes of those decisions; effects of automation on employment and the…
AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT
Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred
2017-01-01
In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the “Islands of Automation” dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing. PMID:28691121
Intelligent Control in Automation Based on Wireless Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Derr; Milos Manic
2007-09-01
Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in controlmore » type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.« less
Intelligent Control in Automation Based on Wireless Traffic Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Derr; Milos Manic
Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in controlmore » type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.« less
Machine vision for various manipulation tasks
NASA Astrophysics Data System (ADS)
Domae, Yukiyasu
2017-03-01
Bin-picking, re-grasping, pick-and-place, kitting, etc. There are many manipulation tasks in the fields of automation of factory, warehouse and so on. The main problem of the automation is that the target objects (items/parts) have various shapes, weights and surface materials. In my talk, I will show latest machine vision systems and algorithms against the problem.
Intelligent robot trends for factory automation
NASA Astrophysics Data System (ADS)
Hall, Ernest L.
1997-09-01
An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent economic and technical trends. The robotics industry now has a billion-dollar market in the U.S. and is growing. Feasibility studies are presented which also show unaudited healthy rates of return for a variety of robotic applications. Technically, the machines are faster, cheaper, more repeatable, more reliable and safer. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. However, the road from inspiration to successful application is still long and difficult, often taking decades to achieve a new product. More cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit both industry and society.
Automated production of plant-based vaccines and pharmaceuticals.
Wirz, Holger; Sauer-Budge, Alexis F; Briggs, John; Sharpe, Aaron; Shu, Sudong; Sharon, Andre
2012-12-01
A fully automated "factory" was developed that uses tobacco plants to produce large quantities of vaccines and other therapeutic biologics within weeks. This first-of-a-kind factory takes advantage of a plant viral vector technology to produce specific proteins within the leaves of rapidly growing plant biomass. The factory's custom-designed robotic machines plant seeds, nurture the growing plants, introduce a viral vector that directs the plant to produce a target protein, and harvest the biomass once the target protein has accumulated in the plants-all in compliance with Food and Drug Administration (FDA) guidelines (e.g., current Good Manufacturing Practices). The factory was designed to be time, cost, and space efficient. The plants are grown in custom multiplant trays. Robots ride up and down a track, servicing the plants and delivering the trays from the lighted, irrigated growth modules to each processing station as needed. Using preprogrammed robots and processing equipment eliminates the need for human contact, preventing potential contamination of the process and economizing the operation. To quickly produce large quantities of protein-based medicines, we transformed a laboratory-based biological process and scaled it into an industrial process. This enables quick, safe, and cost-effective vaccine production that would be required in case of a pandemic.
CIM's bridge from CADD to CAM: Data management requirements for manufacturing engineering
NASA Technical Reports Server (NTRS)
Ford, S. J.
1984-01-01
Manufacturing engineering represents the crossroads of technical data management in a Computer Integrated Manufacturing (CIM) environment. Process planning, numerical control programming and tool design are the key functions which translate information from as engineered to as assembled. In order to transition data from engineering to manufacturing, it is necessary to introduce a series of product interpretations which contain an interim introduction of technical parameters. The current automation of the product definition and the production process places manufacturing engineering in the center of CAD/CAM with the responsibility of communicating design data to the factory floor via a manufacturing model of the data. A close look at data management requirements for manufacturing engineering is necessary in order to establish the overall specifications for CADD output, CAM input, and CIM integration. The functions and issues associated with the orderly evolution of computer aided engineering and manufacturing are examined.
Cyber physical systems role in manufacturing technologies
NASA Astrophysics Data System (ADS)
Al-Ali, A. R.; Gupta, Ragini; Nabulsi, Ahmad Al
2018-04-01
Empowered by the recent development in single System-on-Chip, Internet of Things, and cloud computing technologies, cyber physical systems are evolving as a major controller during and post the manufacturing products process. In additional to their real physical space, cyber products nowadays have a virtual space. A product virtual space is a digital twin that is attached to it to enable manufacturers and their clients to better manufacture, monitor, maintain and operate it throughout its life time cycles, i.e. from the product manufacturing date, through operation and to the end of its lifespan. Each product is equipped with a tiny microcontroller that has a unique identification number, access code and WiFi conductivity to access it anytime and anywhere during its life cycle. This paper presents the cyber physical systems architecture and its role in manufacturing. Also, it highlights the role of Internet of Things and cloud computing in industrial manufacturing and factory automation.
Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes
NASA Astrophysics Data System (ADS)
Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao
2010-06-01
To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.
Quantum computation with realistic magic-state factories
NASA Astrophysics Data System (ADS)
O'Gorman, Joe; Campbell, Earl T.
2017-03-01
Leading approaches to fault-tolerant quantum computation dedicate a significant portion of the hardware to computational factories that churn out high-fidelity ancillas called magic states. Consequently, efficient and realistic factory design is of paramount importance. Here we present the most detailed resource assessment to date of magic-state factories within a surface code quantum computer, along the way introducing a number of techniques. We show that the block codes of Bravyi and Haah [Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329] have been systematically undervalued; we track correlated errors both numerically and analytically, providing fidelity estimates without appeal to the union bound. We also introduce a subsystem code realization of these protocols with constant time and low ancilla cost. Additionally, we confirm that magic-state factories have space-time costs that scale as a constant factor of surface code costs. We find that the magic-state factory required for postclassical factoring can be as small as 6.3 million data qubits, ignoring ancilla qubits, assuming 10-4 error gates and the availability of long-range interactions.
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-01-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye
2016-06-07
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
NASA Astrophysics Data System (ADS)
Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye
2016-06-01
This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.
NASA Technical Reports Server (NTRS)
Ciciora, J. A.; Leonard, S. D.; Johnson, N.; Amell, J.
1984-01-01
In order to derive general design guidelines for automated systems a study was conducted on the utilization and acceptance of existing automated systems as currently employed in several commercial fields. Four principal study area were investigated by means of structured interviews, and in some cases questionnaires. The study areas were aviation, a both scheduled airline and general commercial aviation; process control and factory applications; office automation; and automation in the power industry. The results of over eighty structured interviews were analyzed and responses categoried as various human factors issues for use by both designers and users of automated equipment. These guidelines address such items as general physical features of automated equipment; personnel orientation, acceptance, and training; and both personnel and system reliability.
ERIC Educational Resources Information Center
Albus, James S.
1984-01-01
Spectacular advances in microcomputers are forging new technological frontiers in robotics. For example, many factories will be totally automated. Economic implications of the new technology of robotics for the future are examined. (RM)
Automated Scheduling Via Artificial Intelligence
NASA Technical Reports Server (NTRS)
Biefeld, Eric W.; Cooper, Lynne P.
1991-01-01
Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.
NASA Technical Reports Server (NTRS)
Hackwood, Susan; Belinski, Steven E.; Beni, Gerardo
1989-01-01
The discipline of vacuum mechatronics is defined as the design and development of vacuum-compatible computer-controlled mechanisms for manipulating, sensing and testing in a vacuum environment. The importance of vacuum mechatronics is growing with an increased application of vacuum in space studies and in manufacturing for material processing, medicine, microelectronics, emission studies, lyophylisation, freeze drying and packaging. The quickly developing field of vacuum mechatronics will also be the driving force for the realization of an advanced era of totally enclosed clean manufacturing cells. High technology manufacturing has increasingly demanding requirements for precision manipulation, in situ process monitoring and contamination-free environments. To remove the contamination problems associated with human workers, the tendency in many manufacturing processes is to move towards total automation. This will become a requirement in the near future for e.g., microelectronics manufacturing. Automation in ultra-clean manufacturing environments is evolving into the concept of self-contained and fully enclosed manufacturing. A Self Contained Automated Robotic Factory (SCARF) is being developed as a flexible research facility for totally enclosed manufacturing. The construction and successful operation of a SCARF will provide a novel, flexible, self-contained, clean, vacuum manufacturing environment. SCARF also requires very high reliability and intelligent control. The trends in vacuum mechatronics and some of the key research issues are reviewed.
An Automated and Continuous Plant Weight Measurement System for Plant Factory
Chen, Wei-Tai; Yeh, Yu-Hui F.; Liu, Ting-Yu; Lin, Ta-Te
2016-01-01
In plant factories, plants are usually cultivated in nutrient solution under a controllable environment. Plant quality and growth are closely monitored and precisely controlled. For plant growth evaluation, plant weight is an important and commonly used indicator. Traditional plant weight measurements are destructive and laborious. In order to measure and record the plant weight during plant growth, an automated measurement system was designed and developed herein. The weight measurement system comprises a weight measurement device and an imaging system. The weight measurement device consists of a top disk, a bottom disk, a plant holder and a load cell. The load cell with a resolution of 0.1 g converts the plant weight on the plant holder disk to an analog electrical signal for a precise measurement. The top disk and bottom disk are designed to be durable for different plant sizes, so plant weight can be measured continuously throughout the whole growth period, without hindering plant growth. The results show that plant weights measured by the weight measurement device are highly correlated with the weights estimated by the stereo-vision imaging system; hence, plant weight can be measured by either method. The weight growth of selected vegetables growing in the National Taiwan University plant factory were monitored and measured using our automated plant growth weight measurement system. The experimental results demonstrate the functionality, stability and durability of this system. The information gathered by this weight system can be valuable and beneficial for hydroponic plants monitoring research and agricultural research applications. PMID:27066040
An Automated and Continuous Plant Weight Measurement System for Plant Factory.
Chen, Wei-Tai; Yeh, Yu-Hui F; Liu, Ting-Yu; Lin, Ta-Te
2016-01-01
In plant factories, plants are usually cultivated in nutrient solution under a controllable environment. Plant quality and growth are closely monitored and precisely controlled. For plant growth evaluation, plant weight is an important and commonly used indicator. Traditional plant weight measurements are destructive and laborious. In order to measure and record the plant weight during plant growth, an automated measurement system was designed and developed herein. The weight measurement system comprises a weight measurement device and an imaging system. The weight measurement device consists of a top disk, a bottom disk, a plant holder and a load cell. The load cell with a resolution of 0.1 g converts the plant weight on the plant holder disk to an analog electrical signal for a precise measurement. The top disk and bottom disk are designed to be durable for different plant sizes, so plant weight can be measured continuously throughout the whole growth period, without hindering plant growth. The results show that plant weights measured by the weight measurement device are highly correlated with the weights estimated by the stereo-vision imaging system; hence, plant weight can be measured by either method. The weight growth of selected vegetables growing in the National Taiwan University plant factory were monitored and measured using our automated plant growth weight measurement system. The experimental results demonstrate the functionality, stability and durability of this system. The information gathered by this weight system can be valuable and beneficial for hydroponic plants monitoring research and agricultural research applications.
Improvement of an automated protein crystal exchange system PAM for high-throughput data collection
Hiraki, Masahiko; Yamada, Yusuke; Chavas, Leonard M. G.; Wakatsuki, Soichi; Matsugaki, Naohiro
2013-01-01
Photon Factory Automated Mounting system (PAM) protein crystal exchange systems are available at the following Photon Factory macromolecular beamlines: BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. The beamline AR-NE3A has been constructed for high-throughput macromolecular crystallography and is dedicated to structure-based drug design. The PAM liquid-nitrogen Dewar can store a maximum of three SSRL cassettes. Therefore, users have to interrupt their experiments and replace the cassettes when using four or more of them during their beam time. As a result of investigation, four or more cassettes were used in AR-NE3A alone. For continuous automated data collection, the size of the liquid-nitrogen Dewar for the AR-NE3A PAM was increased, doubling the capacity. In order to check the calibration with the new Dewar and the cassette stand, calibration experiments were repeatedly performed. Compared with the current system, the parameters of the novel system are shown to be stable. PMID:24121334
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johanna Oxstrand; Katya Le Blanc
The Human-Automation Collaboration (HAC) research effort is a part of the Department of Energy (DOE) sponsored Advanced Small Modular Reactor (AdvSMR) program conducted at Idaho National Laboratory (INL). The DOE AdvSMR program focuses on plant design and management, reduction of capital costs as well as plant operations and maintenance costs (O&M), and factory production costs benefits.
Research on the ITOC based scheduling system for ship piping production
NASA Astrophysics Data System (ADS)
Li, Rui; Liu, Yu-Jun; Hamada, Kunihiro
2010-12-01
Manufacturing of ship piping systems is one of the major production activities in shipbuilding. The schedule of pipe production has an important impact on the master schedule of shipbuilding. In this research, the ITOC concept was introduced to solve the scheduling problems of a piping factory, and an intelligent scheduling system was developed. The system, in which a product model, an operation model, a factory model, and a knowledge database of piping production were integrated, automated the planning process and production scheduling. Details of the above points were discussed. Moreover, an application of the system in a piping factory, which achieved a higher level of performance as measured by tardiness, lead time, and inventory, was demonstrated.
NASA Technical Reports Server (NTRS)
Daiello, R. V.
1977-01-01
A general technology assessment and manufacturing cost analysis was presented. A near-term (1982) factory design is described, and the results of an experimental production study for the large-scale production of flat-panel silicon and solar-cell arrays are detailed.
Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete
2008-08-20
Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.
Intelligent robot trends and predictions for the new millennium
NASA Astrophysics Data System (ADS)
Hall, Ernest L.; Mundhenk, Terrell N.
1999-08-01
An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The current use of these machines in outer space, medicine, hazardous materials, defense applications and industry is being pursued with vigor but little funding. In factory automation such robotics machines can improve productivity, increase product quality and improve competitiveness. The computer and the robot have both been developed during recent times. The intelligent robot combines both technologies and requires a thorough understanding and knowledge of mechatronics. In honor of the new millennium, this paper will present a discussion of futuristic trends and predictions. However, in keeping with technical tradition, a new technique for 'Follow the Leader' will also be presented in the hope of it becoming a new, useful and non-obvious technique.
Automated CD-SEM recipe creation technology for mass production using CAD data
NASA Astrophysics Data System (ADS)
Kawahara, Toshikazu; Yoshida, Masamichi; Tanaka, Masashi; Ido, Sanyu; Nakano, Hiroyuki; Adachi, Naokaka; Abe, Yuichi; Nagatomo, Wataru
2011-03-01
Critical Dimension Scanning Electron Microscope (CD-SEM) recipe creation needs sample preparation necessary for matching pattern registration, and recipe creation on CD-SEM using the sample, which hinders the reduction in test production cost and time in semiconductor manufacturing factories. From the perspective of cost reduction and improvement of the test production efficiency, automated CD-SEM recipe creation without the sample preparation and the manual operation has been important in the production lines. For the automated CD-SEM recipe creation, we have introduced RecipeDirector (RD) that enables the recipe creation by using Computer-Aided Design (CAD) data and text data that includes measurement information. We have developed a system that automatically creates the CAD data and the text data necessary for the recipe creation on RD; and, for the elimination of the manual operation, we have enhanced RD so that all measurement information can be specified in the text data. As a result, we have established an automated CD-SEM recipe creation system without the sample preparation and the manual operation. For the introduction of the CD-SEM recipe creation system using RD to the production lines, the accuracy of the pattern matching was an issue. The shape of design templates for the matching created from the CAD data was different from that of SEM images in vision. Thus, a development of robust pattern matching algorithm that considers the shape difference was needed. The addition of image processing of the templates for the matching and shape processing of the CAD patterns in the lower layer has enabled the robust pattern matching. This paper describes the automated CD-SEM recipe creation technology for the production lines without the sample preparation and the manual operation using RD applied in Sony Semiconductor Kyusyu Corporation Kumamoto Technology Center (SCK Corporation Kumamoto TEC).
Engineering biological systems using automated biofoundries
Chao, Ran; Mishra, Shekhar; Si, Tong; Zhao, Huimin
2017-01-01
Engineered biological systems such as genetic circuits and microbial cell factories have promised to solve many challenges in the modern society. However, the artisanal processes of research and development are slow, expensive, and inconsistent, representing a major obstacle in biotechnology and bioengineering. In recent years, biological foundries or biofoundries have been developed to automate design-build-test engineering cycles in an effort to accelerate these processes. This review summarizes the enabling technologies for such biofoundries as well as their early successes and remaining challenges. PMID:28602523
Midlands Teaching Factory, LTD.
ERIC Educational Resources Information Center
Midlands Technical Coll., Columbia, SC.
In 1987, Midlands Technical College (MTC), in Columbia, South Carolina, initiated a Computer Integrated Manufacturing (CIM) project, the Midlands Teaching Factory, LTD, which integrated various college departments with the goal of manufacturing a high quality, saleable product. The faculty developed a teaching factory model which was designed to…
MapFactory - Towards a mapping design pattern for big geospatial data
NASA Astrophysics Data System (ADS)
Rautenbach, Victoria; Coetzee, Serena
2018-05-01
With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.
Preparing Students for "The End of Work".
ERIC Educational Resources Information Center
Rifkin, Jeremy
1997-01-01
With workerless factories, virtual companies, and shrinking governments becoming reality, nations will be hard-pressed to employ millions of "surplus" young people in an increasingly automated global economy. An elitist knowledge sector cannot accommodate enough displaced workers. To advance the goals of civil education, educators must…
Translational Bounds for Factorial n and the Factorial Polynomial
ERIC Educational Resources Information Center
Mahmood, Munir; Edwards, Phillip
2009-01-01
During the period 1729-1826 Bernoulli, Euler, Goldbach and Legendre developed expressions for defining and evaluating "n"! and the related gamma function. Expressions related to "n"! and the gamma function are a common feature in computer science and engineering applications. In the modern computer age people live in now, two common tests to…
Electronic Data Interchange in Procurement
1990-04-01
contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated
A Fog Computing and Cloudlet Based Augmented Reality System for the Industry 4.0 Shipyard.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Vilar-Montesinos, Miguel
2018-06-02
Augmented Reality (AR) is one of the key technologies pointed out by Industry 4.0 as a tool for enhancing the next generation of automated and computerized factories. AR can also help shipbuilding operators, since they usually need to interact with information (e.g., product datasheets, instructions, maintenance procedures, quality control forms) that could be handled easily and more efficiently through AR devices. This is the reason why Navantia, one of the 10 largest shipbuilders in the world, is studying the application of AR (among other technologies) in different shipyard environments in a project called "Shipyard 4.0". This article presents Navantia's industrial AR (IAR) architecture, which is based on cloudlets and on the fog computing paradigm. Both technologies are ideal for supporting physically-distributed, low-latency and QoS-aware applications that decrease the network traffic and the computational load of traditional cloud computing systems. The proposed IAR communications architecture is evaluated in real-world scenarios with payload sizes according to demanding Microsoft HoloLens applications and when using a cloud, a cloudlet and a fog computing system. The results show that, in terms of response delay, the fog computing system is the fastest when transferring small payloads (less than 128 KB), while for larger file sizes, the cloudlet solution is faster than the others. Moreover, under high loads (with many concurrent IAR clients), the cloudlet in some cases is more than four times faster than the fog computing system in terms of response delay.
Robotics: Past, Present, and Future.
ERIC Educational Resources Information Center
Dunne, Maurice J.
Robots are finally receiving wide-spread attention as a means to realize the goal of automating factories. In the 1960's robot use was limited by unfavorable acquisition and operating costs and the affordable control technology limiting applications to relatively simple jobs. During the 1970's productivity of manufacturing organizations declined…
Automated workflows for modelling chemical fate, kinetics and toxicity.
Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P
2017-12-01
Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Employment Opportunities for the Handicapped in Programmable Automation.
ERIC Educational Resources Information Center
Swift, Richard; Leneway, Robert
A Computer Integrated Manufacturing System may make it possible for severely disabled people to custom design, machine, and manufacture either wood or metal parts. Programmable automation merges computer aided design, computer aided manufacturing, computer aided engineering, and computer integrated manufacturing systems with automated production…
NASA Astrophysics Data System (ADS)
Illing, Gerd; Saenger, Wolfram; Heinemann, Udo
2000-06-01
The Protein Structure Factory will be established to characterize proteins encoded by human genes or cDNAs, which will be selected by criteria of potential structural novelty or medical or biotechnological usefulness. It represents an integrative approach to structure analysis combining bioinformatics techniques, automated gene expression and purification of gene products, generation of a biophysical fingerprint of the proteins and the determination of their three-dimensional structures either by NMR spectroscopy or by X-ray diffraction. The use of synchrotron radiation will be crucial to the Protein Structure Factory: high brilliance and tunable wavelengths are prerequisites for fast data collection, the use of small crystals and multiwavelength anomalous diffraction (MAD) phasing. With the opening of BESSY II, direct access to a third-generation XUV storage ring source with excellent conditions is available nearby. An insertion device with two MAD beamlines and one constant energy station will be set up until 2001.
The EB factory project. II. Validation with the Kepler field in preparation for K2 and TESS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parvizi, Mahmoud; Paegert, Martin; Stassun, Keivan G., E-mail: mahmoud.parvizi@vanderbilt.edu
Large repositories of high precision light curve data, such as the Kepler data set, provide the opportunity to identify astrophysically important eclipsing binary (EB) systems in large quantities. However, the rate of classical “by eye” human analysis restricts complete and efficient mining of EBs from these data using classical techniques. To prepare for mining EBs from the upcoming K2 mission as well as other current missions, we developed an automated end-to-end computational pipeline—the Eclipsing Binary Factory (EBF)—that automatically identifies EBs and classifies them into morphological types. The EBF has been previously tested on ground-based light curves. To assess the performancemore » of the EBF in the context of space-based data, we apply the EBF to the full set of light curves in the Kepler “Q3” Data Release. We compare the EBs identified from this automated approach against the human generated Kepler EB Catalog of ∼2600 EBs. When we require EB classification with ⩾90% confidence, we find that the EBF correctly identifies and classifies eclipsing contact (EC), eclipsing semi-detached (ESD), and eclipsing detached (ED) systems with a false positive rate of only 4%, 4%, and 8%, while complete to 64%, 46%, and 32%, respectively. When classification confidence is relaxed, the EBF identifies and classifies ECs, ESDs, and EDs with a slightly higher false positive rate of 6%, 16%, and 8%, while much more complete to 86%, 74%, and 62%, respectively. Through our processing of the entire Kepler “Q3” data set, we also identify 68 new candidate EBs that may have been missed by the human generated Kepler EB Catalog. We discuss the EBF's potential application to light curve classification for periodic variable stars more generally for current and upcoming surveys like K2 and the Transiting Exoplanet Survey Satellite.« less
The Eb Factory Project. Ii. Validation With the Kepler Field in Preparation for K2 and Tess
NASA Astrophysics Data System (ADS)
Parvizi, Mahmoud; Paegert, Martin; Stassun, Keivan G.
2014-12-01
Large repositories of high precision light curve data, such as the Kepler data set, provide the opportunity to identify astrophysically important eclipsing binary (EB) systems in large quantities. However, the rate of classical “by eye” human analysis restricts complete and efficient mining of EBs from these data using classical techniques. To prepare for mining EBs from the upcoming K2 mission as well as other current missions, we developed an automated end-to-end computational pipeline—the Eclipsing Binary Factory (EBF)—that automatically identifies EBs and classifies them into morphological types. The EBF has been previously tested on ground-based light curves. To assess the performance of the EBF in the context of space-based data, we apply the EBF to the full set of light curves in the Kepler “Q3” Data Release. We compare the EBs identified from this automated approach against the human generated Kepler EB Catalog of ˜ 2600 EBs. When we require EB classification with ≥slant 90% confidence, we find that the EBF correctly identifies and classifies eclipsing contact (EC), eclipsing semi-detached (ESD), and eclipsing detached (ED) systems with a false positive rate of only 4%, 4%, and 8%, while complete to 64%, 46%, and 32%, respectively. When classification confidence is relaxed, the EBF identifies and classifies ECs, ESDs, and EDs with a slightly higher false positive rate of 6%, 16%, and 8%, while much more complete to 86%, 74%, and 62%, respectively. Through our processing of the entire Kepler “Q3” data set, we also identify 68 new candidate EBs that may have been missed by the human generated Kepler EB Catalog. We discuss the EBF's potential application to light curve classification for periodic variable stars more generally for current and upcoming surveys like K2 and the Transiting Exoplanet Survey Satellite.
NASA Astrophysics Data System (ADS)
Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.
2013-03-01
To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.
Visiting An "Egg Factory" on the Farm: A Resource Unit.
ERIC Educational Resources Information Center
Ediger, Marlow
The resource unit indicates how elementary school teachers can use contemporary poultry farming to teach the concepts of change and specialization in American society and to show the effects of automation of American farms. The unit lists general objectives for students: to develop an understanding of farm specialization, especially in egg…
Rešková, Z; Koreňová, J; Kuchta, T
2014-04-01
A total of 256 isolates of Staphylococcus aureus were isolated from 98 samples (34 swabs and 64 food samples) obtained from small or medium meat- and cheese-processing plants in Slovakia. The strains were genotypically characterized by multiple locus variable number of tandem repeats analysis (MLVA), involving multiplex polymerase chain reaction (PCR) with subsequent separation of the amplified DNA fragments by an automated flow-through gel electrophoresis. With the panel of isolates, MLVA produced 31 profile types, which was a sufficient discrimination to facilitate the description of spatial and temporal aspects of contamination. Further data on MLVA discrimination were obtained by typing a subpanel of strains by multiple locus sequence typing (MLST). MLVA coupled to automated electrophoresis proved to be an effective, comparatively fast and inexpensive method for tracing S. aureus contamination of food-processing factories. Subspecies genotyping of microbial contaminants in food-processing factories may facilitate identification of spatial and temporal aspects of the contamination. This may help to properly manage the process hygiene. With S. aureus, multiple locus variable number of tandem repeats analysis (MLVA) proved to be an effective method for the purpose, being sufficiently discriminative, yet comparatively fast and inexpensive. The application of automated flow-through gel electrophoresis to separation of DNA fragments produced by multiplex PCR helped to improve the accuracy and speed of the method. © 2013 The Society for Applied Microbiology.
Engineering biological systems using automated biofoundries.
Chao, Ran; Mishra, Shekhar; Si, Tong; Zhao, Huimin
2017-07-01
Engineered biological systems such as genetic circuits and microbial cell factories have promised to solve many challenges in the modern society. However, the artisanal processes of research and development are slow, expensive, and inconsistent, representing a major obstacle in biotechnology and bioengineering. In recent years, biological foundries or biofoundries have been developed to automate design-build-test engineering cycles in an effort to accelerate these processes. This review summarizes the enabling technologies for such biofoundries as well as their early successes and remaining challenges. Copyright © 2017 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
Training Workers for the Factory of the Future.
ERIC Educational Resources Information Center
Clancy, J. Anthony
1989-01-01
In the factory of the future, emphasis on quality and increased productivity creates a competitive advantage. People and computers work together in all major activities. Training is a major factor in creating that competitive advantage. (JOW)
Exploiting Self-organization in Bioengineered Systems: A Computational Approach.
Davis, Delin; Doloman, Anna; Podgorski, Gregory J; Vargis, Elizabeth; Flann, Nicholas S
2017-01-01
The productivity of bioengineered cell factories is limited by inefficiencies in nutrient delivery and waste and product removal. Current solution approaches explore changes in the physical configurations of the bioreactors. This work investigates the possibilities of exploiting self-organizing vascular networks to support producer cells within the factory. A computational model simulates de novo vascular development of endothelial-like cells and the resultant network functioning to deliver nutrients and extract product and waste from the cell culture. Microbial factories with vascular networks are evaluated for their scalability, robustness, and productivity compared to the cell factories without a vascular network. Initial studies demonstrate that at least an order of magnitude increase in production is possible, the system can be scaled up, and the self-organization of an efficient vascular network is robust. The work suggests that bioengineered multicellularity may offer efficiency improvements difficult to achieve with physical engineering approaches.
Automated multiplex genome-scale engineering in yeast
Si, Tong; Chao, Ran; Min, Yuhao; Wu, Yuying; Ren, Wen; Zhao, Huimin
2017-01-01
Genome-scale engineering is indispensable in understanding and engineering microorganisms, but the current tools are mainly limited to bacterial systems. Here we report an automated platform for multiplex genome-scale engineering in Saccharomyces cerevisiae, an important eukaryotic model and widely used microbial cell factory. Standardized genetic parts encoding overexpression and knockdown mutations of >90% yeast genes are created in a single step from a full-length cDNA library. With the aid of CRISPR-Cas, these genetic parts are iteratively integrated into the repetitive genomic sequences in a modular manner using robotic automation. This system allows functional mapping and multiplex optimization on a genome scale for diverse phenotypes including cellulase expression, isobutanol production, glycerol utilization and acetic acid tolerance, and may greatly accelerate future genome-scale engineering endeavours in yeast. PMID:28469255
VizieR Online Data Catalog: Absorption velocities for 21 super-luminous SNe Ic (Liu+, 2017)
NASA Astrophysics Data System (ADS)
Liu, Y.-Q.; Modjaz, M.; Bianco, F. B.
2018-04-01
We have collected the spectra of all available super-luminous supernovae (SLSNe) Ic that have a date of maximum light published before April of 2016. These SLSNe Ic were mainly discovered and observed by the All-Sky Automated Survey for Supernovae (ASAS-SN), the Catalina Real-Time Transient Survey, the Dark Energy Survey (DES), the Hubble Space Telescope Cluster Supernova Survey, the Pan-STARRS1 Medium Deep Survey (PS1), the Public ESO Spectroscopic Survey of Transient Objects (PESSTO), the Intermediate Palomar Transient Factory (iPTF) as well as the Palomar Transient Factory (PTF), and the Supernova Legacy Survey (SNLS). See table 1. (2 data files).
Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools
ERIC Educational Resources Information Center
Jeon, Moongee
2014-01-01
This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…
SISYPHUS: A high performance seismic inversion factory
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas
2016-04-01
In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with branches for the static process setup, inversion iterations, and solver runs, each branch specifying information at the event, station and channel levels. The workflow management framework is based on an embedded scripting engine that allows definition of various workflow scenarios using a high-level scripting language and provides access to all available inversion components represented as standard library functions. At present the SES3D wave propagation solver is integrated in the solution; the work is in progress for interfacing with SPECFEM3D. A separate framework is designed for interoperability with an optimization module; the workflow manager and optimization process run in parallel and cooperate by exchanging messages according to a specially designed protocol. A library of high-performance modules implementing signal pre-processing, misfit and adjoint computations according to established good practices is included. Monitoring is based on information stored in the inversion state database and at present implements a command line interface; design of a graphical user interface is in progress. The software design fits well into the common massively parallel system architecture featuring a large number of computational nodes running distributed applications under control of batch-oriented resource managers. The solution prototype has been implemented on the "Piz Daint" supercomputer provided by the Swiss Supercomputing Centre (CSCS).
Putting Automated Visual Inspection Systems To Work On The Factory Floor: What's Missing?
NASA Astrophysics Data System (ADS)
Waltz, Frederick M.; Snyder, Michael A.; Batchelor, Bruce G.
1990-02-01
Machine vision systems and other automated visual inspection (AVI) systems have been proving their usefulness in factories for more than a decade. In spite of this, the number of installed systems is far below the number that could profitably be employed. In the opinion of the authors, the primary reason for this is the high cost of customizing vision systems to meet applications requirements. A three-part approach to this problem has proven to be useful: 1. A multi-phase paradigm for customer interaction, system specification, system development, and system installation; 2. A powerful and easy-to-use system development environment, including a a flexible laboratory lighting setup, plus software-based tools to assist in the design of image acquisition systems, b. an image processing environment with a very large repertoire of image processing and feature extraction operations and an easy-to-use command interpreter having macro capabilities, and c. an image analysis environment with high-level constructs, a flexible and powerful syntax, and a "seamless" interface to the image processing level; and 3. A moderately-priced high-speed "target" system fully compatible with the development environment, so that algorithms developed thereon can be transferred directly to the factory environment without further development costs or reprogramming. Items 1 and 2 are covered in other papers1,23,4,5 and are touched on here only briefly. Item 3 is the main subject of this paper. Our major motivation in presenting this paper is to offer suggestions to vendors developing commercial boards and systems, in hopes that the special needs of industrial inspection can be met.
1985-01-17
potas- sium oxides. Only then does the mixture react to form ammonia . A method for synthesizing ammonium from nitrogen and hydrogen, along with a...manufactured by this method , most of which is used in the synthesis of nitrogen fertilizers. A modern ammonia factory is a complex, highly automated...V. Karyakin; ZHURNAL ANALITICHESKOY KHIMII, No 8, Aug 84) 5 CATALYSTS Ammonia Synthesis and Homogenous Catalysts (0. Yefimov; LENINSKOYE ZNAMYA
Optimization-based Approach to Cross-layer Resource Management in Wireless Networked Control Systems
2013-05-01
interest from both academia and industry [37], finding applications in un- manned robotic vehicles, automated highways and factories, smart homes and...is stable when the scaler varies slowly. The algorithm is further extended to utilize the slack resource in the network, which leads to the...model . . . . . . . . . . . . . . . . 66 Optimal sampling rate allocation formulation . . . . . 67 Price-based algorithm
A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots.
Sherwin, Tyrone; Easte, Mikala; Chen, Andrew Tzer-Yeu; Wang, Kevin I-Kai; Dai, Wenbin
2018-02-14
Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system.
A Single RF Emitter-Based Indoor Navigation Method for Autonomous Service Robots
Sherwin, Tyrone; Easte, Mikala; Wang, Kevin I-Kai; Dai, Wenbin
2018-01-01
Location-aware services are one of the key elements of modern intelligent applications. Numerous real-world applications such as factory automation, indoor delivery, and even search and rescue scenarios require autonomous robots to have the ability to navigate in an unknown environment and reach mobile targets with minimal or no prior infrastructure deployment. This research investigates and proposes a novel approach of dynamic target localisation using a single RF emitter, which will be used as the basis of allowing autonomous robots to navigate towards and reach a target. Through the use of multiple directional antennae, Received Signal Strength (RSS) is compared to determine the most probable direction of the targeted emitter, which is combined with the distance estimates to improve the localisation performance. The accuracy of the position estimate is further improved using a particle filter to mitigate the fluctuating nature of real-time RSS data. Based on the direction information, a motion control algorithm is proposed, using Simultaneous Localisation and Mapping (SLAM) and A* path planning to enable navigation through unknown complex environments. A number of navigation scenarios were developed in the context of factory automation applications to demonstrate and evaluate the functionality and performance of the proposed system. PMID:29443906
The Change to Administrative Computing in Schools.
ERIC Educational Resources Information Center
Brown, Daniel J.
1984-01-01
Describes a study of the process of school office automation which focuses on personnel reactions to administrative computing, what users view as advantages and disadvantages of the automation, perceived barriers and facilitators of the change to automation, school personnel view of long term effects, and implications for school computer policy.…
Fundamentals of Library Automation and Technology. Participant Workbook.
ERIC Educational Resources Information Center
Bridge, Frank; Walton, Robert
This workbook presents outlines of topics to be covered during a two-day workshop on the fundamentals for library automation. Topics for the first day include: (1) Introduction; (2) Computer Technology--A Historical Overview; (3) Evolution of Library Automation; (4) Computer Hardware Technology--An Introduction; (5) Computer Software…
System for Computer Automated Typesetting (SCAT) of Computer Authored Texts.
ERIC Educational Resources Information Center
Keeler, F. Laurence
This description of the System for Automated Typesetting (SCAT), an automated system for typesetting text and inserting special graphic symbols in programmed instructional materials created by the computer aided authoring system AUTHOR, provides an outline of the design architecture of the system and an overview including the component…
Pilot interaction with automated airborne decision making systems
NASA Technical Reports Server (NTRS)
Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.
1976-01-01
An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.
Smartphone attachment for stethoscope recording.
Thompson, Jeff
2015-01-01
With the ubiquity of smartphones and the rising technology of 3D printing, novel devices can be developed that leverage the "computer in your pocket" and rapid prototyping technologies toward scientific, medical, engineering, and creative purposes. This paper describes such a device: a simple 3D-printed extension for Apple's iPhone that allows the sound from an off-the-shelf acoustic stethoscope to be recorded using the phone's built-in microphone. The attachment's digital 3D files can be easily shared, modified for similar phones and devices capable of recording audio, and in combination with 3D printing technology allow for fabrication of a durable device without need for an entire factory of expensive and specialized machining tools. It is hoped that by releasing this device as an open source set of printable files that can be downloaded and reproduced cheaply, others can make use of these developments where access to cost-prohibitive, specialized medical instruments are not available. Coupled with specialized smartphone software ("apps"), more sophisticated and automated diagnostics may also be possible on-site.
Problem Solving Software for Math Classes.
ERIC Educational Resources Information Center
Troutner, Joanne
1987-01-01
Described are 10 computer software programs for problem solving related to mathematics. Programs described are: (1) Box Solves Story Problems; (2) Safari Search; (3) Puzzle Tanks; (4) The King's Rule; (5) The Factory; (6) The Royal Rules; (7) The Enchanted Forest; (8) Gears; (9) The Super Factory; and (10) Creativity Unlimited. (RH)
Data Processing Factory for the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Stoughton, Christopher; Adelman, Jennifer; Annis, James T.; Hendry, John; Inkmann, John; Jester, Sebastian; Kent, Steven M.; Kuropatkin, Nickolai; Lee, Brian; Lin, Huan; Peoples, John, Jr.; Sparks, Robert; Tucker, Douglas; Vanden Berk, Dan; Yanny, Brian; Yocum, Dan
2002-12-01
The Sloan Digital Sky Survey (SDSS) data handling presents two challenges: large data volume and timely production of spectroscopic plates from imaging data. A data processing factory, using technologies both old and new, handles this flow. Distribution to end users is via disk farms, to serve corrected images and calibrated spectra, and a database, to efficiently process catalog queries. For distribution of modest amounts of data from Apache Point Observatory to Fermilab, scripts use rsync to update files, while larger data transfers are accomplished by shipping magnetic tapes commercially. All data processing pipelines are wrapped in scripts to address consecutive phases: preparation, submission, checking, and quality control. We constructed the factory by chaining these pipelines together while using an operational database to hold processed imaging catalogs. The science database catalogs all imaging and spectroscopic object, with pointers to the various external files associated with them. Diverse computing systems address particular processing phases. UNIX computers handle tape reading and writing, as well as calibration steps that require access to a large amount of data with relatively modest computational demands. Commodity CPUs process steps that require access to a limited amount of data with more demanding computations requirements. Disk servers optimized for cost per Gbyte serve terabytes of processed data, while servers optimized for disk read speed run SQLServer software to process queries on the catalogs. This factory produced data for the SDSS Early Data Release in June 2001, and it is currently producing Data Release One, scheduled for January 2003.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2011 CFR
2011-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., financial records, and automated data systems; (ii) The data are free from computational errors and are... records, financial records, and automated data systems; (ii) The data are free from computational errors... records, and automated data systems; (ii) The data are free from computational errors and are internally...
iPTF report of bright transients
NASA Astrophysics Data System (ADS)
Cannella, Chris; Kuesters, Daniel; Ferretti, Raphael; Blagorodnova, Nadejda; Adams, Scott; Kupfer, Thomas; Neill, James D.; Walters, Richard; Yan, Lin; Kulkarni, Shri
2017-02-01
The intermediate Palomar Transient Factory (iPTF; ATel #4807) reports the following bright ( Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R), and RB5 (Wozniak et al. 2013AAS...22143105W).
1983-06-08
will attempt to destroy factories and industrial estab- lishments in Pusan, Masan, Taegu, Ulsan, Pohang, Kwangju and other major cities, it said... industry is expected to be intensified in the months to come with the Daewoo Electronic Co developing many new models since it took over the electric...The projected optical fiber communications systems will mark a milestone in the annals of the Korean electronic industry because automation in the
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
Nouri, Ali; Chyba, Christopher F
2012-01-01
It is generally assumed that genetic engineering advances will, inevitably, facilitate the misapplication of biotechnology toward the production of biological weapons. Unexpectedly, however, some of these very advances in the areas of DNA synthesis and sequencing may enable the implementation of automated and nonintrusive safeguards to avert the illicit applications of biotechnology. In the case of DNA synthesis, automated DNA screening tools could be built into DNA synthesizers in order to block the synthesis of hazardous agents. In addition, a comprehensive safety and security regime for dual-use genetic engineering research could include nonintrusive monitoring of DNA sequencing. This is increasingly feasible as laboratories outsource this service to just a few centralized sequencing factories. The adoption of automated, nonintrusive monitoring and surveillance of the DNA synthesis and sequencing pipelines may avert many risks associated with dual-use biotechnology. Here, we describe the historical background and current challenges associated with dual-use biotechnologies and propose strategies to address these challenges.
The Interdependence of Computers, Robots, and People.
ERIC Educational Resources Information Center
Ludden, Laverne; And Others
Computers and robots are becoming increasingly more advanced, with smaller and cheaper computers now doing jobs once reserved for huge multimillion dollar computers and with robots performing feats such as painting cars and using television cameras to simulate vision as they perform factory tasks. Technicians expect computers to become even more…
Impact of synthetic biology and metabolic engineering on industrial production of fine chemicals.
Jullesson, David; David, Florian; Pfleger, Brian; Nielsen, Jens
2015-11-15
Industrial bio-processes for fine chemical production are increasingly relying on cell factories developed through metabolic engineering and synthetic biology. The use of high throughput techniques and automation for the design of cell factories, and especially platform strains, has played an important role in the transition from laboratory research to industrial production. Model organisms such as Saccharomyces cerevisiae and Escherichia coli remain widely used host strains for industrial production due to their robust and desirable traits. This review describes some of the bio-based fine chemicals that have reached the market, key metabolic engineering tools that have allowed this to happen and some of the companies that are currently utilizing these technologies for developing industrial production processes. Copyright © 2015 Elsevier Inc. All rights reserved.
Development of sample exchange robot PAM-HC for beamline BL-1A at the photon factory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiraki, Masahiko, E-mail: masahiko.hiraki@kek.jp; Department of Accelerator Science, SOKENDAI; Matsugaki, Naohiro
A macromolecular crystallography beamline, BL-1A, has been built at the Photon Factory (PF) for low energy experiments and has been operational since 2010. We have installed a sample exchange robot, PAM (PF Automated Mounting system), similar to other macromolecular crystallography beamlines. However, following the installation of a helium chamber to reduce the absorption of the diffraction signal by air, we developed a new sample exchange robot to replace PAM. The new robot, named PAM-HC (Helium Chamber), is designed with the goal of minimizing leakage of helium gas from the chamber. Here, the PAM-HC hardware and the flow of its movementmore » are described. Furthermore, measurements of temperature changes during sample exchange are presented in this paper.« less
Automated Help System For A Supercomputer
NASA Technical Reports Server (NTRS)
Callas, George P.; Schulbach, Catherine H.; Younkin, Michael
1994-01-01
Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.
Computer Programs For Automated Welding System
NASA Technical Reports Server (NTRS)
Agapakis, John E.
1993-01-01
Computer programs developed for use in controlling automated welding system described in MFS-28578. Together with control computer, computer input and output devices and control sensors and actuators, provide flexible capability for planning and implementation of schemes for automated welding of specific workpieces. Developed according to macro- and task-level programming schemes, which increases productivity and consistency by reducing amount of "teaching" of system by technician. System provides for three-dimensional mathematical modeling of workpieces, work cells, robots, and positioners.
USSR Report: Cybernetics, Computers and Automation Technology. No. 69.
1983-05-06
computers in multiprocessor and multistation design , control and scientific research automation systems. The results of comparing the efficiency of...Podvizhnaya, Scientific Research Institute of Control Computers, Severodonetsk] [Text] The most significant change in the design of the SM-2M compared to...UPRAVLYAYUSHCHIYE SISTEMY I MASHINY, Nov-Dec 82) 95 APPLICATIONS Kiev Automated Control System, Design Features and Prospects for Development (V. A
O'Connor, Annette M; Totton, Sarah C; Cullen, Jonah N; Ramezani, Mahmood; Kalivarapu, Vijay; Yuan, Chaohui; Gilbert, Stephen B
2018-01-01
Systematic reviews are increasingly using data from preclinical animal experiments in evidence networks. Further, there are ever-increasing efforts to automate aspects of the systematic review process. When assessing systematic bias and unit-of-analysis errors in preclinical experiments, it is critical to understand the study design elements employed by investigators. Such information can also inform prioritization of automation efforts that allow the identification of the most common issues. The aim of this study was to identify the design elements used by investigators in preclinical research in order to inform unique aspects of assessment of bias and error in preclinical research. Using 100 preclinical experiments each related to brain trauma and toxicology, we assessed design elements described by the investigators. We evaluated Methods and Materials sections of reports for descriptions of the following design elements: 1) use of comparison group, 2) unit of allocation of the interventions to study units, 3) arrangement of factors, 4) method of factor allocation to study units, 5) concealment of the factors during allocation and outcome assessment, 6) independence of study units, and 7) nature of factors. Many investigators reported using design elements that suggested the potential for unit-of-analysis errors, i.e., descriptions of repeated measurements of the outcome (94/200) and descriptions of potential for pseudo-replication (99/200). Use of complex factor arrangements was common, with 112 experiments using some form of factorial design (complete, incomplete or split-plot-like). In the toxicology dataset, 20 of the 100 experiments appeared to use a split-plot-like design, although no investigators used this term. The common use of repeated measures and factorial designs means understanding bias and error in preclinical experimental design might require greater expertise than simple parallel designs. Similarly, use of complex factor arrangements creates novel challenges for accurate automation of data extraction and bias and error assessment in preclinical experiments.
Evolution of solid rocket booster component testing
NASA Technical Reports Server (NTRS)
Lessey, Joseph A.
1989-01-01
The evolution of one of the new generation of test sets developed for the Solid Rocket Booster of the U.S. Space Transportation System. Requirements leading to factory checkout of the test set are explained, including the evolution from manual to semiautomated toward fully automated status. Individual improvements in the built-in test equipment, self-calibration, and software flexibility are addressed, and the insertion of fault detection to improve reliability is discussed.
ERIC Educational Resources Information Center
Howell, Abraham L.
2012-01-01
In the high tech factories of today robots can be used to perform various tasks that span a wide spectrum that encompasses the act of performing high-speed, automated assembly of cell phones, laptops and other electronic devices to the compounding, filling, packaging and distribution of life-saving pharmaceuticals. As robot usage continues to…
Automated validation of a computer operating system
NASA Technical Reports Server (NTRS)
Dervage, M. M.; Milberg, B. A.
1970-01-01
Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.
Operating a production pilot factory serving several scientific domains
NASA Astrophysics Data System (ADS)
Sfiligoi, I.; Würthwein, F.; Andrews, W.; Dost, J. M.; MacNeill, I.; McCrea, A.; Sheripon, E.; Murphy, C. W.
2011-12-01
Pilot infrastructures are becoming prominent players in the Grid environment. One of the major advantages is represented by the reduced effort required by the user communities (also known as Virtual Organizations or VOs) due to the outsourcing of the Grid interfacing services, i.e. the pilot factory, to Grid experts. One such pilot factory, based on the glideinWMS pilot infrastructure, is being operated by the Open Science Grid at University of California San Diego (UCSD). This pilot factory is serving multiple VOs from several scientific domains. Currently the three major clients are the analysis operations of the HEP experiment CMS, the community VO HCC, which serves mostly math, biology and computer science users, and the structural biology VO NEBioGrid. The UCSD glidein factory allows the served VOs to use Grid resources distributed over 150 sites in North and South America, in Europe, and in Asia. This paper presents the steps taken to create a production quality pilot factory, together with the challenges encountered along the road.
Microelectronics Revolution And The Impact Of Automation In The New Industrialized Countries
NASA Astrophysics Data System (ADS)
Baranauskas, Vitor
1984-08-01
A brief review of some important historical points on the origin of the Factories and the Industrial Revolution is presented with emphasis in the social problems related to the automation of the human labor. Until the World War I, the social changes provoked by the Industrial Revolution caused one division of the World in developed and underdeveloped countries. After that period, the less developed nations began their industrialization mainly through the Multinationals Corporations (MC). These enterprises were very important to the production and exportation of utilities and manufactures in general, mainly in those products which required intensive and direct human labor. At present time, with the pervasiveness of microelectronics in the automation, this age seems to reaching an end because all continous processes in industry tend economicaly toward total automation. This fact will cause a retraction in long-term investments and, beyond massive unemployment, there is a tendency for these MC industries to return to their original countries. The most promising alternative to avoid these events, and perhaps the unique, is to incentive an autonomous development in areas of high technology, as for instance, the microelectronics itself.
In-Factory Learning - Qualification For The Factory Of The Future
NASA Astrophysics Data System (ADS)
Quint, Fabian; Mura, Katharina; Gorecky, Dominic
2015-07-01
The Industry 4.0 vision anticipates that internet technologies will find their way into future factories replacing traditional components by dynamic and intelligent cyber-physical systems (CPS) that combine the physical objects with their digital representation. Reducing the gap between the real and digital world makes the factory environment more flexible, more adaptive, but also more complex for the human workers. Future workers require interdisciplinary competencies from engineering, information technology, and computer science in order to understand and manage the diverse interrelations between physical objects and their digital counterpart. This paper proposes a mixed-reality based learning environment, which combines physical objects and visualisation of digital content via Augmented Reality. It uses reality-based interaction in order to make the dynamic interrelations between real and digital factory visible and tangible. We argue that our learning system does not work as a stand-alone solution, but should fit into existing academic and advanced training curricula.
Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline
Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.
2016-09-28
A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.
Intermediate Palomar Transient Factory: Realtime Image Subtraction Pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Yi; Nugent, Peter E.; Kasliwal, Mansi M.
A fast-turnaround pipeline for realtime data reduction plays an essential role in discovering and permitting followup observations to young supernovae and fast-evolving transients in modern time-domain surveys. In this paper, we present the realtime image subtraction pipeline in the intermediate Palomar Transient Factory. By using highperformance computing, efficient databases, and machine-learning algorithms, this pipeline manages to reliably deliver transient candidates within 10 minutes of images being taken. Our experience in using high-performance computing resources to process big data in astronomy serves as a trailblazer to dealing with data from large-scale time-domain facilities in the near future.
Applying Semantic Web Services and Wireless Sensor Networks for System Integration
NASA Astrophysics Data System (ADS)
Berkenbrock, Gian Ricardo; Hirata, Celso Massaki; de Oliveira Júnior, Frederico Guilherme Álvares; de Oliveira, José Maria Parente
In environments like factories, buildings, and homes automation services tend to often change during their lifetime. Changes are concerned to business rules, process optimization, cost reduction, and so on. It is important to provide a smooth and straightforward way to deal with these changes so that could be handled in a faster and low cost manner. Some prominent solutions use the flexibility of Wireless Sensor Networks and the meaningful description of Semantic Web Services to provide service integration. In this work, we give an overview of current solutions for machinery integration that combine both technologies as well as a discussion about some perspectives and open issues when applying Wireless Sensor Networks and Semantic Web Services for automation services integration.
Advances in Composites Technology
NASA Technical Reports Server (NTRS)
Tenney, D. R.; Dexter, H. B.
1985-01-01
A significant level of research is currently focused on the development of tough resins and high strain fibers in an effort to gain improved damage tolerance. Moderate success has been achieved with the development of new resins such as PEEK and additional improvements look promising with new thermoplastic resins. Development of innovative material forms such as 2-D and 3-D woven fabrics and braided structural subelements is also expected to improve damage tolerance and durability of composite hardware. The new thrust in composites is to develop low cost manufacturing and design concepts to lower the cost of composite hardware. Processes being examined include automated material placement, filament winding, pultrusion, and thermoforming. The factory of the future will likely incorporate extensive automation in all aspects of manufacturing composite components.
Automated Decomposition of Model-based Learning Problems
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Millar, Bill
1996-01-01
A new generation of sensor rich, massively distributed autonomous systems is being developed that has the potential for unprecedented performance, such as smart buildings, reconfigurable factories, adaptive traffic systems and remote earth ecosystem monitoring. To achieve high performance these massive systems will need to accurately model themselves and their environment from sensor information. Accomplishing this on a grand scale requires automating the art of large-scale modeling. This paper presents a formalization of [\\em decompositional model-based learning (DML)], a method developed by observing a modeler's expertise at decomposing large scale model estimation tasks. The method exploits a striking analogy between learning and consistency-based diagnosis. Moriarty, an implementation of DML, has been applied to thermal modeling of a smart building, demonstrating a significant improvement in learning rate.
The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency
ERIC Educational Resources Information Center
Oder, Karl; Pittman, Stephanie
2015-01-01
Companies purchase computer systems to make their processes more efficient through automation. Some academic medical centers (AMC) have purchased computer systems for their institutional review boards (IRB) to increase efficiency and compliance with regulations. IRB computer systems are expensive to purchase, deploy, and maintain. An AMC should…
Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming
Philip A. Araman
1990-01-01
This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...
Computers Launch Faster, Better Job Matching
ERIC Educational Resources Information Center
Stevenson, Gloria
1976-01-01
Employment Security Automation Project (ESAP), a five-year program sponsored by the Employment and Training Administration, features an innovative computer-assisted job matching system and instantaneous computer-assisted service for unemployment insurance claimants. ESAP will also consolidate existing automated employment security systems to…
Bulat, Petar; Daemen, Edgard; Van Risseghem, Marleen; De Bacquer, Dirk; Tan, Xiaodong; Braeckman, Lutgart; Vanhoorne, Michel
2002-01-01
The objective of this follow-up study was to verify the efficacy of the technical adjustments gradually introduced in departments of a viscose rayon factory from 1989 onward. Personal exposure to carbon disulphide was assessed by means of personal monitoring through active sampling. Six job titles in three departments of the factory were sampled. Geometric means were calculated and used as estimates of time-weighted average (TWA) concentrations. The results from the present study were compared with similar measurements from a previous study in the same factory. Due to organizational changes, only three job titles (spinner, first spinner, and viscose preparator) could be compared directly. Two new job titles were identified, although tasks performed in these two job titles already existed. The measurements from one job title could not be compared, due to a substantial reorganization and automation of the tasks carried out in the department. The comparison before and after technical improvements shows that personal exposure of spinner and first spinner has been substantially reduced. Even the geometric means of measurements outside the fresh air mask are below the TWA-TLV (Threshold Limit Value). Despite the difficulties in comparing the results from the two studies, it is concluded that the technical measures reduced up to tenfold personal exposure to carbon disulphide and personal protection reduced it further by a factor two.
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Papadogiannakis, S.; Taddia, F.; Petrushevska, T.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Nir, G.; Cao, Y.; Blagorodnova, N.; Kulkarni, S.
2016-05-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artefacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Core-Collapse Supernovae
NASA Astrophysics Data System (ADS)
Taddia, F.; Ferretti, R.; Papadogiannakis, S.; Petrushevska, T.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Hangard, L.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bar, I.; Cao, Y.; Kulkarni, S.; Blagorodnova, N.
2016-05-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following core-collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
European Science Notes. Volume 39, Number 11.
1985-11-01
bacteria is factory by means of a mixed culture of being studied under continuous flow con- bacteria. ditions, especially as it depends on solar radiation...of 300 million operations per contraction chromium layers are plated second and occupying less than a cubic from aqueous solutions by an automated...small as 10 a in diame- tics of additives is needed to determine ter, allowing measurement in layers as their effects on microbial growth. thin as 20
iPTF Discoveries of Recent Core-Collapse Supernovae
NASA Astrophysics Data System (ADS)
Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Hangard, L.; De Cia, A.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Sagiv, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Bilgi, P.
2015-04-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Core-Collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernova
NASA Astrophysics Data System (ADS)
Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.; Andreoni, I.
2015-10-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent SNe Ia
NASA Astrophysics Data System (ADS)
Ferretti, R.; Fremling, C.; Johansson, J.; Karamehmetoglu, E.; Migotto, K.; Nyholm, A.; Papadogiannakis, S.; Taddia, F.; Petrushevska, T.; Roy, R.; Ben-Ami, S.; De Cia, A.; Dzigan, Y.; Horesh, A.; Khazov, D.; Manulis, I.; Rubin, A.; Sagiv, I.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.
2015-02-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Papadogiannakis, S.; Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Petrushevska, T.; Nyholm, A.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Lunnan, R.; Blagorodnova, N.; Cao, Y.; Cenk, S. B.
2016-01-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Taddia, F.; Horesh, A.; Khazov, D.; Knezevic, S.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Cao, Y.; Duggan, G.; Lunnan, R.; Blagorodnova, N.
2015-11-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Core-Collapse Supernovae
NASA Astrophysics Data System (ADS)
Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Duggan, G.; Lunnan, R.; Cao, Y.
2015-09-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Core-Collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discovery of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Hangard, L.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Bar, I.; Horesh, A.; Johansson, J.; Khazov, D.; Knezevic, S.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Cao, Y.; Kulkarni, S.; Lunnan, R.; Ravi, V.; Vedantham, H. K.; Yan, L.
2016-04-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Core-Collapse Supernovae
NASA Astrophysics Data System (ADS)
Taddia, F.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Petrushevska, T.; Roy, R.; Hangard, L.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Johansson, J.; Lunnan, R.; Cao, Y.; Miller, A.
2015-11-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Core-Collapse SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.
2016-02-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discovery of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Hangard, L.; Taddia, F.; Ferretti, R.; Papadogiannakis, S.; Petrushevska, T.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bar, I.; Lunnan, R.; Cenk, S. B.
2016-02-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Papadogiannakis, S.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Ferretti, R.; Petrushevska, T.; Roy, R.; Taddia, F.; Bar, I.; Horesh, A.; Johansson, J.; Knezevic, S.; Leloudas, G.; Manulis, I.; Nir, G.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Arcavi, I.; Howell, D. A.; McCully, C.; Hosseinzadeh, G.; Valenti, S.; Blagorodnova, N.; Cao, Y.; Duggan, G.; Ravi, V.; Lunnan, R.
2016-03-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF discoveries of recent type Ia supernovae
NASA Astrophysics Data System (ADS)
Papadogiannakis, S.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Petrushevska, T.; Roy, R.; De Cia, A.; Vreeswijk, P.; Horesh, A.; Manulis, I.; Sagiv, I.; Rubin, A.; Yaron, O.; Leloudas, G.; Khazov, D.; Soumagnac, M.; Knezevic, S.; Cenko, S. B.; Capone, J.; Bartakk, M.
2015-09-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discovery of Recent Type Ia Supernova
NASA Astrophysics Data System (ADS)
Hangard, L.; Petrushevska, T.; Papadogiannakis, S.; Ferretti, R.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Kasliwal, M.
2015-10-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.; Neill, J. D.; Walters, R.
2016-04-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Papadogiannakis, S.; Taddia, F.; Petrushevska, T.; Fremling, C.; Hangard, L.; Johansson, J.; Karamehmetoglu, E.; Migotto, K.; Nyholm, A.; Roy, R.; Ben-Ami, S.; De Cia, A.; Dzigan, Y.; Horesh, A.; Khazov, D.; Soumagnac, M.; Manulis, I.; Rubin, A.; Sagiv, I.; Vreeswijk, P.; Yaron, O.; Bond, H.; Bilgi, P.; Cao, Y.; Duggan, G.
2015-03-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discovery of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Hangard, L.; Ferretti, R.; Papadogiannakis, S.; Petrushevska, T.; Fremling, C.; Karamehmetoglu, E.; Nyholm, A.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Cook, D.
2015-12-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
iPTF Discoveries of Recent Type Ia Supernova
NASA Astrophysics Data System (ADS)
Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Karamehmetoglu, E.; Nyholm, A.; Papadogiannakis, S.; Roy, R.; Horesh, A.; Khazov, D.; Knezevic, S.; Johansson, J.; Leloudas, G.; Manulis, I.; Rubin, A.; Soumagnac, M.; Vreeswijk, P.; Yaron, O.; Bilgi, P.; Cao, Y.; Duggan, G.; Lunnan, R.; Jencson, J.
2015-11-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W).
InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Hamledari, Hesam
In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.
1991-09-01
an Experimental Design ...... 31 Selection of Variables .................... ... 34 Defining Measures of Effectiveness ....... 37 Specification of...Required Number of Replications 44 Modification of Scenario Files ......... ... 46 Analysis of the Main Effects of a Two Level Factorial Design ...48 Analysis of the Interaction Effects of a *Two Level Factorial Design .. ............. ... 49 Yate’s Algorithm ......... ................ 50
Computer vision in the poultry industry
USDA-ARS?s Scientific Manuscript database
Computer vision is becoming increasingly important in the poultry industry due to increasing use and speed of automation in processing operations. Growing awareness of food safety concerns has helped add food safety inspection to the list of tasks that automated computer vision can assist. Researc...
Computing and Office Automation: Changing Variables.
ERIC Educational Resources Information Center
Staman, E. Michael
1981-01-01
Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…
ADP Analysis project for the Human Resources Management Division
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1993-01-01
The ADP (Automated Data Processing) Analysis Project was conducted for the Human Resources Management Division (HRMD) of NASA's Langley Research Center. The three major areas of work in the project were computer support, automated inventory analysis, and an ADP study for the Division. The goal of the computer support work was to determine automation needs of Division personnel and help them solve computing problems. The goal of automated inventory analysis was to find a way to analyze installed software and usage on a Macintosh. Finally, the ADP functional systems study for the Division was designed to assess future HRMD needs concerning ADP organization and activities.
Automated computation of autonomous spectral submanifolds for nonlinear modal analysis
NASA Astrophysics Data System (ADS)
Ponsioen, Sten; Pedergnana, Tiemo; Haller, George
2018-04-01
We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2013 CFR
2013-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2014 CFR
2014-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2011 CFR
2011-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2012 CFR
2012-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
Systems engineering: A formal approach. Part 1: System concepts
NASA Astrophysics Data System (ADS)
Vanhee, K. M.
1993-03-01
Engineering is the scientific discipline focused on the creation of new artifacts that are supposed to be of some use to our society. Different types of artifacts require different engineering approaches. However, in all these disciplines the development of a new artifact is divided into stages. Three stages can always be recognized: Analysis, Design, and Realization. The book considers only the first two stages of the development process. It focuses on a specific type of artifacts, called discrete dynamic systems. These systems consist of active components of actors that consume and produce passive components or tokens. Three subtypes are studied in more detail: business systems (like a factory or restaurant), information systems (whether automated or not), and automated systems (systems that are controlled by an automated information system). The first subtype is studied by industrial engineers, the last by software engineers and electrical engineers, whereas the second is a battlefield for all three disciplines. The union of these disciplines is called systems engineering.
Continuous-flow automation and hemolysis index: a crucial combination.
Lippi, Giuseppe; Plebani, Mario
2013-04-01
A paradigm shift has occurred in the role and organization of laboratory diagnostics over the past decades, wherein consolidation or networking of small laboratories into larger factories and point-of-care testing have simultaneously evolved and now seem to favorably coexist. There is now evidence, however, that the growing implementation of continuous-flow automation, especially in closed systems, has not eased the identification of hemolyzed specimens since the integration of preanalytical and analytical workstations would hide them from visual scrutiny, with an inherent risk that unreliable test results may be released to the stakeholders. Along with other technical breakthroughs, the new generation of laboratory instrumentation is increasingly equipped with systems that can systematically and automatically be tested for a broad series of interferences, the so-called serum indices, which also include the hemolysis index. The routine implementation of these technical tools in clinical laboratories equipped with continuous-flow automation carries several advantages and some drawbacks that are discussed in this article.
High-speed autoverifying technology for printed wiring boards
NASA Astrophysics Data System (ADS)
Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi
1996-10-01
We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).
17 CFR 38.156 - Automated trade surveillance system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...
17 CFR 38.156 - Automated trade surveillance system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...
ERIC Educational Resources Information Center
Majchrzak, Ann
A study was conducted of the training programs used by plants with Computer Automated Design/Computer Automated Manufacturing (CAD/CAM) to help their employees adapt to automated manufacturing. The study sought to determine the relative priorities of manufacturing establishments for training certain workers in certain skills; the status of…
Computer Proficiency for Online Learning: Factorial Invariance of Scores among Teachers
ERIC Educational Resources Information Center
Martin, Amy L.; Reeves, Todd D.; Smith, Thomas J.; Walker, David A.
2016-01-01
Online learning is variously employed in K-12 education, including for teacher professional development. However, the use of computer-based technologies for learning purposes assumes learner computer proficiency, making this construct an important domain of procedural knowledge in formal and informal online learning contexts. Addressing this…
Using satellite communications for a mobile computer network
NASA Technical Reports Server (NTRS)
Wyman, Douglas J.
1993-01-01
The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.
iPTF Discoveries of Recent Type Ia Supernovae
NASA Astrophysics Data System (ADS)
Petrushevska, T.; Ferretti, R.; Fremling, C.; Hangard, L.; Johansson, J.; Migotto, K.; Nyholm, A.; Papadogiannakis, S.; Ben-Ami, S.; De Cia, A.; Dzigan, Y.; Horesh, A.; Leloudas, G.; Manulis, I.; Rubin, A.; Sagiv, I.; Vreeswijk, P.; Yaron, O.; Cao, Y.; Perley, D.; Miller, A.; Waszczak, A.; Kasliwal, M. M.; Hosseinzadeh, G.; Cenko, S. B.; Quimby, R.
2015-05-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery and classification of the following Type Ia SNe. Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R) and RB5 (Wozniak et al. 2013AAS...22143105W). See ATel #7112 for additional details.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurt Derr; Milos Manic
Time and location data play a very significant role in a variety of factory automation scenarios, such as automated vehicles and robots, their navigation, tracking, and monitoring, to services of optimization and security. In addition, pervasive wireless capabilities combined with time and location information are enabling new applications in areas such as transportation systems, health care, elder care, military, emergency response, critical infrastructure, and law enforcement. A person/object in proximity to certain areas for specific durations of time may pose a risk hazard either to themselves, others, or the environment. This paper presents a novel fuzzy based spatio-temporal risk calculationmore » DSTiPE method that an object with wireless communications presents to the environment. The presented Matlab based application for fuzzy spatio-temporal risk cluster extraction is verified on a diagonal vehicle movement example.« less
NASA Technical Reports Server (NTRS)
Boyle, W. G.; Barton, G. W.
1979-01-01
The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…
Computer-Aided Instruction in Automated Instrumentation.
ERIC Educational Resources Information Center
Stephenson, David T.
1986-01-01
Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…
An Analysis for Capital Expenditure Decisions at a Naval Regional Medical Center.
1981-12-01
Service Equipment Review Committee 1. Portable defibrilator Computed tomographic scanner and cardioscope 2. ECG cart Automated blood cell counter 3. Gas...system sterilizer Gas system sterilizer 4. Automated blood cell Portable defibrilator and counter cardioscope 5. Computed tomographic ECG cart scanner...dictating and automated typing) systems. e. Filing equipment f. Automatic data processing equipment including data communications equipment. g
Cameo: A Python Library for Computer Aided Metabolic Engineering and Optimization of Cell Factories.
Cardoso, João G R; Jensen, Kristian; Lieven, Christian; Lærke Hansen, Anne Sofie; Galkina, Svetlana; Beber, Moritz; Özdemir, Emre; Herrgård, Markus J; Redestig, Henning; Sonnenschein, Nikolaus
2018-04-20
Computational systems biology methods enable rational design of cell factories on a genome-scale and thus accelerate the engineering of cells for the production of valuable chemicals and proteins. Unfortunately, the majority of these methods' implementations are either not published, rely on proprietary software, or do not provide documented interfaces, which has precluded their mainstream adoption in the field. In this work we present cameo, a platform-independent software that enables in silico design of cell factories and targets both experienced modelers as well as users new to the field. It is written in Python and implements state-of-the-art methods for enumerating and prioritizing knockout, knock-in, overexpression, and down-regulation strategies and combinations thereof. Cameo is an open source software project and is freely available under the Apache License 2.0. A dedicated Web site including documentation, examples, and installation instructions can be found at http://cameo.bio . Users can also give cameo a try at http://try.cameo.bio .
An, Gao; Hong, Li; Zhou, Xiao-Bing; Yang, Qiong; Li, Mei-Qing; Tang, Xiang-Yang
2017-03-01
We investigated and compared the functionality of two 3D visualization software provided by a CT vendor and a third-party vendor, respectively. Using surgical anatomical measurement as baseline, we evaluated the accuracy of 3D visualization and verified their utility in computer-aided anatomical analysis. The study cohort consisted of 50 adult cadavers fixed with the classical formaldehyde method. The computer-aided anatomical analysis was based on CT images (in DICOM format) acquired by helical scan with contrast enhancement, using a CT vendor provided 3D visualization workstation (Syngo) and a third-party 3D visualization software (Mimics) that was installed on a PC. Automated and semi-automated segmentations were utilized in the 3D visualization workstation and software, respectively. The functionality and efficiency of automated and semi-automated segmentation methods were compared. Using surgical anatomical measurement as a baseline, the accuracy of 3D visualization based on automated and semi-automated segmentations was quantitatively compared. In semi-automated segmentation, the Mimics 3D visualization software outperformed the Syngo 3D visualization workstation. No significant difference was observed in anatomical data measurement by the Syngo 3D visualization workstation and the Mimics 3D visualization software (P>0.05). Both the Syngo 3D visualization workstation provided by a CT vendor and the Mimics 3D visualization software by a third-party vendor possessed the needed functionality, efficiency and accuracy for computer-aided anatomical analysis. Copyright © 2016 Elsevier GmbH. All rights reserved.
Computer vision challenges and technologies for agile manufacturing
NASA Astrophysics Data System (ADS)
Molley, Perry A.
1996-02-01
Sandia National Laboratories, a Department of Energy laboratory, is responsible for maintaining the safety, security, reliability, and availability of the nuclear weapons stockpile for the United States. Because of the changing national and global political climates and inevitable budget cuts, Sandia is changing the methods and processes it has traditionally used in the product realization cycle for weapon components. Because of the increasing age of the nuclear stockpile, it is certain that the reliability of these weapons will degrade with time unless eventual action is taken to repair, requalify, or renew them. Furthermore, due to the downsizing of the DOE weapons production sites and loss of technical personnel, the new product realization process is being focused on developing and deploying advanced automation technologies in order to maintain the capability for producing new components. The goal of Sandia's technology development program is to create a product realization environment that is cost effective, has improved quality and reduced cycle time for small lot sizes. The new environment will rely less on the expertise of humans and more on intelligent systems and automation to perform the production processes. The systems will be robust in order to provide maximum flexibility and responsiveness for rapidly changing component or product mixes. An integrated enterprise will allow ready access to and use of information for effective and efficient product and process design. Concurrent engineering methods will allow a speedup of the product realization cycle, reduce costs, and dramatically lessen the dependency on creating and testing physical prototypes. Virtual manufacturing will allow production processes to be designed, integrated, and programed off-line before a piece of hardware ever moves. The overriding goal is to be able to build a large variety of new weapons parts on short notice. Many of these technologies that are being developed are also applicable to commercial production processes and applications. Computer vision will play a critical role in the new agile production environment for automation of processes such as inspection, assembly, welding, material dispensing and other process control tasks. Although there are many academic and commercial solutions that have been developed, none have had widespread adoption considering the huge potential number of applications that could benefit from this technology. The reason for this slow adoption is that the advantages of computer vision for automation can be a double-edged sword. The benefits can be lost if the vision system requires an inordinate amount of time for reprogramming by a skilled operator to account for different parts, changes in lighting conditions, background clutter, changes in optics, etc. Commercially available solutions typically require an operator to manually program the vision system with features used for the recognition. In a recent survey, we asked a number of commercial manufacturers and machine vision companies the question, 'What prevents machine vision systems from being more useful in factories?' The number one (and unanimous) response was that vision systems require too much skill to set up and program to be cost effective.
ERDC MSRC Resource. High Performance Computing for the Warfighter. Fall 2006
2006-01-01
to as Aggregated Combat Modeling, putting us at the campaign level).” Incorporating UIT within DAC The DAC system is written in Python and uses...API calls with two Python classes, UITConnectionFactory and UITConnection. UITConnectionFactory supports Kerberos authentication and establishes a...API calls within these Python classes, we insulated the DAC code from the Python SOAP interface requirements and details of the ERDC MSRC Resource
NASA Technical Reports Server (NTRS)
Sidik, S. M.
1973-01-01
An algorithm and computer program are presented for generating all the distinct 2(p-q) fractional factorial designs. Some applications of this algorithm to the construction of tables of designs and of designs for nonstandard situations and its use in Bayesian design are discussed. An appendix includes a discussion of an actual experiment whose design was facilitated by the algorithm.
NASA Technical Reports Server (NTRS)
Brice, R.; Mosley, J.; Willis, D.; Coleman, K.; Martin, C.; Shelby, L.; Kelley, U.; Renfro, E.; Griffith, G.; Warsame, A.
1989-01-01
In a continued effort to design a surface-based factory on Mars for the production of oxygen and water, the Design Group at Prairie View A&M University made a preliminary study of the surface and atmospheric composition on Mars and determined the mass densities of the various gases in the martian atmosphere. Based on the initial studies, the design group determined oxygen and water to be the two products that could be produced economically under the martian conditions. Studies were also made on present production techniques to obtain water and oxygen. Analyses were made to evaluate the current methods of production that were adaptable to the martian conditions. The detailed report was contained in an Interim Report submitted to NASA/USRA in Aug. of 1986. Even though the initial effort was the production of oxygen and water, we found it necessary to produce some diluted gases that can be mixed with oxygen to constitute 'breathable' air. In Phase 2--Task 1A, the Prairie View A&M University team completed the conceptual design of a breathable-air manufacturing system, a means of drilling for underground water, and storage of water for future use. The design objective of the team for the 1987-1988 academic year was the conceptual design of an integrated system for the supply of quality water for biological consumption, farming, and residential and industrial use. The design has also been completed. Phase 2--Task 1C is the present task for the Prairie View Design Team. This is a continuation of the previous task, and the continuation of this effort is the investigation into the extraction of water from beneath the surface and an alternative method of extraction from ice formations on the surface of Mars if accessible. In addition to investigation of water extraction, a system for computer control of extraction and treatment was developed with emphasis on fully automated control with robotic repair and maintenance. It is expected that oxygen- and water-producing plants on Mars will be limited in the amount of human control that will be available to operate large and/or isolated plants. Therefore, it is imperative that computers be integrated into plant operation with the capability to maintain life support systems and analyze and replace defective parts or systems with no human interface.
Automated computer grading of hardwood lumber
P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber
1988-01-01
This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...
Computer Assisted School Automation (CASA) in Japan.
ERIC Educational Resources Information Center
Sakamoto, Takashi; Nakanome, Naoaki
1991-01-01
This assessment of the status of computer assisted school automation (CASA) in Japan begins by describing the structure of the Japanese educational system and the roles of CASA in that system. Statistics on various aspects of computers in Japanese schools and the findings of several surveys are cited to report on the present state of educational…
DOT National Transportation Integrated Search
1974-08-01
Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work force, computer resources, controller productivity, system manning, failure ef...
Intelligent robot trends and predictions for the first year of the new millennium
NASA Astrophysics Data System (ADS)
Hall, Ernest L.
2000-10-01
An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The current use of these machines in outer space, medicine, hazardous materials, defense applications and industry is being pursued with vigor. In factory automation, industrial robots can improve productivity, increase product quality and improve competitiveness. The computer and the robot have both been developed during recent times. The intelligent robot combines both technologies and requires a thorough understanding and knowledge of mechatronics. Today's robotic machines are faster, cheaper, more repeatable, more reliable and safer than ever. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. Economically, the robotics industry now has more than a billion-dollar market in the U.S. and is growing. Feasibility studies show decreasing costs for robots and unaudited healthy rates of return for a variety of robotic applications. However, the road from inspiration to successful application can be long and difficult, often taking decades to achieve a new product. A greater emphasis on mechatronics is needed in our universities. Certainly, more cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit industry and society. The fearful robot stories may help us prevent future disaster. The inspirational robot ideas may inspire the scientists of tomorrow. However, the intelligent robot ideas, which can be reduced to practice, will change the world.
ERIC Educational Resources Information Center
Collins, Michael J.; Vitz, Ed
1988-01-01
Examines two computer interfaced lab experiments: 1) discusses the automation of a Perkin Elmer 337 infrared spectrophotometer noting the mechanical and electronic changes needed; 2) uses the Gouy method and Lotus Measure software to automate magnetic susceptibility determinations. Methodology is described. (MVL)
Safety in the Automated Office.
ERIC Educational Resources Information Center
Graves, Pat R.; Greathouse, Lillian R.
1990-01-01
Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)
Computer automation for feedback system design
NASA Technical Reports Server (NTRS)
1975-01-01
Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.
Understanding and enhancing user acceptance of computer technology
NASA Technical Reports Server (NTRS)
Rouse, William B.; Morris, Nancy M.
1986-01-01
Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.
Automation to improve efficiency of field expedient injury prediction screening.
Teyhen, Deydre S; Shaffer, Scott W; Umlauf, Jon A; Akerman, Raymond J; Canada, John B; Butler, Robert J; Goffar, Stephen L; Walker, Michael J; Kiesel, Kyle B; Plisky, Phillip J
2012-07-01
Musculoskeletal injuries are a primary source of disability in the U.S. Military. Physical training and sports-related activities account for up to 90% of all injuries, and 80% of these injuries are considered overuse in nature. As a result, there is a need to develop an evidence-based musculoskeletal screen that can assist with injury prevention. The purpose of this study was to assess the capability of an automated system to improve the efficiency of field expedient tests that may help predict injury risk and provide corrective strategies for deficits identified. The field expedient tests include survey questions and measures of movement quality, balance, trunk stability, power, mobility, and foot structure and mobility. Data entry for these tests was automated using handheld computers, barcode scanning, and netbook computers. An automated algorithm for injury risk stratification and mitigation techniques was run on a server computer. Without automation support, subjects were assessed in 84.5 ± 9.1 minutes per subject compared with 66.8 ± 6.1 minutes per subject with automation and 47.1 ± 5.2 minutes per subject with automation and process improvement measures (p < 0.001). The average time to manually enter the data was 22.2 ± 7.4 minutes per subject. An additional 11.5 ± 2.5 minutes per subject was required to manually assign an intervention strategy. Automation of this injury prevention screening protocol using handheld devices and netbook computers allowed for real-time data entry and enhanced the efficiency of injury screening, risk stratification, and prescription of a risk mitigation strategy.
Industrial systems biology and its impact on synthetic biology of yeast cell factories.
Fletcher, Eugene; Krivoruchko, Anastasia; Nielsen, Jens
2016-06-01
Engineering industrial cell factories to effectively yield a desired product while dealing with industrially relevant stresses is usually the most challenging step in the development of industrial production of chemicals using microbial fermentation processes. Using synthetic biology tools, microbial cell factories such as Saccharomyces cerevisiae can be engineered to express synthetic pathways for the production of fuels, biopharmaceuticals, fragrances, and food flavors. However, directing fluxes through these synthetic pathways towards the desired product can be demanding due to complex regulation or poor gene expression. Systems biology, which applies computational tools and mathematical modeling to understand complex biological networks, can be used to guide synthetic biology design. Here, we present our perspective on how systems biology can impact synthetic biology towards the goal of developing improved yeast cell factories. Biotechnol. Bioeng. 2016;113: 1164-1170. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
DOT National Transportation Integrated Search
1974-08-01
Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...
Undergraduate students' initial conceptions of factorials
NASA Astrophysics Data System (ADS)
Lockwood, Elise; Erickson, Sarah
2017-05-01
Counting problems offer rich opportunities for students to engage in mathematical thinking, but they can be difficult for students to solve. In this paper, we present a study that examines student thinking about one concept within counting, factorials, which are a key aspect of many combinatorial ideas. In an effort to better understand students' conceptions of factorials, we conducted interviews with 20 undergraduate students. We present a key distinction between computational versus combinatorial conceptions, and we explore three aspects of data that shed light on students' conceptions (their initial characterizations, their definitions of 0!, and their responses to Likert-response questions). We present implications this may have for mathematics educators both within and separate from combinatorics.
Dynamic mapping of EDDL device descriptions to OPC UA
NASA Astrophysics Data System (ADS)
Atta Nsiah, Kofi; Schappacher, Manuel; Sikora, Axel
2017-07-01
OPC UA (Open Platform Communications Unified Architecture) is already a well-known concept used widely in the automation industry. In the area of factory automation, OPC UA models the underlying field devices such as sensors and actuators in an OPC UA server to allow connecting OPC UA clients to access device-specific information via a standardized information model. One of the requirements of the OPC UA server to represent field device data using its information model is to have advanced knowledge about the properties of the field devices in the form of device descriptions. The international standard IEC 61804 specifies EDDL (Electronic Device Description Language) as a generic language for describing the properties of field devices. In this paper, the authors describe a possibility to dynamically map and integrate field device descriptions based on EDDL into OPCUA.
de novo computational enzyme design.
Zanghellini, Alexandre
2014-10-01
Recent advances in systems and synthetic biology as well as metabolic engineering are poised to transform industrial biotechnology by allowing us to design cell factories for the sustainable production of valuable fuels and chemicals. To deliver on their promises, such cell factories, as much as their brick-and-mortar counterparts, will require appropriate catalysts, especially for classes of reactions that are not known to be catalyzed by enzymes in natural organisms. A recently developed methodology, de novo computational enzyme design can be used to create enzymes catalyzing novel reactions. Here we review the different classes of chemical reactions for which active protein catalysts have been designed as well as the results of detailed biochemical and structural characterization studies. We also discuss how combining de novo computational enzyme design with more traditional protein engineering techniques can alleviate the shortcomings of state-of-the-art computational design techniques and create novel enzymes with catalytic proficiencies on par with natural enzymes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Automated clinical system for chromosome analysis
NASA Technical Reports Server (NTRS)
Castleman, K. R.; Friedan, H. J.; Johnson, E. T.; Rennie, P. A.; Wall, R. J. (Inventor)
1978-01-01
An automatic chromosome analysis system is provided wherein a suitably prepared slide with chromosome spreads thereon is placed on the stage of an automated microscope. The automated microscope stage is computer operated to move the slide to enable detection of chromosome spreads on the slide. The X and Y location of each chromosome spread that is detected is stored. The computer measures the chromosomes in a spread, classifies them by group or by type and also prepares a digital karyotype image. The computer system can also prepare a patient report summarizing the result of the analysis and listing suspected abnormalities.
NASA Astrophysics Data System (ADS)
Calì, M.; Santarelli, M. G. L.; Leone, P.
Gas Turbine Technologies (GTT) and Politecnico di Torino, both located in Torino (Italy), have been involved in the design and installation of a SOFC laboratory in order to analyse the operation, in cogenerative configuration, of the CHP 100 kW e SOFC Field Unit, built by Siemens-Westinghouse Power Corporation (SWPC), which is at present (May 2005) starting its operation and which will supply electric and thermal power to the GTT factory. In order to take the better advantage from the analysis of the on-site operation, and especially to correctly design the scheduled experimental tests on the system, we developed a mathematical model and run a simulated experimental campaign, applying a rigorous statistical approach to the analysis of the results. The aim of this work is the computer experimental analysis, through a statistical methodology (2 k factorial experiments), of the CHP 100 performance. First, the mathematical model has been calibrated with the results acquired during the first CHP100 demonstration at EDB/ELSAM in Westerwoort. After, the simulated tests have been performed in the form of computer experimental session, and the measurement uncertainties have been simulated with perturbation imposed to the model independent variables. The statistical methodology used for the computer experimental analysis is the factorial design (Yates' Technique): using the ANOVA technique the effect of the main independent variables (air utilization factor U ox, fuel utilization factor U F, internal fuel and air preheating and anodic recycling flow rate) has been investigated in a rigorous manner. Analysis accounts for the effects of parameters on stack electric power, thermal recovered power, single cell voltage, cell operative temperature, consumed fuel flow and steam to carbon ratio. Each main effect and interaction effect of parameters is shown with particular attention on generated electric power and stack heat recovered.
Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D
NASA Technical Reports Server (NTRS)
Carle, Alan; Fagan, Mike; Green, Lawrence L.
1998-01-01
This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.
1977-01-26
Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU
Automation of the aircraft design process
NASA Technical Reports Server (NTRS)
Heldenfels, R. R.
1974-01-01
The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.
AN ULTRAVIOLET-VISIBLE SPECTROPHOTOMETER AUTOMATION SYSTEM. PART I: FUNCTIONAL SPECIFICATIONS
This document contains the project definition, the functional requirements, and the functional design for a proposed computer automation system for scanning spectrophotometers. The system will be implemented on a Data General computer using the BASIC language. The system is a rea...
Automated Intelligent Agents: Are They Trusted Members of Military Teams?
2008-12-01
computer -based team firefighting game (C3Fire). The order of presentation of the two trials (human – human vs. human – automation) was...agent. All teams played a computer -based team firefighting game (C3Fire). The order of presentation of the two trials (human – human vs. human...26 b. Participants’ Computer ..................27 C. VARIABLES .........................................27 1. Independent Variables
Ranking of Air Force Heating Plants Relative to the Economic Benefit of Coal Utilization
1989-11-01
HTlW Output Capacity ..................... 27 5.2.2 Combustion Technologies ......................... 31 5.3 COMPUTER MODEL FOR LCC ANALISIS ...and field-erected units have been examined. The packaged units are factory -built, shell (fire-tube) boilers that are small enotgh to be shipped by...40 HBtMu/h with a thermal energy capacity factory of about 65% if used as a baseload heating plant. A water- tube boiler with a steam rating of 1200
A Cross-Cultural Validation Study of the Computer Attitude Scale.
ERIC Educational Resources Information Center
Kim, JinGyu; And Others
The reliability and factorial validity of the Computer Attitudes Scale (CAS) was assessed with college students in South Korea. The CAS was developed for use with high school students, but has been used in higher education in the United States. It is a Likert-type scale of 30 positive and negative statements about the use of computers, and is one…
DOT National Transportation Integrated Search
1976-08-01
This report contains a functional design for the simulation of a future automation concept in support of the ATC Systems Command Center. The simulation subsystem performs airport airborne arrival delay predictions and computes flow control tables for...
Computerized Manufacturing Automation. Employment, Education, and the Workplace. Summary.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
The application of programmable automation (PA) offers new opportunities to enhance and streamline manufacturing processes. Five PA technologies are examined in this report: computer-aided design, robots, numerically controlled machine tools, flexible manufacturing systems, and computer-integrated manufacturing. Each technology is in a relatively…
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
Schmidt, Taly Gilat; Wang, Adam S; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh
2016-10-01
The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was [Formula: see text], with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors.
Schmidt, Taly Gilat; Wang, Adam S.; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh
2016-01-01
Abstract. The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was −7%, with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors. PMID:27921070
Automated data acquisition and processing for a Hohlraum reflectometer
NASA Technical Reports Server (NTRS)
Difilippo, Frank; Mirtich, Michael J.
1988-01-01
A computer and data acquisition board were used to automate a Perkin-Elmer Model 13 spectrophotometer with a Hohlraum reflectivity attachment. Additional electronic circuitry was necessary for amplification, filtering, and debouncing. The computer was programmed to calculate spectral emittance from 1.7 to 14.7 micrometers and also total emittance versus temperature. Automation of the Hohlraum reflectometer reduced the time required to determine total emittance versus temperature from about three hours to about 40 minutes.
Computer automation of ultrasonic testing. [inspection of ultrasonic welding
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Kerlin, E. E.; Gardner, A. H.; Dunmyer, D.; Wells, T. G.; Robinson, A. R.; Kunselman, J. S.; Walker, T. C.
1974-01-01
Report describes a prototype computer-automated ultrasonic system developed for the inspection of weldments. This system can be operated in three modes: manual, automatic, and computer-controlled. In the computer-controlled mode, the system will automatically acquire, process, analyze, store, and display ultrasonic inspection data in real-time. Flaw size (in cross-section), location (depth), and type (porosity-like or crack-like) can be automatically discerned and displayed. The results and pertinent parameters are recorded.
Does Automated Feedback Improve Writing Quality?
ERIC Educational Resources Information Center
Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.
2014-01-01
The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…
Workshop on Office Automation and Telecommunication: Applying the Technology.
ERIC Educational Resources Information Center
Mitchell, Bill
This document contains 12 outlines that forecast the office of the future. The outlines cover the following topics: (1) office automation definition and objectives; (2) functional categories of office automation software packages for mini and mainframe computers; (3) office automation-related software for microcomputers; (4) office automation…
Computer-assisted design in perceptual-motor skills research
NASA Technical Reports Server (NTRS)
Rogers, C. A., Jr.
1974-01-01
A categorization was made of independent variables previously found to be potent in simple perceptual-motor tasks. A computer was then used to generate hypothetical factorial designs. These were evaluated in terms of literature trends and pragmatic criteria. Potential side-effects of machine-assisted research strategy were discussed.
Translations on USSR Science and Technology Physical Sciences and Technology No. 18
1977-09-19
and Avetik Gukasyan discuss component arrangement alternatives. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND...1974. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND AUTOMATION TECHNOLOGY ’PROYEKC’ COMPUTER-ASSISTED DESIGN SYSTEM...throughout the world are struggling. The "Proyekt" system, produced in the Institute of Cybernetics, assists in automating the design and manufacture of
Sensory-based expert monitoring and control
NASA Astrophysics Data System (ADS)
Yen, Gary G.
1999-03-01
Field operators use their eyes, ears, and nose to detect process behavior and to trigger corrective control actions. For instance: in daily practice, the experienced operator in sulfuric acid treatment of phosphate rock may observe froth color or bubble character to control process material in-flow. Or, similarly, (s)he may use acoustic sound of cavitation or boiling/flashing to increase or decrease material flow rates in tank levels. By contrast, process control computers continue to be limited to taking action on P, T, F, and A signals. Yet, there is sufficient evidence from the fields that visual and acoustic information can be used for control and identification. Smart in-situ sensors have facilitated potential mechanism for factory automation with promising industry applicability. In respond to these critical needs, a generic, structured health monitoring approach is proposed. The system assumes a given sensor suite will act as an on-line health usage monitor and at best provide the real-time control autonomy. The sensor suite can incorporate various types of sensory devices, from vibration accelerometers, directional microphones, machine vision CCDs, pressure gauges to temperature indicators. The decision can be shown in a visual on-board display or fed to the control block to invoke controller reconfigurration.
Automated Quantification of Pneumothorax in CT
Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer
2012-01-01
An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091
A Review of Developments in Computer-Based Systems to Image Teeth and Produce Dental Restorations
Rekow, E. Dianne; Erdman, Arthur G.; Speidel, T. Michael
1987-01-01
Computer-aided design and manufacturing (CAD/CAM) make it possible to automate the creation of dental restorations. Currently practiced techniques are described. Three automated systems currently under development are described and compared. Advances in computer-aided design and computer-aided manufacturing (CAD/CAM) provide a new option for dentistry, creating an alternative technique for producing dental restorations. It is possible to create dental restorations that are automatically produced and meet or exceed current requirements for fit and occlusion.
Skin diseases in workers at a perfume factory.
Schubert, Hans-Jürgen
2006-08-01
The aim of this study is to find out the causes of skin diseases in one-third of the staff of a perfume factory, in which 10 different perfume sprays were being manufactured. Site inspection, dermatological examination and patch testing of all 26 persons at risk with 4 perfume oils and 30 ingredients of them. The results showed 6 bottlers were found suffering from allergic contact dermatitis, 2 from irritant contact dermatitis, 12 workers showed different strong reactions to various fragrances. The main causes of allergic contact dermatitis were 2 perfume oils (12 cases) and their ingredients geraniol (12 cases), benzaldehyde(9), cinnamic aldehyde (6), linalool, neroli oil, terpenes of lemon oil and orange oil (4 each). Nobody was tested positive to balsam of Peru. Job changes for office workers, packers or printers to other rooms, where they had no longer contact with fragrances, led to a settling. To conclude, automation and replacement of glass bottles by cartridges from non-fragile materials and using gloves may minimize the risk.
Sharkey, Noel; Sharkey, Amanda
2012-01-01
Rapid advances in service robotics together with dramatic shifts in population demographics have led to the notion that technology may be the answer to our eldercare problems. Robots are being developed for feeding, washing, lifting, carrying and mobilising the elderly as well as monitoring their health. They are also being proposed as a substitute for companionship. While these technologies could accrue major benefits for society and empower the elderly, we must balance their use with the ethical costs. These include a potential reduction in human contact, increased feeling of objectification and loss of control, loss of privacy and personal freedom as well as deception and infantilisation. With appropriate guidelines in place before the introduction of robots en masse into the care system, robots could improve the lives of the elderly, reducing their dependence and creating more opportunities for social interaction. Without forethought, the elderly may find themselves in a barren world of machines, a world of automated care: a factory for the elderly. Copyright © 2011 S. Karger AG, Basel.
Final-Approach-Spacing Subsystem For Air Traffic
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh
1992-01-01
Automation subsystem of computers, computer workstations, communication equipment, and radar helps air-traffic controllers in terminal radar approach-control (TRACON) facility manage sequence and spacing of arriving aircraft for both efficiency and safety. Called FAST (Final Approach Spacing Tool), subsystem enables controllers to choose among various levels of automation.
ERIC Educational Resources Information Center
Kibirige, Harry M.
1991-01-01
Discussion of the potential effects of fiber optic-based communication technology on information networks and systems design highlights library automation. Topics discussed include computers and telecommunications systems, the importance of information in national economies, microcomputers, local area networks (LANs), national computer networks,…
An Introduction to Archival Automation: A RAMP Study with Guidelines.
ERIC Educational Resources Information Center
Cook, Michael
Developed under a contract with the International Council on Archives, these guidelines are designed to emphasize the role of automation techniques in archives and records services, provide an indication of existing computer systems used in different archives services and of specific computer applications at various stages of archives…
Automated system for definition of life-cycle resources of electromechanical equipment
NASA Astrophysics Data System (ADS)
Zhukovskiy, Y.; Koteleva, N.
2017-02-01
The frequency of maintenance of electromechanical equipment depends on the plant, which uses and runs this equipment. Very often the maintenance frequency is poorly correlated with the actual state of the electromechanical equipment. Furthermore, traditional methods of diagnosis sometimes cannot work without stopping the process (for example, for equipment located in hard to reach places) and so the maintenance costs are increased. This problem can be solved using the indirect methods of diagnosing of the electromechanical equipment. The indirect methods often use the parameters in the real time and seldom use the parameters of traditional diagnostic methods for determination of the resource of electromechanical equipment. This article is dedicated to developing the structure of a special automated control system. This system must use the big flow of the information about the direct and indirect parameters of the equipment state from plants from different areas of industry and factories which produce the electromechanical equipment.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Philip A. Araman; Janice K. Wiedenbeck
1995-01-01
Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Multiphasic Health Testing in the Clinic Setting
LaDou, Joseph
1971-01-01
The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771
Babbin, Steven F.; Yin, Hui-Qing; Rossi, Joseph S.; Redding, Colleen A.; Paiva, Andrea L.; Velicer, Wayne F.
2015-01-01
The Self-Efficacy Scale for Sun Protection consists of two correlated factors with three items each for Sunscreen Use and Avoidance. This study evaluated two crucial psychometric assumptions, factorial invariance and scale reliability, with a sample of adults (N = 1356) participating in a computer-tailored, population-based intervention study. A measure has factorial invariance when the model is the same across subgroups. Three levels of invariance were tested, from least to most restrictive: (1) Configural Invariance (nonzero factor loadings unconstrained); (2) Pattern Identity Invariance (equal factor loadings); and (3) Strong Factorial Invariance (equal factor loadings and measurement errors). Strong Factorial Invariance was a good fit for the model across seven grouping variables: age, education, ethnicity, gender, race, skin tone, and Stage of Change for Sun Protection. Internal consistency coefficient Alpha and factor rho scale reliability, respectively, were .84 and .86 for Sunscreen Use, .68 and .70 for Avoidance, and .78 and .78 for the global (total) scale. The psychometric evidence demonstrates strong empirical support that the scale is consistent, has internal validity, and can be used to assess population-based adult samples. PMID:26457203
RECAL: A Computer Program for Selecting Sample Days for Recreation Use Estimation
D.L. Erickson; C.J. Liu; H. Ken Cordell; W.L. Chen
1980-01-01
Recreation Calendar (RECAL) is a computer program in PL/I for drawing a sample of days for estimating recreation use. With RECAL, a sampling period of any length may be chosen; simple random, stratified random, and factorial designs can be accommodated. The program randomly allocates days to strata and locations.
An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.
ERIC Educational Resources Information Center
Skakun, Ernest N.; Hakstian, A. Ralph
Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…
Using Software Tools to Automate the Assessment of Student Programs.
ERIC Educational Resources Information Center
Jackson, David
1991-01-01
Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…
In-House Automation of a Small Library Using a Mainframe Computer.
ERIC Educational Resources Information Center
Waranius, Frances B.; Tellier, Stephen H.
1986-01-01
An automated library routine management system was developed in-house to create system unique to the Library and Information Center, Lunar and Planetary Institute, Houston, Texas. A modular approach was used to allow continuity in operations and services as system was implemented. Acronyms and computer accounts and file names are appended.…
Automated Management Of Documents
NASA Technical Reports Server (NTRS)
Boy, Guy
1995-01-01
Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.
ERIC Educational Resources Information Center
Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.
2016-01-01
Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…
ERIC Educational Resources Information Center
Epstein, A. H.; And Others
The first phase of an ongoing library automation project at Stanford University is described. Project BALLOTS (Bibliographic Automation of Large Library Operations Using a Time-Sharing System) seeks to automate the acquisition and cataloging functions of a large library using an on-line time-sharing computer. The main objectives are to control…
The automation of an inlet mass flow control system
NASA Technical Reports Server (NTRS)
Supplee, Frank; Tcheng, Ping; Weisenborn, Michael
1989-01-01
The automation of a closed-loop computer controlled system for the inlet mass flow system (IMFS) developed for a wind tunnel facility at Langley Research Center is presented. This new PC based control system is intended to replace the manual control system presently in use in order to fully automate the plug positioning of the IMFS during wind tunnel testing. Provision is also made for communication between the PC and a host-computer in order to allow total animation of the plug positioning and data acquisition during the complete sequence of predetermined plug locations. As extensive running time is programmed for the IMFS, this new automated system will save both manpower and tunnel running time.
NMR-based automated protein structure determination.
Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter
2017-08-15
NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.
1981-02-01
Continue on tevetee «Id* If necemtery mid Identify br black number) Battlefield automated systems Human- computer interaction. Design criteria System...Report (this report) In-Depth Analyses of Individual Systems A. Tactical Fire Direction System (TACFIRE) (RP 81-26) B. Tactical Computer Terminal...select the design features and operating procedures of the human- computer Interface which best match the require- ments and capabilities of anticipated
A Computational Workflow for the Automated Generation of Models of Genetic Designs.
Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil
2018-06-05
Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.
Asleep at the automated wheel-Sleepiness and fatigue during highly automated driving.
Vogelpohl, Tobias; Kühn, Matthias; Hummel, Thomas; Vollrath, Mark
2018-03-20
Due to the lack of active involvement in the driving situation and due to monotonous driving environments drivers with automation may be prone to become fatigued faster than manual drivers (e.g. Schömig et al., 2015). However, little is known about the progression of fatigue during automated driving and its effects on the ability to take back manual control after a take-over request. In this driving simulator study with Nö=ö60 drivers we used a three factorial 2ö×ö2ö×ö12 mixed design to analyze the progression (12ö×ö5ömin; within subjects) of driver fatigue in drivers with automation compared to manual drivers (between subjects). Driver fatigue was induced as either mainly sleep related or mainly task related fatigue (between subjects). Additionally, we investigated the drivers' reactions to a take-over request in a critical driving scenario to gain insights into the ability of fatigued drivers to regain manual control and situation awareness after automated driving. Drivers in the automated driving condition exhibited facial indicators of fatigue after 15 to 35ömin of driving. Manual drivers only showed similar indicators of fatigue if they suffered from a lack of sleep and then only after a longer period of driving (approx. 40ömin). Several drivers in the automated condition closed their eyes for extended periods of time. In the driving with automation condition mean automation deactivation times after a take-over request were slower for a certain percentage (about 30%) of the drivers with a lack of sleep (Mö=ö3.2; SDö=ö2.1ös) compared to the reaction times after a long drive (Mö=ö2.4; SDö=ö0.9ös). Drivers with automation also took longer than manual drivers to first glance at the speed display after a take-over request and were more likely to stay behind a braking lead vehicle instead of overtaking it. Drivers are unable to stay alert during extended periods of automated driving without non-driving related tasks. Fatigued drivers could pose a serious hazard in complex take-over situations where situation awareness is required to prepare for threats. Driver fatigue monitoring or controllable distraction through non-driving tasks could be necessary to ensure alertness and availability during highly automated driving. Copyright © 2018 Elsevier Ltd. All rights reserved.
Computer-Assisted Monitoring Of A Complex System
NASA Technical Reports Server (NTRS)
Beil, Bob J.; Mickelson, Eric M.; Sterritt, John M.; Costantino, Rob W.; Houvener, Bob C.; Super, Mike A.
1995-01-01
Propulsion System Advisor (PSA) computer-based system assists engineers and technicians in analyzing masses of sensory data indicative of operating conditions of space shuttle propulsion system during pre-launch and launch activities. Designed solely for monitoring; does not perform any control functions. Although PSA developed for highly specialized application, serves as prototype of noncontrolling, computer-based subsystems for monitoring other complex systems like electric-power-distribution networks and factories.
Intelligent robot trends and predictions for the .net future
NASA Astrophysics Data System (ADS)
Hall, Ernest L.
2001-10-01
An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent and future technical and economic trends. During the past twenty years the use of industrial robots that are equipped not only with precise motion control systems but also with sensors such as cameras, laser scanners, or tactile sensors that permit adaptation to a changing environment has increased dramatically. Intelligent robot products have been developed in many cases for factory automation and for some hospital and home applications. To reach an even higher degree of applications, the addition of learning may be required. Recently, learning theories such as the adaptive critic have been proposed. In this type of learning, a critic provides a grade to the controller of an action module such as a robot. The adaptive critic is a good model for human learning. In general, the critic may be considered to be the human with the teach pendant, plant manager, line supervisor, quality inspector or the consumer. If the ultimate critic is the consumer, then the quality inspector must model the consumer's decision-making process and use this model in the design and manufacturing operations. Can the adaptive critic be used to advance intelligent robots? Intelligent robots have historically taken decades to be developed and reduced to practice. Methods for speeding this development include technology such as rapid prototyping and product development and government, industry and university cooperation.
A research factory for polymer microdevices: muFac
NASA Astrophysics Data System (ADS)
Anthony, Brian W.; Hardt, David E.; Hale, Melinda; Zarrouati, Nadege
2010-02-01
As part of our research on the manufacturing science of micron scale polymer-based devices, an automated production cell has been developed to explore its use in a volume manufacturing environment. This "micro-factory" allows the testing of models and hardware that have resulted from research on material characterization and simulation, tooling and equipment design and control, and process control and metrology. More importantly it has allowed us to identify the problems that exist between and within unit-processes. This paper details our efforts to produce basic micro-fluidic products in high volume at acceptable production rates and quality levels. The device chosen for our first product is a simple binary micromixer with 40×50 micron channel cross section manufactured by embossing of PMMA. The processes in the cell include laser cutting and drilling, hot embossing, thermal bonding and high-speed inspection of the components. Our goal is to create a "lights-out" factory that can make long production runs (e.g. an 8 hour shift) at high rates (Takt time of less than 3 minutes) with consistent quality. This contrasts with device foundries where prototypes in limited quantities but with high variety are the goal. Accordingly, rate and yield are dominant factors in this work, along with the need for precise material handling strategies. Production data will be presented to include process run charts, sampled functional testing of the products and measures of the overall system throughput.
NASA Astrophysics Data System (ADS)
Fiorini, Rodolfo A.; Dacquino, Gianfranco
2005-03-01
GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.
Automation of block assignment planning using a diagram-based scenario modeling method
NASA Astrophysics Data System (ADS)
Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye
2014-03-01
Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.
DOT National Transportation Integrated Search
1974-08-01
Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...
Large-Scale Document Automation: The Systems Integration Issue.
ERIC Educational Resources Information Center
Kalthoff, Robert J.
1985-01-01
Reviews current technologies for electronic imaging and its recording and transmission, including digital recording, optical data disks, automated image-delivery micrographics, high-density-magnetic recording, and new developments in telecommunications and computers. The role of the document automation systems integrator, who will bring these…
Computer-controlled attenuator.
Mitov, D; Grozev, Z
1991-01-01
Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
ERIC Educational Resources Information Center
Divilbiss, J. L., Ed.
To help the librarian in negotiating with vendors of automated library services, nine authors have presented methods of dealing with a specific service or situation. Paper topics include computer services, network contracts, innovative service, data processing, automated circulation, a turn-key system, data base sharing, online data base services,…
Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System
ERIC Educational Resources Information Center
Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia
2013-01-01
The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…
Students' Perceived Usefulness of Formative Feedback for a Computer-Adaptive Test
ERIC Educational Resources Information Center
Lilley, Mariana; Barker, Trevor
2007-01-01
In this paper we report on research related to the provision of automated feedback based on a computer adaptive test (CAT), used in formative assessment. A cohort of 76 second year university undergraduates took part in a formative assessment with a CAT and were provided with automated feedback on their performance. A sample of students responded…
Identifying and locating surface defects in wood: Part of an automated lumber processing system
Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa
1983-01-01
Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
ERIC Educational Resources Information Center
Klein, David C.
2014-01-01
As advancements in automation continue to alter the systemic behavior of computer systems in a wide variety of industrial applications, human-machine interactions are increasingly becoming supervisory in nature, with less hands-on human involvement. This maturing of the human role within the human-computer relationship is relegating operations…
ERIC Educational Resources Information Center
Federal Information Processing Standards Publication, 1976
1976-01-01
These guidelines provide a basis for determining the content and extent of documentation for computer programs and automated data systems. Content descriptions of ten document types plus examples of how management can determine when to use the various types are included. The documents described are (1) functional requirements documents, (2) data…
Science and Technology Review, January-February 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Table of contents: accelerators at Livermore; the B-Factory and the Big Bang; assessing exposure to radiation; next generation of computer storage; and a powerful new tool to detect clandestine nuclear tests.
Girolami, Antonio; Napolitano, Fabio; Faraone, Daniela; Di Bello, Gerardo; Braghieri, Ada
2014-01-01
The object of the investigation was the Lucanian dry sausage appearance, meant as color and visible fat ratio. The study was carried out on dry sausages produced in 10 different salami factories and seasoned for 18 days on average. We studied the effect of the raw material origin (5 producers used meat bought from the market and other 5 producers used meat from pigs bred in their farms) and of the salami factories or brands on meat color, fat color and visible fat ratio in dry sausages. The sausages slices were photographed and the images were analysed with the computer vision system to measure the changes in the colorimetric characteristics L*, a*, b*, hue and chroma and in the visible fat area ratio. The last parameter was assessed on the slice surface using image binarization. A consumer test was conducted to determine the relationship between the perception of visible fat on the sausage slice surface and acceptability and preference of this product. The consumers were asked to look carefully at the 6 sausages slices in a photo, minding the presence of fat, and to identify (a) the slices they considered unacceptable for consumption and (b) the slice they preferred. The results show that the color of the sausage lean part varies in relation to the raw material employed and to the producer or brand (P<0.001). Besides, the sausage meat color is not uniform in some salami factories (P<0.05-0.001). In all salami factories the sausages show a high uniformity in fat color. The visible fat ratio of the sausages slices is higher (P<0.001) in the product from salami factories without pig-breeding farm. The fat percentage is highly variable (P<0.001) among the sausages of each salami factory. On the whole, the product the consumers consider acceptable and is inclined to eat has a low fat percentage (P<0.001). Our consumers (about 70%) prefer slices which are leaner (P<0.001). Women, in particular, show a higher preference for the leanest (P<0.001). © 2013.
Effects of Computer Animation Instructional Package on Students' Achievement in Practical Biology
ERIC Educational Resources Information Center
Hamzat, Abdulrasaq; Bello, Ganiyu; Abimbola, Isaac Olakanmi
2017-01-01
This study examined the effects of computer animation instructional package on secondary school students' achievement in practical biology in Ilorin, Nigeria. The study adopted a pre-test, post-test, control group, non-randomised and nonequivalent quasi-experimental design, with a 2x2x3 factorial design. Two intact classes from two secondary…
Automating NEURON Simulation Deployment in Cloud Resources.
Stockton, David B; Santamaria, Fidel
2017-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.
Automating NEURON Simulation Deployment in Cloud Resources
Santamaria, Fidel
2016-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341
Computational Analysis of Behavior.
Egnor, S E Roian; Branson, Kristin
2016-07-08
In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.
The Computer Aided Aircraft-design Package (CAAP)
NASA Technical Reports Server (NTRS)
Yalif, Guy U.
1994-01-01
The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.
Automated Training Evaluation (ATE). Final Report.
ERIC Educational Resources Information Center
Charles, John P.; Johnson, Robert M.
The automation of weapons system training presents the potential for significant savings in training costs in terms of manpower, time, and money. The demonstration of the technical feasibility of automated training through the application of advanced digital computer techniques and advanced training techniques is essential before the application…
Progress in Fully Automated Abdominal CT Interpretation
Summers, Ronald M.
2016-01-01
OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207
Anterior Tibial Translation in Collegiate Athletes with Normal Anterior Cruciate Ligament Integrity
Rosene, John M.; Fogarty, Tracey D.
1999-01-01
Objective: To examine differences in anterior tibial translation (ATT) among sports, sex, and leg dominance in collegiate athletes with normal anterior cruciate ligament integrity. Design and Setting: Subjects from various athletic teams were measured for ATT in right and left knees. Subjects: Sixty subjects were measured for ATT with a KT-1000 knee arthrometer. Measurements: Statistical analyses were computed for each sex and included a 2 × 3 × 4 mixed-factorial analysis of variance (ANOVA) for anterior cruciate ligament displacement, right and left sides, and force and sport. A 2 × 2 × 3 mixed-factorial ANOVA was computed to compare means for sex and force. A 2 × 3 mixed-factorial ANOVA was computed to compare sex differences across 3 forces. Results: For males and females, no significant interactions were found among leg, force, and sport for mean ATT, for leg and sport or leg and force, or for translation values between dominant and nondominant legs. Males had a significant interaction for force and sport, and a significant difference was found for side of body, since the right side had less translation than the left side. Females had greater ATT than males at all forces. Conclusions: Sex differences exist for ATT, and differences in ATT exist among sports for both sexes. Differences between the right and left sides of the body should be expected when making comparisons of ligamentous laxity. ImagesFigure 2.Figure 3.Figure 5. PMID:16558565
Proof-of-concept automation of propellant processing
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar; Schallhorn, P. A.
1989-01-01
For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.
Nursing operations automation and health care technology innovations: 2025 and beyond.
Suby, ChrysMarie
2013-01-01
This article reviews why nursing operations automation is important, reviews the impact of computer technology on nursing from a historical perspective, and considers the future of nursing operations automation and health care technology innovations in 2025 and beyond. The increasing automation in health care organizations will benefit patient care, staffing and scheduling systems and central staffing offices, census control, and measurement of patient acuity.
Automation of Educational Tasks for Academic Radiology.
Lamar, David L; Richardson, Michael L; Carlson, Blake
2016-07-01
The process of education involves a variety of repetitious tasks. We believe that appropriate computer tools can automate many of these chores, and allow both educators and their students to devote a lot more of their time to actual teaching and learning. This paper details tools that we have used to automate a broad range of academic radiology-specific tasks on Mac OS X, iOS, and Windows platforms. Some of the tools we describe here require little expertise or time to use; others require some basic knowledge of computer programming. We used TextExpander (Mac, iOS) and AutoHotKey (Win) for automated generation of text files, such as resident performance reviews and radiology interpretations. Custom statistical calculations were performed using TextExpander and the Python programming language. A workflow for automated note-taking was developed using Evernote (Mac, iOS, Win) and Hazel (Mac). Automated resident procedure logging was accomplished using Editorial (iOS) and Python. We created three variants of a teaching session logger using Drafts (iOS) and Pythonista (iOS). Editorial and Drafts were used to create flashcards for knowledge review. We developed a mobile reference management system for iOS using Editorial. We used the Workflow app (iOS) to automatically generate a text message reminder for daily conferences. Finally, we developed two separate automated workflows-one with Evernote (Mac, iOS, Win) and one with Python (Mac, Win)-that generate simple automated teaching file collections. We have beta-tested these workflows, techniques, and scripts on several of our fellow radiologists. All of them expressed enthusiasm for these tools and were able to use one or more of them to automate their own educational activities. Appropriate computer tools can automate many educational tasks, and thereby allow both educators and their students to devote a lot more of their time to actual teaching and learning. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.
Luo, Yunhua; Ahmed, Sharif; Leslie, William D
2018-03-01
Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
What's New in the Library Automation Arena?
ERIC Educational Resources Information Center
Breeding, Marshall
1998-01-01
Reviews trends in library automation based on vendors at the 1998 American Library Association Annual Conference. Discusses the major industry trend, a move from host-based computer systems to the new generation of client/server, object-oriented, open systems-based automation. Includes a summary of developments for 26 vendors. (LRW)
Funding for Library Automation.
ERIC Educational Resources Information Center
Thompson, Ronelle K. H.
This paper provides a brief overview of planning and implementing a project to fund library automation. It is suggested that: (1) proposal budgets should include all costs of a project, such as furniture needed for computer terminals, costs for modifying library procedures, initial supplies, or ongoing maintenance; (2) automation does not save…
The Nature of Automated Jobs and Their Educational and Training Requirements.
ERIC Educational Resources Information Center
Fine, S.A.
Objective information concerning the impact of automation on educational and training requirements was obtained for 132 employees engaged in electron tube, computer, and steel manufacturing processes through management questionnaire responses, analysis of job functions, and employer interviews before and after the introduction of automation. The…
Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.
ERIC Educational Resources Information Center
Jonassen, David H.; Wilson, Brent G.
Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…
Planning for the Automation of School Library Media Centers.
ERIC Educational Resources Information Center
Caffarella, Edward P.
1996-01-01
Geared for school library media specialists whose centers are in the early stages of automation or conversion to a new system, this article focuses on major components of media center automation: circulation control; online public access catalogs; machine readable cataloging; retrospective conversion of print catalog cards; and computer networks…
The Historical Evolution of Educational Software.
ERIC Educational Resources Information Center
Troutner, Joanne
This paper establishes the roots of computers and automated teaching in the field of psychology and describes Dr. S. L. Pressey's presentation of the teaching machine; B. F. Skinner's teaching machine; Meyer's steps in composing a program for the automated teaching machine; IBM's beginning research on automated courses and the development of the…
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2014 CFR
2014-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2013 CFR
2013-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2012 CFR
2012-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2011 CFR
2011-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
DOT National Transportation Integrated Search
1974-08-01
Volume 5 describes the DELTA Simulation Model. It includes all documentation of the DELTA (Determine Effective Levels of Task Automation) computer simulation developed by TRW for use in the Automation Applications Study. Volume 5A includes a user's m...
NASA Technical Reports Server (NTRS)
Harrison, Cecil A.
1986-01-01
The efforts to automate the electromagentic compatibility (EMC) test facilites at Marshall Flight Center were examined. A battery of nine standard tests is to be integrated by means of a desktop computer-controller in order to provide near real-time data assessment, store the data acquired during testing on flexible disk, and provide computer production of the certification report.
Preliminary Full-Scale Tests of the Center for Automated Processing of Hardwoods' Auto-Image
Philip A. Araman; Janice K. Wiedenbeck
1995-01-01
Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...
Integral flange design program. [procedure for computing stresses
NASA Technical Reports Server (NTRS)
Wilson, J. F.
1974-01-01
An automated interactive flange design program utilizing an electronic desk top calculator is presented. The program calculates the operating and seating stresses for circular flanges of the integral or optional type subjected to internal pressure. The required input information is documented. The program provides an automated procedure for computing stresses in selected flange geometries for comparison to the allowable code values.
Predictors of Interpersonal Trust in Virtual Distributed Teams
2008-09-01
understand systems that are very complex in nature . Such understanding is essential to facilitate building or maintaining operators’ mental models of the...a significant impact on overall system performance. Specifically, the level of automation that combined human generation of options with computer...and/or computer servers had a significant impact on automated system performance. Additionally, Parasuraman, Sheridan, & Wickens (2000) proposed
NASA Technical Reports Server (NTRS)
Roske-Hofstrand, Renate J.
1990-01-01
The man-machine interface and its influence on the characteristics of computer displays in automated air traffic is discussed. The graphical presentation of spatial relationships and the problems it poses for air traffic control, and the solution of such problems are addressed. Psychological factors involved in the man-machine interface are stressed.
ERIC Educational Resources Information Center
Fridge, Evorell; Bagui, Sikha
2016-01-01
The goal of this research was to investigate the effects of automated testing software on levels of student reflection and student performance. This was a self-selecting, between subjects design that examined the performance of students in introductory computer programming classes. Participants were given the option of using the Web-CAT…
Almost human: Anthropomorphism increases trust resilience in cognitive agents.
de Visser, Ewart J; Monfort, Samuel S; McKendrick, Ryan; Smith, Melissa A B; McKnight, Patrick E; Krueger, Frank; Parasuraman, Raja
2016-09-01
We interact daily with computers that appear and behave like humans. Some researchers propose that people apply the same social norms to computers as they do to humans, suggesting that social psychological knowledge can be applied to our interactions with computers. In contrast, theories of human–automation interaction postulate that humans respond to machines in unique and specific ways. We believe that anthropomorphism—the degree to which an agent exhibits human characteristics—is the critical variable that may resolve this apparent contradiction across the formation, violation, and repair stages of trust. Three experiments were designed to examine these opposing viewpoints by varying the appearance and behavior of automated agents. Participants received advice that deteriorated gradually in reliability from a computer, avatar, or human agent. Our results showed (a) that anthropomorphic agents were associated with greater trust resilience , a higher resistance to breakdowns in trust; (b) that these effects were magnified by greater uncertainty; and c) that incorporating human-like trust repair behavior largely erased differences between the agents. Automation anthropomorphism is therefore a critical variable that should be carefully incorporated into any general theory of human–agent trust as well as novel automation design. PsycINFO Database Record (c) 2016 APA, all rights reserved
Automated Measurement of Patient-Specific Tibial Slopes from MRI
Amerinatanzi, Amirhesam; Summers, Rodney K.; Ahmadi, Kaveh; Goel, Vijay K.; Hewett, Timothy E.; Nyman, Edward
2017-01-01
Background: Multi-planar proximal tibial slopes may be associated with increased likelihood of osteoarthritis and anterior cruciate ligament injury, due in part to their role in checking the anterior-posterior stability of the knee. Established methods suffer repeatability limitations and lack computational efficiency for intuitive clinical adoption. The aims of this study were to develop a novel automated approach and to compare the repeatability and computational efficiency of the approach against previously established methods. Methods: Tibial slope geometries were obtained via MRI and measured using an automated Matlab-based approach. Data were compared for repeatability and evaluated for computational efficiency. Results: Mean lateral tibial slope (LTS) for females (7.2°) was greater than for males (1.66°). Mean LTS in the lateral concavity zone was greater for females (7.8° for females, 4.2° for males). Mean medial tibial slope (MTS) for females was greater (9.3° vs. 4.6°). Along the medial concavity zone, female subjects demonstrated greater MTS. Conclusion: The automated method was more repeatable and computationally efficient than previously identified methods and may aid in the clinical assessment of knee injury risk, inform surgical planning, and implant design efforts. PMID:28952547
Demonstration of the feasibility of automated silicon solar cell fabrication
NASA Technical Reports Server (NTRS)
Thornhill, J. W.; Taylor, W. E.
1976-01-01
An analysis of estimated costs indicate that for an annual output of 4,747,000 hexagonal cells (38 mm. on a side) a total factory cost of $0.866 per cell could be achieved. For cells with 14% efficiency at AMO intensity (1353 watts per square meter), this annual production rate is equivalent to 3,373 kilowatts and a manufacturing cost of $1.22 per watt of electrical output. A laboratory model of such a facility was operated to produce a series of demonstration runs, producing hexagonal cells, 2 x 2 cm cells and 2 x 4 cm cells.
Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott
2016-03-10
In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered.
A Computational Architecture for Programmable Automation Research
NASA Astrophysics Data System (ADS)
Taylor, Russell H.; Korein, James U.; Maier, Georg E.; Durfee, Lawrence F.
1987-03-01
This short paper describes recent work at the IBM T. J. Watson Research Center directed at developing a highly flexible computational architecture for research on sensor-based programmable automation. The system described here has been designed with a focus on dynamic configurability, layered user inter-faces and incorporation of sensor-based real time operations into new commands. It is these features which distinguish it from earlier work. The system is cur-rently being implemented at IBM for research purposes and internal use and is an outgrowth of programmable automation research which has been ongoing since 1972 [e.g., 1, 2, 3, 4, 5, 6] .
Alhmidi, Heba; Cadnum, Jennifer L; Piedrahita, Christina T; John, Amrita R; Donskey, Curtis J
2018-04-01
Touchscreens are a potential source of pathogen transmission. In our facility, patients and visitors rarely perform hand hygiene after using interactive touchscreen computer kiosks. An automated ultraviolet-C touchscreen disinfection device was effective in reducing bacteriophage MS2, bacteriophage ϕX174, methicillin-resistant Staphylococcus aureus, and Clostridium difficile spores inoculated onto a touchscreen. In simulations, an automated ultraviolet-C touchscreen disinfection device alone or in combination with hand hygiene reduced transfer of the viruses from contaminated touchscreens to fingertips. Published by Elsevier Inc.
Toward an automated parallel computing environment for geosciences
NASA Astrophysics Data System (ADS)
Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping
2007-08-01
Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1975-01-01
The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.
A Simple Method for Automated Equilibration Detection in Molecular Simulations.
Chodera, John D
2016-04-12
Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.
A simple method for automated equilibration detection in molecular simulations
Chodera, John D.
2016-01-01
Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390
A review of automated image understanding within 3D baggage computed tomography security screening.
Mouton, Andre; Breckon, Toby P
2015-01-01
Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT.
Vonhofen, Geraldine; Evangelista, Tonya; Lordeon, Patricia
2012-04-01
The traditional method of administering radioactive isotopes to pediatric patients undergoing ictal brain single photon emission computed tomography testing has been by manual injections. This method presents certain challenges for nursing, including time requirements and safety risks. This quality improvement project discusses the implementation of an automated injection system for isotope administration and its impact on staffing, safety, and nursing satisfaction. It was conducted in an epilepsy monitoring unit at a large urban pediatric facility. Results of this project showed a decrease in the number of nurses exposed to radiation and improved nursing satisfaction with the use of the automated injection system. In addition, there was a decrease in the number of nursing hours required during ictal brain single photon emission computed tomography testing.
NASA Technical Reports Server (NTRS)
Amling, G. E.; Holms, A. G.
1973-01-01
A computer program is described that performs a statistical multiple-decision procedure called chain pooling. It uses a number of mean squares assigned to error variance that is conditioned on the relative magnitudes of the mean squares. The model selection is done according to user-specified levels of type 1 or type 2 error probabilities.
Gega, Lina; Swift, Louise; Barton, Garry; Todd, Gillian; Reeve, Nesta; Bird, Kelly; Holland, Richard; Howe, Amanda; Wilson, Jon; Molle, Jo
2012-08-27
Computerised cognitive behaviour therapy (cCBT) involves standardised, automated, interactive self-help programmes delivered via a computer. Randomised controlled trials (RCTs) and observational studies have shown than cCBT reduces depressive symptoms as much as face-to-face therapy and more than waiting lists or treatment as usual. cCBT's efficacy and acceptability may be influenced by the "human" support offered as an adjunct to it, which can vary in duration and can be offered by people with different levels of training and expertise. This is a two-by-two factorial RCT investigating the effectiveness, cost-effectiveness and acceptability of cCBT supplemented with 12 weekly phone support sessions are either brief (5-10 min) or extended (20-30 min) and are offered by either an expert clinician or an assistant with no clinical training. Adults with non-suicidal depression in primary care can self-refer into the study by completing and posting to the research team a standardised questionnaire. Following an assessment interview, eligible referrals have access to an 8-session cCBT programme called Beating the Blues and are randomised to one of four types of support: brief-assistant, extended-assistant, brief-clinician or extended-clinician.A sample size of 35 per group (total 140) is sufficient to detect a moderate effect size with 90% power on our primary outcome measure (Work and Social Adjustment Scale); assuming a 30% attrition rate, 200 patients will be randomised. Secondary outcome measures include the Beck Depression and Anxiety Inventories and the PHQ-9 and GAD-7. Data on clinical outcomes, treatment usage and patient experiences are collected in three ways: by post via self-report questionnaires at week 0 (randomisation) and at weeks 12 and 24 post-randomisation; electronically by the cCBT system every time patients log-in; by phone during assessments, support sessions and exit interviews. The study's factorial design increases its efficiency by allowing the concurrent investigation of two types of adjunct support for cCBT with a single sample of participants. Difficulties in recruitment, uptake and retention of participants are anticipated because of the nature of the targeted clinical problem (depression impairs motivation) and of the studied interventions (lack of face-to-face contact because referrals, assessments, interventions and data collection are completed by phone, computer or post). Current Controlled Trials ISRCTN98677176.
ERIC Educational Resources Information Center
Chen, Jing; Zhang, Mo; Bejar, Isaac I.
2017-01-01
Automated essay scoring (AES) generally computes essay scores as a function of macrofeatures derived from a set of microfeatures extracted from the text using natural language processing (NLP). In the "e-rater"® automated scoring engine, developed at "Educational Testing Service" (ETS) for the automated scoring of essays, each…
Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.
1983-06-01
office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development
1981-06-30
manpower needs as to quantity, quality and timing; all the internal functions of the personnel service are tapped to help meet these ends. Manpower...Program ACOS - Automated Computation of Service ACQ - Acquisition ACSAC - Assistant Chief of Staff for Automation and Comunications ACT - Automated...ARSTAF - Army Staff ARSTAFF - Army Staff ARTEP - Army Training and Evaluation Program ASI - Additional Skill Identifier ASVAB - Armed Services
Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Tom.
2013-01-01
NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.
SU-F-I-45: An Automated Technique to Measure Image Contrast in Clinical CT Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, J; Abadi, E; Meng, B
Purpose: To develop and validate an automated technique for measuring image contrast in chest computed tomography (CT) exams. Methods: An automated computer algorithm was developed to measure the distribution of Hounsfield units (HUs) inside four major organs: the lungs, liver, aorta, and bones. These organs were first segmented or identified using computer vision and image processing techniques. Regions of interest (ROIs) were automatically placed inside the lungs, liver, and aorta and histograms of the HUs inside the ROIs were constructed. The mean and standard deviation of each histogram were computed for each CT dataset. Comparison of the mean and standardmore » deviation of the HUs in the different organs provides different contrast values. The ROI for the bones is simply the segmentation mask of the bones. Since the histogram for bones does not follow a Gaussian distribution, the 25th and 75th percentile were computed instead of the mean. The sensitivity and accuracy of the algorithm was investigated by comparing the automated measurements with manual measurements. Fifteen contrast enhanced and fifteen non-contrast enhanced chest CT clinical datasets were examined in the validation procedure. Results: The algorithm successfully measured the histograms of the four organs in both contrast and non-contrast enhanced chest CT exams. The automated measurements were in agreement with manual measurements. The algorithm has sufficient sensitivity as indicated by the near unity slope of the automated versus manual measurement plots. Furthermore, the algorithm has sufficient accuracy as indicated by the high coefficient of determination, R2, values ranging from 0.879 to 0.998. Conclusion: Patient-specific image contrast can be measured from clinical datasets. The algorithm can be run on both contrast enhanced and non-enhanced clinical datasets. The method can be applied to automatically assess the contrast characteristics of clinical chest CT images and quantify dependencies that may not be captured in phantom data.« less
High-reliability computing for the smarter planet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul; Manuzzato, Andrea
2010-01-01
The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities ofmore » inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary. Already critical infrastructure is failing too frequently. In this paper, we will introduce the Cross-Layer Reliability concept for designing more reliable computer systems.« less
An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*
Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.
2014-01-01
Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144
Estimating post-marketing exposure to pharmaceutical products using ex-factory distribution data.
Telfair, Tamara; Mohan, Aparna K; Shahani, Shalini; Klincewicz, Stephen; Atsma, Willem Jan; Thomas, Adrian; Fife, Daniel
2006-10-01
The pharmaceutical industry has an obligation to identify adverse reactions to drug products during all phases of drug development, including the post-marketing period. Estimates of population exposure to pharmaceutical products are important to the post-marketing surveillance of drugs, and provide a context for assessing the various risks and benefits, including drug safety, associated with drug treatment. This paper describes a systematic approach to estimating post-marketing drug exposure using ex-factory shipment data to estimate the quantity of medication available, and dosage information (stratified by indication or other factors as appropriate) to convert the quantity of medication to person time of exposure. Unlike the non-standardized methods often used to estimate exposure, this approach provides estimates whose calculations are explicit, documented, and consistent across products and over time. The methods can readily be carried out by an individual or small group specializing in this function, and lend themselves to automation. The present estimation approach is practical and relatively uncomplicated to implement. We believe it is a useful innovation. Copyright 2006 John Wiley & Sons, Ltd.
Computer program CDCID: an automated quality control program using CDC update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singer, G.L.; Aguilar, F.
1984-04-01
A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less
1988-10-01
overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and
Artificial intelligence issues related to automated computing operations
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1989-01-01
Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.
Automated Measurement of Facial Expression in Infant-Mother Interaction: A Pilot Study
ERIC Educational Resources Information Center
Messinger, Daniel S.; Mahoor, Mohammad H.; Chow, Sy-Miin; Cohn, Jeffrey F.
2009-01-01
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two 6-month-old infant-mother dyads who each engaged in a face-to-face…
AUTOMATED LITERATURE PROCESSING HANDLING AND ANALYSIS SYSTEM--FIRST GENERATION.
ERIC Educational Resources Information Center
Redstone Scientific Information Center, Redstone Arsenal, AL.
THE REPORT PRESENTS A SUMMARY OF THE DEVELOPMENT AND THE CHARACTERISTICS OF THE FIRST GENERATION OF THE AUTOMATED LITERATURE PROCESSING, HANDLING AND ANALYSIS (ALPHA-1) SYSTEM. DESCRIPTIONS OF THE COMPUTER TECHNOLOGY OF ALPHA-1 AND THE USE OF THIS AUTOMATED LIBRARY TECHNIQUE ARE PRESENTED. EACH OF THE SUBSYSTEMS AND MODULES NOW IN OPERATION ARE…
Ercan, Ertuğrul; Kırılmaz, Bahadır; Kahraman, İsmail; Bayram, Vildan; Doğan, Hüseyin
2012-11-01
Flow-mediated dilation (FMD) is used to evaluate endothelial functions. Computer-assisted analysis utilizing edge detection permits continuous measurements along the vessel wall. We have developed a new fully automated software program to allow accurate and reproducible measurement. FMD has been measured and analyzed in 18 coronary artery disease (CAD) patients and 17 controls both by manually and by the software developed (computer supported) methods. The agreement between methods was assessed by Bland-Altman analysis. The mean age, body mass index and cardiovascular risk factors were higher in CAD group. Automated FMD% measurement for the control subjects was 18.3±8.5 and 6.8±6.5 for the CAD group (p=0.0001). The intraobserver and interobserver correlation for automated measurement was high (r=0.974, r=0.981, r=0.937, r=0.918, respectively). Manual FMD% at 60th second was correlated with automated FMD % (r=0.471, p=0.004). The new fully automated software© can be used to precise measurement of FMD with low intra- and interobserver variability than manual assessment.
Development of microcomputer-based mental acuity tests for repeated-measures studies
NASA Technical Reports Server (NTRS)
Kennedy, R. S.; Wilkes, R. L.; Baltzley, D. R.; Fowlkes, J. E.
1990-01-01
The purpose of this report is to detail the development of the Automated Performance Test System (APTS), a computer battery of mental acuity tests that can be used to assess human performance in the presence of toxic elements and environmental stressors. There were four objectives in the development of APTS. First, the technical requirements for developing APTS followed the tenets of the classical theory of mental tests which requires that tests meet set criteria like stability and reliability (the lack of which constitutes insensitivity). To be employed in the study of the exotic conditions of protracted space flight, a battery with multiple parallel forms is required. The second criteria was for the battery to have factorial multidimensionality and the third was for the battery to be sensitive to factors known to compromise performance. A fourth objective was for the tests to converge on the abilities entailed in mission specialist tasks. A series of studies is reported in which candidate APTS tests were subjected to an examination of their psychometric properties for repeated-measures testing. From this work, tests were selected that possessed the requisite metric properties of stability, reliability, and factor richness. In addition, studies are reported which demonstrate the predictive validity of the tests to holistic measures of intelligence.
Multi-disciplinary optimization of railway wheels
NASA Astrophysics Data System (ADS)
Nielsen, J. C. O.; Fredö, C. R.
2006-06-01
A numerical procedure for multi-disciplinary optimization of railway wheels, based on Design of Experiments (DOE) methodology and automated design, is presented. The target is a wheel design that meets the requirements for fatigue strength, while minimizing the unsprung mass and rolling noise. A 3-level full factorial (3LFF) DOE is used to collect data points required to set up Response Surface Models (RSM) relating design and response variables in the design space. Computationally efficient simulations are thereafter performed using the RSM to identify the solution that best fits the design target. A demonstration example, including four geometric design variables in a parametric finite element (FE) model, is presented. The design variables are wheel radius, web thickness, lateral offset between rim and hub, and radii at the transitions rim/web and hub/web, but more variables (including material properties) can be added if needed. To improve further the performance of the wheel design, a constrained layer damping (CLD) treatment is applied on the web. For a given load case, compared to a reference wheel design without CLD, a combination of wheel shape and damping optimization leads to the conclusion that a reduction in the wheel component of A-weighted rolling noise of 11 dB can be achieved if a simultaneous increase in wheel mass of 14 kg is accepted.
Interactive visualization of Earth and Space Science computations
NASA Technical Reports Server (NTRS)
Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise
1994-01-01
Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.
Toward the Factory of the Future.
ERIC Educational Resources Information Center
Hazony, Yehonathan
1983-01-01
Computer-integrated manufacturing (CIM) involves use of data processing technology as the vehicle for full integration of the total manufacturing process. A prototype research and educational facility for CIM developed with industrial sponsorship at Princeton University is described. (JN)
Technology Pathway Partnership Final Scientific Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, John C. Dr.; Godby, Larry A.
2012-04-26
This report covers the scientific progress and results made in the development of high efficiency multijunction solar cells and the light concentrating non-imaging optics for the commercial generation of renewable solar energy. During the contract period the efficiency of the multijunction solar cell was raised from 36.5% to 40% in commercially available fully qualified cells. In addition significant strides were made in automating production process for these cells in order to meet the costs required to compete with commercial electricity. Concurrent with the cells effort Boeing also developed a non imaging optical systems to raise the light intensity at themore » photovoltaic cell to the rage of 800 to 900 suns. Solar module efficiencies greater than 30% were consistently demonstrated. The technology and its manufacturing were maturated to a projected price of < $0.015 per kWh and demonstrated by automated assembly in a robotic factory with a throughput of 2 MWh/yr. The technology was demonstrated in a 100 kW power plant erected at California State University Northridge, CA.« less
Predicting Flows of Rarefied Gases
NASA Technical Reports Server (NTRS)
LeBeau, Gerald J.; Wilmoth, Richard G.
2005-01-01
DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.
Evolutionary and biological metaphors for engineering design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakiela, M.
1994-12-31
Since computing became generally available, there has been strong interest in using computers to assist and automate engineering design processes. Specifically, for design optimization and automation, nonlinear programming and artificial intelligence techniques have been extensively studied. New computational techniques, based upon the natural processes of evolution, adaptation, and learing, are showing promise because of their generality and robustness. This presentation will describe the use of two such techniques, genetic algorithms and classifier systems, for a variety of engineering design problems. Structural topology optimization, meshing, and general engineering optimization are shown as example applications.
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Veawab, A.
2013-03-01
This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.
Automated Reporting of DXA Studies Using a Custom-Built Computer Program.
England, Joseph R; Colletti, Patrick M
2018-06-01
Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
NASA Tech Briefs, May 1994. Volume 18, No. 5
NASA Technical Reports Server (NTRS)
1994-01-01
Topics covered include: Robotics/Automation; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Genetic algorithms in teaching artificial intelligence (automated generation of specific algebras)
NASA Astrophysics Data System (ADS)
Habiballa, Hashim; Jendryscik, Radek
2017-11-01
The problem of teaching essential Artificial Intelligence (AI) methods is an important task for an educator in the branch of soft-computing. The key focus is often given to proper understanding of the principle of AI methods in two essential points - why we use soft-computing methods at all and how we apply these methods to generate reasonable results in sensible time. We present one interesting problem solved in the non-educational research concerning automated generation of specific algebras in the huge search space. We emphasize above mentioned points as an educational case study of an interesting problem in automated generation of specific algebras.
Agent Models for Self-Motivated Home-Assistant Bots
NASA Astrophysics Data System (ADS)
Merrick, Kathryn; Shafi, Kamran
2010-01-01
Modern society increasingly relies on technology to support everyday activities. In the past, this technology has focused on automation, using computer technology embedded in physical objects. More recently, there is an expectation that this technology will not just embed reactive automation, but also embed intelligent, proactive automation in the environment. That is, there is an emerging desire for novel technologies that can monitor, assist, inform or entertain when required, and not just when requested. This paper presents three self-motivated, home-assistant bot applications using different self-motivated agent models. Self-motivated agents use a computational model of motivation to generate goals proactively. Technologies based on self-motivated agents can thus respond autonomously and proactively to stimuli from their environment. Three prototypes of different self-motivated agent models, using different computational models of motivation, are described to demonstrate these concepts.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M
2015-09-01
The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.
NASA Astrophysics Data System (ADS)
Murga, Alicia; Sano, Yusuke; Kawamoto, Yoichi; Ito, Kazuhide
2017-10-01
Mechanical and passive ventilation strategies directly impact indoor air quality. Passive ventilation has recently become widespread owing to its ability to reduce energy demand in buildings, such as the case of natural or cross ventilation. To understand the effect of natural ventilation on indoor environmental quality, outdoor-indoor flow paths need to be analyzed as functions of urban atmospheric conditions, topology of the built environment, and indoor conditions. Wind-driven natural ventilation (e.g., cross ventilation) can be calculated through the wind pressure coefficient distributions of outdoor wall surfaces and openings of a building, allowing the study of indoor air parameters and airborne contaminant concentrations. Variations in outside parameters will directly impact indoor air quality and residents' health. Numerical modeling can contribute to comprehend these various parameters because it allows full control of boundary conditions and sampling points. In this study, numerical weather prediction modeling was used to calculate wind profiles/distributions at the atmospheric scale, and computational fluid dynamics was used to model detailed urban and indoor flows, which were then integrated into a dynamic downscaling analysis to predict specific urban wind parameters from the atmospheric to built-environment scale. Wind velocity and contaminant concentration distributions inside a factory building were analyzed to assess the quality of the human working environment by using a computer simulated person. The impact of cross ventilation flows and its variations on local average contaminant concentration around a factory worker, and inhaled contaminant dose, were then discussed.
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2013 CFR
2013-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2014 CFR
2014-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2011 CFR
2011-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2012 CFR
2012-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
Topics in programmable automation. [for materials handling, inspection, and assembly
NASA Technical Reports Server (NTRS)
Rosen, C. A.
1975-01-01
Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.
[Automated processing of data from the 1985 population and housing census].
Cholakov, S
1987-01-01
The author describes the method of automated data processing used in the 1985 census of Bulgaria. He notes that the computerization of the census involves decentralization and the use of regional computing centers as well as data processing at the Central Statistical Office's National Information Computer Center. Special attention is given to problems concerning the projection and programming of census data. (SUMMARY IN ENG AND RUS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, G.R.; Bystroff, R.I.; Downey, R.M.
1975-09-01
In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less
Summers, Ronald M; Baecher, Nicolai; Yao, Jianhua; Liu, Jiamin; Pickhardt, Perry J; Choi, J Richard; Hill, Suvimol
2011-01-01
To show the feasibility of calculating the bone mineral density (BMD) from computed tomographic colonography (CTC) scans using fully automated software. Automated BMD measurement software was developed that measures the BMD of the first and second lumbar vertebrae on computed tomography and calculates the mean of the 2 values to provide a per patient BMD estimate. The software was validated in a reference population of 17 consecutive women who underwent quantitative computed tomography and in a population of 475 women from a consecutive series of asymptomatic patients enrolled in a CTC screening trial conducted at 3 medical centers. The mean (SD) BMD was 133.6 (34.6) mg/mL (95% confidence interval, 130.5-136.7; n = 475). In women aged 42 to 60 years (n = 316) and 61 to 79 years (n = 159), the mean (SD) BMDs were 143.1 (33.5) and 114.7 (28.3) mg/mL, respectively (P < 0.0001). Fully automated BMD measurements were reproducible for a given patient with 95% limits of agreement of -9.79 to 8.46 mg/mL for the mean difference between paired assessments on supine and prone CTC. Osteoporosis screening can be performed simultaneously with screening for colorectal polyps.
Automation Problems of 1968; Papers Presented at the Meeting...October 4-5, 1968.
ERIC Educational Resources Information Center
Andrews, Theodora, Ed.
Librarians and their concerned colleagues met to give, hear and discuss papers on library automation, primarily by computers. Noted at this second meeting on library automation were: (1) considerably more sophistication and casualness about the techniques involved, (2) considerably more assurance of what and where things can be applied and (3)…
Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry
ERIC Educational Resources Information Center
Gerard, Libby F.; Linn, Marcia C.
2016-01-01
Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students…
MARC and the Library Service Center: Automation at Bargain Rates.
ERIC Educational Resources Information Center
Pearson, Karl M.
Despite recent research and development in the field of library automation, libraries have been unable to reap the benefits promised by technology due to the high cost of building and maintaining their own computer-based systems. Time-sharing and disc mass storage devices will bring automation costs, if spread over a number of users, within the…
The Automated Logistics Element Planning System (ALEPS)
NASA Technical Reports Server (NTRS)
Schwaab, Douglas G.
1991-01-01
The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.
Fink, Christine; Uhlmann, Lorenz; Klose, Christina; Haenssle, Holger A
2018-05-17
Reliable and accurate assessment of severity in psoriasis is very important in order to meet indication criteria for initiation of systemic treatment or to evaluate treatment efficacy. The most acknowledged tool for measuring the extent of psoriatic skin changes is the Psoriasis Area and Severity Index (PASI). However, the calculation of PASI can be tedious and subjective and high intraobserver and interobserver variability is an important concern. Therefore, there is a great need for a standardised and objective method that guarantees a reproducible PASI calculation. Within this study we will investigate the precision and reproducibility of automated, computer-guided PASI measurements in comparison to trained physicians to address these limitations. Non-interventional analyses of PASI calculations by either physicians in a prospective versus retrospective setting or an automated computer-guided algorithm in 120 patients with plaque psoriasis. All retrospective PASI calculations by physicians or by the computer algorithm are based on total body digital images. The primary objective of this study is comparison of automated computer-guided PASI measurements by means of digital image analysis versus conventional, prospective or retrospective physicians' PASI assessments. Secondary endpoints include (1) the assessment of physicians' interobserver variance in PASI calculations, (2) the assessment of physicians' intraobserver variance in PASI assessments of the same patients' images after a time interval of at least 4 weeks, (3) the assessment of the deviation between physicians' prospective versus retrospective PASI calculations, and (4) the reproducibility of automated computer-guided PASI measurements by assessment of two sets of total body digital images of the same patients taken at one time point. Ethical approval was provided by the Ethics Committee of the Medical Faculty of the University of Heidelberg (ethics approval number S-379/2016). DRKS00011818; Results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Long-term pulmonary complications of chemical weapons exposure in former poison gas factory workers.
Nishimura, Yoshifumi; Iwamoto, Hiroshi; Ishikawa, Nobuhisa; Hattori, Noboru; Horimasu, Yasushi; Ohshimo, Shinichiro; Fujitaka, Kazunori; Kondo, Keiichi; Hamada, Hironobu; Awai, Kazuo; Kohno, Nobuoki
2016-07-01
Sulfur mustard (SM) and lewisite are vesicant chemical warfare agents that can cause skin blistering and chronic lung complications. During 1929-1945, a Japanese factory produced poisonous gases, which included SM, lewisite and other chemical weapons. The aim of this study was to investigate the chest computed tomography (CT) findings among long-term survivors who worked at this factory. During 2009-2012, we evaluated chest CT findings from 346 long-term survivors who worked at the poison gas factory. Skin lesions were used as an indicator of significant exposure to vesicant agents. Among the 346 individuals, 53 (15%) individuals experienced skin lesions while working at the factory, and chest CT revealed abnormal findings in 179 individuals (52%). Emphysema was the most common CT finding and was observed in 75 individuals (22%), while honeycombing was observed in 8 individuals (2%). Emphysema and honeycombing were more prevalent among individuals with skin lesions, compared to individuals without skin lesions. Multivariate analyses revealed significant associations between the presence of emphysema and skin lesions (p = 0.008). Among individuals who never smoked, individuals with skin lesions (n = 26) exhibited a significantly higher rate of emphysema, compared to individuals without skin lesions (n = 200) (35% versus 7%, respectively; p < 0.001). Among the long-term survivors who worked at the poison gas factory, a history of skin lesions was associated with the presence of emphysema, even among never smokers, which suggests that emphysema might be a long-term complication of exposure to chemical warfare agents.
Planning for the semiconductor manufacturer of the future
NASA Technical Reports Server (NTRS)
Fargher, Hugh E.; Smith, Richard A.
1992-01-01
Texas Instruments (TI) is currently contracted by the Air Force Wright Laboratory and the Defense Advanced Research Projects Agency (DARPA) to develop the next generation flexible semiconductor wafer fabrication system called Microelectronics Manufacturing Science & Technology (MMST). Several revolutionary concepts are being pioneered on MMST, including the following: new single-wafer rapid thermal processes, in-situ sensors, cluster equipment, and advanced Computer Integrated Manufacturing (CIM) software. The objective of the project is to develop a manufacturing system capable of achieving an order of magnitude improvement in almost all aspects of wafer fabrication. TI was awarded the contract in Oct., 1988, and will complete development with a fabrication facility demonstration in April, 1993. An important part of MMST is development of the CIM environment responsible for coordinating all parts of the system. The CIM architecture being developed is based on a distributed object oriented framework made of several cooperating subsystems. The software subsystems include the following: process control for dynamic control of factory processes; modular processing system for controlling the processing equipment; generic equipment model which provides an interface between processing equipment and the rest of the factory; specification system which maintains factory documents and product specifications; simulator for modelling the factory for analysis purposes; scheduler for scheduling work on the factory floor; and the planner for planning and monitoring of orders within the factory. This paper first outlines the division of responsibility between the planner, scheduler, and simulator subsystems. It then describes the approach to incremental planning and the way in which uncertainty is modelled within the plan representation. Finally, current status and initial results are described.
A Low Cost Micro-Computer Based Local Area Network for Medical Office and Medical Center Automation
Epstein, Mel H.; Epstein, Lynn H.; Emerson, Ron G.
1984-01-01
A Low Cost Micro-computer based Local Area Network for medical office automation is described which makes use of an array of multiple and different personal computers interconnected by a local area network. Each computer on the network functions as fully potent workstations for data entry and report generation. The network allows each workstation complete access to the entire database. Additionally, designated computers may serve as access ports for remote terminals. Through “Gateways” the network may serve as a front end for a large mainframe, or may interface with another network. The system provides for the medical office environment the expandability and flexibility of a multi-terminal mainframe system at a far lower cost without sacrifice of performance.
NASA Technical Reports Server (NTRS)
Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)
1983-01-01
Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.
Computational methods for structural load and resistance modeling
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Millwater, H. R.; Harren, S. V.
1991-01-01
An automated capability for computing structural reliability considering uncertainties in both load and resistance variables is presented. The computations are carried out using an automated Advanced Mean Value iteration algorithm (AMV +) with performance functions involving load and resistance variables obtained by both explicit and implicit methods. A complete description of the procedures used is given as well as several illustrative examples, verified by Monte Carlo Analysis. In particular, the computational methods described in the paper are shown to be quite accurate and efficient for a material nonlinear structure considering material damage as a function of several primitive random variables. The results show clearly the effectiveness of the algorithms for computing the reliability of large-scale structural systems with a maximum number of resolutions.
Ogata, Y; Nishizawa, K
1995-10-01
An automated smear counting and data processing system for a life science laboratory was developed to facilitate routine surveys and eliminate human errors by using a notebook computer. This system was composed of a personal computer, a liquid scintillation counter and a well-type NaI(Tl) scintillation counter. The radioactivity of smear samples was automatically measured by these counters. The personal computer received raw signals from the counters through an interface of RS-232C. The software for the computer evaluated the surface density of each radioisotope and printed out that value along with other items as a report. The software was programmed in Pascal language. This system was successfully applied to routine surveys for contamination in our facility.
NASA Technical Reports Server (NTRS)
Hockaday, Stephen; Kuhlenschmidt, Sharon (Editor)
1991-01-01
The objective of the workshop was to explore the role of human factors in facilitating the introduction of artificial intelligence (AI) to advanced air traffic control (ATC) automation concepts. AI is an umbrella term which is continually expanding to cover a variety of techniques where machines are performing actions taken based upon dynamic, external stimuli. AI methods can be implemented using more traditional programming languages such as LISP or PROLOG, or they can be implemented using state-of-the-art techniques such as object-oriented programming, neural nets (hardware or software), and knowledge based expert systems. As this technology advances and as increasingly powerful computing platforms become available, the use of AI to enhance ATC systems can be realized. Substantial efforts along these lines are already being undertaken at the FAA Technical Center, NASA Ames Research Center, academic institutions, industry, and elsewhere. Although it is clear that the technology is ripe for bringing computer automation to ATC systems, the proper scope and role of automation are not at all apparent. The major concern is how to combine human controllers with computer technology. A wide spectrum of options exists, ranging from using automation only to provide extra tools to augment decision making by human controllers to turning over moment-by-moment control to automated systems and using humans as supervisors and system managers. Across this spectrum, it is now obvious that the difficulties that occur when tying human and automated systems together must be resolved so that automation can be introduced safely and effectively. The focus of the workshop was to further explore the role of injecting AI into ATC systems and to identify the human factors that need to be considered for successful application of the technology to present and future ATC systems.
Budgeting for Computer Technology in the Small College Library
ERIC Educational Resources Information Center
Axford, H. William
1978-01-01
Addresses the need for liberal arts colleges and the use of available technology/automation to help secure their survival. Some factors to be considered in planning and budgeting for automation are discussed. (Author/MBR)
Long-Term Pavement Performance Automated Faulting Measurement
DOT National Transportation Integrated Search
2015-02-01
This study focused on identifying transverse joint locations on jointed plain concrete pavements using an automated joint detection algorithm and computing faulting at these locations using Long-Term Pavement Performance (LTPP) Program profile data c...
ERIC Educational Resources Information Center
Library Computing, 1985
1985-01-01
Special supplement to "Library Journal" and "School Library Journal" covers topics of interest to school, public, academic, and special libraries planning for automation: microcomputer use, readings in automation, online searching, databases of microcomputer software, public access to microcomputers, circulation, creating a…
Changing technology in transportation : automated vehicles in freight.
DOT National Transportation Integrated Search
2017-06-27
The world of transportation is on the verge of undergoing an impactful transformation. Over the past decade, automotive computing technology has progressed far more rapidly than anticipated. Most major auto manufacturers integrated automated features...
Computing technology in the 1980's. [computers
NASA Technical Reports Server (NTRS)
Stone, H. S.
1978-01-01
Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.
ERIC Educational Resources Information Center
Boggis, Jean J.
2001-01-01
Interviews and observations in a British clothing factory that introduced a new computer numerical control system and teamwork/empowerment showed that "teamwork" actually meant little worker control over daily work; deployment of workers often disrupted group cohesiveness. Worker responses included increased absence and turnover.…
Introduction of home electronics for the future
NASA Astrophysics Data System (ADS)
Yoshimoto, Hideyuki; Shirai, Iwao
Development of electronics has accelerated the automation and labor saving at factories and offices. Home electronics is also expected to be needed more and more in Japan towards the 21st century, as the advanced information society and the elderly society will be accelerated, and women's participation in social affairs will be increased. Resources Council, which is the advisory organ of the Minister of State for Science and Technology, forecast to what extent home electronics will be popularized by the year of 2010. The Council expected to promote home electronics, because resource and energy saving should be accelerated and people should enjoy much more their individual lives at home.
The acceptability of computer applications to group practices.
Zimmerman, J; Gordon, R S; Tao, D K; Boxerman, S B
1978-01-01
Of the 72 identified group practices in a midwest urban environment, 39 were found to use computers. The practices had been influenced strongly by vendors in their selection of an automated system or service, and had usually spent less than a work-month analyzing their needs and reviewing alternate ways in which those needs could be met. Ninety-seven percent of the practices had some financial applications and 64% had administrative applications, but only 2.5% had medical applications. For half the practices at least 2 months elapsed from the time the automated applications were put into operation until they were considered to be integrated into the office routine. Advantages experienced by at least a third of the practices using computers were that the work was done faster, information was more readily available, and costs were reduced. The most common disadvantage was inflexibility. Most (89%) of the practices believed that automation was preferable to their previous manual system.
Recent advances in automated protein design and its future challenges.
Setiawan, Dani; Brender, Jeffrey; Zhang, Yang
2018-04-25
Protein function is determined by protein structure which is in turn determined by the corresponding protein sequence. If the rules that cause a protein to adopt a particular structure are understood, it should be possible to refine or even redefine the function of a protein by working backwards from the desired structure to the sequence. Automated protein design attempts to calculate the effects of mutations computationally with the goal of more radical or complex transformations than are accessible by experimental techniques. Areas covered: The authors give a brief overview of the recent methodological advances in computer-aided protein design, showing how methodological choices affect final design and how automated protein design can be used to address problems considered beyond traditional protein engineering, including the creation of novel protein scaffolds for drug development. Also, the authors address specifically the future challenges in the development of automated protein design. Expert opinion: Automated protein design holds potential as a protein engineering technique, particularly in cases where screening by combinatorial mutagenesis is problematic. Considering solubility and immunogenicity issues, automated protein design is initially more likely to make an impact as a research tool for exploring basic biology in drug discovery than in the design of protein biologics.
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-03-06
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user's home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered.
Off the Shelf Cloud Robotics for the Smart Home: Empowering a Wireless Robot through Cloud Computing
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-01-01
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user’s home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered. PMID:28272305
Advanced computer architecture specification for automated weld systems
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1994-01-01
This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.
Computer-Generated Feedback on Student Writing
ERIC Educational Resources Information Center
Ware, Paige
2011-01-01
A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…
ERIC Educational Resources Information Center
Wang, Pei-Yu; Huang, Chung-Kai
2015-01-01
This study aims to explore the impact of learner grade, visual cueing, and control design on children's reading achievement of audio e-books with tablet computers. This research was a three-way factorial design where the first factor was learner grade (grade four and six), the second factor was e-book visual cueing (word-based, line-based, and…
ERIC Educational Resources Information Center
Husby, Ole
1990-01-01
The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…
Jasko, D J; Lein, D H; Foote, R H
1990-01-01
Two commercially available computer-automate semen analysis instruments (CellSoft Automated Semen Analyzer and HTM-2000 Motion Analyzer) were compared for their ability to report similar results based on the analysis of pre-recorded video tapes of extended, motile stallion semen. The determinations of the percentage of motile cells by these instruments were more similar than the comparisons between subjective estimates and either instrument. However, mean values obtained from the same sample may still differ by as much as 30 percentage units between instruments. Instruments varied with regard to the determinations of mean sperm curvilinear velocity and sperm concentration, but mean sperm linearity determinations were similar between the instruments. We concluded that the determinations of sperm motion characteristics by subjective estimation, CellSoft Automated Semen Analyzer, and HTM-2000 Motility Analyzer are often dissimilar, making direct comparisons of results difficult.
Automated procedures for sizing aerospace vehicle structures /SAVES/
NASA Technical Reports Server (NTRS)
Giles, G. L.; Blackburn, C. L.; Dixon, S. C.
1972-01-01
Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.
Microbial Consortia Engineering for Cellular Factories: in vitro to in silico systems
Bernstein, Hans C; Carlson, Ross P
2012-01-01
This mini-review discusses the current state of experimental and computational microbial consortia engineering with a focus on cellular factories. A discussion of promising ecological theories central to community resource usage is presented to facilitate interpretation of consortial designs. Recent case studies exemplifying different resource usage motifs and consortial assembly templates are presented. The review also highlights in silico approaches to design and to analyze consortia with an emphasis on stoichiometric modeling methods. The discipline of microbial consortia engineering possesses a widely accepted potential to generate highly novel and effective bio-catalysts for applications from biofuels to specialty chemicals to enhanced mineral recovery. PMID:24688677
Altering user' acceptance of automation through prior automation exposure.
Bekier, Marek; Molesworth, Brett R C
2017-06-01
Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.
Texas A & M University at Galveston: College and University Computing Environment.
ERIC Educational Resources Information Center
CAUSE/EFFECT, 1986
1986-01-01
Texas A & M University at Galveston is the only marine and maritime-oriented University in the Southwest. Its computing policy/direction, academic computing, administrative computing, and library automation are described, and hurricance emergency plans are also discussed. (MLW)
Prakash, Jaya; Yalavarthy, Phaneendra K
2013-03-01
Developing a computationally efficient automated method for the optimal choice of regularization parameter in diffuse optical tomography. The least-squares QR (LSQR)-type method that uses Lanczos bidiagonalization is known to be computationally efficient in performing the reconstruction procedure in diffuse optical tomography. The same is effectively deployed via an optimization procedure that uses the simplex method to find the optimal regularization parameter. The proposed LSQR-type method is compared with the traditional methods such as L-curve, generalized cross-validation (GCV), and recently proposed minimal residual method (MRM)-based choice of regularization parameter using numerical and experimental phantom data. The results indicate that the proposed LSQR-type and MRM-based methods performance in terms of reconstructed image quality is similar and superior compared to L-curve and GCV-based methods. The proposed method computational complexity is at least five times lower compared to MRM-based method, making it an optimal technique. The LSQR-type method was able to overcome the inherent limitation of computationally expensive nature of MRM-based automated way finding the optimal regularization parameter in diffuse optical tomographic imaging, making this method more suitable to be deployed in real-time.
Robert, A; Ducos, P; Francin, J M; Marsan, P
2007-04-01
To study the range of urinary levels of 4,4'-methylenedianiline (MDA), a metabolite of methylenediphenyl diisocyanate (MDI), across factories in the polyurethane industries and to evaluate the validity of this biomarker to assess MDI occupational exposure. Workers exposed to MDI, as well as non-occupationally exposed subjects, were studied and pre- and post-shift urine samples were collected from 169 workers of 19 French factories and 120 controls. Details on work activities and practices were collected by a questionnaire and workers were classified into three job categories. The identification and quantification of the total urinary MDA were performed by high-performance liquid chromatography with electrochemical detection (HPLC/EC). For all the factories, MDA was detectable in 73% of the post-shift urine samples. These post-shift values, in the range of <0.10 (detection limit)-23.60 microg/l, were significantly higher than those of the pre-shift samples. Urinary MDA levels in the control group were in the range of < 0.10-0.80 microg/l. The degree of automation of the mixing operation (polyols and MDI) appears as a determinant in the extent of exposure levels. The highest amounts of MDA in urine were found in the spraying or hot processes. The excretion levels of the workers directly exposed to the hardener containing the MDI monomer were significantly higher than those of the other workers. In addition, skin exposure to MDI monomer or to polyurethane resin during the curing step were always associated with significant MDA levels in urine. Total MDA in post-shift urine samples is a reliable biomarker to assess occupational exposure to MDI in various industrial applications and to help factories to improve their manufacturing processes and working practices. A biological guiding value not exceeding 7 microg/l (5 microg/g creatinine) could be proposed in France.
Panuccio, Giuseppe; Torsello, Giovanni Federico; Pfister, Markus; Bisdas, Theodosios; Bosiers, Michel J; Torsello, Giovanni; Austermann, Martin
2016-12-01
To assess the usability of a fully automated fusion imaging engine prototype, matching preinterventional computed tomography with intraoperative fluoroscopic angiography during endovascular aortic repair. From June 2014 to February 2015, all patients treated electively for abdominal and thoracoabdominal aneurysms were enrolled prospectively. Before each procedure, preoperative planning was performed with a fully automated fusion engine prototype based on computed tomography angiography, creating a mesh model of the aorta. In a second step, this three-dimensional dataset was registered with the two-dimensional intraoperative fluoroscopy. The main outcome measure was the applicability of the fully automated fusion engine. Secondary outcomes were freedom from failure of automatic segmentation or of the automatic registration as well as accuracy of the mesh model, measuring deviations from intraoperative angiography in millimeters, if applicable. Twenty-five patients were enrolled in this study. The fusion imaging engine could be used in successfully 92% of the cases (n = 23). Freedom from failure of automatic segmentation was 44% (n = 11). The freedom from failure of the automatic registration was 76% (n = 19), the median error of the automatic registration process was 0 mm (interquartile range, 0-5 mm). The fully automated fusion imaging engine was found to be applicable in most cases, albeit in several cases a fully automated data processing was not possible, requiring manual intervention. The accuracy of the automatic registration yielded excellent results and promises a useful and simple to use technology. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
"First generation" automated DNA sequencing technology.
Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M
2011-10-01
Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.
Econophysics of a ranked demand and supply resource allocation problem
NASA Astrophysics Data System (ADS)
Priel, Avner; Tamir, Boaz
2018-01-01
We present a two sided resource allocation problem, between demands and supplies, where both parties are ranked. For example, in Big Data problems where a set of different computational tasks is divided between a set of computers each with its own resources, or between employees and employers where both parties are ranked, the employees by their fitness and the employers by their package benefits. The allocation process can be viewed as a repeated game where in each iteration the strategy is decided by a meta-rule, based on the ranks of both parties and the results of the previous games. We show the existence of a phase transition between an absorbing state, where all demands are satisfied, and an active one where part of the demands are always left unsatisfied. The phase transition is governed by the ratio between supplies and demand. In a job allocation problem we find positive correlation between the rank of the workers and the rank of the factories; higher rank workers are usually allocated to higher ranked factories. These all suggest global emergent properties stemming from local variables. To demonstrate the global versus local relations, we introduce a local inertial force that increases the rank of employees in proportion to their persistence time in the same factory. We show that such a local force induces non trivial global effects, mostly to benefit the lower ranked employees.
Computer Applications in the Design Process.
ERIC Educational Resources Information Center
Winchip, Susan
Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…
Vision 20/20: Automation and advanced computing in clinical radiation oncology.
Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa
2014-01-01
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.
Data Handling and Communication
NASA Astrophysics Data System (ADS)
Hemmer, FréDéRic Giorgio Innocenti, Pier
The following sections are included: * Introduction * Computing Clusters and Data Storage: The New Factory and Warehouse * Local Area Networks: Organizing Interconnection * High-Speed Worldwide Networking: Accelerating Protocols * Detector Simulation: Events Before the Event * Data Analysis and Programming Environment: Distilling Information * World Wide Web: Global Networking * References
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
Methodology for Prototyping Increased Levels of Automation for Spacecraft Rendezvous Functions
NASA Technical Reports Server (NTRS)
Hart, Jeremy J.; Valasek, John
2007-01-01
The Crew Exploration Vehicle necessitates higher levels of automation than previous NASA vehicles, due to program requirements for automation, including Automated Rendezvous and Docking. Studies of spacecraft development often point to the locus of decision-making authority between humans and computers (i.e. automation) as a prime driver for cost, safety, and mission success. Therefore, a critical component in the Crew Exploration Vehicle development is the determination of the correct level of automation. To identify the appropriate levels of automation and autonomy to design into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool. This paper develops a methodology for prototyping increased levels of automation for spacecraft rendezvous functions. This methodology is used to evaluate the accuracy of the Function-specific Level of Autonomy and Automation Tool specified levels of automation, via prototyping. Spacecraft rendezvous planning tasks are selected and then prototyped in Matlab using Fuzzy Logic techniques and existing Space Shuttle rendezvous trajectory algorithms.
Automation of Periodic Reports
DOT National Transportation Integrated Search
1975-06-01
The manual is a user's guide to the automation of the 'Summary of National Transportation Statistics.' The System is stored on the in-house PDP-10 computer to provide ready access and retrieval of the data. The information stored in the system includ...
Human Factors Considerations in System Design
NASA Technical Reports Server (NTRS)
Mitchell, C. M. (Editor); Vanbalen, P. M. (Editor); Moe, K. L. (Editor)
1983-01-01
Human factors considerations in systems design was examined. Human factors in automated command and control, in the efficiency of the human computer interface and system effectiveness are outlined. The following topics are discussed: human factors aspects of control room design; design of interactive systems; human computer dialogue, interaction tasks and techniques; guidelines on ergonomic aspects of control rooms and highly automated environments; system engineering for control by humans; conceptual models of information processing; information display and interaction in real time environments.
A Computational Approach for Automated Posturing of a Human Finite Element Model
2016-07-01
Std. Z39.18 July 2016 Memorandum Report A Computational Approach for Automated Posturing of a Human Finite Element Model Justin McKee and Adam...protection by influencing the path that loading will be transferred into the body and is a major source of variability. The development of a finite element ...posture, human body, finite element , leg, spine 42 Adam Sokolow 410-306-2985Unclassified Unclassified Unclassified UU ii Approved for public release
Automated High-Temperature Hall-Effect Apparatus
NASA Technical Reports Server (NTRS)
Parker, James B.; Zoltan, Leslie D.
1992-01-01
Automated apparatus takes Hall-effect measurements of specimens of thermoelectric materials at temperatures from ambient to 1,200 K using computer control to obtain better resolution of data and more data points about three times as fast as before. Four-probe electrical-resistance measurements taken in 12 electrical and 2 magnetic orientations to characterize specimens at each temperature. Computer acquires data, and controls apparatus via three feedback loops: one for temperature, one for magnetic field, and one for electrical-potential data.
NASA Astrophysics Data System (ADS)
Gatti, Vijay; Hill, Jason; Mitra, Sunanda; Nutter, Brian
2014-03-01
Despite the current availability in resource-rich regions of advanced technologies in scanning and 3-D imaging in current ophthalmology practice, world-wide screening tests for early detection and progression of glaucoma still consist of a variety of simple tools, including fundus image-based parameters such as CDR (cup to disc diameter ratio) and CAR (cup to disc area ratio), especially in resource -poor regions. Reliable automated computation of the relevant parameters from fundus image sequences requires robust non-rigid registration and segmentation techniques. Recent research work demonstrated that proper non-rigid registration of multi-view monocular fundus image sequences could result in acceptable segmentation of cup boundaries for automated computation of CAR and CDR. This research work introduces a composite diffeomorphic demons registration algorithm for segmentation of cup boundaries from a sequence of monocular images and compares the resulting CAR and CDR values with those computed manually by experts and from 3-D visualization of stereo pairs. Our preliminary results show that the automated computation of CDR and CAR from composite diffeomorphic segmentation of monocular image sequences yield values comparable with those from the other two techniques and thus may provide global healthcare with a cost-effective yet accurate tool for management of glaucoma in its early stage.
The interaction of representation and reasoning.
Bundy, Alan
2013-09-08
Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group.
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.
Stockton, David B; Santamaria, Fidel
2017-10-01
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety
NASA Technical Reports Server (NTRS)
Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.
1994-01-01
Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.
The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety
NASA Astrophysics Data System (ADS)
Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.
1994-02-01
Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.
Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC
NASA Technical Reports Server (NTRS)
Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet
1999-01-01
The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.
Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC
NASA Technical Reports Server (NTRS)
Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet
1998-01-01
The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.
McKenzie, Kirsten; Walker, Sue; Tong, Shilu
It remains unclear whether the change from a manual to an automated coding system (ACS) for deaths has significantly affected the consistency of Australian mortality data. The underlying causes of 34,000 deaths registered in 1997 in Australia were dual coded, in ICD-9 manually, and by using an automated computer coding program. The diseases most affected by the change from manual to ACS were senile/presenile dementia, and pneumonia. The most common disease to which a manually assigned underlying cause of senile dementia was coded with ACS was unspecified psychoses (37.2%). Only 12.5% of codes assigned by ACS as senile dementia were coded the same by manual coders. This study indicates some important differences in mortality rates when comparing mortality data that have been coded manually with those coded using an automated computer coding program. These differences may be related to both the different interpretation of ICD coding rules between manual and automated coding, and different co-morbidities or co-existing conditions among demographic groups.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
1974-07-01
automated manufacturing processes and a rough technoeconomic evaluation of those concepts. Our evaluation is largely based on estimates; therefore, the...must be subjected to thorough analysis and experimental verification before they can be considered definitive. They are being published at this time...hardware and sensor technology, manufacturing engineering, automation, and economic analysis . Members of this team inspected over thirty manufacturing
1980-07-25
matrix (DTM) and digital planimetric data, combined and integrated into so-called "data bases." I’ll say more about this later. AUTOMATION OF...projection with mechanical inversors to maintain the Scheimpflug condition. Some automation has been achieved, with computer control to determine rectifier... matrix (DTM) form that is not necessarily collected from the same photography as that from which the orthophoto is being produced. Because they are
Nakanishi, Rine; Sankaran, Sethuraman; Grady, Leo; Malpeso, Jenifer; Yousfi, Razik; Osawa, Kazuhiro; Ceponiene, Indre; Nazarat, Negin; Rahmani, Sina; Kissel, Kendall; Jayawardena, Eranthi; Dailing, Christopher; Zarins, Christopher; Koo, Bon-Kwon; Min, James K; Taylor, Charles A; Budoff, Matthew J
2018-03-23
Our goal was to evaluate the efficacy of a fully automated method for assessing the image quality (IQ) of coronary computed tomography angiography (CCTA). The machine learning method was trained using 75 CCTA studies by mapping features (noise, contrast, misregistration scores, and un-interpretability index) to an IQ score based on manual ground truth data. The automated method was validated on a set of 50 CCTA studies and subsequently tested on a new set of 172 CCTA studies against visual IQ scores on a 5-point Likert scale. The area under the curve in the validation set was 0.96. In the 172 CCTA studies, our method yielded a Cohen's kappa statistic for the agreement between automated and visual IQ assessment of 0.67 (p < 0.01). In the group where good to excellent (n = 163), fair (n = 6), and poor visual IQ scores (n = 3) were graded, 155, 5, and 2 of the patients received an automated IQ score > 50 %, respectively. Fully automated assessment of the IQ of CCTA data sets by machine learning was reproducible and provided similar results compared with visual analysis within the limits of inter-operator variability. • The proposed method enables automated and reproducible image quality assessment. • Machine learning and visual assessments yielded comparable estimates of image quality. • Automated assessment potentially allows for more standardised image quality. • Image quality assessment enables standardization of clinical trial results across different datasets.
ERIC Educational Resources Information Center
STONE, PHILIP J.
AUTOMATED LANGUAGE PROCESSING (CONTENT ANALYSIS) IS ENGAGED IN NEW VENTURES IN COMPUTER DIALOG AS A RESULT OF NEW TECHNIQUES IN CATEGORIZING RESPONSES. A COMPUTER "NEED-ACHIEVEMENT" SCORING SYSTEM HAS BEEN DEVELOPED. A SET OF COMPUTER PROGRAMS, LABELED "THE GENERAL INQUIRER," WILL SCORE COMPUTER INPUTS WITH RESPONSES FED FROM…
Manufacturing engineering: Principles for optimization
NASA Astrophysics Data System (ADS)
Koenig, Daniel T.
Various subjects in the area of manufacturing engineering are addressed. The topics considered include: manufacturing engineering organization concepts and management techniques, factory capacity and loading techniques, capital equipment programs, machine tool and equipment selection and implementation, producibility engineering, methods, planning and work management, and process control engineering in job shops. Also discussed are: maintenance engineering, numerical control of machine tools, fundamentals of computer-aided design/computer-aided manufacture, computer-aided process planning and data collection, group technology basis for plant layout, environmental control and safety, and the Integrated Productivity Improvement Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Andrew; Lawrence, Earl
The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less
Lean coding machine. Facilities target productivity and job satisfaction with coding automation.
Rollins, Genna
2010-07-01
Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.
Automated drafting system uses computer techniques
NASA Technical Reports Server (NTRS)
Millenson, D. H.
1966-01-01
Automated drafting system produces schematic and block diagrams from the design engineers freehand sketches. This system codes conventional drafting symbols and their coordinate locations on standard size drawings for entry on tapes that are used to drive a high speed photocomposition machine.
Automated tetraploid genotype calling by hierarchical clustering
USDA-ARS?s Scientific Manuscript database
SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...
AUTOMATION OF EXPERIMENTS WITH A HAND-HELD PROGRAMMABLE CALCULATOR
Technological developments have dramatically reduced the cost of data collection, experimental control and computation. Products are now available which allow automation of experiments both in the laboratory and in the field at substantially lower cost and with less technical exp...
The 3D Euler solutions using automated Cartesian grid generation
NASA Technical Reports Server (NTRS)
Melton, John E.; Enomoto, Francis Y.; Berger, Marsha J.
1993-01-01
Viewgraphs on 3-dimensional Euler solutions using automated Cartesian grid generation are presented. Topics covered include: computational fluid dynamics (CFD) and the design cycle; Cartesian grid strategy; structured body fit; grid generation; prolate spheroid; and ONERA M6 wing.
Post-Fisherian Experimentation: From Physical to Virtual
Jeff Wu, C. F.
2014-04-24
Fisher's pioneering work in design of experiments has inspired further work with broader applications, especially in industrial experimentation. Three topics in physical experiments are discussed: principles of effect hierarchy, sparsity, and heredity for factorial designs, a new method called CME for de-aliasing aliased effects, and robust parameter design. The recent emergence of virtual experiments on a computer is reviewed. Here, some major challenges in computer experiments, which must go beyond Fisherian principles, are outlined.
NASA Astrophysics Data System (ADS)
van Leunen, J. A. J.; Dreessen, J.
1984-05-01
The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.
Concepts and algorithms for terminal-area traffic management
NASA Technical Reports Server (NTRS)
Erzberger, H.; Chapel, J. D.
1984-01-01
The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.
Automated Illustration of Patients Instructions
Bui, Duy; Nakamura, Carlos; Bray, Bruce E.; Zeng-Treitler, Qing
2012-01-01
A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration. PMID:23304392
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
Computer aided fixture design - A case based approach
NASA Astrophysics Data System (ADS)
Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom
2017-11-01
Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.
Continuous stacking computational approach based automated microscope slide scanner
NASA Astrophysics Data System (ADS)
Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva
2018-02-01
Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.
Computers and Technological Forecasting
ERIC Educational Resources Information Center
Martino, Joseph P.
1971-01-01
Forecasting is becoming increasingly automated, thanks in large measure to the computer. It is now possible for a forecaster to submit his data to a computation center and call for the appropriate program. (No knowledge of statistics is required.) (Author)
NASA Astrophysics Data System (ADS)
Yamada, Yusuke; Hiraki, Masahiko; Sasajima, Kumiko; Matsugaki, Naohiro; Igarashi, Noriyuki; Amano, Yasushi; Warizaya, Masaichi; Sakashita, Hitoshi; Kikuchi, Takashi; Mori, Takeharu; Toyoshima, Akio; Kishimoto, Shunji; Wakatsuki, Soichi
2010-06-01
Recent advances in high-throughput techniques for macromolecular crystallography have highlighted the importance of structure-based drug design (SBDD), and the demand for synchrotron use by pharmaceutical researchers has increased. Thus, in collaboration with Astellas Pharma Inc., we have constructed a new high-throughput macromolecular crystallography beamline, AR-NE3A, which is dedicated to SBDD. At AR-NE3A, a photon flux up to three times higher than those at existing high-throughput beams at the Photon Factory, AR-NW12A and BL-5A, can be realized at the same sample positions. Installed in the experimental hutch are a high-precision diffractometer, fast-readout, high-gain CCD detector, and sample exchange robot capable of handling more than two hundred cryo-cooled samples stored in a Dewar. To facilitate high-throughput data collection required for pharmaceutical research, fully automated data collection and processing systems have been developed. Thus, sample exchange, centering, data collection, and data processing are automatically carried out based on the user's pre-defined schedule. Although Astellas Pharma Inc. has a priority access to AR-NE3A, the remaining beam time is allocated to general academic and other industrial users.
Computer Administering of the Psychological Investigations: Set-Relational Representation
NASA Astrophysics Data System (ADS)
Yordzhev, Krasimir
Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.
Ramadas, Gisela C V; Rocha, Ana Maria A C; Fernandes, Edite M G P
2015-01-01
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Industrial applications of automated X-ray inspection
NASA Astrophysics Data System (ADS)
Shashishekhar, N.
2015-03-01
Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.
NASA Technical Reports Server (NTRS)
Clancey, William J.
2003-01-01
A human-centered approach to computer systems design involves reframing analysis in terms of people interacting with each other, not only human-machine interaction. The primary concern is not how people can interact with computers, but how shall we design computers to help people work together? An analysis of astronaut interactions with CapCom on Earth during one traverse of Apollo 17 shows what kind of information was conveyed and what might be automated today. A variety of agent and robotic technologies are proposed that deal with recurrent problems in communication and coordination during the analyzed traverse.
ERIC Educational Resources Information Center
Popovich, Donna
This descriptive study surveys the staff of all 18 founding member libraries of OhioLINK to see whether or not they prefer the new system or the old one and why. The purpose of the study is to determine if resistance to change, computer anxiety and technostress can be found in libraries converting their automated systems over to the OhioLINK…
Software For Computer-Security Audits
NASA Technical Reports Server (NTRS)
Arndt, Kate; Lonsford, Emily
1994-01-01
Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The ARES (Automated Residential Energy Standard) User`s Guide is designed to the user successfully operate the ARES computer program. This guide assumes that the user is familiar with basic PC skills such as using a keyboard and loading a disk drive. The ARES computer program was designed to assist building code officials in creating a residential energy standard based on local climate and costs.
Automated CPX support system preliminary design phase
NASA Technical Reports Server (NTRS)
Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.
1984-01-01
The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System
Punjabi, Naresh M.; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N.
2015-01-01
Study Objectives: Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. Design: The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Setting: Clinical sleep laboratories. Measurements and Results: A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90–0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91–0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Conclusion: Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. Citation: Punjabi NM, Shifa N, Dorffner G, Patil S, Pien G, Aurora RN. Computer-assisted automated scoring of polysomnograms using the Somnolyzer system. SLEEP 2015;38(10):1555–1566. PMID:25902809
Model-Based Design of Air Traffic Controller-Automation Interaction
NASA Technical Reports Server (NTRS)
Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)
1998-01-01
A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.
Precision Relative Positioning for Automated Aerial Refueling from a Stereo Imaging System
2015-03-01
PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS Kyle P. Werner, 2Lt, USAF AFIT-ENG-MS-15-M-048...REFUELING FROM A STEREO IMAGING SYSTEM THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of...RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-M-048 PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS
An Automated Weather Research and Forecasting (WRF)-Based Nowcasting System: Software Description
2013-10-01
14. ABSTRACT A Web service /Web interface software package has been engineered to address the need for an automated means to run the Weather Research...An Automated Weather Research and Forecasting (WRF)- Based Nowcasting System: Software Description by Stephen F. Kirby, Brian P. Reen, and...Based Nowcasting System: Software Description Stephen F. Kirby, Brian P. Reen, and Robert E. Dumais Jr. Computational and Information Sciences
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
Faster Aerodynamic Simulation With Cart3D
NASA Technical Reports Server (NTRS)
2003-01-01
A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
2014-01-15
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
77 FR 67381 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
.... ``Computational and Experimental RNA Nanoparticle Design,'' in Automation in Genomics and Proteomics: An... and Experimental RNA Nanoparticle Design,'' in Automation in Genomics and Proteomics: An Engineering... Development Stage: Prototype Pre-clinical In vitro data available Inventors: Robert J. Crouch and Yutaka...
Program for improved electrical harness documentation and fabrication
NASA Technical Reports Server (NTRS)
1971-01-01
Computer program provides automated print-out of harness interconnection table and automated cross-check of reciprocal pin/connector assignments, and improves accuracy and reliability of final documented data. Programs and corresponding library tapes are successfully and continuously employed on Nimbus spacecraft programs.
Intelligent robot trends for 1998
NASA Astrophysics Data System (ADS)
Hall, Ernest L.
1998-10-01
An intelligent robot is a remarkably useful combination of a manipulator, sensors and controls. The use of these machines in factory automation can improve productivity, increase product quality and improve competitiveness. This paper presents a discussion of recent technical and economic trends. Technically, the machines are faster, cheaper, more repeatable, more reliable and safer. The knowledge base of inverse kinematic and dynamic solutions and intelligent controls is increasing. More attention is being given by industry to robots, vision and motion controls. New areas of usage are emerging for service robots, remote manipulators and automated guided vehicles. Economically, the robotics industry now has a 1.1 billion-dollar market in the U.S. and is growing. Feasibility studies results are presented which also show decreasing costs for robots and unaudited healthy rates of return for a variety of robotic applications. However, the road from inspiration to successful application can be long and difficult, often taking decades to achieve a new product. A greater emphasis on mechatronics is needed in our universities. Certainly, more cooperation between government, industry and universities is needed to speed the development of intelligent robots that will benefit industry and society.
Pak, Richard; McLaughlin, Anne Collins; Bass, Brock
2014-01-01
Previous research has shown that gender stereotypes, elicited by the appearance of the anthropomorphic technology, can alter perceptions of system reliability. The current study examined whether stereotypes about the perceived age and gender of anthropomorphic technology interacted with reliability to affect trust in such technology. Participants included a cross-section of younger and older adults. Through a factorial survey, participants responded to health-related vignettes containing anthropomorphic technology with a specific age, gender, and level of past reliability by rating their trust in the system. Trust in the technology was affected by the age and gender of the user as well as its appearance and reliability. Perceptions of anthropomorphic technology can be affected by pre-existing stereotypes about the capability of a specific age or gender. The perceived age and gender of automation can alter perceptions of the anthropomorphic technology such as trust. Thus, designers of automation should design anthropomorphic interfaces with an awareness that the perceived age and gender will interact with the user’s age and gender
NASA Astrophysics Data System (ADS)
Saffar, Seha; Azni Jafar, Fairul; Jamaludin, Zamberi
2016-02-01
A case study was selected as a method to collect data in actual industry situation. The study aimed to assess the influences of automated material handling system in automotive industry by proposing a new design of integration system through simulation, and analyze the significant effect and influence of the system. The method approach tool will be CAD Software (Delmia & Quest). The process of preliminary data gathering in phase 1 will collect all data related from actual industry situation. It is expected to produce a guideline and limitation in designing a new integration system later. In phase 2, an idea or concept of design will be done by using 10 principles of design consideration for manufacturing. A full factorial design will be used as design of experiment in order to analyze the performance measured of the integration system with the current system in case study. From the result of the experiment, an ANOVA analysis will be done to study the performance measured. Thus, it is expected that influences can be seen from the improvement made in the system.
Computer Science and Technology Publications. NBS Publications List 84.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2012 CFR
2012-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2014 CFR
2014-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2013 CFR
2013-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2011 CFR
2011-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2010 CFR
2010-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
Advanced computer architecture for large-scale real-time applications.
DOT National Transportation Integrated Search
1973-04-01
Air traffic control automation is identified as a crucial problem which provides a complex, real-time computer application environment. A novel computer architecture in the form of a pipeline associative processor is conceived to achieve greater perf...
Computer grading of examinations
NASA Technical Reports Server (NTRS)
Frigerio, N. A.
1969-01-01
A method, using IBM cards and computer processing, automates examination grading and recording and permits use of computational problems. The student generates his own answers, and the instructor has much greater freedom in writing questions than is possible with multiple choice examinations.
Networked Microcomputers--The Next Generation in College Computing.
ERIC Educational Resources Information Center
Harris, Albert L.
The evolution of computer hardware for college computing has mirrored the industry's growth. When computers were introduced into the educational environment, they had limited capacity and served one user at a time. Then came large mainframes with many terminals sharing the resource. Next, the use of computers in office automation emerged. As…
ERIC Educational Resources Information Center
Micro-Ideas, Glenview, IL.
The 47 papers in these proceedings describe computer technology and its many applications to the educational process. Topics discussed include computer literacy, networking, word processing, automated instructional management, computer conferencing, career information services, computer-aided drawing/design, and robotics. Programming languages…
Tomorrow Is Today at Silver Ridge.
ERIC Educational Resources Information Center
Wise, B. J.
1994-01-01
Describes a Washington State school's efforts to forego factory-model education for a boldly restructured curriculum dependent on new technologies, such as computer networks, two-year classrooms, ongoing staff development and planning sessions, and an innovative onsite day-care program for staff and students. The school has succeeded in…
Automated aortic calcification detection in low-dose chest CT images
NASA Astrophysics Data System (ADS)
Xie, Yiting; Htwe, Yu Maw; Padgett, Jennifer; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.
2014-03-01
The extent of aortic calcification has been shown to be a risk indicator for vascular events including cardiac events. We have developed a fully automated computer algorithm to segment and measure aortic calcification in low-dose noncontrast, non-ECG gated, chest CT scans. The algorithm first segments the aorta using a pre-computed Anatomy Label Map (ALM). Then based on the segmented aorta, aortic calcification is detected and measured in terms of the Agatston score, mass score, and volume score. The automated scores are compared with reference scores obtained from manual markings. For aorta segmentation, the aorta is modeled as a series of discrete overlapping cylinders and the aortic centerline is determined using a cylinder-tracking algorithm. Then the aortic surface location is detected using the centerline and a triangular mesh model. The segmented aorta is used as a mask for the detection of aortic calcification. For calcification detection, the image is first filtered, then an elevated threshold of 160 Hounsfield units (HU) is used within the aorta mask region to reduce the effect of noise in low-dose scans, and finally non-aortic calcification voxels (bony structures, calcification in other organs) are eliminated. The remaining candidates are considered as true aortic calcification. The computer algorithm was evaluated on 45 low-dose non-contrast CT scans. Using linear regression, the automated Agatston score is 98.42% correlated with the reference Agatston score. The automated mass and volume score is respectively 98.46% and 98.28% correlated with the reference mass and volume score.
Pilots of the future - Human or computer?
NASA Technical Reports Server (NTRS)
Chambers, A. B.; Nagel, D. C.
1985-01-01
In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.
A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data
NASA Technical Reports Server (NTRS)
Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.
2011-01-01
A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.
Satellite freeze forecast system: Executive summary
NASA Technical Reports Server (NTRS)
Martsolf, J. D. (Principal Investigator)
1983-01-01
A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.
Development of microcomputer-based mental acuity tests.
Turnage, J J; Kennedy, R S; Smith, M G; Baltzley, D R; Lane, N E
1992-10-01
Recent disasters have focused attention on performance problems due to the use of alcohol and controlled substances in the workplace. Environmental stressors such as thermal extremes, mixed gases, noise, motion, and vibration also have adverse effects on human performance and operator efficiency. However, the lack of a standardized, sensitive, human performance assessment battery has probably delayed the systematic study of the deleterious effects of various toxic chemicals and drugs at home and in the workplace. The collective goal of the research reported here is the development of a menu of tests embedded in a coherent package of hardware and software that may be useful in repeated-measures studies of a broad range of agents that can degrade human performance. A menu of 40 tests from the Automated Performance Test System (APTS) is described, and the series of interlocking studies supporting its development is reviewed. The APTS tests, which run on several versions of laptop portables and desktop personal computers, have been shown to be stable, reliable, and factorially rich, and to have predictive validities with holistic measures of intelligence and simulator performances. In addition, sensitivity studies have been conducted in which performance changes due to stressors, agents, and treatments were demonstrated. We believe that tests like those described here have prospective use as an adjunct to urine testing for the screening for performance loss of individuals who are granted access to workplaces and stations that impact public safety.
Development of microcomputer-based mental acuity tests
NASA Technical Reports Server (NTRS)
Turnage, J. J.; Kennedy, R. S.; Smith, M. G.; Baltzley, D. R.; Lane, N. E.
1992-01-01
Recent disasters have focused attention on performance problems due to the use of alcohol and controlled substances in the workplace. Environmental stressors such as thermal extremes, mixed gases, noise, motion, and vibration also have adverse effects on human performance and operator efficiency. However, the lack of a standardized, sensitive, human performance assessment battery has probably delayed the systematic study of the deleterious effects of various toxic chemicals and drugs at home and in the workplace. The collective goal of the research reported here is the development of a menu of tests embedded in a coherent package of hardware and software that may be useful in repeated-measures studies of a broad range of agents that can degrade human performance. A menu of 40 tests from the Automated Performance Test System (APTS) is described, and the series of interlocking studies supporting its development is reviewed. The APTS tests, which run on several versions of laptop portables and desktop personal computers, have been shown to be stable, reliable, and factorially rich, and to have predictive validities with holistic measures of intelligence and simulator performances. In addition, sensitivity studies have been conducted in which performance changes due to stressors, agents, and treatments were demonstrated. We believe that tests like those described here have prospective use as an adjunct to urine testing for the screening for performance loss of individuals who are granted access to workplaces and stations that impact public safety.
Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.
Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.
Laboratory systems integration: robotics and automation.
Felder, R A
1991-01-01
Robotic technology is going to have a profound impact on the clinical laboratory of the future. Faced with increased pressure to reduce health care spending yet increase services to patients, many laboratories are looking for alternatives to the inflexible or "fixed" automation found in many clinical analyzers. Robots are being examined by many clinical pathologists as an attractive technology which can adapt to the constant changes in laboratory testing. Already, laboratory designs are being altered to accommodate robotics and automated specimen processors. However, the use of robotics and computer intelligence in the clinical laboratory is still in its infancy. Successful examples of robotic automation exist in several laboratories. Investigators have used robots to automate endocrine testing, high performance liquid chromatography, and specimen transportation. Large commercial laboratories are investigating the use of specimen processors which combine the use of fixed automation and robotics. Robotics have also reduced the exposure of medical technologists to specimens infected with viral pathogens. The successful examples of clinical robotics applications were a result of the cooperation of clinical chemists, engineers, and medical technologists. At the University of Virginia we have designed and implemented a robotic critical care laboratory. Initial clinical experience suggests that robotic performance is reliable, however, staff acceptance and utilization requires continuing education. We are also developing a robotic cyclosporine which promises to greatly reduce the labor costs of this analysis. The future will bring lab wide automation that will fully integrate computer artificial intelligence and robotics. Specimens will be transported by mobile robots. Specimen processing, aliquotting, and scheduling will be automated.(ABSTRACT TRUNCATED AT 250 WORDS)
The application of automated operations at the Institutional Processing Center
NASA Technical Reports Server (NTRS)
Barr, Thomas H.
1993-01-01
The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.
Software technology insertion: A study of success factors
NASA Technical Reports Server (NTRS)
Lydon, Tom
1990-01-01
Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.
iPTF discovery and identification of bright transients
NASA Astrophysics Data System (ADS)
Adams, Scott; Karamehmetoglu, Emir; Roy, Rupak; Neill, James D.; Walters, Richard; Cook, Dave; Kupfer, Thomas; Cannella, Chris; Blagorodnova, Nadejda; Yan, Lin; Kasliwal, Mansi; Kulkarni, Shri
2017-02-01
The intermediate Palomar Transient Factory (ATel #4807) reports the discovery of the following bright transients. We report as ATel alerts all objects brighter than 19 mag. Our discoveries are reported in two filters: sdss-g and Mould-I, denoted as g and I. All magnitudes are obtained using difference image photometry based on the PTFIDE pipeline described in Masci et al. 2016.Our automated candidate vetting to distinguish a real astrophysical source (1.0) from bogus artifacts (0.0) is powered by three generations of machine learning algorithms: RB2 (Brink et al. 2013MNRAS.435.1047B), RB4 (Rebbapragada et al. 2015AAS...22543402R), and RB5 (Wozniak et al. 2013AAS...22143105W).
Career Education via Data Processing
ERIC Educational Resources Information Center
Wagner, Gerald E.
1975-01-01
A data processing instructional program should provide students with career awareness, exploration, and orientation. This can be accomplished by establishing three objectives: (1) familiarization with automation terminology; (2) understanding the influence of the cultural and social impact of computers and automation; and (3) the kinds of job…
Office Automation in Student Affairs.
ERIC Educational Resources Information Center
Johnson, Sharon L.; Hamrick, Florence A.
1987-01-01
Offers recommendations to assist in introducing or expanding computer assistance in student affairs. Describes need for automation and considers areas of choosing hardware and software, funding and competitive bidding, installation and training, and system management. Cites greater efficiency in handling tasks and data and increased levels of…
Development of an automated pre-sampling plan for construction projects : final report.
DOT National Transportation Integrated Search
1983-03-01
The development of an automated pre-sampling plan was undertaken to free the district construction personnel from the cumbersome and time-consuming task of preparing such plans manually. A computer program was written and linked to a data file which ...
Automated Induction Of Rule-Based Neural Networks
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.; Goodman, Rodney M.
1994-01-01
Prototype expert systems implemented in software and are functionally equivalent to neural networks set up automatically and placed into operation within minutes following information-theoretic approach to automated acquisition of knowledge from large example data bases. Approach based largely on use of ITRULE computer program.
USSR Report, Cybernetics, Computers and Automation Technology.
1987-03-02
Studies in the Area of EPR of Non- Ordered Solids, Spectral Recording, Processing and Analysis System (A.N. Bals, L.M. Kuzmina ; AVTOMETRIYA, No 2, Feb...L.M. Kuzmina , Riga] [Abstract] An automated system has been developed for electron paramagnetic resonance studies, oriented toward achievement of
One of My Favorite Assignments: Automated Teller Machine Simulation.
ERIC Educational Resources Information Center
Oberman, Paul S.
2001-01-01
Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)
An anatomy of industrial robots and their controls
NASA Astrophysics Data System (ADS)
Luh, J. Y. S.
1983-02-01
The modernization of manufacturing facilities by means of automation represents an approach for increasing productivity in industry. The three existing types of automation are related to continuous process controls, the use of transfer conveyor methods, and the employment of programmable automation for the low-volume batch production of discrete parts. The industrial robots, which are defined as computer controlled mechanics manipulators, belong to the area of programmable automation. Typically, the robots perform tasks of arc welding, paint spraying, or foundary operation. One may assign a robot to perform a variety of job assignments simply by changing the appropriate computer program. The present investigation is concerned with an evaluation of the potential of the robot on the basis of its basic structure and controls. It is found that robots function well in limited areas of industry. If the range of tasks which robots can perform is to be expanded, it is necessary to provide multiple-task sensors, or special tooling, or even automatic tooling.
Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.
2012-01-01
The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.
Computation of Flow Through Water-Control Structures Using Program DAMFLO.2
Sanders, Curtis L.; Feaster, Toby D.
2004-01-01
As part of its mission to collect, analyze, and store streamflow data, the U.S. Geological Survey computes flow through several dam structures throughout the country. Flows are computed using hydraulic equations that describe flow through sluice and Tainter gates, crest gates, lock gates, spillways, locks, pumps, and siphons, which are calibrated using flow measurements. The program DAMFLO.2 was written to compute, tabulate, and plot flow through dam structures using data that describe the physical properties of dams and various hydraulic parameters and ratings that use time-varying data, such as lake elevations or gate openings. The program uses electronic computer files of time-varying data, such as lake elevation or gate openings, retrieved from the U.S. Geological Survey Automated Data Processing System. Computed time-varying flow data from DAMFLO.2 are output in flat files, which can be entered into the Automated Data Processing System database. All computations are made in units of feet and seconds. DAMFLO.2 uses the procedures and language developed by the SAS Institute Inc.
Medical Information Processing by Computer.
ERIC Educational Resources Information Center
Kleinmuntz, Benjamin
The use of the computer for medical information processing was introduced about a decade ago. Considerable inroads have now been made toward its applications to problems in medicine. Present uses of the computer, both as a computational and noncomputational device include the following: automated search of patients' files; on-line clinical data…
ERIC Educational Resources Information Center
Clyde, Anne
1999-01-01
Discussion of the Year 2000 (Y2K) problem, the computer-code problem that affects computer programs or computer chips, focuses on the impact on teacher-librarians. Topics include automated library systems, access to online information services, library computers and software, and other electronic equipment such as photocopiers and fax machines.…
Automating Disk Forensic Processing with SleuthKit, XML and Python
2009-05-01
1 Automating Disk Forensic Processing with SleuthKit, XML and Python Simson L. Garfinkel Abstract We have developed a program called fiwalk which...files themselves. We show how it is relatively simple to create automated disk forensic applications using a Python module we have written that reads...software that the portable device may contain. Keywords: Computer Forensics; XML; Sleuth Kit; Python I. INTRODUCTION In recent years we have found many
1982-01-27
Visible 3. 3 Ea r th Location, Colocation, and Normalization 4. IMAGE ANALYSIS 4. 1 Interactive Capabilities 4.2 Examples 5. AUTOMATED CLOUD...computer Interactive Data Access System (McIDAS) before image analysis and algorithm development were done. Earth-location is an automated procedure to...the factor l / s in (SSE) toward the gain settings given in Table 5. 4. IMAGE ANALYSIS 4.1 Interactive Capabilities The development of automated
Cost considerations in automating the library.
Bolef, D
1987-01-01
The purchase price of a computer and its software is but a part of the cost of any automated system. There are many additional costs, including one-time costs of terminals, printers, multiplexors, microcomputers, consultants, workstations and retrospective conversion, and ongoing costs of maintenance and maintenance contracts for the equipment and software, telecommunications, and supplies. This paper examines those costs in an effort to produce a more realistic picture of an automated system. PMID:3594021
NASA Tech Briefs, June 1996. Volume 20, No. 6
NASA Technical Reports Server (NTRS)
1996-01-01
Topics: New Computer Hardware; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;Books and Reports.
Automation of electromagnetic compatability (EMC) test facilities
NASA Technical Reports Server (NTRS)
Harrison, C. A.
1986-01-01
Efforts to automate electromagnetic compatibility (EMC) test facilities at Marshall Space Flight Center are discussed. The present facility is used to accomplish a battery of nine standard tests (with limited variations) deigned to certify EMC of Shuttle payload equipment. Prior to this project, some EMC tests were partially automated, but others were performed manually. Software was developed to integrate all testing by means of a desk-top computer-controller. Near real-time data reduction and onboard graphics capabilities permit immediate assessment of test results. Provisions for disk storage of test data permit computer production of the test engineer's certification report. Software flexibility permits variation in the tests procedure, the ability to examine more closely those frequency bands which indicate compatibility problems, and the capability to incorporate additional test procedures.
A Computational Framework for Automation of Point Defect Calculations
NASA Astrophysics Data System (ADS)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration
A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.
NASA Astrophysics Data System (ADS)
Srivastava, Vishal; Dalal, Devjyoti; Kumar, Anuj; Prakash, Surya; Dalal, Krishna
2018-06-01
Moisture content is an important feature of fruits and vegetables. As 80% of apple content is water, so decreasing the moisture content will degrade the quality of apples (Golden Delicious). The computational and texture features of the apples were extracted from optical coherence tomography (OCT) images. A support vector machine with a Gaussian kernel model was used to perform automated classification. To evaluate the quality of wax coated apples during storage in vivo, our proposed method opens up the possibility of fully automated quantitative analysis based on the morphological features of apples. Our results demonstrate that the analysis of the computational and texture features of OCT images may be a good non-destructive method for the assessment of the quality of apples.
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Providing security for automated process control systems at hydropower engineering facilities
NASA Astrophysics Data System (ADS)
Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.
2016-12-01
This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.
A test matrix sequencer for research test facility automation
NASA Technical Reports Server (NTRS)
Mccartney, Timothy P.; Emery, Edward F.
1990-01-01
The hardware and software configuration of a Test Matrix Sequencer, a general purpose test matrix profiler that was developed for research test facility automation at the NASA Lewis Research Center, is described. The system provides set points to controllers and contact closures to data systems during the course of a test. The Test Matrix Sequencer consists of a microprocessor controlled system which is operated from a personal computer. The software program, which is the main element of the overall system is interactive and menu driven with pop-up windows and help screens. Analog and digital input/output channels can be controlled from a personal computer using the software program. The Test Matrix Sequencer provides more efficient use of aeronautics test facilities by automating repetitive tasks that were once done manually.
Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen
2017-09-01
Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.
Adaptive function allocation reduces performance costs of static automation
NASA Technical Reports Server (NTRS)
Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian
1993-01-01
Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.
The interaction of representation and reasoning
Bundy, Alan
2013-01-01
Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group. PMID:24062623
Object-Oriented Algorithm For Evaluation Of Fault Trees
NASA Technical Reports Server (NTRS)
Patterson-Hine, F. A.; Koen, B. V.
1992-01-01
Algorithm for direct evaluation of fault trees incorporates techniques of object-oriented programming. Reduces number of calls needed to solve trees with repeated events. Provides significantly improved software environment for such computations as quantitative analyses of safety and reliability of complicated systems of equipment (e.g., spacecraft or factories).
Why Machine-Information Metaphors Are Bad for Science and Science Education
ERIC Educational Resources Information Center
Pigliucci, Massimo; Boudry, Maarten
2011-01-01
Genes are often described by biologists using metaphors derived from computational science: they are thought of as carriers of information, as being the equivalent of "blueprints" for the construction of organisms. Likewise, cells are often characterized as "factories" and organisms themselves become analogous to machines. Accordingly, when the…
An automated digital imaging system for environmental monitoring applications
Bogle, Rian; Velasco, Miguel; Vogel, John
2013-01-01
Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.
Development of a plan for automating integrated circuit processing
NASA Technical Reports Server (NTRS)
1971-01-01
The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.
Automating quantum experiment control
NASA Astrophysics Data System (ADS)
Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.
2017-03-01
The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.
Farahani, Navid; Liu, Zheng; Jutt, Dylan; Fine, Jeffrey L
2017-10-01
- Pathologists' computer-assisted diagnosis (pCAD) is a proposed framework for alleviating challenges through the automation of their routine sign-out work. Currently, hypothetical pCAD is based on a triad of advanced image analysis, deep integration with heterogeneous information systems, and a concrete understanding of traditional pathology workflow. Prototyping is an established method for designing complex new computer systems such as pCAD. - To describe, in detail, a prototype of pCAD for the sign-out of a breast cancer specimen. - Deidentified glass slides and data from breast cancer specimens were used. Slides were digitized into whole-slide images with an Aperio ScanScope XT, and screen captures were created by using vendor-provided software. The advanced workflow prototype was constructed by using PowerPoint software. - We modeled an interactive, computer-assisted workflow: pCAD previews whole-slide images in the context of integrated, disparate data and predefined diagnostic tasks and subtasks. Relevant regions of interest (ROIs) would be automatically identified and triaged by the computer. A pathologist's sign-out work would consist of an interactive review of important ROIs, driven by required diagnostic tasks. The interactive session would generate a pathology report automatically. - Using animations and real ROIs, the pCAD prototype demonstrates the hypothetical sign-out in a stepwise fashion, illustrating various interactions and explaining how steps can be automated. The file is publicly available and should be widely compatible. This mock-up is intended to spur discussion and to help usher in the next era of digitization for pathologists by providing desperately needed and long-awaited automation.
History of a Building Automation System.
ERIC Educational Resources Information Center
Martin, Anthony A.
1984-01-01
Having successfully used computer control in the solar-heated and cooled Terraset School, the Fairfax County, VA, Public Schools are now computerizing all their facilities. This article discusses the configuration and use of a countywide control system, reasons for the project's success, and problems of facility automation. (MCG)
Automated Instructional Management Systems (AIMS) Version III, Users Manual.
ERIC Educational Resources Information Center
New York Inst. of Tech., Old Westbury.
This document sets forth the procedures necessary to utilize and understand the operating characteristics of the Automated Instructional Management System - Version III, a computer-based system for management of educational processes. Directions for initialization, including internal and user files; system and operational input requirements;…
Ringling School of Art and Design Builds a CASTLE.
ERIC Educational Resources Information Center
Morse, Yvonne; Davis, Wendy
1984-01-01
Describes the development and installation of the Computer Automated Software for the Total Library Environment System (CASTLE), which uses a microcomputer to automate operations of small academic library in six main areas: circulation, online catalog, inventory and file maintenance, audiovisual equipment, accounting, and information and…
Air Force Tech Order Management System (AFTOMS). Automation Plan-Final Report. Version 1.0
DOT National Transportation Integrated Search
1988-02-01
Computer aided Acquisition and Logistics Support (CALS) is a Department of Defense (DoD) program designed to improve weapon systems support through digital automation. In June 1985, the joint industry/DoD Task Force on CALS issued a five volume repor...
An Overview of Automated Scoring of Essays
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
Automated Essay Scoring (AES) is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003). AES systems are mainly used to overcome time, cost, reliability, and generalizability issues in writing assessment (Bereiter, 2003; Burstein,…
Providing Access to Library Automation Systems for Students with Disabilities.
ERIC Educational Resources Information Center
California Community Colleges, Sacramento. High-Tech Center for the Disabled.
This document provides information on the integration of assistive computer technologies and library automation systems at California Community Colleges in order to ensure access for students with disabilities. Topics covered include planning, upgrading, purchasing, implementing and using these technologies with library systems. As information…