Electronic Data Interchange in Procurement
1990-04-01
contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated
Proof-of-concept automation of propellant processing
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar; Schallhorn, P. A.
1989-01-01
For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.
Film/Adhesive Processing Module for Fiber-Placement Processing of Composites
NASA Technical Reports Server (NTRS)
Hulcher, A. Bruce
2007-01-01
An automated apparatus has been designed and constructed that enables the automated lay-up of composite structures incorporating films, foils, and adhesives during the automated fiber-placement process. This apparatus, denoted a film module, could be used to deposit materials in film or thin sheet form either simultaneously when laying down the fiber composite article or in an independent step.
The role of behavioral decision theory for cockpit information management
NASA Technical Reports Server (NTRS)
Jonsson, Jon E.
1991-01-01
The focus of this report is the consideration of one form of cognition, judgment and decision making, while examining some information management issues associated with the implementation of new forms of automation. As technology matures and more tasks become suitable to automation, human factors researchers will have to consider the effect that increasing automation will have on operator performance. Current technology allows flight deck designers the opportunity to automate activities involving substantially more cognitive processing.
Effects of automation of information-processing functions on teamwork.
Wright, Melanie C; Kaber, David B
2005-01-01
We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.
Laboratory automation: total and subtotal.
Hawker, Charles D
2007-12-01
Worldwide, perhaps 2000 or more clinical laboratories have implemented some form of laboratory automation, either a modular automation system, such as for front-end processing, or a total laboratory automation system. This article provides descriptions and examples of these various types of automation. It also presents an outline of how a clinical laboratory that is contemplating automation should approach its decision and the steps it should follow to ensure a successful implementation. Finally, the role of standards in automation is reviewed.
A comparison of adaptive and adaptable automation under different levels of environmental stress.
Sauer, Juergen; Kao, Chung-Shan; Wastell, David
2012-01-01
The effectiveness of different forms of adaptive and adaptable automation was examined under low- and high-stress conditions, in the form of different levels of noise. Thirty-six participants were assigned to one of the three types of variable automation (adaptive event-based, adaptive performance-based and adaptable serving as a control condition). Participants received 3 h of training on a simulation of a highly automated process control task and were subsequently tested during a 4-h session under noise exposure and quiet conditions. The results for performance suggested no clear benefits of one automation control mode over the other two. However, it emerged that participants under adaptable automation adopted a more active system management strategy and reported higher levels of self-confidence than in the two adaptive control modes. Furthermore, the results showed higher levels of perceived workload, fatigue and anxiety for performance-based adaptive automation control than the other two modes. This study compared two forms of adaptive automation (where the automated system flexibly allocates tasks between human and machine) with adaptable automation (where the human allocates the tasks). The adaptable mode showed marginal advantages. This is of relevance, given that this automation mode may also be easier to design.
NASA Technical Reports Server (NTRS)
Doane, William J.; Hall, Ronald G.
1992-01-01
This paper describes the design and process development of low-cost structural parts made by a modified resin transfer molding process. Innovative application of elastomeric tooling to increase laminate fiber volume and automated forming of fiber preforms are discussed, as applied to fabrication of a representative section of a cruise missile fuselage.
NASA Technical Reports Server (NTRS)
1984-01-01
Automation reuirements were developed for two manufacturing concepts: (1) Gallium Arsenide Electroepitaxial Crystal Production and Wafer Manufacturing Facility, and (2) Gallium Arsenide VLSI Microelectronics Chip Processing Facility. A functional overview of the ultimate design concept incoporating the two manufacturing facilities on the space station are provided. The concepts were selected to facilitate an in-depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, sensors, and artificial intelligence. While the cost-effectiveness of these facilities was not analyzed, both appear entirely feasible for the year 2000 timeframe.
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1990-01-01
Information on the design of human-centered tools for terminal area air traffic control (ATC) is given in viewgraph form. Information is given on payoffs and products, guidelines, ATC as a team process, automation tools for ATF, and the traffic management advisor.
Adaptive automation of human-machine system information-processing functions.
Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P
2005-01-01
The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.
COINS: A composites information database system
NASA Technical Reports Server (NTRS)
Siddiqi, Shahid; Vosteen, Louis F.; Edlow, Ralph; Kwa, Teck-Seng
1992-01-01
An automated data abstraction form (ADAF) was developed to collect information on advanced fabrication processes and their related costs. The information will be collected for all components being fabricated as part of the ACT program and include in a COmposites INformation System (COINS) database. The aim of the COINS development effort is to provide future airframe preliminary design and fabrication teams with a tool through which production cost can become a deterministic variable in the design optimization process. The effort was initiated by the Structures Technology Program Office (STPO) of the NASA LaRC to implement the recommendations of a working group comprised of representatives from the commercial airframe companies. The principal working group recommendation was to re-institute collection of composite part fabrication data in a format similar to the DOD/NASA Structural Composites Fabrication Guide. The fabrication information collection form was automated with current user friendly computer technology. This work in progress paper describes the new automated form and features that make the form easy to use by an aircraft structural design-manufacturing team.
75 FR 38087 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-01
... Repatriation Reporting System, DD Form 2585, OMB Control Number 0704- 0334. Needs and Uses: This information... from the DD Form 2585 is entered into an automated system; a series of reports is accessible to DoD... DD Form 2585, ``Repatriation Processing Center Processing Sheet'', has numerous functions, but is...
Lehmann, R; Gallert, C; Roddelkopf, T; Junginger, S; Wree, A; Thurow, K
2016-08-01
Cancer diseases are a common problem of the population caused by age and increased harmful environmental influences. Herein, new therapeutic strategies and compound screenings are necessary. The regular 2D cultivation has to be replaced by three dimensional cell culturing (3D) for better simulation of in vivo conditions. The 3D cultivation with alginate matrix is an appropriate method for encapsulate cells to form cancer constructs. The automated manufacturing of alginate beads might be an ultimate method for large-scaled manufacturing constructs similar to cancer tissue. The aim of this study was the integration of full automated systems for the production, cultivation and screening of 3D cell cultures. We compared the automated methods with the regular manual processes. Furthermore, we investigated the influence of antibiotics on these 3D cell culture systems. The alginate beads were formed by automated and manual procedures. The automated steps were processes by the Biomek(®) Cell Workstation (celisca, Rostock, Germany). The proliferation and toxicity were manually and automatically evaluated at day 14 and 35 of cultivation. The results visualized an accumulation and expansion of cell aggregates over the period of incubation. However, the proliferation and toxicity were faintly and partly significantly decreased on day 35 compared to day 14. The comparison of the manual and automated methods displayed similar results. We conclude that the manual production process could be replaced by the automation. Using automation, 3D cell cultures can be produced in industrial scale and improve the drug development and screening to treat serious illnesses like cancer.
aMC fast: automation of fast NLO computations for PDF fits
NASA Astrophysics Data System (ADS)
Bertone, Valerio; Frederix, Rikkert; Frixione, Stefano; Rojo, Juan; Sutton, Mark
2014-08-01
We present the interface between M adG raph5_ aMC@NLO, a self-contained program that calculates cross sections up to next-to-leading order accuracy in an automated manner, and APPL grid, a code that parametrises such cross sections in the form of look-up tables which can be used for the fast computations needed in the context of PDF fits. The main characteristic of this interface, which we dub aMC fast, is its being fully automated as well, which removes the need to extract manually the process-specific information for additional physics processes, as is the case with other matrix-element calculators, and renders it straightforward to include any new process in the PDF fits. We demonstrate this by studying several cases which are easily measured at the LHC, have a good constraining power on PDFs, and some of which were previously unavailable in the form of a fast interface.
Optimized and Automated design of Plasma Diagnostics for Additive Manufacture
NASA Astrophysics Data System (ADS)
Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon
2016-10-01
Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.
Palttala, Iida; Heinämäki, Jyrki; Honkanen, Outi; Suominen, Risto; Antikainen, Osmo; Hirvonen, Jouni; Yliruusi, Jouko
2013-03-01
To date, little is known on applicability of different types of pharmaceutical dosage forms in an automated high-speed multi-dose dispensing process. The purpose of the present study was to identify and further investigate various process-induced and/or product-related limitations associated with multi-dose dispensing process. The rates of product defects and dose dispensing errors in automated multi-dose dispensing were retrospectively investigated during a 6-months follow-up period. The study was based on the analysis of process data of totally nine automated high-speed multi-dose dispensing systems. Special attention was paid to the dependence of multi-dose dispensing errors/product defects and pharmaceutical tablet properties (such as shape, dimensions, weight, scored lines, coatings, etc.) to profile the most suitable forms of tablets for automated dose dispensing systems. The relationship between the risk of errors in dose dispensing and tablet characteristics were visualized by creating a principal component analysis (PCA) model for the outcome of dispensed tablets. The two most common process-induced failures identified in the multi-dose dispensing are predisposal of tablet defects and unexpected product transitions in the medication cassette (dose dispensing error). The tablet defects are product-dependent failures, while the tablet transitions are dependent on automated multi-dose dispensing systems used. The occurrence of tablet defects is approximately twice as common as tablet transitions. Optimal tablet preparation for the high-speed multi-dose dispensing would be a round-shaped, relatively small/middle-sized, film-coated tablet without any scored line. Commercial tablet products can be profiled and classified based on their suitability to a high-speed multi-dose dispensing process.
MOD control center automated information systems security evolution
NASA Technical Reports Server (NTRS)
Owen, Rich
1991-01-01
The role of the technology infusion process in future Control Center Automated Information Systems (AIS) is highlighted. The following subject areas are presented in the form of the viewgraphs: goals, background, threat, MOD's AISS program, TQM, SDLC integration, payback, future challenges, and bottom line.
24 CFR 908.104 - Requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... contracts with a service bureau to provide the system, the software must be periodically updated to.... Housing agencies that currently use automated software packages to transmit Forms HUD-50058 and HUD-50058... software required to develop and maintain an in-house automated data processing system (ADP) used to...
77 FR 6125 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-07
... services requesting participation in the Medicare program. This form initiates the process of obtaining a... promotes data reduction or introduction to and retrieval from the Automated Survey Process Environment... clinic data as a part of the RHC certification process. Therefore, the revised title is ``Form CMS-29...
Recent advancements in low cost solar cell processing
NASA Technical Reports Server (NTRS)
Ralph, E. L.
1975-01-01
A proof-of-concept solar cell process has been developed that is adaptable to automation. This involved the development of a new contact system, a new antireflection coating system, a drift field cell design and a new contoured surface treatment. All these processes are performed without the use of vacuum chambers and expensive masking techniques, thus providing the possibility of reduced costs by automation using conventional semiconductor processing machinery. The contacts were printed on the cells by conventional silk screen machinery. The P(+) back field was formed by diffusing in aluminum from a printed aluminum back contact. The antireflection coating was formed by spinning on and baking a TiO2-SiO2 glass film. Air-mass-zero efficiencies of over 10% were achieved using this completely vacuum-free process.
Influence of prepreg characteristics on stamp consolidation
NASA Astrophysics Data System (ADS)
Slange, T. K.; Warnet, L. L.; Grouve, W. J. B.; Akkerman, R.
2017-10-01
Stamp forming is a rapid manufacturing technology used to shape flat blanks of thermoplastic composite material into three-dimensional components. The development of automated lay-up technologies further extends the applicability of stamp forming by allowing rapid lay-up of tailored blanks and partial preconsolidation. This partial preconsolidation makes the influence of prepreg more critical compared to conventional preconsolidation methods which provide full preconsolidation. This paper aims to highlight consolidation challenges that can appear when stamp forming blanks manufactured by automated lay-up. Important prepreg characteristics were identified based on an experimental study where a comparison was made between various prepreg in their as-received, deconsolidated and stamp consolidated state. It was found that adding up small thickness variations across the width of a prepreg when stacking plies into a blank by automated lay-up can cause non-uniform consolidation. Additionally, deconsolidation of the prepreg does not seem to obstruct interlaminar bonding, while intralaminar voids initially present in a prepreg cannot be removed during stamp forming. An additional preconsolidation step after automated lay-up seems necessary to remove blank thickness variations and intralaminar voids for the current prepregs. Eliminating this process step and the successful combination of rapid automated lay-up and stamp forming requires prepregs which are void-free and have less thickness variation.
NASA Astrophysics Data System (ADS)
Bouquerel, Laure; Moulin, Nicolas; Drapier, Sylvain; Boisse, Philippe; Beraud, Jean-Marc
2017-10-01
While weight has been so far the main driver for the development of prepreg based-composites solutions for aeronautics, a new weight-cost trade-off tends to drive choices for next-generation aircrafts. As a response, Hexcel has designed a new dry reinforcement type for aircraft primary structures, which combines the benefits of automation, out-of-autoclave process cost-effectiveness, and mechanical performances competitive to prepreg solutions: HiTape® is a unidirectional (UD) dry carbon reinforcement with thermoplastic veil on each side designed for aircraft primary structures [1-3]. One privileged process route for HiTape® in high volume automated processes consists in forming initially flat dry reinforcement stacks, before resin infusion [4] or injection. Simulation of the forming step aims at predicting the geometry and mechanical properties of the formed stack (so-called preform) for process optimisation. Extensive work has been carried out on prepreg and dry woven fabrics forming behaviour and simulation, but the interest for dry non-woven reinforcements has emerged more recently. Some work has been achieved on non crimp fabrics but studies on the forming behaviour of UDs are seldom and deal with UD prepregs only. Tension and bending in the fibre direction, along with inter-ply friction have been identified as the main mechanisms controlling the HiTape® response during forming. Bending has been characterised using a modified Peirce's flexometer [5] and inter-ply friction study is under development. Anisotropic hyperelastic constitutive models have been selected to represent the assumed decoupled deformation mechanisms. Model parameters are then identified from associated experimental results. For forming simulation, a continuous approach at the macroscopic scale has been selected first, and simulation is carried out in the Zset framework [6] using proper shell finite elements.
Explicit control of adaptive automation under different levels of environmental stress.
Sauer, Jürgen; Kao, Chung-Shan; Wastell, David; Nickel, Peter
2011-08-01
This article examines the effectiveness of three different forms of explicit control of adaptive automation under low- and high-stress conditions, operationalised by different levels of noise. In total, 60 participants were assigned to one of three types of automation design (free, prompted and forced choice). They were trained for 4 h on a highly automated simulation of a process control environment, called AutoCAMS. This was followed by a 4-h testing session under noise exposure and quiet conditions. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that all three modes of explicit control of adaptive automation modes were able to attenuate the negative effects of noise. This was partly due to the fact that operators opted for higher levels of automation under noise. It also emerged that forced choice showed marginal advantages over the two other automation modes. Statement of Relevance: This work is relevant to the design of adaptive automation since it emphasises the need to consider the impact of work-related stressors during task completion. During the presence of stressors, different forms of operator support through automation may be required than under more favourable working conditions.
NASA Out-of-Autoclave Process Technology Development
NASA Technical Reports Server (NTRS)
Johnston, Norman, J.; Clinton, R. G., Jr.; McMahon, William M.
2000-01-01
Polymer matrix composites (PMCS) will play a significant role in the construction of large reusable launch vehicles (RLVs), mankind's future major access to low earth orbit and the international space station. PMCs are lightweight and offer attractive economies of scale and automated fabrication methodology. Fabrication of large RLV structures will require non-autoclave methods which have yet to be matured including (1) thermoplastic forming: heated head robotic tape placement, sheet extrusion, pultrusion, molding and forming; (2) electron beam curing: bulk and ply-by-ply automated placement; (3) RTM and VARTM. Research sponsored by NASA in industrial and NASA laboratories on automated placement techniques involving the first 2 categories will be presented.
Automation of experimental research of waveguide paths induction soldering
NASA Astrophysics Data System (ADS)
Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.
2018-05-01
The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.
Designing Automated Guidance to Promote Productive Revision of Science Explanations
ERIC Educational Resources Information Center
Tansomboon, Charissa; Gerard, Libby F.; Vitale, Jonathan M.; Linn, Marcia C.
2017-01-01
Supporting students to revise their written explanations in science can help students to integrate disparate ideas and develop a coherent, generative account of complex scientific topics. Using natural language processing to analyze student written work, we compare forms of automated guidance designed to motivate productive revision and help…
A review on automated sorting of source-separated municipal solid waste for recycling.
Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul
2017-02-01
A crucial prerequisite for recycling forming an integral part of municipal solid waste (MSW) management is sorting of useful materials from source-separated MSW. Researchers have been exploring automated sorting techniques to improve the overall efficiency of recycling process. This paper reviews recent advances in physical processes, sensors, and actuators used as well as control and autonomy related issues in the area of automated sorting and recycling of source-separated MSW. We believe that this paper will provide a comprehensive overview of the state of the art and will help future system designers in the area. In this paper, we also present research challenges in the field of automated waste sorting and recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.
Automation of NDE on RSRM Metal Components
NASA Technical Reports Server (NTRS)
Hartman, John; Kirby, Mark; McCool, Alex (Technical Monitor)
2002-01-01
An automated eddy current system has been designed and built, and is being implemented to inspect RSRM (Space Shuttle) metal components. The system provides a significant increase in inspection reliability, as well as other benefits such as data storage, chemical waste reduction and reduction in overall process time. This paper is in viewgraph form.
Development of a beam builder for automatic fabrication of large composite space structures
NASA Technical Reports Server (NTRS)
Bodle, J. G.
1979-01-01
The composite material beam builder which will produce triangular beams from pre-consolidated graphite/glass/thermoplastic composite material through automated mechanical processes is presented, side member storage, feed and positioning, ultrasonic welding, and beam cutoff are formed. Each process lends itself to modular subsystem development. Initial development is concentrated on the key processes for roll forming and ultrasonic welding composite thermoplastic materials. The construction and test of an experimental roll forming machine and ultrasonic welding process control techniques are described.
Advanced Airframe Structural Materials: A Primer and Cost Estimating Methodology
1991-01-01
laying machines for larger, mildly con- toured parts such as wing and stabilizer skins. For such parts, automated tape laying machines can operate many...heat guns (90-130°F). However, thermoplastics require as much as 650°F for forming. Automated tape laying machines for these materials use warm...cycles to properly seat the plies onto the tool. This time-consuming process can sometimes be eliminated or reduced by the use of automated tape laying procedures
The Impact of Anonymization for Automated Essay Scoring
ERIC Educational Resources Information Center
Shermis, Mark D.; Lottridge, Sue; Mayfield, Elijah
2015-01-01
This study investigated the impact of anonymizing text on predicted scores made by two kinds of automated scoring engines: one that incorporates elements of natural language processing (NLP) and one that does not. Eight data sets (N = 22,029) were used to form both training and test sets in which the scoring engines had access to both text and…
Uncertainties in the Item Parameter Estimates and Robust Automated Test Assembly
ERIC Educational Resources Information Center
Veldkamp, Bernard P.; Matteucci, Mariagiulia; de Jong, Martijn G.
2013-01-01
Item response theory parameters have to be estimated, and because of the estimation process, they do have uncertainty in them. In most large-scale testing programs, the parameters are stored in item banks, and automated test assembly algorithms are applied to assemble operational test forms. These algorithms treat item parameters as fixed values,…
Automated knowledge generation
NASA Technical Reports Server (NTRS)
Myler, Harley R.; Gonzalez, Avelino J.
1988-01-01
The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).
NASA Technical Reports Server (NTRS)
1984-01-01
The two manufacturing concepts developed represent innovative, technologically advanced manufacturing schemes. The concepts were selected to facilitate an in depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, and artificial intelligence. While the cost effectiveness of these facilities has not been analyzed as part of this study, both appear entirely feasible for the year 2000 timeframe. The growing demand for high quality gallium arsenide microelectronics may warrant the ventures.
NASA Technical Reports Server (NTRS)
Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.
1989-01-01
A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.
Discovery informatics in biological and biomedical sciences: research challenges and opportunities.
Honavar, Vasant
2015-01-01
New discoveries in biological, biomedical and health sciences are increasingly being driven by our ability to acquire, share, integrate and analyze, and construct and simulate predictive models of biological systems. While much attention has focused on automating routine aspects of management and analysis of "big data", realizing the full potential of "big data" to accelerate discovery calls for automating many other aspects of the scientific process that have so far largely resisted automation: identifying gaps in the current state of knowledge; generating and prioritizing questions; designing studies; designing, prioritizing, planning, and executing experiments; interpreting results; forming hypotheses; drawing conclusions; replicating studies; validating claims; documenting studies; communicating results; reviewing results; and integrating results into the larger body of knowledge in a discipline. Against this background, the PSB workshop on Discovery Informatics in Biological and Biomedical Sciences explores the opportunities and challenges of automating discovery or assisting humans in discovery through advances (i) Understanding, formalization, and information processing accounts of, the entire scientific process; (ii) Design, development, and evaluation of the computational artifacts (representations, processes) that embody such understanding; and (iii) Application of the resulting artifacts and systems to advance science (by augmenting individual or collective human efforts, or by fully automating science).
The application of automated operations at the Institutional Processing Center
NASA Technical Reports Server (NTRS)
Barr, Thomas H.
1993-01-01
The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.
Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin
2016-10-01
Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.
The Virtual Mission Operations Center
NASA Technical Reports Server (NTRS)
Moore, Mike; Fox, Jeffrey
1994-01-01
Spacecraft management is becoming more human intensive as spacecraft become more complex and as operations costs are growing accordingly. Several automation approaches have been proposed to lower these costs. However, most of these approaches are not flexible enough in the operations processes and levels of automation that they support. This paper presents a concept called the Virtual Mission Operations Center (VMOC) that provides highly flexible support for dynamic spacecraft management processes and automation. In a VMOC, operations personnel can be shared among missions, the operations team can change personnel and their locations, and automation can be added and removed as appropriate. The VMOC employs a form of on-demand supervisory control called management by exception to free operators from having to actively monitor their system. The VMOC extends management by exception, however, so that distributed, dynamic teams can work together. The VMOC uses work-group computing concepts and groupware tools to provide a team infrastructure, and it employs user agents to allow operators to define and control system automation.
Automated UMLS-Based Comparison of Medical Forms
Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard
2013-01-01
Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827
AI and workflow automation: The prototype electronic purchase request system
NASA Technical Reports Server (NTRS)
Compton, Michael M.; Wolfe, Shawn R.
1994-01-01
Automating 'paper' workflow processes with electronic forms and email can dramatically improve the efficiency of those processes. However, applications that involve complex forms that are used for a variety of purposes or that require numerous and varied approvals often require additional software tools to ensure that (1) the electronic form is correctly and completely filled out, and (2) the form is routed to the proper individuals and organizations for approval. The prototype electronic purchase request (PEPR) system, which has been in pilot use at NASA Ames Research Center since December 1993, seamlessly links a commercial electronics forms package and a CLIPS-based knowledge system that first ensures that electronic forms are correct and complete, and then generates an 'electronic routing slip' that is used to route the form to the people who must sign it. The PEPR validation module is context-sensitive, and can apply different validation rules at each step in the approval process. The PEPR system is form-independent, and has been applied to several different types of forms. The system employs a version of CLIPS that has been extended to support AppleScript, a recently-released scripting language for the Macintosh. This 'scriptability' provides both a transparent, flexible interface between the two programs and a means by which a single copy of the knowledge base can be utilized by numerous remote users.
The interaction of representation and reasoning.
Bundy, Alan
2013-09-08
Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group.
General Tool for Evaluating High-Contrast Coronagraphic Telescope Performance Error Budgets
NASA Technical Reports Server (NTRS)
Marchen, Luis F.
2011-01-01
The Coronagraph Performance Error Budget (CPEB) tool automates many of the key steps required to evaluate the scattered starlight contrast in the dark hole of a space-based coronagraph. The tool uses a Code V prescription of the optical train, and uses MATLAB programs to call ray-trace code that generates linear beam-walk and aberration sensitivity matrices for motions of the optical elements and line-of-sight pointing, with and without controlled fine-steering mirrors (FSMs). The sensitivity matrices are imported by macros into Excel 2007, where the error budget is evaluated. The user specifies the particular optics of interest, and chooses the quality of each optic from a predefined set of PSDs. The spreadsheet creates a nominal set of thermal and jitter motions, and combines that with the sensitivity matrices to generate an error budget for the system. CPEB also contains a combination of form and ActiveX controls with Visual Basic for Applications code to allow for user interaction in which the user can perform trade studies such as changing engineering requirements, and identifying and isolating stringent requirements. It contains summary tables and graphics that can be instantly used for reporting results in view graphs. The entire process to obtain a coronagraphic telescope performance error budget has been automated into three stages: conversion of optical prescription from Zemax or Code V to MACOS (in-house optical modeling and analysis tool), a linear models process, and an error budget tool process. The first process was improved by developing a MATLAB package based on the Class Constructor Method with a number of user-defined functions that allow the user to modify the MACOS optical prescription. The second process was modified by creating a MATLAB package that contains user-defined functions that automate the process. The user interfaces with the process by utilizing an initialization file where the user defines the parameters of the linear model computations. Other than this, the process is fully automated. The third process was developed based on the Terrestrial Planet Finder coronagraph Error Budget Tool, but was fully automated by using VBA code, form, and ActiveX controls.
Effects of imperfect automation on decision making in a simulated command and control task.
Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja
2007-02-01
Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.
ClinicalTrials.gov as a data source for semi-automated point-of-care trial eligibility screening.
Pfiffner, Pascal B; Oh, JiWon; Miller, Timothy A; Mandl, Kenneth D
2014-01-01
Implementing semi-automated processes to efficiently match patients to clinical trials at the point of care requires both detailed patient data and authoritative information about open studies. To evaluate the utility of the ClinicalTrials.gov registry as a data source for semi-automated trial eligibility screening. Eligibility criteria and metadata for 437 trials open for recruitment in four different clinical domains were identified in ClinicalTrials.gov. Trials were evaluated for up to date recruitment status and eligibility criteria were evaluated for obstacles to automated interpretation. Finally, phone or email outreach to coordinators at a subset of the trials was made to assess the accuracy of contact details and recruitment status. 24% (104 of 437) of trials declaring on open recruitment status list a study completion date in the past, indicating out of date records. Substantial barriers to automated eligibility interpretation in free form text are present in 81% to up to 94% of all trials. We were unable to contact coordinators at 31% (45 of 146) of the trials in the subset, either by phone or by email. Only 53% (74 of 146) would confirm that they were still recruiting patients. Because ClinicalTrials.gov has entries on most US and many international trials, the registry could be repurposed as a comprehensive trial matching data source. Semi-automated point of care recruitment would be facilitated by matching the registry's eligibility criteria against clinical data from electronic health records. But the current entries fall short. Ultimately, improved techniques in natural language processing will facilitate semi-automated complex matching. As immediate next steps, we recommend augmenting ClinicalTrials.gov data entry forms to capture key eligibility criteria in a simple, structured format.
Automated Array Assembly, Phase 2
NASA Technical Reports Server (NTRS)
Carbajal, B. G.
1979-01-01
The Automated Array Assembly Task, Phase 2 of the Low Cost Silicon Solar Array Project is a process development task. The contract provides for the fabrication of modules from large area tandem junction cells (TJC). During this quarter, effort was focused on the design of a large area, approximately 36 sq cm, TJC and process verification runs. The large area TJC design was optimized for minimum I squared R power losses. In the TJM activity, the cell-module interfaces were defined, module substrates were formed and heat treated and clad metal interconnect strips were fabricated.
NASA Goddard Space Flight Center Robotic Processing System Program Automation Systems, volume 2
NASA Technical Reports Server (NTRS)
Dobbs, M. E.
1991-01-01
Topics related to robot operated materials processing in space (RoMPS) are presented in view graph form. Some of the areas covered include: (1) mission requirements; (2) automation management system; (3) Space Transportation System (STS) Hitchhicker Payload; (4) Spacecraft Command Language (SCL) scripts; (5) SCL software components; (6) RoMPS EasyLab Command & Variable summary for rack stations and annealer module; (7) support electronics assembly; (8) SCL uplink packet definition; (9) SC-4 EasyLab System Memory Map; (10) Servo Axis Control Logic Suppliers; and (11) annealing oven control subsystem.
Complacency and Automation Bias in the Use of Imperfect Automation.
Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L
2015-08-01
We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.
The interaction of representation and reasoning
Bundy, Alan
2013-01-01
Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group. PMID:24062623
Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy
NASA Astrophysics Data System (ADS)
Sindern, Sven; Meyer, F. Michael
2016-09-01
Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.
NASA Technical Reports Server (NTRS)
1975-01-01
The extent was investigated to which experiment hardware and operational requirements can be met by automatic control and material handling devices; payload and system concepts that make extensive use of automation technology are defined. Hardware requirements for each experiment were established and tabulated, and investigations of applicable existing hardware were documented. The capabilities and characteristics of industrial automation equipment, controls, and techniques are presented in the form of a summary of applicable equipment characteristics in three basic mutually-supporting formats. Facilities for performing groups of experiments are defined along with four levitation groups and three furnace groups; major hardware elements required to implement them were identified. A conceptual design definition of ten different automated processing facilities is presented along with the specific equipment to implement each facility and the design layouts of the different units. Constraints and packaging, weight, and power requirements for six payloads postulated for shuttle missions in the 1979 to 1982 time period were examined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agarwal, Vivek; Oxstrand, Johanna H.; Le Blanc, Katya L.
The work management process in current fleets of national nuclear power plants is so highly dependent on large technical staffs and quality of work instruction, i.e., paper-based, that this puts nuclear energy at somewhat of a long-term economic disadvantage and increase the possibility of human errors. Technologies like mobile portable devices and computer-based procedures can play a key role in improving the plant work management process, thereby increasing productivity and decreasing cost. Automated work packages are a fundamentally an enabling technology for improving worker productivity and human performance in nuclear power plants work activities because virtually every plant work activitymore » is accomplished using some form of a work package. As part of this year’s research effort, automated work packages architecture is identified and an initial set of requirements identified, that are essential and necessary for implementation of automated work packages in nuclear power plants.« less
Management of processes of electrochemical dimensional processing
NASA Astrophysics Data System (ADS)
Akhmetov, I. D.; Zakirova, A. R.; Sadykov, Z. B.
2017-09-01
In different industries a lot high-precision parts are produced from hard-processed scarce materials. Forming such details can only be acting during non-contact processing, or a minimum of effort, and doable by the use, for example, of electro-chemical processing. At the present stage of development of metal working processes are important management issues electrochemical machining and its automation. This article provides some indicators and factors of electrochemical machining process.
NASA Astrophysics Data System (ADS)
A novel production technology in aircraft construction was developed for manufacturing parts of shapes and dimensions that involve only small quantities for one machine. The process, called computerized integrated and automated manufacturing (CIAM), makes it possible to make ready-to-install sheet-metal parts for all types of aircraft. All of the system's job sequences, which include milling the flat sheet-metal parts in stacks, deburring, heat treatment, and forming under the high-pressure rubber-pad press, are automated. The CIAM production center, called SIAM Forming, fulfills the prerequisites for the cost-effective production of sheet-metal parts made of aluminum alloys, titanium, or steel. The SIAM procedure results in negligible material loss through computerizing both component-contour nesting of the sheet-metal parts and contour milling.
Automated Solvent Seaming of Large Polyimide Membranes
NASA Technical Reports Server (NTRS)
Rood, Robert; Moore, James D.; Talley, Chris; Gierow, Paul A.
2006-01-01
A solvent-based welding process enables the joining of precise, cast polyimide membranes at their edges to form larger precise membranes. The process creates a homogeneous, optical-quality seam between abutting membranes, with no overlap and with only a very localized area of figure disturbance. The seam retains 90 percent of the strength of the parent material. The process was developed for original use in the fabrication of wide-aperture membrane optics, with areal densities of less than 1 kg/m2, for lightweight telescopes, solar concentrators, antennas, and the like to be deployed in outer space. The process is just as well applicable to the fabrication of large precise polyimide membranes for flat or inflatable solar concentrators and antenna reflectors for terrestrial applications. The process is applicable to cast membranes made of CP1 (or equivalent) polyimide. The process begins with the precise fitting together and fixturing of two membrane segments. The seam is formed by applying a metered amount of a doped solution of the same polyimide along the abutting edges of the membrane segments. After the solution has been applied, the fixtured films are allowed to dry and are then cured by convective heating. The weld material is the same as the parent material, so that what is formed is a homogeneous, strong joint that is almost indistinguishable from the parent material. The success of the process is highly dependent on formulation of the seaming solution from the correct proportion of the polyimide in a suitable solvent. In addition, the formation of reliable seams depends on the deposition of a precise amount of the seaming solution along the seam line. To ensure the required precision, deposition is performed by use of an automated apparatus comprising a modified commercially available, large-format, ink-jet print head on an automated positioning table. The printing head jets the seaming solution into the seam area at a rate controlled in coordination with the movement of the positioning table.
Designing Domain-Specific HUMS Architectures: An Automated Approach
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Agarwal, Neha; Kumar, Pramod; Sundaram, Parthiban
2004-01-01
The HUMS automation system automates the design of HUMS architectures. The automated design process involves selection of solutions from a large space of designs as well as pure synthesis of designs. Hence the whole objective is to efficiently search for or synthesize designs or parts of designs in the database and to integrate them to form the entire system design. The automation system adopts two approaches in order to produce the designs: (a) Bottom-up approach and (b) Top down approach. Both the approaches are endowed with a Suite of quantitative and quantitative techniques that enable a) the selection of matching component instances, b) the determination of design parameters, c) the evaluation of candidate designs at component-level and at system-level, d) the performance of cost-benefit analyses, e) the performance of trade-off analyses, etc. In short, the automation system attempts to capitalize on the knowledge developed from years of experience in engineering, system design and operation of the HUMS systems in order to economically produce the most optimal and domain-specific designs.
Integrating Test-Form Formatting into Automated Test Assembly
ERIC Educational Resources Information Center
Diao, Qi; van der Linden, Wim J.
2013-01-01
Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…
NASA Astrophysics Data System (ADS)
Makarycheva, A. I.; Faerman, V. A.
2017-02-01
The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.
Towards a characterization of information automation systems on the flight deck
NASA Astrophysics Data System (ADS)
Dudley, Rachel Feddersen
This thesis summarizes research to investigate the characteristics that define information automation systems used on aircraft flight decks and the significant impacts that these characteristics have on pilot performance. Major accomplishments of the work include the development of a set of characteristics that describe information automation systems on the flight deck and an experiment designed to study a subset of these characteristics. Information automation systems on the flight deck are responsible for the collection, processing, analysis, and presentation of data to the flightcrew. These systems pose human factors issues and challenges that must be considered by designers of these systems. Based on a previously developed formal definition of information automation for aircraft flight deck systems, an analysis process was developed and conducted to reach a refined set of information automation characteristics. In this work, characteristics are defined as a set of properties or attributes that describe an information automation system's operation or behavior, which can be used to identify and assess potential human factors issues. Hypotheses were formed for a subset of the characteristics: Automation Visibility, Information Quality, and Display Complexity. An experimental investigation was developed to measure performance impacts related to these characteristics, which showed mixed results of expected and surprising findings, with many interactions. A set of recommendations were then developed based on the experimental observations. Ensuring that the right information is presented to pilots at the right time and in the appropriate manner is the job of flight deck system designers. This work provides a foundation for developing recommendations and guidelines specific to information automation on the flight deck with the goal of improving the design and evaluation of information automation systems before they are implemented.
Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Haas, Janet; Ramirez, Julio A; Carrico, Ruth M
2018-06-01
Hand hygiene is one of the most important interventions in the quest to eliminate healthcare-associated infections, and rates in healthcare facilities are markedly low. Since hand hygiene observation and feedback are critical to improve adherence, we created an easy-to-use, platform-independent hand hygiene data collection process and an automated, on-demand reporting engine. A 3-step approach was used for this project: 1) creation of a data collection form using Google Forms, 2) transfer of data from the form to a spreadsheet using Google Spreadsheets, and 3) creation of an automated, cloud-based analytics platform for report generation using R and RStudio Shiny software. A video tutorial of all steps in the creation and use of this free tool can be found on our YouTube channel: https://www.youtube.com/watch?v=uFatMR1rXqU&t. The on-demand reporting tool can be accessed at: https://crsp.louisville.edu/shiny/handhygiene. This data collection and automated analytics engine provides an easy-to-use environment for evaluating hand hygiene data; it also provides rapid feedback to healthcare workers. By reducing some of the data management workload required of the infection preventionist, more focused interventions may be instituted to increase global hand hygiene rates and reduce infection. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Nabavi, N.
2018-07-01
The author investigates the monitoring methods for fine adjustment of the previously proposed on-chip architecture for frequency multiplication and translation of harmonics by design. Digital signal processing (DSP) algorithms are utilized to create an optimized microwave photonic integrated circuit functionality toward automated frequency multiplication. The implemented DSP algorithms are formed on discrete Fourier transform and optimization-based algorithms (Greedy and gradient-based algorithms), which are analytically derived and numerically compared based on the accuracy and speed of convergence criteria.
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Functional-to-form mapping for assembly design automation
NASA Astrophysics Data System (ADS)
Xu, Z. G.; Liu, W. M.; Shen, W. D.; Yang, D. Y.; Liu, T. T.
2017-11-01
Assembly-level function-to-form mapping is the most effective procedure towards design automation. The research work mainly includes: the assembly-level function definitions, product network model and the two-step mapping mechanisms. The function-to-form mapping is divided into two steps, i.e. mapping of function-to-behavior, called the first-step mapping, and the second-step mapping, i.e. mapping of behavior-to-structure. After the first step mapping, the three dimensional transmission chain (or 3D sketch) is studied, and the feasible design computing tools are developed. The mapping procedure is relatively easy to be implemented interactively, but, it is quite difficult to finish it automatically. So manual, semi-automatic, automatic and interactive modification of the mapping model are studied. A mechanical hand F-F mapping process is illustrated to verify the design methodologies.
Designing automation for complex work environments under different levels of stress.
Sauer, Juergen; Nickel, Peter; Wastell, David
2013-01-01
This article examines the effectiveness of different forms of static and adaptable automation under low- and high-stress conditions. Forty participants were randomly assigned to one of four experimental conditions, comparing three levels of static automation (low, medium and high) and one level of adaptable automation, with the environmental stressor (noise) being varied as a within-subjects variable. Participants were trained for 4 h on a simulation of a process control environment, called AutoCAMS, followed by a 2.5-h testing session. Measures of performance, psychophysiology and subjective reactions were taken. The results showed that operators preferred higher levels of automation under noise than under quiet conditions. A number of parameters indicated negative effects of noise exposure, such as performance impairments, physiological stress reactions and higher mental workload. It also emerged that adaptable automation provided advantages over low and intermediate static automation, with regard to mental workload, effort expenditure and diagnostic performance. The article concludes that for the design of automation a wider range of operational scenarios reflecting adverse as well as ideal working conditions needs to be considered. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Mardesich, N.; Garcia, A.; Bunyan, S.; Pepe, A.
1979-01-01
The technological readiness of the proposed process sequence was reviewed. Process steps evaluated include: (1) plasma etching to establish a standard surface; (2) forming junctions by diffusion from an N-type polymeric spray-on source; (3) forming a p+ back contact by firing a screen printed aluminum paste; (4) forming screen printed front contacts after cleaning the back aluminum and removing the diffusion oxide; (5) cleaning the junction by a laser scribe operation; (6) forming an antireflection coating by baking a polymeric spray-on film; (7) ultrasonically tin padding the cells; and (8) assembling cell strings into solar circuits using ethylene vinyl acetate as an encapsulant and laminating medium.
NASA Technical Reports Server (NTRS)
Hayashi, Miwa; Ravinder, Ujwala; McCann, Robert S.; Beutter, Brent; Spirkovska, Lily
2009-01-01
Performance enhancements associated with selected forms of automation were quantified in a recent human-in-the-loop evaluation of two candidate operational concepts for fault management on next-generation spacecraft. The baseline concept, called Elsie, featured a full-suite of "soft" fault management interfaces. However, operators were forced to diagnose malfunctions with minimal assistance from the standalone caution and warning system. The other concept, called Besi, incorporated a more capable C&W system with an automated fault diagnosis capability. Results from analyses of participants' eye movements indicate that the greatest empirical benefit of the automation stemmed from eliminating the need for text processing on cluttered, text-rich displays.
Automated Test-Form Generation
ERIC Educational Resources Information Center
van der Linden, Wim J.; Diao, Qi
2011-01-01
In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…
JWST Associations overview: automated generation of combined products
NASA Astrophysics Data System (ADS)
Alexov, Anastasia; Swade, Daryl; Bushouse, Howard; Diaz, Rosa; Eisenhamer, Jonathan; Hack, Warren; Kyprianou, Mark; Levay, Karen; Rahmani, Christopher; Swam, Mike; Valenti, Jeff
2018-01-01
We are presenting the design of the James Webb Space Telescope (JWST) Data Management System (DMS) automated processing of Associations. An Association captures the relationship between exposures and higher level data products, such as combined mosaics created from dithered and tiled observations. The astronomer’s intent is captured within the Proposal Planning System (PPS) and provided to DMS as candidate associations. These candidates are converted into Association Pools and Association Generator Tables that serve as input to automated processing which create the combined data products. Association Pools are generated to capture a list of exposures that could potentially form associations and provide relevant information about those exposures. The Association Generator using definitions on groupings creates one or more Association Tables from a single input Association Pool. Each Association Table defines a set of exposures to be combined and the ruleset of the combination to be performed; the calibration software creates Associated data products based on these input tables. The initial design produces automated Associations within a proposal. Additionally this JWST overall design is conducive to eventually produce Associations for observations from multiple proposals, similar to the Hubble Legacy Archive (HLA).
An atmosphere protection subsystem in the thermal power station automated process control system
NASA Astrophysics Data System (ADS)
Parchevskii, V. M.; Kislov, E. A.
2014-03-01
Matters concerned with development of methodical and mathematical support for an atmosphere protection subsystem in the thermal power station automated process control system are considered taking as an example the problem of controlling nitrogen oxide emissions at a gas-and-oil-fired thermal power station. The combined environmental-and-economic characteristics of boilers, which correlate the costs for suppressing emissions with the boiler steam load and mass discharge of nitrogen oxides in analytic form, are used as the main tool for optimal control. A procedure for constructing and applying environmental-and-economic characteristics on the basis of technical facilities available in modern instrumentation and control systems is presented.
Automated structure solution, density modification and model building.
Terwilliger, Thomas C
2002-11-01
The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.
77 FR 1093 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-09
..., including the use of automated collection techniques or other forms of information technology. 1. Title and... obtains information from an applicant about their marital history, work history, military service... application process obtains information from an applicant about their marital history, work history, benefits...
Cerebellum engages in automation of verb-generation skill.
Yang, Zhi; Wu, Paula; Weng, Xuchu; Bandettini, Peter A
2014-03-01
Numerous studies have shown cerebellar involvement in item-specific association, a form of explicit learning. However, very few have demonstrated cerebellar participation in automation of non-motor cognitive tasks. Applying fMRI to a repeated verb-generation task, we sought to distinguish cerebellar involvement in learning of item-specific noun-verb association and automation of verb generation skill. The same set of nouns was repeated in six verb-generation blocks so that subjects practiced generating verbs for the nouns. The practice was followed by a novel block with a different set of nouns. The cerebellar vermis (IV/V) and the right cerebellar lobule VI showed decreased activation following practice; activation in the right cerebellar Crus I was significantly lower in the novel challenge than in the initial verb-generation task. Furthermore, activation in this region during well-practiced blocks strongly correlated with improvement of behavioral performance in both the well-practiced and the novel blocks, suggesting its role in the learning of general mental skills not specific to the practiced noun-verb pairs. Therefore, the cerebellum processes both explicit verbal associative learning and automation of cognitive tasks. Different cerebellar regions predominate in this processing: lobule VI during the acquisition of item-specific association, and Crus I during automation of verb-generation skills through practice.
Automated inspection of gaps on the free-form shape parts by laser scanning technologies
NASA Astrophysics Data System (ADS)
Zhou, Sen; Xu, Jian; Tao, Lei; An, Lu; Yu, Yan
2018-01-01
In industrial manufacturing processes, the dimensional inspection of the gaps on the free-form shape parts is critical and challenging, and is directly associated with subsequent assembly and terminal product quality. In this paper, a fast measuring method for automated gap inspection based on laser scanning technologies is presented. The proposed measuring method consists of three steps: firstly, the relative position is determined according to the geometric feature of measuring gap, which considers constraints existing in a laser scanning operation. Secondly, in order to acquire a complete gap profile, a fast and effective scanning path is designed. Finally, the range dimension of the gaps on the free-form shape parts including width, depth and flush, correspondingly, is described in a virtual environment. In the future, an appliance machine based on the proposed method will be developed for the on-line dimensional inspection of gaps on the automobile or aerospace production line.
Robotics in space-age manufacturing
NASA Technical Reports Server (NTRS)
Jones, Chip
1991-01-01
Robotics technologies are developed to improve manufacturing of space hardware. The following applications of robotics are covered: (1) welding for the space shuttle and space station Freedom programs; (2) manipulation of high-pressure water for shuttle solid rocket booster refurbishment; (3) automating the application of insulation materials; (4) precision application of sealants; and (5) automation of inspection procedures. Commercial robots are used for these development programs, but they are teamed with advanced sensors, process controls, and computer simulation to form highly productive manufacturing systems. Many of the technologies are also being actively pursued in private sector manufacturing operations.
Optoelectronic image processing for cervical cancer screening
NASA Astrophysics Data System (ADS)
Narayanswamy, Ramkumar; Sharpe, John P.; Johnson, Kristina M.
1994-05-01
Automation of the Pap-smear cervical screening method is highly desirable as it relieves tedium for the human operators, reduces cost and should increase accuracy and provide repeatability. We present here the design for a high-throughput optoelectronic system which forms the first stage of a two stage system to automate pap-smear screening. We use a mathematical morphological technique called the hit-or-miss transform to identify the suspicious areas on a pap-smear slide. This algorithm is implemented using a VanderLugt architecture and a time-sequential ANDing smart pixel array.
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
NASA Astrophysics Data System (ADS)
Kuhtz, M.; Maron, B.; Hornig, A.; Müller, M.; Langkamp, A.; Gude, M.
2018-05-01
Textile reinforced thermoplastic composites are predestined for highly automated medium- and high-volume production processes. The presented work focusses on experimental studies of different types of glass fibre reinforced polypropylene (GF-PP) semi-finished thermoplastic textiles to characterise the forming behaviour. The main deformation modes fabric shear, tension, thought-thickness compression and bending are investigated with special emphasis on the impact of the textile structure, the deformation temperature and rate dependency. The understanding of the fundamental forming behaviour is required to allow FEM based assessment and improvement of thermoforming process chains.
University of Arizona: College and University Systems Environment.
ERIC Educational Resources Information Center
CAUSE/EFFECT, 1985
1985-01-01
The University of Arizona has begun to reorganize campus computing. Six working groups were formed to address six areas of computing: academic computing, library automation, administrative data processing and information systems, writing and graphics, video and audio services, and outreach and public service. (MLW)
24 CFR 908.104 - Requirements.
Code of Federal Regulations, 2014 CFR
2014-04-01
... ELECTRONIC TRANSMISSION OF REQUIRED FAMILY DATA FOR PUBLIC HOUSING, INDIAN HOUSING, AND THE SECTION 8 RENTAL.... Housing agencies that currently use automated software packages to transmit Forms HUD-50058 and HUD-50058-FSS information by tape or diskette to the Department's data processing contractor must convert to...
24 CFR 908.104 - Requirements.
Code of Federal Regulations, 2012 CFR
2012-04-01
... ELECTRONIC TRANSMISSION OF REQUIRED FAMILY DATA FOR PUBLIC HOUSING, INDIAN HOUSING, AND THE SECTION 8 RENTAL.... Housing agencies that currently use automated software packages to transmit Forms HUD-50058 and HUD-50058-FSS information by tape or diskette to the Department's data processing contractor must convert to...
24 CFR 908.104 - Requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... ELECTRONIC TRANSMISSION OF REQUIRED FAMILY DATA FOR PUBLIC HOUSING, INDIAN HOUSING, AND THE SECTION 8 RENTAL.... Housing agencies that currently use automated software packages to transmit Forms HUD-50058 and HUD-50058-FSS information by tape or diskette to the Department's data processing contractor must convert to...
24 CFR 908.104 - Requirements.
Code of Federal Regulations, 2013 CFR
2013-04-01
... ELECTRONIC TRANSMISSION OF REQUIRED FAMILY DATA FOR PUBLIC HOUSING, INDIAN HOUSING, AND THE SECTION 8 RENTAL.... Housing agencies that currently use automated software packages to transmit Forms HUD-50058 and HUD-50058-FSS information by tape or diskette to the Department's data processing contractor must convert to...
NASA Astrophysics Data System (ADS)
Kerekes, Ryan A.; Gleason, Shaun S.; Trivedi, Niraj; Solecki, David J.
2010-03-01
Segmentation, tracking, and tracing of neurons in video imagery are important steps in many neuronal migration studies and can be inaccurate and time-consuming when performed manually. In this paper, we present an automated method for tracing the leading and trailing processes of migrating neurons in time-lapse image stacks acquired with a confocal fluorescence microscope. In our approach, we first locate and track the soma of the cell of interest by smoothing each frame and tracking the local maxima through the sequence. We then trace the leading process in each frame by starting at the center of the soma and stepping repeatedly in the most likely direction of the leading process. This direction is found at each step by examining second derivatives of fluorescent intensity along curves of constant radius around the current point. Tracing terminates after a fixed number of steps or when fluorescent intensity drops below a fixed threshold. We evolve the resulting trace to form an improved trace that more closely follows the approximate centerline of the leading process. We apply a similar algorithm to the trailing process of the cell by starting the trace in the opposite direction. We demonstrate our algorithm on two time-lapse confocal video sequences of migrating cerebellar granule neurons (CGNs). We show that the automated traces closely approximate ground truth traces to within 1 or 2 pixels on average. Additionally, we compute line intensity profiles of fluorescence along the automated traces and quantitatively demonstrate their similarity to manually generated profiles in terms of fluorescence peak locations.
Automated delineation and characterization of drumlins using a localized contour tree approach
NASA Astrophysics Data System (ADS)
Wang, Shujie; Wu, Qiusheng; Ward, Dylan
2017-10-01
Drumlins are ubiquitous landforms in previously glaciated regions, formed through a series of complex subglacial processes operating underneath the paleo-ice sheets. Accurate delineation and characterization of drumlins are essential for understanding the formation mechanism of drumlins as well as the flow behaviors and basal conditions of paleo-ice sheets. Automated mapping of drumlins is particularly important for examining the distribution patterns of drumlins across large spatial scales. This paper presents an automated vector-based approach to mapping drumlins from high-resolution light detection and ranging (LiDAR) data. The rationale is to extract a set of concentric contours by building localized contour trees and establishing topological relationships. This automated method can overcome the shortcomings of previously manual and automated methods for mapping drumlins, for instance, the azimuthal biases during the generation of shaded relief images. A case study was carried out over a portion of the New York Drumlin Field. Overall 1181 drumlins were identified from the LiDAR-derived DEM across the study region, which had been underestimated in previous literature. The delineation results were visually and statistically compared to the manual digitization results. The morphology of drumlins was characterized by quantifying the length, width, elongation ratio, height, area, and volume. Statistical and spatial analyses were conducted to examine the distribution pattern and spatial variability of drumlin size and form. The drumlins and the morphologic characteristics exhibit significant spatial clustering rather than randomly distributed patterns. The form of drumlins varies from ovoid to spindle shapes towards the downstream direction of paleo ice flows, along with the decrease in width, area, and volume. This observation is in line with previous studies, which may be explained by the variations in sediment thickness and/or the velocity increases of ice flows towards ice front.
Work process and task-based design of intelligent assistance systems in German textile industry
NASA Astrophysics Data System (ADS)
Löhrer, M.; Ziesen, N.; Altepost, A.; Saggiomo, M.; Gloy, Y. S.
2017-10-01
The mid-sized embossed German textile industry must face social challenges e.g. demographic change or technical changing processes. Interaction with intelligent systems (on machines) and increasing automation changes processes, working structures and employees’ tasks on all levels. Work contents are getting more complex, resulting in the necessity for diversified and enhanced competencies. Mobile devices like tablets or smartphones are increasingly finding their way into the workplace. Employees who grew up with new forms of media have certain advantages regarding the usage of modern technologies compared to older employees. Therefore, it is necessary to design new systems which help to adapt the competencies of both younger and older employees to new automated production processes in the digital work environment. The key to successful integration of technical assistance systems is user-orientated design and development that includes concepts for competency development under consideration of, e.g., ethical and legal aspects.
Dry Ribbon for Heated Head Automated Fiber Placement
NASA Technical Reports Server (NTRS)
Hulcher, A. Bruce; Marchello, Joseph M.; Hinkley, Jeffrey A.; Johnston, Norman J.; Lamontia, Mark A.
2000-01-01
Ply-by-ply in situ processes involving automated heated head deposition are being developed for fabrication of high performance, high temperature composite structures from low volatile content polymer matrices. This technology requires (1) dry carbon fiber towpreg, (2) consolidation of towpreg to quality, placement-grade unidirectional ribbon or tape, and (3) rapid, in situ, accurate, ply-by-ply robotic placement and consolidation of this material to fabricate a composite structure. In this study, the physical properties of a candidate thermoplastic ribbon, PIXA/IM7, were evaluated and screened for suitability in robotic placement. Specifically, towpreg was prepared from PIXA powder. Various conditions (temperatures) were used to convert the powder-coated towpreg to ribbons with varying degrees of processability. Ribbon within preset specifications was fabricated at 3 temperatures: 390, 400 and 410 C. Ribbon was also produced out-of-spec by purposely overheating the material to a processing temperature of 450 C. Automated placement equipment at Cincinnati Milacron and NASA Langley was used to fabricate laminates from these experimental ribbons. Ribbons were placed at 405 and 450 C by both sets of equipment. Double cantilever beam and wedge peel tests were used to determine the quality of the laminates and, especially, the interlaminar bond formed during the placement process. Ribbon made under conditions expected to be non-optimal (overheated) resulted in poor placeability and composites with weak interlaminar bond strengths, regardless of placement conditions. Ribbon made under conditions expected to be ideal showed good processability and produced well-consolidated laminates. Results were consistent from machine to machine and demonstrated the importance of ribbon quality in heated-head placement of dry material forms. Preliminary screening criteria for the development and evaluation of ribbon from new matrix materials were validated.
NASA Astrophysics Data System (ADS)
Mozgovoy, Dmitry k.; Hnatushenko, Volodymyr V.; Vasyliev, Volodymyr V.
2018-04-01
Vegetation and water bodies are a fundamental element of urban ecosystems, and water mapping is critical for urban and landscape planning and management. A methodology of automated recognition of vegetation and water bodies on the territory of megacities in satellite images of sub-meter spatial resolution of the visible and IR bands is proposed. By processing multispectral images from the satellite SuperView-1A, vector layers of recognized plant and water objects were obtained. Analysis of the results of image processing showed a sufficiently high accuracy of the delineation of the boundaries of recognized objects and a good separation of classes. The developed methodology provides a significant increase of the efficiency and reliability of updating maps of large cities while reducing financial costs. Due to the high degree of automation, the proposed methodology can be implemented in the form of a geo-information web service functioning in the interests of a wide range of public services and commercial institutions.
1990 censuses to increase use of automation.
Ward, S E
1988-12-01
This article summarizes information from selected reports presented at the 12th Population Census Conference. Ward reports that plans for the 1990 census in many countries of Asia and the Pacific call for increased use of automation, with applications ranging from the use of computer-generated maps of enumeration areas and optical mark readers for data processing to desktop publishing and electronic mail for disseminating the results. Recent advances in automation offer opportunities for improved accuracy and speed of census operations while reducing the need for clerical personnel. Most of the technologies discussed at the 12th Population Census are designed to make the planning, editing, processing, analysis, and publication of census data more reliable and efficient. However, technology alone cannot overcome high rates of illiteracy that preclude having respondents complete the census forms themselves. But it enables even China, India, Indonesia and Pakistan - the countries with huge population and limited financial resources - to make significant improvements in their forthcoming censuses.
Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.
Peters, Carl N; Evans, Iain E J
2016-12-01
Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.
Zhang, Airong; Zhang, Song; Bian, Cuirong
2018-02-01
Cortical bone provides the main form of support in humans and other vertebrates against various forces. Thus, capturing its mechanical properties is important. In this study, the mechanical properties of cortical bone were investigated by using automated ball indentation and graphics processing at both the macroscopic and microstructural levels under dry conditions. First, all polished samples were photographed under a metallographic microscope, and the area ratio of the circumferential lamellae and osteons was calculated through the graphics processing method. Second, fully-computer-controlled automated ball indentation (ABI) tests were performed to explore the micro-mechanical properties of the cortical bone at room temperature and a constant indenter speed. The indentation defects were examined with a scanning electron microscope. Finally, the macroscopic mechanical properties of the cortical bone were estimated with the graphics processing method and mixture rule. Combining ABI and graphics processing proved to be an effective tool to obtaining the mechanical properties of the cortical bone, and the indenter size had a significant effect on the measurement. The methods presented in this paper provide an innovative approach to acquiring the macroscopic mechanical properties of cortical bone in a nondestructive manner. Copyright © 2017 Elsevier Ltd. All rights reserved.
Process development for automated solar cell and module production. Task 4: Automated array assembly
NASA Technical Reports Server (NTRS)
1980-01-01
A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.
Acoustic-sensor-based detection of damage in composite aircraft structures
NASA Astrophysics Data System (ADS)
Foote, Peter; Martin, Tony; Read, Ian
2004-03-01
Acoustic emission detection is a well-established method of locating and monitoring crack development in metal structures. The technique has been adapted to test facilities for non-destructive testing applications. Deployment as an operational or on-line automated damage detection technology in vehicles is posing greater challenges. A clear requirement of potential end-users of such systems is a level of automation capable of delivering low-level diagnosis information. The output from the system is in the form of "go", "no-go" indications of structural integrity or immediate maintenance actions. This level of automation requires significant data reduction and processing. This paper describes recent trials of acoustic emission detection technology for the diagnosis of damage in composite aerospace structures. The technology comprises low profile detection sensors using piezo electric wafers encapsulated in polymer film ad optical sensors. Sensors are bonded to the structure"s surface and enable acoustic events from the loaded structure to be located by triangulation. Instrumentation has been enveloped to capture and parameterise the sensor data in a form suitable for low-bandwidth storage and transmission.
Evaluation in context: ATC automation in the field
NASA Technical Reports Server (NTRS)
Harwood, Kelly; Sanford, Beverly
1994-01-01
The process for incorporating advanced technologies into complex aviation systems is as important as the final product itself. This paper described a process that is currently being applied to the development and assessment of an advanced ATC automation system, CTAS. The key element of the process is field exposure early in the system development cycle. The process deviates from current established practices of system development -- where field testing is an implementation endpoint -- and has been deemed necessary by the FAA for streamlining development and bringing system functions to a level of stability and usefulness. Methods and approaches for field assessment are borrowed from human factors engineering, cognitive engineering, and usability engineering and are tailored for the constraints of an operational ATC environment. To date, the focus has been on the qualitative assessment of the match between TMA capabilities and the context for their use. Capturing the users' experience with the automation tool and understanding tool use in the context of the operational environment is important, not only for developing a tool that is an effective problem-solving instrument but also for defining meaningful operational requirements. Such requirements form the basis for certifying the safety and efficiency of the system. CTAS is the first U.S. advanced ATC automation system of its scope and complexity to undergo this field development and assessment process. With the rapid advances in aviation technologies and our limited understanding of their impact on system performance, it is time we opened our eyes to new possibilities for developing, validating, and ultimately certifying complex aviation systems.
NASA Technical Reports Server (NTRS)
Lum, Henry, Jr.
1988-01-01
Information on systems autonomy is given in viewgraph form. Information is given on space systems integration, intelligent autonomous systems, automated systems for in-flight mission operations, the Systems Autonomy Demonstration Project on the Space Station Thermal Control System, the architecture of an autonomous intelligent system, artificial intelligence research issues, machine learning, and real-time image processing.
Issues in Retrospective Conversion for a Small Special Collection: A Case Study.
ERIC Educational Resources Information Center
Hieb, Fern
1997-01-01
Small special collections present unique problems for retrospective conversion of catalogs to machine-readable form. Examines retrospective conversion using the Moravian Music Foundation as a case study. Discusses advantages to automation, options for conversion process, quantifying conversion effort, costs, in-house conversion, national standards…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-04
... program helps to ensure that requested data can be provided in the desired format, reporting burden (time... coal mining industry with a standardized reporting format that expedites the certification process... appropriate automated, electronic, mechanical, or other technological collection techniques or other forms of...
Polymer Light-Emitting Diode (PLED) Process Development
2003-12-01
conclusions and recommendations for Phase II of the Flexible Display Program. 15. SUBJECT TERMS LIGHT EMITTING DIODES LIQUID CRYSTAL DISPLAY SYSTEMS...space for Phase I and II confined by backplane complexity and substrate form...12 Figure 6. Semi automated I-V curve measurement setup consisting of Keithley power supply, computer and
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.
1990-01-01
The Environmental Control and Life Support System (ECLSS) is a Freedom Station distributed system with inherent applicability to advanced automation primarily due to the comparatively large reaction times of its subsystem processes. This allows longer contemplation times in which to form a more intelligent control strategy and to detect or prevent faults. The objective of the ECLSS Advanced Automation Project is to reduce the flight and ground manpower needed to support the initial and evolutionary ECLS system. The approach is to search out and make apparent those processes in the baseline system which are in need of more automatic control and fault detection strategies, to influence the ECLSS design by suggesting software hooks and hardware scars which will allow easy adaptation to advanced algorithms, and to develop complex software prototypes which fit into the ECLSS software architecture and will be shown in an ECLSS hardware testbed to increase the autonomy of the system. Covered here are the preliminary investigation and evaluation process, aimed at searching the ECLSS for candidate functions for automation and providing a software hooks and hardware scars analysis. This analysis shows changes needed in the baselined system for easy accommodation of knowledge-based or other complex implementations which, when integrated in flight or ground sustaining engineering architectures, will produce a more autonomous and fault tolerant Environmental Control and Life Support System.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-02
... language)-enabled Adobe PDF form, Form FDA 3331--Automated to submit new drug application (NDA) and...-enabled Adobe PDF form, Form FDA 3331--Automated, will be available for piloting between May 1, 2013, and... modernize the FAR submission and review pathway using an XML-enabled PDF form to enable integration with...
Web-Based Requesting and Scheduling Use of Facilities
NASA Technical Reports Server (NTRS)
Yeager, Carolyn M.
2010-01-01
Automated User's Training Operations Facility Utilization Request (AutoFUR) is prototype software that administers a Web-based system for requesting and allocating facilities and equipment for astronaut-training classes in conjunction with scheduling the classes. AutoFUR also has potential for similar use in such applications as scheduling flight-simulation equipment and instructors in commercial airplane-pilot training, managing preventive- maintenance facilities, and scheduling operating rooms, doctors, nurses, and medical equipment for surgery. Whereas requesting and allocation of facilities was previously a manual process that entailed examination of documents (including paper drawings) from different sources, AutoFUR partly automates the process and makes all of the relevant information available via the requester s computer. By use of AutoFUR, an instructor can fill out a facility-utilization request (FUR) form on line, consult the applicable flight manifest(s) to determine what equipment is needed and where it should be placed in the training facility, reserve the corresponding hardware listed in a training-hardware inventory database, search for alternative hardware if necessary, submit the FUR for processing, and cause paper forms to be printed. Auto-FUR also maintains a searchable archive of prior FURs.
The automated Army ROTC Questionnaire (ARQ)
NASA Technical Reports Server (NTRS)
Young, David L. H.
1991-01-01
The Reserve Officer Training Corps Cadet Command (ROTCCC) takes applications for its officer training program from college students and Army enlisted personnel worldwide. Each applicant is required to complete a set of application forms prior to acceptance into the ROTC program. These forms are covered by several regulations that govern the eligibility of potential applicants and guide the applicant through the application process. Eligibility criteria changes as Army regulations are periodically revised. Outdated information results in a loss of applications attributable to frustration and error. ROTCCC asked for an inexpensive and reliable way of automating their application process. After reviewing the process, it was determined that an expert system with good end user interface capabilities could be used to solve a large part of the problem. The system captures the knowledge contained within the regulations, enables the quick distribution and implementation of eligibility criteria changes, and distributes the expertise of the admissions personnel to the education centers and colleges. The expert system uses a modified version of CLIPS that was streamlined to make the most efficient use of its capabilities. A user interface with windowing capabilities provides the applicant with a simple and effective way to input his/her personal data.
Selecting automation for the clinical chemistry laboratory.
Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr
2007-07-01
Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.
NASA Technical Reports Server (NTRS)
Lyubashevskiy, G. S.
1973-01-01
Fourier processing of automatic signals transforms direct current voltage into a numerical form through bandpass filtration in time-pulse multiplying devices. It is shown that the ratio of the interference energy to the useful signal energy is inversely proportional to the square of the product of the depth of the width modulation and the ratio of the time constant averaging to the cross-multiplied signals.
AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.
New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.« less
NASA Technical Reports Server (NTRS)
Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam;
2009-01-01
The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.
Vavadi, Hamed; Zhu, Quing
2016-01-01
Imaging-guided near infrared diffuse optical tomography (DOT) has demonstrated a great potential as an adjunct modality for differentiation of malignant and benign breast lesions and for monitoring treatment response of breast cancers. However, diffused light measurements are sensitive to artifacts caused by outliers and errors in measurements due to probe-tissue coupling, patient and probe motions, and tissue heterogeneity. In general, pre-processing of the measurements is needed by experienced users to manually remove these outliers and therefore reduce imaging artifacts. An automated method of outlier removal, data selection, and filtering for diffuse optical tomography is introduced in this manuscript. This method consists of multiple steps to first combine several data sets collected from the same patient at contralateral normal breast and form a single robust reference data set using statistical tests and linear fitting of the measurements. The second step improves the perturbation measurements by filtering out outliers from the lesion site measurements using model based analysis. The results of 20 malignant and benign cases show similar performance between manual data processing and automated processing and improvement in tissue characterization of malignant to benign ratio by about 27%. PMID:27867711
NASA Technical Reports Server (NTRS)
Newcomb, J. S.
1975-01-01
The present paper describes an automated system for measuring stellar proper motions on the basis of information contained in photographic plates. In this system, the images on a star plate are digitized by a scanning microdensitometer using light from a He-Ne gas laser, and a special-purpose computer arranges the measurements in computer-compatible form on magnetic tape. The scanning and image-reconstruction processes are briefly outlined, and the image-evaluation techniques are discussed. It is shown that the present system has been especially successful in measuring the proper motions of low-luminosity stars, including 119 stars with less than 1/10,000 of the solar bolometric luminosity. Plans for measurements of high-density Milky Way star plates are noted.
Advanced tow placement of composite fuselage structure
NASA Technical Reports Server (NTRS)
Anderson, Robert L.; Grant, Carroll G.
1992-01-01
The Hercules NASA ACT program was established to demonstrate and validate the low cost potential of the automated tow placement process for fabrication of aircraft primary structures. The program is currently being conducted as a cooperative program in collaboration with the Boeing ATCAS Program. The Hercules advanced tow placement process has been in development since 1982 and was developed specifically for composite aircraft structures. The second generation machine, now in operation at Hercules, is a production-ready machine that uses a low cost prepreg tow material form to produce structures with laminate properties equivalent to prepreg tape layup. Current program activities are focused on demonstration of the automated tow placement process for fabrication of subsonic transport aircraft fuselage crown quadrants. We are working with Boeing Commercial Aircraft and Douglas Aircraft during this phase of the program. The Douglas demonstration panels has co-cured skin/stringers, and the Boeing demonstration panel is an intricately bonded part with co-cured skin/stringers and co-bonded frames. Other aircraft structures that were evaluated for the automated tow placement process include engine nacelle components, fuselage pressure bulkheads, and fuselage tail cones. Because of the cylindrical shape of these structures, multiple parts can be fabricated on one two placement tool, thus reducing the cost per pound of the finished part.
Automated detection of exudates for diabetic retinopathy screening
NASA Astrophysics Data System (ADS)
Fleming, Alan D.; Philip, Sam; Goatman, Keith A.; Williams, Graeme J.; Olson, John A.; Sharp, Peter F.
2007-12-01
Automated image analysis is being widely sought to reduce the workload required for grading images resulting from diabetic retinopathy screening programmes. The recognition of exudates in retinal images is an important goal for automated analysis since these are one of the indicators that the disease has progressed to a stage requiring referral to an ophthalmologist. Candidate exudates were detected using a multi-scale morphological process. Based on local properties, the likelihoods of a candidate being a member of classes exudate, drusen or background were determined. This leads to a likelihood of the image containing exudates which can be thresholded to create a binary decision. Compared to a clinical reference standard, images containing exudates were detected with sensitivity 95.0% and specificity 84.6% in a test set of 13 219 images of which 300 contained exudates. Depending on requirements, this method could form part of an automated system to detect images showing either any diabetic retinopathy or referable diabetic retinopathy.
Automation and control of off-planet oxygen production processes
NASA Technical Reports Server (NTRS)
Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.
1990-01-01
This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.
Choice of mathematical models for technological process of glass rod drawing
NASA Astrophysics Data System (ADS)
Alekseeva, L. B.
2017-10-01
The technological process of drawing glass rods (light guides) is considered. Automated control of the drawing process is reduced to the process of making decisions to ensure a given quality. The drawing process is considered as a control object, including the drawing device (control device) and the optical fiber forming zone (control object). To study the processes occurring in the formation zone, mathematical models are proposed, based on the continuum mechanics basics. To assess the influence of disturbances, a transfer function is obtained from the basis of the wave equation. Obtaining the regression equation also adequately describes the drawing process.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-26
... Project- Based Section 8 Contracts AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice... through the use of appropriate automated collection techniques or other forms of information technology, e... Section 8 project-based assistance contracts are renewed. The Section 8 contract renewal process is an...
NASA Astrophysics Data System (ADS)
Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti
2015-04-01
Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.
1980-03-06
performing the present NPFC tasks. Potential automation technologies may include order processing mechanization, demand printing from micrographic or...effort and documented in this volume included the following: a. Functional description of the order processing activities as they currently operate. b...covered under each analysis area. i It is obvious from the exhibit that the functional description of order processing operations was to include COG I
Hot Forging of a Cladded Component by Automated GMAW Process
NASA Astrophysics Data System (ADS)
Rafiq, Muhammad; Langlois, Laurent; Bigot, Régis
2011-01-01
Weld cladding is employed to improve the service life of engineering components by increasing corrosion and wear resistance and reducing the cost. The acceptable multi-bead cladding layer depends on single bead geometry. Hence, in first step, the relationship between input process parameters and the single bead geometry is studied and in second step a comprehensive study on multi bead clad layer deposition is carried out. This paper highlights an experimental study carried out to get single layer cladding deposited by automated Gas Metal Arc Welding (GMAW) process and to find the possibility of hot forming of the cladded work piece to get the final hot formed improved structure. GMAW is an arc welding process that uses an arc between a consumable electrode and the welding pool with an external shielding gas and the cladding is done by alongside deposition of weld beads. The experiments for single bead were conducted by varying the three main process parameters wire feed rate, arc voltage and welding speed while keeping other parameters like nozzle to work distance, shielding gas and its flow rate and torch angle constant. The effect of bead spacing and torch orientation on the cladding quality of single layer from the results of single bead deposition was studied. Effect of the dilution rate and nominal energy on the cladded layer hot bending quality was also performed at different temperatures.
Jipp, Meike
2016-02-01
I explored whether different cognitive abilities (information-processing ability, working-memory capacity) are needed for expertise development when different types of automation (information vs. decision automation) are employed. It is well documented that expertise development and the employment of automation lead to improved performance. Here, it is argued that a learner's ability to reason about an activity may be hindered by the employment of information automation. Additional feedback needs to be processed, thus increasing the load on working memory and decelerating expertise development. By contrast, the employment of decision automation may stimulate reasoning, increase the initial load on information-processing ability, and accelerate expertise development. Authors of past research have not investigated the interrelations between automation assistance, individual differences, and expertise development. Sixty-one naive learners controlled simulated air traffic with two types of automation: information automation and decision automation. Their performance was captured across 16 trials. Well-established tests were used to assess information-processing ability and working-memory capacity. As expected, learners' performance benefited from expertise development and decision automation. Furthermore, individual differences moderated the effect of the type of automation on expertise development: The employment of only information automation increased the load on working memory during later expertise development. The employment of decision automation initially increased the need to process information. These findings highlight the importance of considering individual differences and expertise development when investigating human-automation interaction. The results are relevant for selecting automation configurations for expertise development. © 2015, Human Factors and Ergonomics Society.
Comparison on Human Resource Requirement between Manual and Automated Dispensing Systems.
Noparatayaporn, Prapaporn; Sakulbumrungsil, Rungpetch; Thaweethamcharoen, Tanita; Sangseenil, Wunwisa
2017-05-01
This study was conducted to compare human resource requirement among manual, automated, and modified automated dispensing systems. Data were collected from the pharmacy department at the 2100-bed university hospital (Siriraj Hospital, Bangkok, Thailand). Data regarding the duration of the medication distribution process were collected by using self-reported forms for 1 month. The data on the automated dispensing machine (ADM) system were obtained from 1 piloted inpatient ward, whereas those on the manual system were the average of other wards. Data on dispensing, returned unused medication, and stock management processes under the traditional manual system and the ADM system were from actual activities, whereas the modified ADM system was modeled. The full-time equivalent (FTE) of each model was estimated for comparison. The result showed that the manual system required 46.84 FTEs of pharmacists and 132.66 FTEs of pharmacy technicians. By adding pharmacist roles on screening and verification under the ADM system, the ADM system required 117.61 FTEs of pharmacists. Replacing counting and filling medication functions by ADM has decreased the number of pharmacy technicians to 55.38 FTEs. After the modified ADM system canceled the return unused medication process, FTEs requirement for pharmacists and pharmacy technicians decreased to 69.78 and 51.90 FTEs, respectively. The ADM system decreased the workload of pharmacy technicians, whereas it required more time from pharmacists. However, the increased workload of pharmacists was associated with more comprehensive patient care functions, which resulted from the redesigned work process. Copyright © 2017. Published by Elsevier Inc.
Human-centered automation: Development of a philosophy
NASA Technical Reports Server (NTRS)
Graeber, Curtis; Billings, Charles E.
1990-01-01
Information on human-centered automation philosophy is given in outline/viewgraph form. It is asserted that automation of aircraft control will continue in the future, but that automation should supplement, not supplant the human management and control function in civil air transport.
76 FR 53763 - Immigration Benefits Business Transformation, Increment I
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-29
...The Department of Homeland Security (DHS) is amending its regulations to enable U.S. Citizenship and Immigration Services (USCIS) to migrate from a paper file-based, non-integrated systems environment to an electronic customer-focused, centralized case management environment for benefit processing. This transformation process will allow USCIS to streamline benefit processing, eliminate the capture and processing of redundant data, and reduce the number of and automate its forms. This transformation process will be a phased multi-year initiative to restructure USCIS business processes and related information technology systems. DHS is removing references to form numbers, form titles, expired regulatory provisions, and descriptions of internal procedures, many of which will change during transformation. DHS is also finalizing interim rules that permitted submission of benefit requests with an electronic signature when such requests are submitted in an electronic format rather than on a paper form and that removed references to filing locations for immigration benefits. In addition, in this rule DHS is publishing the final rule for six other interim rules published during the past several years, most of which received no public comments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, M.G.
1983-07-20
The Joint Atomic Information Exchange Group (JAIEG) was formed in 1959. JAIEG's charter was to be the agent to determine the transmissibility of atomic information to foreign governments or regional defense organizations. JAIEG, operating from the joint policy direction of the Deputy Assistant Secretary for Military Application under the Department of Energy and of the Assistant to the Secretary of Defense for Atomic Energy under the Department of Defense, determines in a uniform and consistent manner the transmissibility of Restricted Data (RD) and Formerly Restricted Data (FRD). JAIEG also maintains records of all transmittal of such data, and maintains recordmore » copies of all information reviewed. This information exists only on paper and must be processed manually. JAIEG's work load has grown continuously with the size of its filing system. This has prompted an investigation of the potential benefits of automating JAIEG activities. Science Applications, Inc. (SAI) has studied JAIEG procedures and has identified several areas where JAIEG information processing activities can be automated to achieve greater system efficiency.« less
78 FR 68824 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-15
... Number: Automated Repatriation Reporting System; DD Form 2585; OMB Control Number 0704-0334. Needs and... States. The information obtained from the DD Form 2585 is entered into an automated system; a series of... the protection, evacuation and repatriation of U.S. citizens in threatened areas overseas. The DD Form...
Research on Internal Controls and Auditing. Navy Financial Management Improvement Program. Volume 3.
1980-06-01
facilities. NARDAC Pensacola, was formed in October, 1977 with the transfer of the Naval Education and Training Information Support o Activity (NETISA...to NAVDAC and the additional transfer of four other geographically separate data processing centers. Somewhere between 25 to 50 percent of the Navy’s...may change the jobs within the framework of the time and processing resources available. The automated system tells the operations personnel the setup
Generating Test Templates via Automated Theorem Proving
NASA Technical Reports Server (NTRS)
Kancherla, Mani Prasad
1997-01-01
Testing can be used during the software development process to maintain fidelity between evolving specifications, program designs, and code implementations. We use a form of specification-based testing that employs the use of an automated theorem prover to generate test templates. A similar approach was developed using a model checker on state-intensive systems. This method applies to systems with functional rather than state-based behaviors. This approach allows for the use of incomplete specifications to aid in generation of tests for potential failure cases. We illustrate the technique on the cannonical triangle testing problem and discuss its use on analysis of a spacecraft scheduling system.
The Automation of Reserve Processing.
ERIC Educational Resources Information Center
Self, James
1985-01-01
Describes an automated reserve processing system developed locally at Clemons Library, University of Virginia. Discussion covers developments in the reserve operation at Clemons Library, automation of the processing and circulation functions of reserve collections, and changes in reserve operation performance and staffing needs due to automation.…
Optimization of a Tube Hydroforming Process
NASA Astrophysics Data System (ADS)
Abedrabbo, Nader; Zafar, Naeem; Averill, Ron; Pourboghrat, Farhang; Sidhu, Ranny
2004-06-01
An approach is presented to optimize a tube hydroforming process using a Genetic Algorithm (GA) search method. The goal of the study is to maximize formability by identifying the optimal internal hydraulic pressure and feed rate while satisfying the forming limit diagram (FLD). The optimization software HEEDS is used in combination with the nonlinear structural finite element code LS-DYNA to carry out the investigation. In particular, a sub-region of a circular tube blank is formed into a square die. Compared to the best results of a manual optimization procedure, a 55% increase in expansion was achieved when using the pressure and feed profiles identified by the automated optimization procedure.
Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin
2017-04-01
Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.
Advanced automation for in-space vehicle processing
NASA Technical Reports Server (NTRS)
Sklar, Michael; Wegerif, D.
1990-01-01
The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.
NASA Astrophysics Data System (ADS)
Gorlov, A. P.; Averchenkov, V. I.; Rytov, M. Yu; Eryomenko, V. T.
2017-01-01
The article is concerned with mathematical simulation of protection level assessment of complex organizational and technical systems of industrial enterprises by creating automated system, which main functions are: information security (IS) audit, forming of the enterprise threats model, recommendations concerning creation of the information protection system, a set of organizational-administrative documentation.
Application of Intelligent Tutoring Technology to an Apparently Mechanical Task.
ERIC Educational Resources Information Center
Newman, Denis
The increasing automation of many occupations leads to jobs that involve understanding and monitoring the operation of complex computer systems. One case is PATRIOT, an air defense surface-to-air missile system deployed by the U.S. Army. Radar information is processed and presented to the operators in highly abstract form. The system identifies…
Homogenisation of the strain distribution in stretch formed parts to improve part properties
NASA Astrophysics Data System (ADS)
Schmitz, Roman; Winkelmann, Mike; Bailly, David; Hirt, Gerhard
2018-05-01
Inhomogeneous strain and sheet thickness distributions can be observed in complex sheet metal parts manufactured by stretch forming. In literature, this problem is solved by flexible clampings adapted to the part geometry. In this paper, an approach, which does not rely on extensive tooling, is presented. The strain distribution in the sheet is influenced by means of hole patterns. Holes are introduced into the sheet area between clamping and part next to areas where high strains are expected. When deforming the sheet, high strains are shifted out of the part area. In a local area around the holes, high strains concentrate perpendicular to the drawing direction. Thus, high strains in the part area are reduced and the strain distribution is homogenised. To verify this approach, an FE-model of a stretch forming process of a conical part is implemented in LS-Dyna. The model is validated by corresponding experiments. In the first step, the positioning of the holes is applied manually based on the numerically determined strain distribution and experience. In order to automate the positioning of the holes, an optimisation method is applied in a second step. The presented approach implemented in LS-OPT uses the response surface method to identify the positioning and radius of the holes homogenising the strain in a defined area of the sheet. Due to nonlinear increase of computational complexity with increasing number of holes, the maximum number of holes is set to three. With both, the manual and the automated method, hole patterns were found which allow for a relative reduction of maximum strains and for a homogenisation of the strain distribution. Comparing the manual and automated positioning of holes, the pattern determined by automated optimisation shows better results in terms of homogenising the strain distribution.
Utilization of Induction Bonding for Automated Fabrication of TIGR
NASA Technical Reports Server (NTRS)
Hinkley, Jeffrey A.; Johnston, Norman J.; Hulcher, A. Bruce; Marchello, Joseph M.; Messier, Bernadette C.
1999-01-01
A laboratory study of magnetic induction heat bonding of titanium foil and graphite fiber reinforced polymer prepreg tape, TiGr, demonstrated that the process is a viable candidate for low cost fabrication of aircraft structure made of this new material form. Data were obtained on weld bonding of PIXA and PETI-5 prepreg to titanium. Both the foil and honeycomb forms of titanium were investigated. The process relies on magnetic susceptor heating of titanium, not on high frequency heating of graphite fiber. The experiments showed that with a toroid magnet configuration, good weld bonds might be obtained with heating times of a few seconds. These results suggest the potential is good for the induction heating process to achieve acceptable commercial production rates.
Adaptive Algorithms for Automated Processing of Document Images
2011-01-01
ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University
Automated processing of endoscopic surgical instruments.
Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G
1994-10-01
This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.
The Automation and Exoplanet Orbital Characterization from the Gemini Planet Imager Exoplanet Survey
NASA Astrophysics Data System (ADS)
Jinfei Wang, Jason; Graham, James; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry; Kalas, Paul; arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Ruffio, Jean-Baptiste; Sivaramakrishnan, Anand; Gemini Planet Imager Exoplanet Survey Collaboration
2018-01-01
The Gemini Planet Imager (GPI) Exoplanet Survey (GPIES) is a multi-year 600-star survey to discover and characterize young Jovian exoplanets and their planet forming environments. For large surveys like GPIES, it is critical to have a uniform dataset processed with the latest techniques and calibrations. I will describe the GPI Data Cruncher, an automated data processing framework that is able to generate fully reduced data minutes after the data are taken and can also reprocess the entire campaign in a single day on a supercomputer. The Data Cruncher integrates into a larger automated data processing infrastructure which syncs, logs, and displays the data. I will discuss the benefits of the GPIES data infrastructure, including optimizing observing strategies, finding planets, characterizing instrument performance, and constraining giant planet occurrence. I will also discuss my work in characterizing the exoplanets we have imaged in GPIES through monitoring their orbits. Using advanced data processing algorithms and GPI's precise astrometric calibration, I will show that GPI can achieve one milliarcsecond astrometry on the extensively-studied planet Beta Pic b. With GPI, we can confidently rule out a possible transit of Beta Pic b, but have precise timings on a Hill sphere transit, and I will discuss efforts to search for transiting circumplanetary material this year. I will also discuss the orbital monitoring of other exoplanets as part of GPIES.
A solvent-extraction module for cyclotron production of high-purity technetium-99m.
Martini, Petra; Boschi, Alessandra; Cicoria, Gianfranco; Uccelli, Licia; Pasquali, Micòl; Duatti, Adriano; Pupillo, Gaia; Marengo, Mario; Loriggiola, Massimo; Esposito, Juan
2016-12-01
The design and fabrication of a fully-automated, remotely controlled module for the extraction and purification of technetium-99m (Tc-99m), produced by proton bombardment of enriched Mo-100 molybdenum metallic targets in a low-energy medical cyclotron, is here described. After dissolution of the irradiated solid target in hydrogen peroxide, Tc-99m was obtained under the chemical form of 99m TcO 4 - , in high radionuclidic and radiochemical purity, by solvent extraction with methyl ethyl ketone (MEK). The extraction process was accomplished inside a glass column-shaped vial especially designed to allow for an easy automation of the whole procedure. Recovery yields were always >90% of the loaded activity. The final pertechnetate saline solution Na 99m TcO 4 , purified using the automated module here described, is within the Pharmacopoeia quality control parameters and is therefore a valid alternative to generator-produced 99m Tc. The resulting automated module is cost-effective and easily replicable for in-house production of high-purity Tc-99m by cyclotrons. Copyright © 2016 Elsevier Ltd. All rights reserved.
Automation in biological crystallization.
Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen
2014-06-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.
Automation in biological crystallization
Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen
2014-01-01
Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
Automated Chromium Plating Line for Gun Barrels
1979-09-01
consistent pretreatments and bath dwell times. Some of the advantages of automated processing include increased productivity (average of 20^) due to...when automated processing procedures’ are used. The current method of applying chromium electrodeposits to gun tubes is a manual, batch operation...currently practiced with rotary swaged gun tubes would substantially reduce the difficulties in automated processing . RECOMMENDATIONS
Modeling Multiple Human-Automation Distributed Systems using Network-form Games
NASA Technical Reports Server (NTRS)
Brat, Guillaume
2012-01-01
The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.
Automation in School Library Media Centers.
ERIC Educational Resources Information Center
Driver, Russell W.; Driver, Mary Anne
1982-01-01
Surveys the historical development of automated technical processing in schools and notes the impact of this automation in a number of cases. Speculations about the future involvement of school libraries in automated processing and networking are included. Thirty references are listed. (BBM)
Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang
2013-06-01
Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.
Lommen, Arjen; van der Kamp, Henk J; Kools, Harrie J; van der Lee, Martijn K; van der Weg, Guido; Mol, Hans G J
2012-11-09
A new alternative data processing tool set, metAlignID, is developed for automated pre-processing and library-based identification and concentration estimation of target compounds after analysis by comprehensive two-dimensional gas chromatography with mass spectrometric detection. The tool set has been developed for and tested on LECO data. The software is developed to run multi-threaded (one thread per processor core) on a standard PC (personal computer) under different operating systems and is as such capable of processing multiple data sets simultaneously. Raw data files are converted into netCDF (network Common Data Form) format using a fast conversion tool. They are then preprocessed using previously developed algorithms originating from metAlign software. Next, the resulting reduced data files are searched against a user-composed library (derived from user or commercial NIST-compatible libraries) (NIST=National Institute of Standards and Technology) and the identified compounds, including an indicative concentration, are reported in Excel format. Data can be processed batch wise. The overall time needed for conversion together with processing and searching of 30 raw data sets for 560 compounds is routinely within an hour. The screening performance is evaluated for detection of pesticides and contaminants in raw data obtained after analysis of soil and plant samples. Results are compared to the existing data-handling routine based on proprietary software (LECO, ChromaTOF). The developed software tool set, which is freely downloadable at www.metalign.nl, greatly accelerates data-analysis and offers more options for fine-tuning automated identification toward specific application needs. The quality of the results obtained is slightly better than the standard processing and also adds a quantitative estimate. The software tool set in combination with two-dimensional gas chromatography coupled to time-of-flight mass spectrometry shows great potential as a highly-automated and fast multi-residue instrumental screening method. Copyright © 2012 Elsevier B.V. All rights reserved.
Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang
2015-04-01
Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.
Demonstration of the feasibility of automated silicon solar cell fabrication
NASA Technical Reports Server (NTRS)
Taylor, W. E.; Schwartz, F. M.
1975-01-01
A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.
Automated Space Processing Payloads Study. Volume 1: Executive Summary
NASA Technical Reports Server (NTRS)
1975-01-01
An investigation is described which examined the extent to which the experiment hardware and operational requirements can be met by automatic control and material handling devices; payload and system concepts are defined which make extensive use of automation technology. Topics covered include experiment requirements and hardware data, capabilities and characteristics of industrial automation equipment and controls, payload grouping, automated payload conceptual design, space processing payload preliminary design, automated space processing payloads for early shuttle missions, and cost and scheduling.
Information management - Assessing the demand for information
NASA Technical Reports Server (NTRS)
Rogers, William H.
1991-01-01
Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd
Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective ofmore » the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.« less
NASA Astrophysics Data System (ADS)
Hoffman, Kenneth J.
1995-10-01
Few information systems create a standardized clinical patient record in which there are discrete and concise observations of patient problems and their resolution. Clinical notes usually are narratives which don't support an aggregate and systematic outcome analysis. Many programs collect information on diagnosis and coded procedures but are not focused on patient problems. Integrated definition (IDEF) methodology has been accepted by the Department of Defense as part of the Corporate Information Management Initiative and serves as the foundation that establishes a need for automation. We used IDEF modeling to describe present and idealized patient care activities. A logical IDEF data model was created to support those activities. The modeling process allows for accurate cost estimates based upon performed activities, efficient collection of relevant information, and outputs which allow real- time assessments of process and outcomes. This model forms the foundation for a prototype automated clinical information system (ACIS).
Automated Classification of Phonological Errors in Aphasic Language
Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.
1984-01-01
Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.
Understanding human management of automation errors
McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.
2013-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042
Understanding human management of automation errors.
McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D
2014-01-01
Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.
Santos, R S; Malheiros, S M F; Cavalheiro, S; de Oliveira, J M Parente
2013-03-01
Cancer is the leading cause of death in economically developed countries and the second leading cause of death in developing countries. Malignant brain neoplasms are among the most devastating and incurable forms of cancer, and their treatment may be excessively complex and costly. Public health decision makers require significant amounts of analytical information to manage public treatment programs for these patients. Data mining, a technology that is used to produce analytically useful information, has been employed successfully with medical data. However, the large-scale adoption of this technique has been limited thus far because it is difficult to use, especially for non-expert users. One way to facilitate data mining by non-expert users is to automate the process. Our aim is to present an automated data mining system that allows public health decision makers to access analytical information regarding brain tumors. The emphasis in this study is the use of ontology in an automated data mining process. The non-experts who tried the system obtained useful information about the treatment of brain tumors. These results suggest that future work should be conducted in this area. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S
2015-10-01
The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Brouwer, Albert; Brown, David; Tomuta, Elena
2017-04-01
To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.
On the modeling of separation foils in thermoforming simulations
NASA Astrophysics Data System (ADS)
Margossian, Alexane; Bel, Sylvain; Hinterhölzl, Roland
2016-10-01
Composite forming simulations consist in modelling the forming process of composite components to anticipate the occurrence of potential flaws such as out-of-plane wrinkles and fibre re-orientation. Forming methods often consist of automated processes in which flat composite blanks are forced to comply with tool geometries. Although Finite Element forming simulations require the modelling of all stakeholders (blankholder, tooling and composite blank), consumables such as separation films are often not considered. Used in thermoforming processes, these films are placed between tooling and composite to ease part removal after forming. These films are also used to decrease tool/ply friction and thus, enhance forming quality. This work presents thermoforming simulations of pre-impregnated carbon fibre thermoplastic blanks in which separation films are modelled in the same manner as composite layers, i.e. by a layer of shell elements. The mechanical properties of such films are also characterised at the same temperature as forming occurs. The proposed approach is finally compared to the actual modelling method, in which separation films are not modelled as such but in which their influence is only considered within the friction coefficient between tooling and blank.
Automation of 3D cell culture using chemically defined hydrogels.
Rimann, Markus; Angres, Brigitte; Patocchi-Tenzer, Isabel; Braum, Susanne; Graf-Hausner, Ursula
2014-04-01
Drug development relies on high-throughput screening involving cell-based assays. Most of the assays are still based on cells grown in monolayer rather than in three-dimensional (3D) formats, although cells behave more in vivo-like in 3D. To exemplify the adoption of 3D techniques in drug development, this project investigated the automation of a hydrogel-based 3D cell culture system using a liquid-handling robot. The hydrogel technology used offers high flexibility of gel design due to a modular composition of a polymer network and bioactive components. The cell inert degradation of the gel at the end of the culture period guaranteed the harmless isolation of live cells for further downstream processing. Human colon carcinoma cells HCT-116 were encapsulated and grown in these dextran-based hydrogels, thereby forming 3D multicellular spheroids. Viability and DNA content of the cells were shown to be similar in automated and manually produced hydrogels. Furthermore, cell treatment with toxic Taxol concentrations (100 nM) had the same effect on HCT-116 cell viability in manually and automated hydrogel preparations. Finally, a fully automated dose-response curve with the reference compound Taxol showed the potential of this hydrogel-based 3D cell culture system in advanced drug development.
Liyanage, Kishan Andre; Steward, Christopher; Moffat, Bradford Armstrong; Opie, Nicholas Lachlan; Rind, Gil Simon; John, Sam Emmanuel; Ronayne, Stephen; May, Clive Newton; O'Brien, Terence John; Milne, Marjorie Eileen; Oxley, Thomas James
2016-01-01
Segmentation is the process of partitioning an image into subdivisions and can be applied to medical images to isolate anatomical or pathological areas for further analysis. This process can be done manually or automated by the use of image processing computer packages. Atlas-based segmentation automates this process by the use of a pre-labelled template and a registration algorithm. We developed an ovine brain atlas that can be used as a model for neurological conditions such as Parkinson's disease and focal epilepsy. 17 female Corriedale ovine brains were imaged in-vivo in a 1.5T (low-resolution) MRI scanner. 13 of the low-resolution images were combined using a template construction algorithm to form a low-resolution template. The template was labelled to form an atlas and tested by comparing manual with atlas-based segmentations against the remaining four low-resolution images. The comparisons were in the form of similarity metrics used in previous segmentation research. Dice Similarity Coefficients were utilised to determine the degree of overlap between eight independent, manual and atlas-based segmentations, with values ranging from 0 (no overlap) to 1 (complete overlap). For 7 of these 8 segmented areas, we achieved a Dice Similarity Coefficient of 0.5-0.8. The amygdala was difficult to segment due to its variable location and similar intensity to surrounding tissues resulting in Dice Coefficients of 0.0-0.2. We developed a low resolution ovine brain atlas with eight clinically relevant areas labelled. This brain atlas performed comparably to prior human atlases described in the literature and to intra-observer error providing an atlas that can be used to guide further research using ovine brains as a model and is hosted online for public access.
Automation of Space Processing Applications Shuttle payloads
NASA Technical Reports Server (NTRS)
Crosmer, W. E.; Neau, O. T.; Poe, J.
1975-01-01
The Space Processing Applications Program is examining the effect of weightlessness on key industrial materials processes, such as crystal growth, fine-grain casting of metals, and production of unique and ultra-pure glasses. Because of safety and in order to obtain optimum performance, some of these processes lend themselves to automation. Automation can increase the number of potential Space Shuttle flight opportunities and increase the overall productivity of the program. Five automated facility design concepts and overall payload combinations incorporating these facilities are presented.
Development of a plan for automating integrated circuit processing
NASA Technical Reports Server (NTRS)
1971-01-01
The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.
The Hyperspectral Imager for the Coastal Ocean (HICO): Sensor and Data Processing Overview
2010-01-20
backscattering coefficients, and others. Several of these software modules will be developed within the Automated Processing System (APS), a data... Automated Processing System (APS) NRL developed APS, which processes satellite data into ocean color data products. APS is a collection of methods...used for ocean color processing which provide the tools for the automated processing of satellite imagery [1]. These tools are in the process of
10 CFR 1017.28 - Processing on Automated Information Systems (AIS).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...
2018-01-01
ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a
Nyholm, Sven
2017-07-18
Many ethicists writing about automated systems (e.g. self-driving cars and autonomous weapons systems) attribute agency to these systems. Not only that; they seemingly attribute an autonomous or independent form of agency to these machines. This leads some ethicists to worry about responsibility-gaps and retribution-gaps in cases where automated systems harm or kill human beings. In this paper, I consider what sorts of agency it makes sense to attribute to most current forms of automated systems, in particular automated cars and military robots. I argue that whereas it indeed makes sense to attribute different forms of fairly sophisticated agency to these machines, we ought not to regard them as acting on their own, independently of any human beings. Rather, the right way to understand the agency exercised by these machines is in terms of human-robot collaborations, where the humans involved initiate, supervise, and manage the agency of their robotic collaborators. This means, I argue, that there is much less room for justified worries about responsibility-gaps and retribution-gaps than many ethicists think.
Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation
2018-01-01
ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
Continuous Calibration of Trust in Automated Systems
2014-01-01
Airlines Flight 214 in San Francisco. Therefore, understanding how users form, lose, and recover trust in imperfect automation is of critical...1997). Misuse and disuse can have fatal consequences; for example, inappropriate automation reliance has been implicated in the recent crash of Asiana
Out-of-autoclave manufacturing of a stiffened thermoplastic carbon fibre PEEK panel
NASA Astrophysics Data System (ADS)
Flanagan, M.; Goggins, J.; Doyle, A.; Weafer, B.; Ward, M.; Bizeul, M.; Canavan, R.; O'Bradaigh, C.; Doyle, K.; Harrison, N.
2017-10-01
Out-of-Autoclave manufacturing methods, specifically Automated Tape Placement (ATP) and induction welding, used in the fabrication of a stiffened thermoplastic demonstrator panel, are presented in this study. The demonstrator panel consists of two stiffeners induction welded to a flat skin, to form a typical load bearing aerospace sub-component. The skin of the panel is manufactured from uni-directional Carbon Fibre (CF) Polyetheretherkeytone (PEEK) using laser assisted Automated Tape Placement (ATP) and the stiffeners are press formed from woven CF-PEEK. The stiffeners are fusion bonded to the skin using a continuous induction welding process. A susceptor material is used at the interface to ensure the required heating is concentrated at the weldline. Microscopy was used to examine the manufactured coupons for defects. Destructive testing was carried out to evaluate the strength of the overall assembly. The work shows that assemblies manufactured using continuous induction welding and ATP are suitable for load bearing aerospace applications.
Sequence-of-events-driven automation of the deep space network
NASA Technical Reports Server (NTRS)
Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.
1996-01-01
In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.
Sequence-of-Events-Driven Automation of the Deep Space Network
NASA Technical Reports Server (NTRS)
Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.
1996-01-01
In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.
Methods for Preparing Nanoparticle-Containing Thermoplastic Composite Laminates
NASA Technical Reports Server (NTRS)
Jensen, Brian J. (Inventor); Cano, Roberto J. (Inventor); Gruber, Mark B. (Inventor)
2016-01-01
High quality thermoplastic composites and composite laminates containing nanoparticles and/or nanofibers, and methods of producing such composites and laminates are disclosed. The composites comprise a thermoplastic polymer and a plurality of nanoparticles, and may include a fibrous structural reinforcement. The composite laminates are formed from a plurality of nanoparticle-containing composite layers and may be fused to one another via an automated process.
Creating ISO/EN 13606 archetypes based on clinical information needs.
Rinner, Christoph; Kohler, Michael; Hübner-Bloder, Gudrun; Saboor, Samrend; Ammenwerth, Elske; Duftschmid, Georg
2011-01-01
Archetypes model individual EHR contents and build the basis of the dual-model approach used in the ISO/EN 13606 EHR architecture. We present an approach to create archetypes using an iterative development process. It includes automated generation of electronic case report forms from archetypes. We evaluated our approach by developing 128 archetypes which represent 446 clinical information items from the diabetes domain.
USSR Report: Machine Tools and Metalworking Equipment.
1986-01-23
between satellite stop and the camshaft of the programer unit. The line has 23 positions including 12 automatic ones. Specification of line Number...technological, processes, automated research, etc.) are as follows.: a monochannel based on a shared trunk line, ring, star and tree (polychannel...line or ring networks based on decentralized control of data exchange between subscribers are very robust. A tree -form network has star structure
An Intelligent Automation Platform for Rapid Bioprocess Design.
Wu, Tianyi; Zhou, Yuhong
2014-08-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.
Automated Student Aid Processing: The Challenge and Opportunity.
ERIC Educational Resources Information Center
St. John, Edward P.
1985-01-01
To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)
[The use of programmed microcalculators for automation of leukocyte count].
Plykin, D L
1989-01-01
Soviet programmed microcalculators are recommended to be used for the calculation of the leukocytic formulae when making serial blood analyses at clinical laboratories. The suggested program helps completely automate the process of estimating the leukocyte types, detectable in microscopic examination of the blood smears; the results may be obtained as a per cent ratio of the cells (a form most prevalent nowadays) and as their quantity per microliter of blood. The presence of service elements in the program essentially simplifies the work, making it convenient for an untrained user of the microcalculator. Since commercial Soviet programmed microcalculators somewhat differ in the systems of program steps, two variants of the program are suggested, adapted to the two most prevalent designs.
Knowledge-based requirements analysis for automating software development
NASA Technical Reports Server (NTRS)
Markosian, Lawrence Z.
1988-01-01
We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.
Advances in Composites Technology
NASA Technical Reports Server (NTRS)
Tenney, D. R.; Dexter, H. B.
1985-01-01
A significant level of research is currently focused on the development of tough resins and high strain fibers in an effort to gain improved damage tolerance. Moderate success has been achieved with the development of new resins such as PEEK and additional improvements look promising with new thermoplastic resins. Development of innovative material forms such as 2-D and 3-D woven fabrics and braided structural subelements is also expected to improve damage tolerance and durability of composite hardware. The new thrust in composites is to develop low cost manufacturing and design concepts to lower the cost of composite hardware. Processes being examined include automated material placement, filament winding, pultrusion, and thermoforming. The factory of the future will likely incorporate extensive automation in all aspects of manufacturing composite components.
NASA Astrophysics Data System (ADS)
Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo
2017-08-01
Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
Comparability of automated human induced pluripotent stem cell culture: a pilot study.
Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J
2016-12-01
Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.
The automated system for technological process of spacecraft's waveguide paths soldering
NASA Astrophysics Data System (ADS)
Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.
2016-11-01
The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.
Practical automated glass selection and the design of apochromats with large field of view.
Siew, Ronian
2016-11-10
This paper presents an automated approach to the selection of optical glasses for the design of an apochromatic lens with large field of view, based on a design originally provided by Yang et al. [Appl. Opt.55, 5977 (2016)APOPAI0003-693510.1364/AO.55.005977]. Following from this reference's preliminary optimized structure, it is shown that the effort of glass selection is significantly reduced by using the global optimization feature in the Zemax optical design program. The glass selection process is very fast, complete within minutes, and the key lies in automating the substitution of glasses found from the global search without the need to simultaneously optimize any other lens parameter during the glass search. The result is an alternate optimized version of the lens from the above reference possessing zero axial secondary color within the visible spectrum and a large field of view. Supplementary material is provided in the form of Zemax and text files, before and after final optimization.
Satellite freeze forecast system: Executive summary
NASA Technical Reports Server (NTRS)
Martsolf, J. D. (Principal Investigator)
1983-01-01
A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.
Automation of a high risk medication regime algorithm in a home health care population.
Olson, Catherine H; Dierich, Mary; Westra, Bonnie L
2014-10-01
Create an automated algorithm for predicting elderly patients' medication-related risks for readmission and validate it by comparing results with a manual analysis of the same patient population. Outcome and Assessment Information Set (OASIS) and medication data were reused from a previous, manual study of 911 patients from 15 Medicare-certified home health care agencies. The medication data was converted into standardized drug codes using APIs managed by the National Library of Medicine (NLM), and then integrated in an automated algorithm that calculates patients' high risk medication regime scores (HRMRs). A comparison of the results between algorithm and manual process was conducted to determine how frequently the HRMR scores were derived which are predictive of readmission. HRMR scores are composed of polypharmacy (number of drugs), Potentially Inappropriate Medications (PIM) (drugs risky to the elderly), and Medication Regimen Complexity Index (MRCI) (complex dose forms, instructions or administration). The algorithm produced polypharmacy, PIM, and MRCI scores that matched with 99%, 87% and 99% of the scores, respectively, from the manual analysis. Imperfect match rates resulted from discrepancies in how drugs were classified and coded by the manual analysis vs. the automated algorithm. HRMR rules lack clarity, resulting in clinical judgments for manual coding that were difficult to replicate in the automated analysis. The high comparison rates for the three measures suggest that an automated clinical tool could use patients' medication records to predict their risks of avoidable readmissions. Copyright © 2014 Elsevier Inc. All rights reserved.
Electrohydraulic Forming of Near-Net Shape Automotive Panels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golovaschenko, Sergey F.
2013-09-26
The objective of this project was to develop the electrohydraulic forming (EHF) process as a near-net shape automotive panel manufacturing technology that simultaneously reduces the energy embedded in vehicles and the energy consumed while producing automotive structures. Pulsed pressure is created via a shockwave generated by the discharge of high voltage capacitors through a pair of electrodes in a liquid-filled chamber. The shockwave in the liquid initiated by the expansion of the plasma channel formed between two electrodes propagates towards the blank and causes the blank to be deformed into a one-sided die cavity. The numerical model of the EHFmore » process was validated experimentally and was successfully applied to the design of the electrode system and to a multi-electrode EHF chamber for full scale validation of the process. The numerical model was able to predict stresses in the dies during pulsed forming and was validated by the experimental study of the die insert failure mode for corner filling operations. The electrohydraulic forming process and its major subsystems, including durable electrodes, an EHF chamber, a water/air management system, a pulse generator and integrated process controls, were validated to be capable to operate in a fully automated, computer controlled mode for forming of a portion of a full-scale sheet metal component in laboratory conditions. Additionally, the novel processes of electrohydraulic trimming and electrohydraulic calibration were demonstrated at a reduced-scale component level. Furthermore, a hybrid process combining conventional stamping with EHF was demonstrated as a laboratory process for a full-scale automotive panel formed out of AHSS material. The economic feasibility of the developed EHF processes was defined by developing a cost model of the EHF process in comparison to the conventional stamping process.« less
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/ deviation so that further analysis can be directed and corrective actions followed.
NASA Technical Reports Server (NTRS)
Pham, Timothy T.; Machuzak, Richard J.; Bedrossian, Alina; Kelly, Richard M.; Liao, Jason C.
2012-01-01
This software provides an automated capability to measure and qualify the frequency stability performance of the Deep Space Network (DSN) ground system, using daily spacecraft tracking data. The results help to verify if the DSN performance is meeting its specification, therefore ensuring commitments to flight missions; in particular, the radio science investigations. The rich set of data also helps the DSN Operations and Maintenance team to identify the trends and patterns, allowing them to identify the antennas of lower performance and implement corrective action in a timely manner. Unlike the traditional approach where the performance can only be obtained from special calibration sessions that are both time-consuming and require manual setup, the new method taps into the daily spacecraft tracking data. This new approach significantly increases the amount of data available for analysis, roughly by two orders of magnitude, making it possible to conduct trend analysis with good confidence. The software is built with automation in mind for end-to-end processing. From the inputs gathering to computation analysis and later data visualization of the results, all steps are done automatically, making the data production at near zero cost. This allows the limited engineering resource to focus on high-level assessment and to follow up with the exceptions/deviations. To make it possible to process the continual stream of daily incoming data without much effort, and to understand the results quickly, the processing needs to be automated and the data summarized at a high level. Special attention needs to be given to data gathering, input validation, handling anomalous conditions, computation, and presenting the results in a visual form that makes it easy to spot items of exception/deviation so that further analysis can be directed and corrective actions followed.
[Algorithm for the automated processing of rheosignals].
Odinets, G S
1988-01-01
Algorithm for rheosignals recognition for a microprocessing device with a representation apparatus and with automated and manual cursor control was examined. The algorithm permits to automate rheosignals registrating and processing taking into account their changeability.
Designing Automated Guidance for Concept Diagrams in Inquiry Instruction
ERIC Educational Resources Information Center
Ryoo, Kihyun; Linn, Marcia C.
2016-01-01
Advances in automated scoring technologies have the potential to support student learning during inquiry instruction by providing timely and adaptive guidance on individual students' responses. To identify which forms of automated guidance can be beneficial for inquiry learning, we compared reflective guidance to directive guidance for…
Work and Programmable Automation.
ERIC Educational Resources Information Center
DeVore, Paul W.
A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…
Space station automation study. Volume 1: Executive summary. Autonomous systems and assembly
NASA Technical Reports Server (NTRS)
1984-01-01
The space station automation study (SSAS) was to develop informed technical guidance for NASA personnel in the use of autonomy and autonomous systems to implement space station functions. The initial step taken by NASA in organizing the SSAS was to form and convene a panel of recognized expert technologists in automation, space sciences and aerospace engineering to produce a space station automation plan.
Flexible End2End Workflow Automation of Hit-Discovery Research.
Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin
2014-08-01
The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.
Design of Tailored Non-Crimp Fabrics Based on Stitching Geometry
NASA Astrophysics Data System (ADS)
Krieger, Helga; Gries, Thomas; Stapleton, Scott E.
2018-02-01
Automation of the preforming process brings up two opposing requirements for the used engineering fabric. On the one hand, the fabric requires a sufficient drapeability, or low shear stiffness, for forming into double-curved geometries; but on the other hand, the fabric requires a high form stability, or high shear stiffness, for automated handling. To meet both requirements tailored non-crimp fabrics (TNCFs) are proposed. While the stitching has little structural influence on the final part, it virtually dictates the TNCFs local capability to shear and drape over a mold during preforming. The shear stiffness of TNCFs is designed by defining the local stitching geometry. NCFs with chain stitch have a comparatively high shear stiffness and NCFs with a stitch angle close to the symmetry stitch angle have a very low shear stiffness. A method to design the component specific local stitching parameters of TNCFs is discussed. For validation of the method, NCFs with designed tailored stitching parameters were manufactured and compared to benchmark NCFs with uniform stitching parameters. The designed TNCFs showed both, generally a high form stability and in locally required zones a good drapeability, in drape experiments over an elongated hemisphere.
NASA Astrophysics Data System (ADS)
Yolken, H. T.; Mehrabian, R.
1985-12-01
These are the proceedings of the workshop A National Forum on the Future of Automated Materials Processing in U.S. Industry - The Role of Sensors. This is the first of two workshops to be sponsored by the Industrial Research Institute and the White House Office of Science and Technology Policy, Committee on Materials Working Group on Automation of Materials Processing. The second workshop will address the other two key components required for automated materials processing, process models and artificial intelligence coupled with computer integration of the system. The objective of these workshops is to identify and assess important issues afecting the competitive position of U.S. industry related to its ability to automate production processes for basic and advanced materials and to develop approaches for improved capability through cooperative R&D and associated efforts.
NASA Technical Reports Server (NTRS)
1992-01-01
This Handbook sets forth information for the guidance of all users of the NASA Forms Management Program System. It is issued in accordance with the Federal Information Resources Management Regulation (FIRMR), Subpart 201-9.1. This Handbook sets forth an alpha-functional index of NASA-prefixed forms by title, identifying number, and unit of issue. The automated processing two-letter code (NF) has been substituted for the spelling out of the NASA form-prefix preceding the form number. To indicate a description in lieu of a distinct title, the entire reference under the Form Title/Description column has been enclosed in parentheses. A list of current forms, shown by number and page, is included for cross-reference and to preclude the ordering of those forms which have been deleted from the system. This Handbook will be updated, as appropriate. NHB 1420.2H dated July 1986, is cancelled.
Nonanalytic Laboratory Automation: A Quarter Century of Progress.
Hawker, Charles D
2017-06-01
Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.
Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.
Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H
2009-01-01
Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.
Automated transient detection in the STEREO Heliospheric Imagers.
NASA Astrophysics Data System (ADS)
Barnard, Luke; Scott, Chris; Owens, Mat; Lockwood, Mike; Tucker-Hood, Kim; Davies, Jackie
2014-05-01
Since the launch of the twin STEREO satellites, the heliospheric imagers (HI) have been used, with good results, in tracking transients of solar origin, such as Coronal Mass Ejections (CMEs), out far into the heliosphere. A frequently used approach is to build a "J-map", in which multiple elongation profiles along a constant position angle are stacked in time, building an image in which radially propagating transients form curved tracks in the J-map. From this the time-elongation profile of a solar transient can be manually identified. This is a time consuming and laborious process, and the results are subjective, depending on the skill and expertise of the investigator. Therefore, it is desirable to develop an automated algorithm for the detection and tracking of the transient features observed in HI data. This is to some extent previously covered ground, as similar problems have been encountered in the analysis of coronagraph data and have led to the development of products such as CACtus etc. We present the results of our investigation into the automated detection of solar transients observed in J-maps formed from HI data. We use edge and line detection methods to identify transients in the J-maps, and then use kinematic models of the solar transient propagation (such as the fixed-phi and harmonic mean geometric models) to estimate the solar transients properties, such as transient speed and propagation direction, from the time-elongation profile. The effectiveness of this process is assessed by comparison of our results with a set of manually identified CMEs, extracted and analysed by the Solar Storm Watch Project. Solar Storm Watch is a citizen science project in which solar transients are identified in J-maps formed from HI data and tracked multiple times by different users. This allows the calculation of a consensus time-elongation profile for each event, and therefore does not suffer from the potential subjectivity of an individual researcher tracking an event. Furthermore, we present preliminary results regarding the estimation of the ambient solar wind speed from the automated analysis of the HI J-maps, by the tracking of numerous small scale features entrained into the ambient solar wind, which can only be tracked out to small elongations.
Automation, Manpower, and Education.
ERIC Educational Resources Information Center
Rosenberg, Jerry M.
Each group in our population will be affected by automation and other forms of technological advancement. This book seeks to identify the needs of these various groups, and to present ways in which educators can best meet them. The author corrects certain prevalent misconceptions concerning manpower utilization and automation. Based on the…
41 CFR 102-194.25 - What is an automated Standard or Optional format?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Property Management Regulations System (Continued) FEDERAL MANAGEMENT REGULATION ADMINISTRATIVE PROGRAMS 194-STANDARD AND OPTIONAL FORMS MANAGEMENT PROGRAM § 102-194.25 What is an automated Standard or... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false What is an automated...
More steps towards process automation for optical fabrication
NASA Astrophysics Data System (ADS)
Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina
2017-06-01
In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.
NASA Astrophysics Data System (ADS)
Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel
2008-10-01
Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.
Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel
2008-10-01
Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
Automated Subsystem Control for Life Support System (ASCLSS)
NASA Technical Reports Server (NTRS)
Block, Roger F.
1987-01-01
The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..
Improvements to the Processing and Characterization of Needled Composite Laminates
2014-01-01
the automated processing equipment are shown and discussed. The modifications allow better spatial control at the penetration sites and the ability... automated processing equipment are shown and discussed. The modifications allow better spatial control at the penetration sites and the ability to...semi- automated processing equipment, commercial off-the-shelf (COTS) needles and COTS aramid mat designed for other applications. Needled material
Knowledge Representation Artifacts for Use in Sensemaking Support Systems
2015-03-12
and manual processing must be replaced by automated processing wherever it makes sense and is possible. Clearly, given the data and cognitive...knowledge-centric view to situation analysis and decision-making as previously discussed, has lead to the development of several automated processing components...for use in sensemaking support systems [6-11]. In turn, automated processing has required the development of appropriate knowledge
Command and Control Common Semantic Core Required to Enable Net-centric Operations
2008-05-20
automated processing capability. A former US Marine Corps component C4 director during Operation Iraqi Freedom identified the problems of 1) uncertainty...interoperability improvements to warfighter community processes, thanks to ubiquitous automated processing , are likely high and somewhat easier to quantify. A...synchronized with the actions of other partners / warfare communities. This requires high- quality information, rapid sharing and automated processing – which
ERIC Educational Resources Information Center
Naclerio, Nick
1979-01-01
Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)
An Intelligent Automation Platform for Rapid Bioprocess Design
Wu, Tianyi
2014-01-01
Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579
Validation of an automated colony counting system for group A Streptococcus.
Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R
2016-02-08
The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.
An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*
Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.
2014-01-01
Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144
E-Services quality assessment framework for collaborative networks
NASA Astrophysics Data System (ADS)
Stegaru, Georgiana; Danila, Cristian; Sacala, Ioan Stefan; Moisescu, Mihnea; Mihai Stanescu, Aurelian
2015-08-01
In a globalised networked economy, collaborative networks (CNs) are formed to take advantage of new business opportunities. Collaboration involves shared resources and capabilities, such as e-Services that can be dynamically composed to automate CN participants' business processes. Quality is essential for the success of business process automation. Current approaches mostly focus on quality of service (QoS)-based service selection and ranking algorithms, overlooking the process of service composition which requires interoperable, adaptable and secure e-Services to ensure seamless collaboration, data confidentiality and integrity. Lack of assessment of these quality attributes can result in e-Service composition failure. The quality of e-Service composition relies on the quality of each e-Service and on the quality of the composition process. Therefore, there is the need for a framework that addresses quality from both views: product and process. We propose a quality of e-Service composition (QoESC) framework for quality assessment of e-Service composition for CNs which comprises of a quality model for e-Service evaluation and guidelines for quality of e-Service composition process. We implemented a prototype considering a simplified telemedicine use case which involves a CN in e-Healthcare domain. To validate the proposed quality-driven framework, we analysed service composition reliability with and without using the proposed framework.
Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong
2017-01-01
Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro–meso–scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy–enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy–enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates. PMID:28869520
Han, Zhenyu; Sun, Shouzheng; Fu, Hongya; Fu, Yunzhong
2017-09-03
Automated fiber placement (AFP) process includes a variety of energy forms and multi-scale effects. This contribution proposes a novel multi-scale low-entropy method aiming at optimizing processing parameters in an AFP process, where multi-scale effect, energy consumption, energy utilization efficiency and mechanical properties of micro-system could be taken into account synthetically. Taking a carbon fiber/epoxy prepreg as an example, mechanical properties of macro-meso-scale are obtained by Finite Element Method (FEM). A multi-scale energy transfer model is then established to input the macroscopic results into the microscopic system as its boundary condition, which can communicate with different scales. Furthermore, microscopic characteristics, mainly micro-scale adsorption energy, diffusion coefficient entropy-enthalpy values, are calculated under different processing parameters based on molecular dynamics method. Low-entropy region is then obtained in terms of the interrelation among entropy-enthalpy values, microscopic mechanical properties (interface adsorbability and matrix fluidity) and processing parameters to guarantee better fluidity, stronger adsorption, lower energy consumption and higher energy quality collaboratively. Finally, nine groups of experiments are carried out to verify the validity of the simulation results. The results show that the low-entropy optimization method can reduce void content effectively, and further improve the mechanical properties of laminates.
Development Status: Automation Advanced Development Space Station Freedom Electric Power System
NASA Technical Reports Server (NTRS)
Dolce, James L.; Kish, James A.; Mellor, Pamela A.
1990-01-01
Electric power system automation for Space Station Freedom is intended to operate in a loop. Data from the power system is used for diagnosis and security analysis to generate Operations Management System (OMS) requests, which are sent to an arbiter, which sends a plan to a commander generator connected to the electric power system. This viewgraph presentation profiles automation software for diagnosis, scheduling, and constraint interfaces, and simulation to support automation development. The automation development process is diagrammed, and the process of creating Ada and ART versions of the automation software is described.
Using machine learning techniques to automate sky survey catalog generation
NASA Technical Reports Server (NTRS)
Fayyad, Usama M.; Roden, J. C.; Doyle, R. J.; Weir, Nicholas; Djorgovski, S. G.
1993-01-01
We describe the application of machine classification techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Palomar Observatory Sky Survey provides comprehensive photographic coverage of the northern celestial hemisphere. The photographic plates are being digitized into images containing on the order of 10(exp 7) galaxies and 10(exp 8) stars. Since the size of this data set precludes manual analysis and classification of objects, our approach is to develop a software system which integrates independently developed techniques for image processing and data classification. Image processing routines are applied to identify and measure features of sky objects. Selected features are used to determine the classification of each object. GID3* and O-BTree, two inductive learning techniques, are used to automatically learn classification decision trees from examples. We describe the techniques used, the details of our specific application, and the initial encouraging results which indicate that our approach is well-suited to the problem. The benefits of the approach are increased data reduction throughput, consistency of classification, and the automated derivation of classification rules that will form an objective, examinable basis for classifying sky objects. Furthermore, astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems given automatically cataloged data.
A Neural-Network-Based Semi-Automated Geospatial Classification Tool
NASA Astrophysics Data System (ADS)
Hale, R. G.; Herzfeld, U. C.
2014-12-01
North America's largest glacier system, the Bering Bagley Glacier System (BBGS) in Alaska, surged in 2011-2013, as shown by rapid mass transfer, elevation change, and heavy crevassing. Little is known about the physics controlling surge glaciers' semi-cyclic patterns; therefore, it is crucial to collect and analyze as much data as possible so that predictive models can be made. In addition, physical signs frozen in ice in the form of crevasses may help serve as a warning for future surges. The BBGS surge provided an opportunity to develop an automated classification tool for crevasse classification based on imagery collected from small aircraft. The classification allows one to link image classification to geophysical processes associated with ice deformation. The tool uses an approach that employs geostatistical functions and a feed-forward perceptron with error back-propagation. The connectionist-geostatistical approach uses directional experimental (discrete) variograms to parameterize images into a form that the Neural Network (NN) can recognize. In an application to preform analysis on airborne video graphic data from the surge of the BBGS, an NN was able to distinguish 18 different crevasse classes with 95 percent or higher accuracy, for over 3,000 images. Recognizing that each surge wave results in different crevasse types and that environmental conditions affect the appearance in imagery, we designed the tool's semi-automated pre-training algorithm to be adaptable. The tool can be optimized to specific settings and variables of image analysis: (airborne and satellite imagery, different camera types, observation altitude, number and types of classes, and resolution). The generalization of the classification tool brings three important advantages: (1) multiple types of problems in geophysics can be studied, (2) the training process is sufficiently formalized to allow non-experts in neural nets to perform the training process, and (3) the time required to manually pre-sort imagery into classes is greatly reduced.
Effectiveness of a web-based automated cell distribution system.
Niland, Joyce C; Stiller, Tracey; Cravens, James; Sowinski, Janice; Kaddis, John; Qian, Dajun
2010-01-01
In recent years, industries have turned to the field of operations research to help improve the efficiency of production and distribution processes. Largely absent is the application of this methodology to biological materials, such as the complex and costly procedure of human pancreas procurement and islet isolation. Pancreatic islets are used for basic science research and in a promising form of cell replacement therapy for a subset of patients afflicted with severe type 1 diabetes mellitus. Having an accurate and reliable system for cell distribution is therefore crucial. The Islet Cell Resource Center Consortium was formed in 2001 as the first and largest cooperative group of islet production and distribution facilities in the world. We previously reported on the development of a Matching Algorithm for Islet Distribution (MAID), an automated web-based tool used to optimize the distribution of human pancreatic islets by matching investigator requests to islet characteristics. This article presents an assessment of that algorithm and compares it to the manual distribution process used prior to MAID. A comparison was done using an investigator's ratio of the number of islets received divided by the number requested pre- and post-MAID. Although the supply of islets increased between the pre- versus post-MAID period, the median received-to-requested ratio remained around 60% due to an increase in demand post-MAID. A significantly smaller variation in the received-to-requested ratio was achieved in the post- versus pre-MAID period. In particular, the undesirable outcome of providing users with more islets than requested, ranging up to four times their request, was greatly reduced through the algorithm. In conclusion, this analysis demonstrates, for the first time, the effectiveness of using an automated web-based cell distribution system to facilitate efficient and consistent delivery of human pancreatic islets by enhancing the islet matching process.
Semi-automated ontology generation and evolution
NASA Astrophysics Data System (ADS)
Stirtzinger, Anthony P.; Anken, Craig S.
2009-05-01
Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural components referenced above and identify a potential technology transition path to Scott AFB's Tanker Airlift Control Center (TACC) which serves as the Air Operations Center (AOC) for the Air Mobility Command (AMC).
Containerless automated processing of intermetallic compounds and composites
NASA Technical Reports Server (NTRS)
Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.
1993-01-01
An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.
Investigation of the Formability of TRIP780 Steel Sheets
NASA Astrophysics Data System (ADS)
Song, Yang
The formability of a metal sheet is dependent on its work hardening behaviour and its forming limits; and both aspects must be carefully determined in order to accurately simulate a particular forming process. This research aims to characterize the formability of a TRIP780 sheet steel using advanced experimental testing and analysis techniques. A series of flat rolling and tensile tests, as well as shear tests were conducted to determine the large deformation work hardening behaviour of this TRIP780 steel. Nakazima tests were carried out up to fracture to determine the forming limits of this sheet material. A highly-automated method for generating a robust FLC for sheet materials from DIC strain measurements was created with the help of finite element simulations, and evaluated against the conventional method. A correction algorithm that aims to compensate for the process dependent effects in the Nakazima test was implemented and tested with some success.
Information Fusion for Feature Extraction and the Development of Geospatial Information
2004-07-01
of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2011 CFR
2011-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moeglein, W. A.; Griswold, R.; Mehdi, B. L.
In-situ (scanning) transmission electron microscopy (S/TEM) is being developed for numerous applications in the study of nucleation and growth under electrochemical driving forces. For this type of experiment, one of the key parameters is to identify when nucleation initiates. Typically the process of identifying the moment that crystals begin to form is a manual process requiring the user to perform an observation and respond accordingly (adjust focus, magnification, translate the stage etc.). However, as the speed of the cameras being used to perform these observations increases, the ability of a user to “catch” the important initial stage of nucleation decreasesmore » (there is more information that is available in the first few milliseconds of the process). Here we show that video shot boundary detection (SBD) can automatically detect frames where a change in the image occurs. We show that this method can be applied to quickly and accurately identify points of change during crystal growth. This technique allows for automated segmentation of a digital stream for further analysis and the assignment of arbitrary time stamps for the initiation of processes that are independent of the user’s ability to observe and react.« less
Automated workflows for modelling chemical fate, kinetics and toxicity.
Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P
2017-12-01
Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Cockpit Adaptive Automation and Pilot Performance
NASA Technical Reports Server (NTRS)
Parasuraman, Raja
2001-01-01
The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals were met or exceeded. The results of the research extended knowledge of automation-related performance decrements in pilots and demonstrated the positive effects of adaptive task allocation. In addition, several practical implications for cockpit automation design were drawn from the research conducted. A total of 12 articles deriving from the project were published.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario
Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use.« less
Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B
2014-11-01
To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use. Copyright © 2014 Elsevier Inc. All rights reserved.
Trust in automation: designing for appropriate reliance.
Lee, John D; See, Katrina A
2004-01-01
Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.
Development of a menu of performance tests self-administered on a portable microcomputer
NASA Technical Reports Server (NTRS)
Wilkes, Robert L.; Kuntz, Lois-Ann; Kennedy, Robert S.
1987-01-01
Eighteen cognitive, motor, and information processing performance subtests were screened for self-administration over 10 trials by 16 subjects. When altered presentation forms of the same test were collectively considered, the battery composition was reduced to 10 distinctly different measures. A fully automated microbased testing system was employed in presenting the battery of subtests. Successful self-administration of the battery provided for the field testing of the automated system and facilitated convenient data collection. Total test administration time was 47.2 minutes for each session. Results indicated that nine of the tests stabilized, but for a short battery of tests only five are recommended for use in repeated-measures research. The five recommended tests include: the Tapping series, Number Comparison, Short-term Memory, Grammatical Reasoning, and 4-Choice Reaction Time. These tests can be expected to reveal three factors: (1) cognition, (2) processing quickness, and (3) motor. All the tests stabilized in 24 minutes, or approximately two 12-minute sessions.
Automated geospatial Web Services composition based on geodata quality requirements
NASA Astrophysics Data System (ADS)
Cruz, Sérgio A. B.; Monteiro, Antonio M. V.; Santos, Rafael
2012-10-01
Service-Oriented Architecture and Web Services technologies improve the performance of activities involved in geospatial analysis with a distributed computing architecture. However, the design of the geospatial analysis process on this platform, by combining component Web Services, presents some open issues. The automated construction of these compositions represents an important research topic. Some approaches to solving this problem are based on AI planning methods coupled with semantic service descriptions. This work presents a new approach using AI planning methods to improve the robustness of the produced geospatial Web Services composition. For this purpose, we use semantic descriptions of geospatial data quality requirements in a rule-based form. These rules allow the semantic annotation of geospatial data and, coupled with the conditional planning method, this approach represents more precisely the situations of nonconformities with geodata quality that may occur during the execution of the Web Service composition. The service compositions produced by this method are more robust, thus improving process reliability when working with a composition of chained geospatial Web Services.
Automated manufacturing process for DEAP stack-actuators
NASA Astrophysics Data System (ADS)
Tepel, Dominik; Hoffstadt, Thorben; Maas, Jürgen
2014-03-01
Dielectric elastomers (DE) are thin polymer films belonging to the class of electroactive polymers (EAP), which are coated with compliant and conductive electrodes on each side. Due to the influence of an electrical field, dielectric elastomers perform a large amount of deformation. In this contribution a manufacturing process of automated fabricated stack-actuators based on dielectric electroactive polymers (DEAP) are presented. First of all the specific design of the considered stack-actuator is explained and afterwards the development, construction and realization of an automated manufacturing process is presented in detail. By applying this automated process, stack-actuators with reproducible and homogeneous properties can be manufactured. Finally, first DEAP actuator modules fabricated by the mentioned process are validated experimentally.
An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques
2018-01-09
ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER
Aviation safety/automation program overview
NASA Technical Reports Server (NTRS)
Morello, Samuel A.
1990-01-01
The goal is to provide a technology base leading to improved safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers. Information on the problems, specific objectives, human-automation interaction, intelligent error-tolerant systems, and air traffic control/cockpit integration is given in viewgraph form.
NASA Technical Reports Server (NTRS)
Nevill, Gale E., Jr.
1988-01-01
The goal of the Fall 1987 class of EGM 4000 was the investigation of engineering aspects contributing to the development of NASA's Controlled Ecological Life Support System (CELSS). The areas investigated were the geometry of plant growth chambers, automated seeding of plants, remote sensing of plant health, and processing of grain into edible forms. The group investigating variable spacing of individual soybean plants designed growth trays consisting of three dimensional trapezoids arranged in a compact circular configuration. The automated seed manipulation and planting group investigated the electrical and mechanical properties of wheat seeds and developed three seeding concepts based upon these properties. The plant health and disease sensing group developed a list of reliable plant health indicators and investigated potential detection technologies.
Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.
Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S
2013-03-01
Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.
Augmenting team cognition in human-automation teams performing in complex operational environments.
Cuevas, Haydee M; Fiore, Stephen M; Caldwell, Barrett S; Strater, Laura
2007-05-01
There is a growing reliance on automation (e.g., intelligent agents, semi-autonomous robotic systems) to effectively execute increasingly cognitively complex tasks. Successful team performance for such tasks has become even more dependent on team cognition, addressing both human-human and human-automation teams. Team cognition can be viewed as the binding mechanism that produces coordinated behavior within experienced teams, emerging from the interplay between each team member's individual cognition and team process behaviors (e.g., coordination, communication). In order to better understand team cognition in human-automation teams, team performance models need to address issues surrounding the effect of human-agent and human-robot interaction on critical team processes such as coordination and communication. Toward this end, we present a preliminary theoretical framework illustrating how the design and implementation of automation technology may influence team cognition and team coordination in complex operational environments. Integrating constructs from organizational and cognitive science, our proposed framework outlines how information exchange and updating between humans and automation technology may affect lower-level (e.g., working memory) and higher-level (e.g., sense making) cognitive processes as well as teams' higher-order "metacognitive" processes (e.g., performance monitoring). Issues surrounding human-automation interaction are discussed and implications are presented within the context of designing automation technology to improve task performance in human-automation teams.
Agile based "Semi-"Automated Data ingest process : ORNL DAAC example
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.
2015-12-01
The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.
Automation Bias: Decision Making and Performance in High-Tech Cockpits
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
Implementation of and experiences with new automation
Mahmud, Ifte; Kim, David
2000-01-01
In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at ‘get-go’, we were able to successfully bring in many automation technologies. Our first experience with automation was SFA/SDAS, and then Zymark TPWII followed by Zymark Multi-dose. The future of product testing lies in automation, and we shall continue to explore the possibilities of improving the testing methodologies so that the chemists will be less burdened with repetitive and mundane daily tasks and be more focused on bringing quality into our products. PMID:18924695
Implementation of and experiences with new automation.
Mahmud, I; Kim, D
2000-01-01
In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at 'get-go', we were able to successfully bring in many automation technologies. Our first experience with automation was SFA/SDAS, and then Zymark TPWII followed by Zymark Multi-dose. The future of product testing lies in automation, and we shall continue to explore the possibilities of improving the testing methodologies so that the chemists will be less burdened with repetitive and mundane daily tasks and be more focused on bringing quality into our products.
Code of Federal Regulations, 2010 CFR
2010-10-01
... to conduct automated data processing and recordkeeping activities through Office Automation... IV-D Systems and office automation? 310.5 Section 310.5 Public Welfare Regulations Relating to Public... AUTOMATION Requirements for Computerized Tribal IV-D Systems and Office Automation § 310.5 What options are...
An Automation Survival Guide for Media Centers.
ERIC Educational Resources Information Center
Whaley, Roger E.
1989-01-01
Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)
Space Station Freedom ECLSS: A step toward autonomous regenerative life support systems
NASA Technical Reports Server (NTRS)
Dewberry, Brandon S.
1990-01-01
The Environmental Control and Life Support System (ECLSS) is a Freedom Station distributed system with inherent applicability to extensive automation primarily due to its comparatively long control system latencies. These allow longer contemplation times in which to form a more intelligent control strategy and to prevent and diagnose faults. The regenerative nature of the Space Station Freedom ECLSS will contribute closed loop complexities never before encountered in life support systems. A study to determine ECLSS automation approaches has been completed. The ECLSS baseline software and system processes could be augmented with more advanced fault management and regenerative control systems for a more autonomous evolutionary system, as well as serving as a firm foundation for future regenerative life support systems. Emerging advanced software technology and tools can be successfully applied to fault management, but a fully automated life support system will require research and development of regenerative control systems and models. The baseline Environmental Control and Life Support System utilizes ground tests in development of batch chemical and microbial control processes. Long duration regenerative life support systems will require more active chemical and microbial feedback control systems which, in turn, will require advancements in regenerative life support models and tools. These models can be verified using ground and on orbit life support test and operational data, and used in the engineering analysis of proposed intelligent instrumentation feedback and flexible process control technologies for future autonomous regenerative life support systems, including the evolutionary Space Station Freedom ECLSS.
Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin
2018-02-01
Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.
Tests of Spectral Cloud Classification Using DMSP Fine Mode Satellite Data.
1980-06-02
processing techniques of potential value. Fourier spectral analysis was identified as the most promising technique to upgrade automated processing of...these measurements on the Earth’s surface is 0. 3 n mi. 3. Pickett, R.M., and Blackman, E.S. (1976) Automated Processing of Satellite Imagery Data at Air...and Pickett. R. Al. (1977) Automated Processing of Satellite Imagery Data at the Air Force Global Weather Central: Demonstrations of Spectral Analysis
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2009-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor. Morphometry of the corneal endothelium is presently done by semi-automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development of fully automated analysis of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images. The digitally enhanced images of the corneal endothelium were transformed, using the fast Fourier transform (FFT). Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Xu, Wei
2007-12-01
This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
NASA Technical Reports Server (NTRS)
1984-01-01
The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.
REDIR: Automated Static Detection of Obfuscated Anti-Debugging Techniques
2014-03-27
analyzing code samples that resist other forms of analysis. 2.5.6 RODS and HASTI: Software Engineering Cognitive Support Software Engineering (SE) is another...and (c) this method is resistant to common obfuscation techniques. To achieve this goal, the Data/Frame sensemaking theory guides the process of...No Starch Press, 2012. [46] C.-W. Hsu, S. W. Shieh et al., “Divergence Detector: A Fine-Grained Approach to Detecting VM-Awareness Malware,” in
Deburring die-castings by wet vibratory plant
NASA Astrophysics Data System (ADS)
Loeschbart, H. M.
1980-02-01
A wet vibratory procedure for the removal of burrs from die castings is described. In this process synthetic abrasive chips and detergent solutions are agitated with the work in such a way as to produce a spiral circulatory movement. Details of various forms of vibrator basin and shapes of abrasive are illustrated. The automation of deburring is illustrated through the application of vibrators of spiral design in combination with transport and drying devices.
Deburring die-castings by wet vibratory plant
NASA Technical Reports Server (NTRS)
Loeschbart, H. M.
1980-01-01
A wet vibratory procedure for the removal of burrs from die castings is described. In this process synthetic abrasive chips and detergent solutions are agitated with the work in such a way as to produce a spiral circulatory movement. Details of various forms of vibrator basin and shapes of abrasive are illustrated. The automation of deburring is illustrated through the application of vibrators of spiral design in combination with transport and drying devices.
Rice, Stephen; McCarley, Jason S
2011-12-01
Automated diagnostic aids prone to false alarms often produce poorer human performance in signal detection tasks than equally reliable miss-prone aids. However, it is not yet clear whether this is attributable to differences in the perceptual salience of the automated aids' misses and false alarms or is the result of inherent differences in operators' cognitive responses to different forms of automation error. The present experiments therefore examined the effects of automation false alarms and misses on human performance under conditions in which the different forms of error were matched in their perceptual characteristics. Young adult participants performed a simulated baggage x-ray screening task while assisted by an automated diagnostic aid. Judgments from the aid were rendered as text messages presented at the onset of each trial, and every trial was followed by a second text message providing response feedback. Thus, misses and false alarms from the aid were matched for their perceptual salience. Experiment 1 found that even under these conditions, false alarms from the aid produced poorer human performance and engendered lower automation use than misses from the aid. Experiment 2, however, found that the asymmetry between misses and false alarms was reduced when the aid's false alarms were framed as neutral messages rather than explicit misjudgments. Results suggest that automation false alarms and misses differ in their inherent cognitive salience and imply that changes in diagnosis framing may allow designers to encourage better use of imperfectly reliable automated aids.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-15
.../APR), and two additional, automated versions of form: an Excel version and a version on HUD's Energy... same information, but have different burdens due to the automated capabilities of the Excel and EPIC versions. These automated enhancements make the Excel and EPIC versions easier and faster to complete than...
Cockpit avionics integration and automation
NASA Technical Reports Server (NTRS)
Pischke, Keith M.
1990-01-01
Information on cockpit avionics integration and automation is given in viewgraph form, with a number of photographs. The benefits of cockpit integration are listed. The MD-11 flight guidance/flight deck system is illustrated.
Improved compliance by BPM-driven workflow automation.
Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin
2014-12-01
Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.
Slepoy, A; Peters, M D; Thompson, A P
2007-11-30
Molecular dynamics and other molecular simulation methods rely on a potential energy function, based only on the relative coordinates of the atomic nuclei. Such a function, called a force field, approximately represents the electronic structure interactions of a condensed matter system. Developing such approximate functions and fitting their parameters remains an arduous, time-consuming process, relying on expert physical intuition. To address this problem, a functional programming methodology was developed that may enable automated discovery of entirely new force-field functional forms, while simultaneously fitting parameter values. The method uses a combination of genetic programming, Metropolis Monte Carlo importance sampling and parallel tempering, to efficiently search a large space of candidate functional forms and parameters. The methodology was tested using a nontrivial problem with a well-defined globally optimal solution: a small set of atomic configurations was generated and the energy of each configuration was calculated using the Lennard-Jones pair potential. Starting with a population of random functions, our fully automated, massively parallel implementation of the method reproducibly discovered the original Lennard-Jones pair potential by searching for several hours on 100 processors, sampling only a minuscule portion of the total search space. This result indicates that, with further improvement, the method may be suitable for unsupervised development of more accurate force fields with completely new functional forms. Copyright (c) 2007 Wiley Periodicals, Inc.
Problems of Automation and Management Principles Information Flow in Manufacturing
NASA Astrophysics Data System (ADS)
Grigoryuk, E. N.; Bulkin, V. V.
2017-07-01
Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.
Public Library Automation Report: 1984.
ERIC Educational Resources Information Center
Gotanda, Masae
Data processing was introduced to public libraries in Hawaii in 1973 with a feasibility study which outlined the candidate areas for automation. Since then, the Office of Library Services has automated the order procedures for one of the largest book processing centers for public libraries in the country; created one of the first COM…
Conflict-free trajectory planning for air traffic control automation
NASA Technical Reports Server (NTRS)
Slattery, Rhonda; Green, Steve
1994-01-01
As the traffic demand continues to grow within the National Airspace System (NAS), the need for long-range planning (30 minutes plus) of arrival traffic increases greatly. Research into air traffic control (ATC) automation at ARC has led to the development of the Center-TRACON Automation System (CTAS). CTAS determines optimum landing schedules for arrival traffic and assists controllers in meeting those schedules safely and efficiently. One crucial element in the development of CTAS is the capability to perform long-range (20 minutes) and short-range (5 minutes) conflict prediction and resolution once landing schedules are determined. The determination of conflict-free trajectories within the Center airspace is particularly difficult because of large variations in speed and altitude. The paper describes the current design and implementation of the conflict prediction and resolution tools used to generate CTAS advisories in Center airspace. Conflict criteria (separation requirements) are defined and the process of separation prediction is described. The major portion of the paper will describe the current implementation of CTAS conflict resolution algorithms in terms of the degrees of freedom for resolutions as well as resolution search techniques. The tools described in this paper have been implemented in a research system designed to rapidly develop and evaluate prototype concepts and will form the basis for an operational ATC automation system.
Development of an automated fuzing station for the future armored resupply vehicle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chesser, J.B.; Jansen, J.F.; Lloyd, P.D.
1995-03-01
The US Army is developing the Advanced Field Artillery System (SGSD), a next generation armored howitzer. The Future Armored Resupply Vehicle (FARV) will be its companion ammunition resupply vehicle. The FARV with automate the supply of ammunition and fuel to the AFAS which will increase capabilities over the current system. One of the functions being considered for automation is ammunition processing. Oak Ridge National Laboratory is developing equipment to demonstrate automated ammunition processing. One of the key operations to be automated is fuzing. The projectiles are initially unfuzed, and a fuze must be inserted and threaded into the projectile asmore » part of the processing. A constraint on the design solution is that the ammunition cannot be modified to simplify automation. The problem was analyzed to determine the alignment requirements. Using the results of the analysis, ORNL designed, built, and tested a test stand to verify the selected design solution.« less
PLACE: an open-source python package for laboratory automation, control, and experimentation.
Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper
2015-02-01
In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.
NASA Technical Reports Server (NTRS)
Doggett, William R.
1992-01-01
The topics are presented in viewgraph form and include: automated structures assembly facility current control hierarchy; automated structures assembly facility purposed control hierarchy; end-effector software state transition diagram; block diagram for ideal install composite; and conclusions.
Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.
Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels
2012-10-01
We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.
Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing
NASA Astrophysics Data System (ADS)
Rowe, C. A.; Stead, R. J.; Begnaud, M. L.
2013-12-01
Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.
A machine-learning apprentice for the completion of repetitive forms
NASA Technical Reports Server (NTRS)
Hermens, Leonard A.; Schlimmer, Jeffrey C.
1994-01-01
Forms of all types are used in businesses and government agencies, and most of them are filled in by hand. Yet much time and effort has been expended to automate form-filling by programming specific systems or computers. The high cost of programmers and other resources prohibits many organizations from benefiting from efficient office automation. A learning apprentice can be used for such repetitious form-filling tasks. In this paper, we establish the need for learning apprentices, describe a framework for such a system, explain the difficulties of form-filling, and present empirical results of a form-filling system used in our department from September 1991 to April 1992. The form-filling apprentice saves up to 87 percent in keystroke effort and correctly predicts nearly 90 percent of the values on the form.
An Automated HIV-1 Env-Pseudotyped Virus Production for Global HIV Vaccine Trials
Fuss, Martina; Mazzotta, Angela S.; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; von Briesen, Hagen; Zimmermann, Heiko; Meyerhans, Andreas
2012-01-01
Background Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. Methodology/Principal Findings The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays) confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP) guidelines, including the validation parameters accuracy, precision, robustness and specificity. Conclusions An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell culture supernatant per week. Thus, this novel automation facilitates standardized large-scale productions of HIV pseudoviruses for ongoing and upcoming HIV vaccine trials. PMID:23300558
Shah, Ami P; Cobb, Benjamin T; Lower, Darla R; Shaikh, Nader; Rasmussen, Jayne; Hoberman, Alejandro; Wald, Ellen R; Rosendorff, Adam; Hickey, Robert W
2014-03-01
Urinary tract infections (UTI) are the most common serious bacterial infection in febrile infants. Urinalysis (UA) is a screening test for preliminary diagnosis of UTI. UA can be performed manually or using automated techniques. We sought to compare manual versus automated UA for urine specimens obtained via catheterization in the pediatric emergency department. In this prospective study, we processed catheterized urine samples from infants with suspected UTI by both the manual method (enhanced UA) and the automated method. We defined a positive enhanced UA as ≥ 10 white blood cells per cubic millimeter and presence of any bacteria per 10 oil immersion fields on a Gram-stained smear. We defined a positive automated UA as ≥ 2 white blood cells per high-powered field and presence of any bacteria using the IRIS iQ200 ELITE. We defined a positive urine culture as growth of ≥ 50,000 colony-forming units per milliliter of a single uropathogen. We analyzed data using SPSS software. A total of 703 specimens were analyzed. Prevalence of UTI was 7%. For pyuria, the sensitivity and positive predictive value (PPV) of the enhanced UA in predicting positive urine culture were 83.6% and 52.5%, respectively; corresponding values for the automated UA were 79.5% and 37.5%, respectively. For bacteriuria, the sensitivity and PPV of a Gram-stained smear (enhanced UA) were 83.6% and 59.4%, respectively; corresponding values for the automated UA were 73.4%, and 26.2%, respectively. Using criteria of both pyuria and bacteriuria for the enhanced UA resulted in a sensitivity of 77.5% and a PPV of 84.4%; corresponding values for the automated UA were 63.2% and 51.6%, respectively. Combining automated pyuria (≥ 2 white blood cells/high-powered microscopic field) with a Gram-stained smear resulted in a sensitivity of 75.5% and a PPV of 84%. Automated UA is comparable with manual UA for detection of pyuria in young children with suspected UTI. Bacteriuria detected by automated UA is less sensitive and specific for UTI when compared with a Gram-stained smear. We recommend using either manual or automated measurement of pyuria in combination with Gram-stained smear as the preferred technique for UA of catheterized specimens obtained from children in an acute care setting.
Takahashi; Nakazawa; Watanabe; Konagaya
1999-01-01
We have developed the automated processing algorithms for 2-dimensional (2-D) electrophoretograms of genomic DNA based on RLGS (Restriction Landmark Genomic Scanning) method, which scans the restriction enzyme recognition sites as the landmark and maps them onto a 2-D electrophoresis gel. Our powerful processing algorithms realize the automated spot recognition from RLGS electrophoretograms and the automated comparison of a huge number of such images. In the final stage of the automated processing, a master spot pattern, on which all the spots in the RLGS images are mapped at once, can be obtained. The spot pattern variations which seemed to be specific to the pathogenic DNA molecular changes can be easily detected by simply looking over the master spot pattern. When we applied our algorithms to the analysis of 33 RLGS images derived from human colon tissues, we successfully detected several colon tumor specific spot pattern changes.
Studies on Automated Manufacturing of High Performance Composites
NASA Technical Reports Server (NTRS)
Cano, R. J.; Belvin, H. L.; Hulcher, A. B.; Grenoble, R. W.
2001-01-01
The NASA Langley Research Center fiber placement facility has proven to be a valuable asset for obtaining data, experience, and insights into the automated fabrication of high performance composites. The facility consists of two automated devices: an Asea Brown Boveri (ABB) robotic arm with a modified heated head capable of hot gas and focused infrared heating and a 7' x 17' gantry containing a feeder head, rotating platform, focused infrared lamp and e-beam gun. While uncured thermoset tow and tape, e.g., epoxy and cyanate prepreg, can be placed with a robot, the placement facility s most powerful attribute is the ability to place thermoplastic and e-beam curable material to net shape. In recent years, ribbonizing techniques have been developed to make high quality thermoplastic and thermoset dry material forms to the standards required for robotic placement. A variety of composites have been fabricated from these ribbons by heated head tow and tape placement including both flat plates and cylinders. Composite mechanical property values of the former were between 85 and 100 percent of those obtained by hand lay-up/autoclave processing.
ActionMap: A web-based software that automates loci assignments to framework maps.
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-07-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).
ActionMap: a web-based software that automates loci assignments to framework maps
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-01-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426
Nanobiocatalysis for protein digestion in proteomic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Jungbae; Kim, Byoung Chan; Lopez-Ferrer, Daniel
2010-02-01
The process of protein digestion is a critical step for successful protein identification in the bottom-up proteomic analysis. To substitute the present practice of in-solution protein digestion, which is long, tedious, and difficult to automate, a lot of efforts have been dedicated for the development of a rapid, recyclable and automated digestion system. Recent advances of nanobiocatalytic approaches have improved the performance of protein digestion by using various nanomaterials such as nanoporous materials, magnetic nanoparticles, and polymer nanofibers. Especially, the unprecedented success of trypsin stabilization in the form of trypsin-coated nanofibers, showing no activity decrease under repeated uses for onemore » year and retaining good resistance to proteolysis, has demonstrated its great potential to be employed in the development of automated, high-throughput, and on-line digestion systems. This review discusses recent developments of nanobiocatalytic approaches for the improved performance of protein digestion in speed, detection sensitivity, recyclability, and trypsin stability. In addition, we also introduce the protein digestions under unconventional energy inputs for protein denaturation and the development of microfluidic enzyme reactors that can benefit from recent successes of these nanobiocatalytic approaches.« less
Complacency and bias in human use of automation: an attentional integration.
Parasuraman, Raja; Manzey, Dietrich H
2010-06-01
Our aim was to review empirical studies of complacency and bias in human interaction with automated and decision support systems and provide an integrated theoretical model for their explanation. Automation-related complacency and automation bias have typically been considered separately and independently. Studies on complacency and automation bias were analyzed with respect to the cognitive processes involved. Automation complacency occurs under conditions of multiple-task load, when manual tasks compete with the automated task for the operator's attention. Automation complacency is found in both naive and expert participants and cannot be overcome with simple practice. Automation bias results in making both omission and commission errors when decision aids are imperfect. Automation bias occurs in both naive and expert participants, cannot be prevented by training or instructions, and can affect decision making in individuals as well as in teams. While automation bias has been conceived of as a special case of decision bias, our analysis suggests that it also depends on attentional processes similar to those involved in automation-related complacency. Complacency and automation bias represent different manifestations of overlapping automation-induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics. The integrated model and attentional synthesis provides a heuristic framework for further research on complacency and automation bias and design options for mitigating such effects in automated and decision support systems.
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Hillyer, T. N.; Wilkins, J.
2012-12-01
The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.
Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela
2014-01-01
Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130
Marsolo, Keith; Margolis, Peter A; Forrest, Christopher B; Colletti, Richard B; Hutton, John J
2015-01-01
We collaborated with the ImproveCareNow Network to create a proof-of-concept architecture for a network-based Learning Health System. This collaboration involved transitioning an existing registry to one that is linked to the electronic health record (EHR), enabling a "data in once" strategy. We sought to automate a series of reports that support care improvement while also demonstrating the use of observational registry data for comparative effectiveness research. We worked with three leading EHR vendors to create EHR-based data collection forms. We automated many of ImproveCareNow's analytic reports and developed an application for storing protected health information and tracking patient consent. Finally, we deployed a cohort identification tool to support feasibility studies and hypothesis generation. There is ongoing uptake of the system. To date, 31 centers have adopted the EHR-based forms and 21 centers are uploading data to the registry. Usage of the automated reports remains high and investigators have used the cohort identification tools to respond to several clinical trial requests. The current process for creating EHR-based data collection forms requires groups to work individually with each vendor. A vendor-agnostic model would allow for more rapid uptake. We believe that interfacing network-based registries with the EHR would allow them to serve as a source of decision support. Additional standards are needed in order for this vision to be achieved, however. We have successfully implemented a proof-of-concept Learning Health System while providing a foundation on which others can build. We have also highlighted opportunities where sponsors could help accelerate progress.
2010-04-01
NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS). APS was developed for processing...have not previously developed automated processing for 73 hyperspectral ocean color data. The hyperspectral processing branch includes several
A novel method for automated assessment of megakaryocyte differentiation and proplatelet formation.
Salzmann, M; Hoesel, B; Haase, M; Mussbacher, M; Schrottmaier, W C; Kral-Pointner, J B; Finsterbusch, M; Mazharian, A; Assinger, A; Schmid, J A
2018-06-01
Transfusion of platelet concentrates represents an important treatment for various bleeding complications. However, the short half-life and frequent contaminations with bacteria restrict the availability of platelet concentrates and raise a clear demand for platelets generated ex vivo. Therefore, in vitro platelet generation from megakaryocytes represents an important research topic. A vital step for this process represents accurate analysis of thrombopoiesis and proplatelet formation, which is usually conducted manually. We aimed to develop a novel method for automated classification and analysis of proplatelet-forming megakaryocytes in vitro. After fluorescent labelling of surface and nucleus, MKs were automatically categorized and analysed with a novel pipeline of the open source software CellProfiler. Our new workflow is able to detect and quantify four subtypes of megakaryocytes undergoing thrombopoiesis: proplatelet-forming, spreading, pseudopodia-forming and terminally differentiated, anucleated megakaryocytes. Furthermore, we were able to characterize the inhibitory effect of dasatinib on thrombopoiesis in more detail. Our new workflow enabled rapid, unbiased, quantitative and qualitative in-depth analysis of proplatelet formation based on morphological characteristics. Clinicians and basic researchers alike will benefit from this novel technique that allows reliable and unbiased quantification of proplatelet formation. It thereby provides a valuable tool for the development of methods to generate platelets ex vivo and to detect effects of drugs on megakaryocyte differentiation.
ERIC Educational Resources Information Center
Evanini, Keelan; Heilman, Michael; Wang, Xinhao; Blanchard, Daniel
2015-01-01
This report describes the initial automated scoring results that were obtained using the constructed responses from the Writing and Speaking sections of the pilot forms of the "TOEFL Junior"® Comprehensive test administered in late 2011. For all of the items except one (the edit item in the Writing section), existing automated scoring…
Multi-Dimensional Signal Processing Research Program
1981-09-30
applications to real-time image processing and analysis. A specific long-range application is the automated processing of aerial reconnaissance imagery...Non-supervised image segmentation is a potentially im- portant operation in the automated processing of aerial reconnaissance pho- tographs since it
Generative Representations for Automated Design of Robots
NASA Technical Reports Server (NTRS)
Homby, Gregory S.; Lipson, Hod; Pollack, Jordan B.
2007-01-01
A method of automated design of complex, modular robots involves an evolutionary process in which generative representations of designs are used. The term generative representations as used here signifies, loosely, representations that consist of or include algorithms, computer programs, and the like, wherein encoded designs can reuse elements of their encoding and thereby evolve toward greater complexity. Automated design of robots through synthetic evolutionary processes has already been demonstrated, but it is not clear whether genetically inspired search algorithms can yield designs that are sufficiently complex for practical engineering. The ultimate success of such algorithms as tools for automation of design depends on the scaling properties of representations of designs. A nongenerative representation (one in which each element of the encoded design is used at most once in translating to the design) scales linearly with the number of elements. Search algorithms that use nongenerative representations quickly become intractable (search times vary approximately exponentially with numbers of design elements), and thus are not amenable to scaling to complex designs. Generative representations are compact representations and were devised as means to circumvent the above-mentioned fundamental restriction on scalability. In the present method, a robot is defined by a compact programmatic form (its generative representation) and the evolutionary variation takes place on this form. The evolutionary process is an iterative one, wherein each cycle consists of the following steps: 1. Generative representations are generated in an evolutionary subprocess. 2. Each generative representation is a program that, when compiled, produces an assembly procedure. 3. In a computational simulation, a constructor executes an assembly procedure to generate a robot. 4. A physical-simulation program tests the performance of a simulated constructed robot, evaluating the performance according to a fitness criterion to yield a figure of merit that is fed back into the evolutionary subprocess of the next iteration. In comparison with prior approaches to automated evolutionary design of robots, the use of generative representations offers two advantages: First, a generative representation enables the reuse of components in regular and hierarchical ways and thereby serves a systematic means of creating more complex modules out of simpler ones. Second, the evolved generative representation may capture intrinsic properties of the design problem, so that variations in the representations move through the design space more effectively than do equivalent variations in a nongenerative representation. This method has been demonstrated by using it to design some robots that move, variously, by walking, rolling, or sliding. Some of the robots were built (see figure). Although these robots are very simple, in comparison with robots designed by humans, their structures are more regular, modular, hierarchical, and complex than are those of evolved designs of comparable functionality synthesized by use of nongenerative representations.
NASA Astrophysics Data System (ADS)
Minh Ha, Thien; Niggeler, Dieter; Bunke, Horst; Clarinval, Jose
1995-08-01
Although giro forms are used by many people in daily life for money remittance in Switzerland, the processing of these forms at banks and post offices is only partly automated. We describe an ongoing project for building an automatic system that is able to recognize various items printed or written on a giro form. The system comprises three main components, namely, an automatic form feeder, a camera system, and a computer. These components are connected in such a way that the system is able to process a bunch of forms without any human interactions. We present two real applications of our system in the field of payment services, which require the reading of both machine printed and handwritten information that may appear on a giro form. One particular feature of giro forms is their flexible layout, i.e., information items are located differently from one form to another, thus requiring an additional analysis step to localize them before recognition. A commercial optical character recognition software package is used for recognition of machine-printed information, whereas handwritten information is read by our own algorithms, the details of which are presented. The system is implemented by using a client/server architecture providing a high degree of flexibility to change. Preliminary results are reported supporting our claim that the system is usable in practice.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-12
... Relating to Establishing an Automated Service for the Processing of Transfers, Replacements, and Exchanges... (the ``Act'').\\1\\ The proposed rule change allows NSCC to add a new automated service to process... offer a new automated service for the transfer, replacement, or exchange (collectively referred to as a...
AUTOMATED LITERATURE PROCESSING HANDLING AND ANALYSIS SYSTEM--FIRST GENERATION.
ERIC Educational Resources Information Center
Redstone Scientific Information Center, Redstone Arsenal, AL.
THE REPORT PRESENTS A SUMMARY OF THE DEVELOPMENT AND THE CHARACTERISTICS OF THE FIRST GENERATION OF THE AUTOMATED LITERATURE PROCESSING, HANDLING AND ANALYSIS (ALPHA-1) SYSTEM. DESCRIPTIONS OF THE COMPUTER TECHNOLOGY OF ALPHA-1 AND THE USE OF THIS AUTOMATED LIBRARY TECHNIQUE ARE PRESENTED. EACH OF THE SUBSYSTEMS AND MODULES NOW IN OPERATION ARE…
2008-09-01
automated processing of images for color correction, segmentation of foreground targets from sediment and classification of targets to taxonomic category...element in the development of HabCam as a tool for habitat characterization is the automated processing of images for color correction, segmentation of
Classification Trees for Quality Control Processes in Automated Constructed Response Scoring.
ERIC Educational Resources Information Center
Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.
As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…
Development of a Logistics Support Framework for Defense Mapping Agency (DMA) Automated Systems
1990-09-01
cycle of a particular system. This research identified principles of management , design or system life cycle processes, and ILS elements needed to...Delphi results gathered from DMA expert opinions. The principles of management , depicted in the Logistics Systems Management Matrix (LSMM) portrayed...review were used to form the Delphi survey questions in Chapter III. As shown in Figure 2, the LSMM is a three-dimensional model with thp principles of management on
Centralized and distributed control architectures under Foundation Fieldbus network.
Persechini, Maria Auxiliadora Muanis; Jota, Fábio Gonçalves
2013-01-01
This paper aims at discussing possible automation and control system architectures based on fieldbus networks in which the controllers can be implemented either in a centralized or in a distributed form. An experimental setup is used to demonstrate some of the addressed issues. The control and automation architecture is composed of a supervisory system, a programmable logic controller and various other devices connected to a Foundation Fieldbus H1 network. The procedures used in the network configuration, in the process modelling and in the design and implementation of controllers are described. The specificities of each one of the considered logical organizations are also discussed. Finally, experimental results are analysed using an algorithm for the assessment of control loops to compare the performances between the centralized and the distributed implementations. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Degree of anisotropy as an automated indicator of rip channels in high resolution bathymetric models
NASA Astrophysics Data System (ADS)
Trimble, S. M.; Houser, C.; Bishop, M. P.
2017-12-01
A rip current is a concentrated seaward flow of water that forms in the surf zone of a beach as a result of alongshore variations in wave breaking. Rips can carry swimmers swiftly into deep water, and they are responsible for hundreds of fatal drownings and thousands of rescues worldwide each year. These currents form regularly alongside hard structures like piers and jetties, and can also form along sandy coasts when there is a three dimensional bar morphology. This latter rip type tends to be variable in strength and location, making them arguably the most dangerous to swimmers and most difficult to identify. These currents form in characteristic rip channels in surf zone bathymetry, in which the primary axis of self-similarity is oriented shore-normal. This paper demonstrates a new method for automating identification of such rip channels in bathymetric digital surface models (DSMs) using bathymetric data collected by various remote sensing methods. Degree of anisotropy is used to detect rip channels and distinguishes between sandbars, rip channels, and other beach features. This has implications for coastal geomorphology theory and safety practices. As technological advances increase access and accuracy of topobathy mapping methods in the surf zone, frequent nearshore bathymetric DSMs could be more easily captured and processed, then analyzed with this method to result in localized, automated, and frequent detection of rip channels. This could ultimately reduce rip-related fatalities worldwide (i) in present mitigation, by identifying the present location of rip channels, (ii) in forecasting, by tracking the channel's evolution through multiple DSMs, and (iii) in rip education by improving local lifeguard knowledge of the rip hazard. Although this paper on applies analysis of degree of anisotropy to the identification of rip channels, this parameter can be applied to multiple facets of barrier island morphological analysis.
NASA Astrophysics Data System (ADS)
Herr, J.; Bhatnagar, T.; Goldfarb, S.; Irrer, J.; McKee, S.; Neal, H. A.
2008-07-01
Large scientific collaborations as well as universities have a growing need for multimedia archiving of meetings and courses. Collaborations need to disseminate training and news to their wide-ranging members, and universities seek to provide their students with more useful studying tools. The University of Michigan ATLAS Collaboratory Project has been involved in the recording and archiving of multimedia lectures since 1999. Our software and hardware architecture has been used to record events for CERN, ATLAS, many units inside the University of Michigan, Fermilab, the American Physical Society and the International Conference on Systems Biology at Harvard. Until 2006 our group functioned primarily as a tiny research/development team with special commitments to the archiving of certain ATLAS events. In 2006 we formed the MScribe project, using a larger scale, and highly automated recording system to record and archive eight University courses in a wide array of subjects. Several robotic carts are wheeled around campus by unskilled student helpers to automatically capture and post to the Web audio, video, slides and chalkboard images. The advances the MScribe project has made in automation of these processes, including a robotic camera operator and automated video processing, are now being used to record ATLAS Collaboration events, making them available more quickly than before and enabling the recording of more events.
Abstracts of AF Materials Laboratory Reports
1975-09-01
NO: TITLE: AUTHOR(S): CONTRACT NO; CONTRACTOR: AFML-TR-73-307 200,397 IMPROVED AUTOMATED TAPE LAYING MACHINE M. Poullos, W. J. Murray, D.L...AUTOMATED IMPROVED AUTOMATED TAPE LAYING MACHINE AUTOMATION AUTOMATION OF COATING PROCESSES FOR GAS TURBINE DLADcS AND VANES 203222/111 203072...IMP90VE0 TAPE LAYING MACHINE IMPP)VED AUTOMATED TAPE LAYING MACHINE A STUDY O^ THE STRESS-STRAIN TEHAVIOR OF GRAPHITE
DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,
1995-08-14
processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated
Automated imaging system for single molecules
Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel
2012-09-18
There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.
Automated Sequence Processor: Something Old, Something New
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Schrock, Mitchell; Fisher, Forest; Himes, Terry
2012-01-01
High productivity required for operations teams to meet schedules Risk must be minimized. Scripting used to automate processes. Scripts perform essential operations functions. Automated Sequence Processor (ASP) was a grass-roots task built to automate the command uplink process System engineering task for ASP revitalization organized. ASP is a set of approximately 200 scripts written in Perl, C Shell, AWK and other scripting languages.. ASP processes/checks/packages non-interactive commands automatically.. Non-interactive commands are guaranteed to be safe and have been checked by hardware or software simulators.. ASP checks that commands are non-interactive.. ASP processes the commands through a command. simulator and then packages them if there are no errors.. ASP must be active 24 hours/day, 7 days/week..
Batch manufacturing: Six strategic needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ash, R.H.; Chappell, D.A.
1995-08-01
Since the advent of industrial digital control systems in the mid-1970s, industry has had the promise of integrated, configurable digital batch control systems to replace the morass of electromechanical devices like relays and stepping switches, recorders, and indicators which comprised the components of previous generations of batch control systems - the {open_quotes}monolithic monsters{close_quotes} of the 1960s and earlier. To help fulfill that promise, there have been many wide-ranging proprietary automation solutions for batch control since 1975, many of them technically excellent. However, even the best examples suffered from the lack of a common language and unifying concept permitting separate systemsmore » to be interconnected and work together. Today, some 20 years after the digital revolution began, industry has microprocessors, memory chips, data highways, and other marvelous technology to help automate the control of discontinuous processes. They also are on the way to having an accepted standard for batch automation, ISA S88. Batching systems are at once conceptually simple but executionally complex. The notion of adding ingredients one at a time to a vat, mixing, and then processing into final form is as old as the stone age. Every homemaker on earth, male or female, is familiar with how to follow a recipe to create some sumptuous item of culinary delight. Food recipes, so familiar and ubiquitous, are really just microcosms of the S88 recipe standard. They contain the same components: (1) Header (name and description of item being prepared, sometimes serving size); (2) Formula (list and amount of ingredients); (3) Equipment requirements (pans, mixing and cooking equipment); (4) Procedure (description of order of ingredient addition, mixing and other processing steps, baking/cooling time, and other processing steps); and (5) Other information (safety, cautions, and other miscellaneous instructions).« less
NASA Astrophysics Data System (ADS)
Weckenmann, Albert A.; Gall, P.; Gabbia, A.
2005-02-01
Modern forming technology allows the production of highly sophisticated free form sheet material components, affording great flexibility to the design and manufacturing processes across a wide range of industries. This increased design and manufacturing potential places an ever growing demand on the accompanying inspection metrology. As a consequence of their surface shape, these parts underlie a reversible geometrical deformation caused by variations of the material and the manufacturing process, as well as by gravity. This distortion is removed during the assembly process, usually performed in automated robotic processes. For this reason, the part's tolerated parameters have to be inspected in a defined state, simulating the assembly process' boundary conditions. Thus, the inspection process chain consists of six steps: picking the workpiece up, manual fixation of the workpiece, tactile measurement of the surface's coordinates using a defined measurement strategy, manual removal of the fixation and removal of the workpiece from the inspection area. These steps are both laborious and time consuming (for example, the inspection of a car door can take up to a working day to complete). Using optical measuring systems and virtual distortion compensation, this process chain can be dramatically shortened. Optical measuring systems provide as a measurement result a point cloud representing a sample of all nearest surfaces in the measuring range containing the measurand. From this data, a surface model of the measurand can be determined, independent of its position in the measuring range. For thin sheet material parts an approximating finite element model can be deduced from such a surface model. By means of pattern recognition, assembly relevant features of the measurand can be identified and located on this model. Together with the boundary conditions given by the assembly process, the shape of the surface in its assembled state can be calculated using the finite elements method. In application these methods culminate in a shortened inspection process chain (which can now also be automated): picking the workpiece up, placing it in the measuring range, optical measurement, virtual distortion compensation and removal of the workpiece from the inspection area. This work discusses the methodology of our approach in detail and also provides and analyses experimental results. The underlying research was greatfully funded by the German Research Foundation (DFG).
Standard Terminal Automation Replacement System Human Factors Review Volume 1
DOT National Transportation Integrated Search
1997-12-18
The Federal Aviation Administration formed a Standard Terminal Automation Replacement : System (STARS) Working Group to identify and resolve human factors concerns with the Early : Display Capability (EDC) system before it is introduced in the field....
Managing laboratory automation
Saboe, Thomas J.
1995-01-01
This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018
Managing laboratory automation.
Saboe, T J
1995-01-01
This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.
NASA Technical Reports Server (NTRS)
Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.
1978-01-01
The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.
Managing mapping data using commercial data base management software.
Elassal, A.A.
1985-01-01
Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author
Manufacturing development of pultruded composite panels
NASA Technical Reports Server (NTRS)
Meade, L. E.
1989-01-01
The weight savings potential, of graphite-epoxy composites for secondary and medium primary aircraft structures, was demonstrated. One of the greatest challenges facing the aircraft industry is to reduce the acquisition costs for composite structures to a level below that of metal structures. The pultrusion process, wherein reinforcing fibers, after being passed through a resin bath are drawn through a die to form and cure the desired cross-section, is an automated low cost manufacturing process for composite structures. The Lockheed Aeronautical Systems Company (LASC) Composites Development Center designed, characterizated materials for, fabricated and tested a stiffened cover concept compatible with the continuous pultrusion process. The procedures used and the results obtained are presented.
Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.
2013-01-01
A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232
A service oriented approach for guidelines-based clinical decision support using BPMN.
Rodriguez-Loya, Salvador; Aziz, Ayesha; Chatwin, Chris
2014-01-01
Evidence-based medical practice requires that clinical guidelines need to be documented in such a way that they represent a clinical workflow in its most accessible form. In order to optimize clinical processes to improve clinical outcomes, we propose a Service Oriented Architecture (SOA) based approach for implementing clinical guidelines that can be accessed from an Electronic Health Record (EHR) application with a Web Services enabled communication mechanism with the Enterprise Service Bus. We have used Business Process Modelling Notation (BPMN) for modelling and presenting the clinical pathway in the form of a workflow. The aim of this study is to produce spontaneous alerts in the healthcare workflow in the diagnosis of Chronic Obstructive Pulmonary Disease (COPD). The use of BPMN as a tool to automate clinical guidelines has not been previously employed for providing Clinical Decision Support (CDS).
Algorithm of the automated choice of points of the acupuncture for EHF-therapy
NASA Astrophysics Data System (ADS)
Lyapina, E. P.; Chesnokov, I. A.; Anisimov, Ya. E.; Bushuev, N. A.; Murashov, E. P.; Eliseev, Yu. Yu.; Syuzanna, H.
2007-05-01
Offered algorithm of the automated choice of points of the acupuncture for EHF-therapy. The recipe formed by algorithm of an automated choice of points for acupunctural actions has a recommendational character. Clinical investigations showed that application of the developed algorithm in EHF-therapy allows to normalize energetic state of the meridians and to effectively solve many problems of an organism functioning.
1980-07-25
matrix (DTM) and digital planimetric data, combined and integrated into so-called "data bases." I’ll say more about this later. AUTOMATION OF...projection with mechanical inversors to maintain the Scheimpflug condition. Some automation has been achieved, with computer control to determine rectifier... matrix (DTM) form that is not necessarily collected from the same photography as that from which the orthophoto is being produced. Because they are
Automated solar cell assembly team process research
NASA Astrophysics Data System (ADS)
Nowlan, M. J.; Hogan, S. J.; Darkazalli, G.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.
1994-06-01
This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire's objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell's Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.
Industrial applications of automated X-ray inspection
NASA Astrophysics Data System (ADS)
Shashishekhar, N.
2015-03-01
Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.
Dropwise additive manufacturing of pharmaceutical products for melt-based dosage forms.
Içten, Elçin; Giridhar, Arun; Taylor, Lynne S; Nagy, Zoltan K; Reklaitis, Gintaras V
2015-05-01
The US Food and Drug Administration introduced the quality by design approach and process analytical technology guidance to encourage innovation and efficiency in pharmaceutical development, manufacturing, and quality assurance. As part of this renewed emphasis on the improvement of manufacturing, the pharmaceutical industry has begun to develop more efficient production processes with more intensive use of online measurement and sensing, real-time quality control, and process control tools. Here, we present dropwise additive manufacturing of pharmaceutical products (DAMPP) as an alternative to conventional pharmaceutical manufacturing methods. This mini-manufacturing process for the production of pharmaceuticals utilizes drop on demand printing technology for automated and controlled deposition of melt-based formulations onto edible substrates. The advantages of drop-on-demand technology, including reproducible production of small droplets, adjustable drop sizing, high placement accuracy, and flexible use of different formulations, enable production of individualized dosing even for low-dose and high-potency drugs. In this work, DAMPP is used to produce solid oral dosage forms from hot melts of an active pharmaceutical ingredient and a polymer. The dosage forms are analyzed to show the reproducibility of dosing and the dissolution behavior of different formulations. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Data processing and analysis with the autoPROC toolbox.
Vonrhein, Clemens; Flensburg, Claus; Keller, Peter; Sharff, Andrew; Smart, Oliver; Paciorek, Wlodek; Womack, Thomas; Bricogne, Gérard
2011-04-01
A typical diffraction experiment will generate many images and data sets from different crystals in a very short time. This creates a challenge for the high-throughput operation of modern synchrotron beamlines as well as for the subsequent data processing. Novice users in particular may feel overwhelmed by the tables, plots and numbers that the different data-processing programs and software packages present to them. Here, some of the more common problems that a user has to deal with when processing a set of images that will finally make up a processed data set are shown, concentrating on difficulties that may often show up during the first steps along the path of turning the experiment (i.e. data collection) into a model (i.e. interpreted electron density). Difficulties such as unexpected crystal forms, issues in crystal handling and suboptimal choices of data-collection strategies can often be dealt with, or at least diagnosed, by analysing specific data characteristics during processing. In the end, one wants to distinguish problems over which one has no immediate control once the experiment is finished from problems that can be remedied a posteriori. A new software package, autoPROC, is also presented that combines third-party processing programs with new tools and an automated workflow script that is intended to provide users with both guidance and insight into the offline processing of data affected by the difficulties mentioned above, with particular emphasis on the automated treatment of multi-sweep data sets collected on multi-axis goniostats.
Dangerous intersections? A review of studies of fatigue and distraction in the automated vehicle.
Matthews, Gerald; Neubauer, Catherine; Saxby, Dyani J; Wohleber, Ryan W; Lin, Jinchao
2018-04-10
The impacts of fatigue on the vehicle driver may change with technological advancements including automation and the increasing prevalence of potentially distracting in-car systems. This article reviews the authors' simulation studies of how fatigue, automation, and distraction may intersect as threats to safety. Distinguishing between states of active and passive fatigue supports understanding of fatigue and the development of countermeasures. Active fatigue is a stress-like state driven by overload of cognitive capabilities. Passive fatigue is produced by underload and monotony, and is associated with loss of task engagement and alertness. Our studies show that automated driving reliably elicits subjective symptoms of passive fatigue and also loss of alertness that persists following manual takeover. Passive fatigue also impairs attention and automation use in operators of Remotely Piloted Vehicles (RPVs). Use of in-vehicle media has been proposed as a countermeasure to fatigue, but such media may also be distracting. Studies tested whether various forms of phone-based media interacted with automation-induced fatigue, but effects were complex and dependent on task configuration. Selection of fatigue countermeasures should be guided by an understanding of the form of fatigue confronting the operator. System design, regulation of level of automation, managing distraction, and selection of fatigue-resilient personnel are all possible interventions for passive fatigue, but careful evaluation of interventions is necessary prior to deployment. Copyright © 2018. Published by Elsevier Ltd.
Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David
2018-04-01
Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
FRAME (Force Review Automation Environment): MATLAB-based AFM data processor.
Partola, Kostyantyn R; Lykotrafitis, George
2016-05-03
Data processing of force-displacement curves generated by atomic force microscopes (AFMs) for elastic moduli and unbinding event measurements is very time consuming and susceptible to user error or bias. There is an evident need for consistent, dependable, and easy-to-use AFM data processing software. We have developed an open-source software application, the force review automation environment (or FRAME), that provides users with an intuitive graphical user interface, automating data processing, and tools for expediting manual processing. We did not observe a significant difference between manually processed and automatically processed results from the same data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.
Accelerated design of bioconversion processes using automated microscale processing techniques.
Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M
2003-01-01
Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.
Automating Acquisitions: The Planning Process.
ERIC Educational Resources Information Center
Bryant, Bonita
1984-01-01
Account of process followed at large academic library in preparing for automation of acquisition and fund accounting functions highlights planning criteria, local goals, planning process elements (selecting participants, assigning tasks, devising timetable, providing foundations, evaluating systems, determining costs, formulating recommendations).…
EOS Terra: EOS DAM Automation Constellation MOWG
NASA Technical Reports Server (NTRS)
Mantziaras, Dimitrios C.
2017-01-01
Brief summary of the decision factors considered and process improvement steps made, to evolve the ESMO debris avoidance maneuver process to a more automated process. Presentation is in response to an action item/question received at a prior MOWG meeting.
Driver compliance to take-over requests with different auditory outputs in conditional automation.
Forster, Yannick; Naujoks, Frederik; Neukum, Alexandra; Huestegge, Lynn
2017-12-01
Conditionally automated driving (CAD) systems are expected to improve traffic safety. Whenever the CAD system exceeds its limit of operation, designers of the system need to ensure a safe and timely enough transition from automated to manual mode. An existing visual Human-Machine Interface (HMI) was supplemented by different auditory outputs. The present work compares the effects of different auditory outputs in form of (1) a generic warning tone and (2) additional semantic speech output on driver behavior for the announcement of an upcoming take-over request (TOR). We expect the information carried by means of speech output to lead to faster reactions and better subjective evaluations by the drivers compared to generic auditory output. To test this assumption, N=17 drivers completed two simulator drives, once with a generic warning tone ('Generic') and once with additional speech output ('Speech+generic'), while they were working on a non-driving related task (NDRT; i.e., reading a magazine). Each drive incorporated one transition from automated to manual mode when yellow secondary lanes emerged. Different reaction time measures, relevant for the take-over process, were assessed. Furthermore, drivers evaluated the complete HMI regarding usefulness, ease of use and perceived visual workload just after experiencing the take-over. They gave comparative ratings on usability and acceptance at the end of the experiment. Results revealed that reaction times, reflecting information processing time (i.e., hands on the steering wheel, termination of NDRT), were shorter for 'Speech+generic' compared to 'Generic' while reaction time, reflecting allocation of attention (i.e., first glance ahead), did not show this difference. Subjective ratings were in favor of the system with additional speech output. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Savant, Vaibhav; Smith, Niall
2016-07-01
We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.
Executive system software design and expert system implementation
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1992-01-01
The topics are presented in viewgraph form and include: software requirements; design layout of the automated assembly system; menu display for automated composite command; expert system features; complete robot arm state diagram and logic; and expert system benefits.
Standard Terminal Automation Replacement System Human Factors Review Supporting Documents Volume 2
DOT National Transportation Integrated Search
1997-12-18
The Federal Aviation Administration formed a Standard Terminal Automation Replacement : System (STARS) Working Group to identify and resolve human factors concerns with the Early : Display Capability (EDC) system before it is introduced in the field....
A system-level approach to automation research
NASA Technical Reports Server (NTRS)
Harrison, F. W.; Orlando, N. E.
1984-01-01
Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.
First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)
NASA Technical Reports Server (NTRS)
Griffin, Sandy (Editor)
1987-01-01
Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.
Automation and decision support in interactive consumer products.
Sauer, J; Rüttinger, B
2007-06-01
This article presents two empirical studies (n = 30, n = 48) that are concerned with different forms of automation in interactive consumer products. The goal of the studies was to evaluate the effectiveness of two types of automation: perceptual augmentation (i.e. supporting users' information acquisition and analysis); and control integration (i.e. supporting users' action selection and implementation). Furthermore, the effectiveness of on-product information (i.e. labels attached to product) in supporting automation design was evaluated. The findings suggested greater benefits for automation in control integration than in perceptual augmentation alone, which may be partly due to the specific requirements of consumer product usage. If employed appropriately, on-product information can be a helpful means of information conveyance. The article discusses the implications of automation design in interactive consumer products while drawing on automation models from the work environment.
Application of automation and information systems to forensic genetic specimen processing.
Leclair, Benoît; Scholl, Tom
2005-03-01
During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.
Automation bias: decision making and performance in high-tech cockpits.
Mosier, K L; Skitka, L J; Heers, S; Burdick, M
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
Programmable logic controller optical fibre sensor interface module
NASA Astrophysics Data System (ADS)
Allwood, Gary; Wild, Graham; Hinckley, Steven
2011-12-01
Most automated industrial processes use Distributed Control Systems (DCSs) or Programmable Logic Controllers (PLCs) for automated control. PLCs tend to be more common as they have much of the functionality of DCSs, although they are generally cheaper to install and maintain. PLCs in conjunction with a human machine interface form the basis of Supervisory Control And Data Acquisition (SCADA) systems, combined with communication infrastructure and Remote Terminal Units (RTUs). RTU's basically convert different sensor measurands in to digital data that is sent back to the PLC or supervisory system. Optical fibre sensors are becoming more common in industrial processes because of their many advantageous properties. Being small, lightweight, highly sensitive, and immune to electromagnetic interference, means they are an ideal solution for a variety of diverse sensing applications. Here, we have developed a PLC Optical Fibre Sensor Interface Module (OFSIM), in which an optical fibre is connected directly to the OFSIM located next to the PLC. The embedded fibre Bragg grating sensors, are highly sensitive and can detect a number of different measurands such as temperature, pressure and strain without the need for a power supply.
Farine, Damien R.; Firth, Josh A.; Aplin, Lucy M.; Crates, Ross A.; Culina, Antica; Garroway, Colin J.; Hinde, Camilla A.; Kidd, Lindall R.; Milligan, Nicole D.; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C.
2015-01-01
Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals that formed 73 737 groups from which we inferred a social network. Our framework identified that both social and spatial drivers contributed to assortment in the network. In particular, groups had a more even sex ratio than expected and exhibited a consistent age structure that suggested local association preferences, such as preferential attachment or avoidance. By contrast, recent immigrants were spatially partitioned from locally born individuals, suggesting differential dispersal strategies by phenotype. Our results highlight how different scales of social decision-making, ranging from post-natal dispersal settlement to fission–fusion dynamics, can interact to drive phenotypic structure in animal populations. PMID:26064644
Examining single- and multiple-process theories of trust in automation.
Rice, Stephen
2009-07-01
The author examined the effects of human responses to automation alerts and nonalerts. Previous research has shown that automation false alarms and misses have differential effects on human trust (i.e., automation false alarms tend to affect operator compliance, whereas automation misses tend to affect operator reliance). Participants performed a simulated combat task, whereby they examined aerial photographs for the presence of enemy targets. A diagnostic aid provided a recommendation during each trial. The author manipulated the reliability and response bias of the aid to provide appropriate data for state-trace analyses. The analyses provided strong evidence that only a multiple-process theory of operator trust can explain the effects of automation errors on human dependence behaviors. The author discusses the theoretical and practical implications of this finding.
Robotics for Nuclear Material Handling at LANL:Capabilities and Needs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harden, Troy A; Lloyd, Jane A; Turner, Cameron J
Nuclear material processing operations present numerous challenges for effective automation. Confined spaces, hazardous materials and processes, particulate contamination, radiation sources, and corrosive chemical operations are but a few of the significant hazards. However, automated systems represent a significant safety advance when deployed in place of manual tasks performed by human workers. The replacement of manual operations with automated systems has been desirable for nearly 40 years, yet only recently are automated systems becoming increasingly common for nuclear materials handling applications. This paper reviews several automation systems which are deployed or about to be deployed at Los Alamos National Laboratory formore » nuclear material handling operations. Highlighted are the current social and technological challenges faced in deploying automated systems into hazardous material handling environments and the opportunities for future innovations.« less
Automation of Cassini Support Imaging Uplink Command Development
NASA Technical Reports Server (NTRS)
Ly-Hollins, Lisa; Breneman, Herbert H.; Brooks, Robert
2010-01-01
"Support imaging" is imagery requested by other Cassini science teams to aid in the interpretation of their data. The generation of the spacecraft command sequences for these images is performed by the Cassini Instrument Operations Team. The process initially established for doing this was very labor-intensive, tedious and prone to human error. Team management recognized this process as one that could easily benefit from automation. Team members were tasked to document the existing manual process, develop a plan and strategy to automate the process, implement the plan and strategy, test and validate the new automated process, and deliver the new software tools and documentation to Flight Operations for use during the Cassini extended mission. In addition to the goals of higher efficiency and lower risk in the processing of support imaging requests, an effort was made to maximize adaptability of the process to accommodate uplink procedure changes and the potential addition of new capabilities outside the scope of the initial effort.
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
9 CFR 381.307 - Record review and maintenance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...
Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter
2011-07-01
The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture.
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
Exponential error reduction in pretransfusion testing with automation.
South, Susan F; Casina, Tony S; Li, Lily
2012-08-01
Protecting the safety of blood transfusion is the top priority of transfusion service laboratories. Pretransfusion testing is a critical element of the entire transfusion process to enhance vein-to-vein safety. Human error associated with manual pretransfusion testing is a cause of transfusion-related mortality and morbidity and most human errors can be eliminated by automated systems. However, the uptake of automation in transfusion services has been slow and many transfusion service laboratories around the world still use manual blood group and antibody screen (G&S) methods. The goal of this study was to compare error potentials of commonly used manual (e.g., tiles and tubes) versus automated (e.g., ID-GelStation and AutoVue Innova) G&S methods. Routine G&S processes in seven transfusion service laboratories (four with manual and three with automated G&S methods) were analyzed using failure modes and effects analysis to evaluate the corresponding error potentials of each method. Manual methods contained a higher number of process steps ranging from 22 to 39, while automated G&S methods only contained six to eight steps. Corresponding to the number of the process steps that required human interactions, the risk priority number (RPN) of the manual methods ranged from 5304 to 10,976. In contrast, the RPN of the automated methods was between 129 and 436 and also demonstrated a 90% to 98% reduction of the defect opportunities in routine G&S testing. This study provided quantitative evidence on how automation could transform pretransfusion testing processes by dramatically reducing error potentials and thus would improve the safety of blood transfusion. © 2012 American Association of Blood Banks.
Technological advances for studying human behavior
NASA Technical Reports Server (NTRS)
Roske-Hofstrand, Renate J.
1990-01-01
Technological advances for studying human behavior are noted in viewgraph form. It is asserted that performance-aiding systems are proliferating without a fundamental understanding of how they would interact with the humans who must control them. Two views of automation research, the hardware view and the human-centered view, are listed. Other viewgraphs give information on vital elements for human-centered research, a continuum of the research process, available technologies, new technologies for persistent problems, a sample research infrastructure, the need for metrics, and examples of data-link technology.
NASA Technical Reports Server (NTRS)
Jones, Michael K.
1998-01-01
Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.
Beyond the Black Box: An Assessment of Strategic War Gaming
1987-12-01
they are frequently expressed in the form of a model. In contrast, gaming is process-oriented. R ~pication may not be as important as realism, so games...will base its decisions in part on knowledge of opponent attitudes, intentions, and resources. 38 -I . r & VLPIL- A.’K. * % % N. Thus the flow of...Institute Press, 1987. Bennett, Bruce W. and Davis, Paul K., The Role of Automated War Gaming in Strategic Analysis, Rand Corporation Report P-7053, December
NASA Technical Reports Server (NTRS)
Baumann, P. R. (Principal Investigator)
1979-01-01
Three computer quantitative techniques for determining urban land cover patterns are evaluated. The techniques examined deal with the selection of training samples by an automated process, the overlaying of two scenes from different seasons of the year, and the use of individual pixels as training points. Evaluation is based on the number and type of land cover classes generated and the marks obtained from an accuracy test. New Orleans, Louisiana and its environs form the study area.
Automation for deep space vehicle monitoring
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.
1991-01-01
Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.
Updating National Topographic Data Base Using Change Detection Methods
NASA Astrophysics Data System (ADS)
Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.
2016-06-01
The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulsh, M.; Wheeler, D.; Protopappas, P.
The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study usingmore » a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.« less
Workload Capacity: A Response Time-Based Measure of Automation Dependence.
Yamani, Yusuke; McCarley, Jason S
2016-05-01
An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.
Applying machine learning classification techniques to automate sky object cataloguing
NASA Astrophysics Data System (ADS)
Fayyad, Usama M.; Doyle, Richard J.; Weir, W. Nick; Djorgovski, Stanislav
1993-08-01
We describe the application of an Artificial Intelligence machine learning techniques to the development of an automated tool for the reduction of a large scientific data set. The 2nd Mt. Palomar Northern Sky Survey is nearly completed. This survey provides comprehensive coverage of the northern celestial hemisphere in the form of photographic plates. The plates are being transformed into digitized images whose quality will probably not be surpassed in the next ten to twenty years. The images are expected to contain on the order of 107 galaxies and 108 stars. Astronomers wish to determine which of these sky objects belong to various classes of galaxies and stars. Unfortunately, the size of this data set precludes analysis in an exclusively manual fashion. Our approach is to develop a software system which integrates the functions of independently developed techniques for image processing and data classification. Digitized sky images are passed through image processing routines to identify sky objects and to extract a set of features for each object. These routines are used to help select a useful set of attributes for classifying sky objects. Then GID3 (Generalized ID3) and O-B Tree, two inductive learning techniques, learns classification decision trees from examples. These classifiers will then be applied to new data. These developmnent process is highly interactive, with astronomer input playing a vital role. Astronomers refine the feature set used to construct sky object descriptions, and evaluate the performance of the automated classification technique on new data. This paper gives an overview of the machine learning techniques with an emphasis on their general applicability, describes the details of our specific application, and reports the initial encouraging results. The results indicate that our machine learning approach is well-suited to the problem. The primary benefit of the approach is increased data reduction throughput. Another benefit is consistency of classification. The classification rules which are the product of the inductive learning techniques will form an objective, examinable basis for classifying sky objects. A final, not to be underestimated benefit is that astronomers will be freed from the tedium of an intensely visual task to pursue more challenging analysis and interpretation problems based on automatically catalogued data.
ARES - A New Airborne Reflective Emissive Spectrometer
2005-10-01
Information and Management System (DIMS), an automated processing environment with robot archive interface as established for the handling of satellite data...consisting of geocoded ground reflectance data. All described processing steps will be integrated in the automated processing environment DIMS to assure a
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2013 CFR
2013-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2012 CFR
2012-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2014 CFR
2014-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2011 CFR
2011-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
9 CFR 318.307 - Record review and maintenance.
Code of Federal Regulations, 2010 CFR
2010-01-01
... temperature/time recording devices shall be identified by production date, container code, processing vessel... made available to Program employees for review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and recordkeeping systems shall be designed and operated in a manner that will...
Siotto, Mariacristina; Pasqualetti, Patrizio; Marano, Massimo; Squitti, Rosanna
2014-10-01
Ceruloplasmin (Cp) is a serum ferroxidase that plays an essential role in iron metabolism. It is routinely tested by immunoturbidimetric assays that quantify the concentration of the protein both in its active and inactive forms. Cp activity is generally analyzed manually; the process is time-consuming, has a limited repeatability, and is not suitable for a clinical setting. To overcome these inconveniences, we have set the automation of the o-dianisidine Cp activity assay on a Cobas Mira Plus apparatus. The automation was rapid and repeatable, and the data were provided in terms of IU/L. The assay was adapted for human sera and showed a good precision [coefficient of variation (CV) 3.7 %] and low limit of detection (LoD 11.58 IU/L). The simultaneous analysis of Cp concentration and activity in the same run allowed us to calculate the Cp-specific activity that provides a better index of the overall Cp status. To test the usefulness of this automation, we tested this assay on 104 healthy volunteers and 36 patients with Wilson's disease, hepatic encephalopathy, and chronic liver disease. Cp activity and specific activity distinguished better patients between groups with respect to Cp concentration alone, and providing support for the clinical investigation of neurological diseases in which liver failure is one of the clinical hallmarks.
Development of simplified process for environmentally resistant cells
NASA Technical Reports Server (NTRS)
King, W. J.
1980-01-01
This report describes a program to develop a simple, foolproof, all vacuum solar cell manufacturing process which can be completely automated and which results in medium efficiency cells which are inherently environmentally resistant. All components of the completed cells are integrated into a monolithic structure with no material interfaces. The exposed materials (SI, Al2O3, Al, Ni) are all resistant to atmospheric attack and the junction, per se, is passivated to prevent long term degradation. Such cells are intended to be incorporated into a simple module consisting basically of a press formed metallic superstructure with a separated glass cover for missile, etc., protection.
You're a What? Automation Technician
ERIC Educational Resources Information Center
Mullins, John
2010-01-01
Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…
Opportunities for Automation of Student Aid Processing in Postsecondary Institutions.
ERIC Educational Resources Information Center
St. John, Edward P.
1986-01-01
An overview of the options and opportunities postsecondary institutions should consider when developing plans for student aid automation is provided. The role of automation in the financial aid office, interfaces with institutional and external systems, alternative approaches to automation, and the need for an institutional strategy for automation…
Stage Evolution of Office Automation Technological Change and Organizational Learning.
ERIC Educational Resources Information Center
Sumner, Mary
1985-01-01
A study was conducted to identify stage characteristics in terms of technology, applications, the role and responsibilities of the office automation organization, and planning and control strategies; and to describe the respective roles of data processing professionals, office automation analysts, and users in office automation systems development…
Records Management Handbook; Source Data Automation Equipment Guide.
ERIC Educational Resources Information Center
National Archives and Records Service (GSA), Washington, DC. Office of Records Management.
A detailed guide to selecting appropriate source data automation equipment is presented. Source data automation equipment is used to prepare data for electronic data processing or computerized recordkeeping. The guide contains specifications, performance data cost, and pictures of the major types of machines used in source data automation.…
The Influence of Cultural Factors on Trust in Automation
ERIC Educational Resources Information Center
Chien, Shih-Yi James
2016-01-01
Human interaction with automation is a complex process that requires both skilled operators and complex system designs to effectively enhance overall performance. Although automation has successfully managed complex systems throughout the world for over half a century, inappropriate reliance on automation can still occur, such as the recent…
Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2012-06-01
Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.
Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2011-01-01
Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871
Integration of disabled people in an automated work process
NASA Astrophysics Data System (ADS)
Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.
2017-05-01
Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.
CIS-lunar space infrastructure lunar technologies: Executive summary
NASA Technical Reports Server (NTRS)
Faller, W.; Hoehn, A.; Johnson, S.; Moos, P.; Wiltberger, N.
1989-01-01
Technologies necessary for the creation of a cis-Lunar infrastructure, namely: (1) automation and robotics; (2) life support systems; (3) fluid management; (4) propulsion; and (5) rotating technologies, are explored. The technological focal point is on the development of automated and robotic systems for the implementation of a Lunar Oasis produced by Automation and Robotics (LOAR). Under direction from the NASA Office of Exploration, automation and robotics were extensively utilized as an initiating stage in the return to the Moon. A pair of autonomous rovers, modular in design and built from interchangeable and specialized components, is proposed. Utilizing a buddy system, these rovers will be able to support each other and to enhance their individual capabilities. One rover primarily explores and maps while the second rover tests the feasibility of various materials-processing techniques. The automated missions emphasize availability and potential uses of Lunar resources, and the deployment and operations of the LOAR program. An experimental bio-volume is put into place as the precursor to a Lunar environmentally controlled life support system. The bio-volume will determine the reproduction, growth and production characteristics of various life forms housed on the Lunar surface. Physicochemical regenerative technologies and stored resources will be used to buffer biological disturbances of the bio-volume environment. The in situ Lunar resources will be both tested and used within this bio-volume. Second phase development on the Lunar surface calls for manned operations. Repairs and re-configuration of the initial framework will ensue. An autonomously-initiated manned Lunar oasis can become an essential component of the United States space program.
The paper crisis: from hospitals to medical practices.
Park, Gregory; Neaveill, Rodney S
2009-01-01
Hospitals, not unlike physician practices, are faced with an increasing burden of managing piles of hard copy documents including insurance forms, requests for information, and advance directives. Healthcare organizations are moving to transform paper-based forms and documents into digitized files in order to save time and money and to have those documents available at a moment's notice. The cost of these document management/imaging systems can be easily justified with the significant savings of resources realized from the implementation of these systems. This article illustrates the enormity of the "paper problem" in healthcare and outlines just a few of the required processes that could be improved with the use of automated document management/imaging systems.
Foundations of the Bandera Abstraction Tools
NASA Technical Reports Server (NTRS)
Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby
2003-01-01
Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.
Elements of EAF automation processes
NASA Astrophysics Data System (ADS)
Ioana, A.; Constantin, N.; Dragna, E. C.
2017-01-01
Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.
Sheng, Xi
2012-07-01
The thesis aims to study the automation replenishment algorithm in hospital on medical supplies supplying chain. The mathematical model and algorithm of medical supplies automation replenishment are designed through referring to practical data form hospital on the basis of applying inventory theory, greedy algorithm and partition algorithm. The automation replenishment algorithm is proved to realize automatic calculation of the medical supplies distribution amount and optimize medical supplies distribution scheme. A conclusion could be arrived that the model and algorithm of inventory theory, if applied in medical supplies circulation field, could provide theoretical and technological support for realizing medical supplies automation replenishment of hospital on medical supplies supplying chain.
NASA Technical Reports Server (NTRS)
Knight, Russell; Donnellan, Andrea; Green, Joseph J.
2013-01-01
A challenge for any proposed mission is to demonstrate convincingly that the proposed systems will in fact deliver the science promised. Funding agencies and mission design personnel are becoming ever more skeptical of the abstractions that form the basis of the current state of the practice with respect to approximating science return. To address this, we have been using automated planning and scheduling technology to provide actual coverage campaigns that provide better predictive performance with respect to science return for a given mission design and set of mission objectives given implementation uncertainties. Specifically, we have applied an adaptation of ASPEN and SPICE to the Eagle-Eye domain that demonstrates the performance of the mission design with respect to coverage of science imaging targets that address climate change and disaster response. Eagle-Eye is an Earth-imaging telescope that has been proposed to fly aboard the International Space Station (ISS).
GT-CATS: Tracking Operator Activities in Complex Systems
NASA Technical Reports Server (NTRS)
Callantine, Todd J.; Mitchell, Christine M.; Palmer, Everett A.
1999-01-01
Human operators of complex dynamic systems can experience difficulties supervising advanced control automation. One remedy is to develop intelligent aiding systems that can provide operators with context-sensitive advice and reminders. The research reported herein proposes, implements, and evaluates a methodology for activity tracking, a form of intent inferencing that can supply the knowledge required for an intelligent aid by constructing and maintaining a representation of operator activities in real time. The methodology was implemented in the Georgia Tech Crew Activity Tracking System (GT-CATS), which predicts and interprets the actions performed by Boeing 757/767 pilots navigating using autopilot flight modes. This report first describes research on intent inferencing and complex modes of automation. It then provides a detailed description of the GT-CATS methodology, knowledge structures, and processing scheme. The results of an experimental evaluation using airline pilots are given. The results show that GT-CATS was effective in predicting and interpreting pilot actions in real time.
Door recognition in cluttered building interiors using imagery and lidar data
NASA Astrophysics Data System (ADS)
Díaz-Vilariño, L.; Martínez-Sánchez, J.; Lagüela, S.; Armesto, J.; Khoshelham, K.
2014-06-01
Building indoors reconstruction is an active research topic due to the importance of the wide range of applications to which they can be subjected, from architecture and furniture design, to movies and video games editing, or even crime scene investigation. Among the constructive elements defining the inside of a building, doors are important entities in applications like routing and navigation, and their automated recognition is advantageous e.g. in case of large multi-storey buildings with many office rooms. The inherent complexity of the automation of the recognition process is increased by the presence of clutter and occlusions, difficult to avoid in indoor scenes. In this work, we present a pipeline of techniques used for the reconstruction and interpretation of building interiors using information acquired in the form of point clouds and images. The methodology goes in depth with door detection and labelling as either opened, closed or furniture (false positive)
Automated high-throughput flow-through real-time diagnostic system
Regan, John Frederick
2012-10-30
An automated real-time flow-through system capable of processing multiple samples in an asynchronous, simultaneous, and parallel fashion for nucleic acid extraction and purification, followed by assay assembly, genetic amplification, multiplex detection, analysis, and decontamination. The system is able to hold and access an unlimited number of fluorescent reagents that may be used to screen samples for the presence of specific sequences. The apparatus works by associating extracted and purified sample with a series of reagent plugs that have been formed in a flow channel and delivered to a flow-through real-time amplification detector that has a multiplicity of optical windows, to which the sample-reagent plugs are placed in an operative position. The diagnostic apparatus includes sample multi-position valves, a master sample multi-position valve, a master reagent multi-position valve, reagent multi-position valves, and an optical amplification/detection system.
Real-time automated failure identification in the Control Center Complex (CCC)
NASA Technical Reports Server (NTRS)
Kirby, Sarah; Lauritsen, Janet; Pack, Ginger; Ha, Anhhoang; Jowers, Steven; Mcnenny, Robert; Truong, The; Dell, James
1993-01-01
A system which will provide real-time failure management support to the Space Station Freedom program is described. The system's use of a simplified form of model based reasoning qualifies it as an advanced automation system. However, it differs from most such systems in that it was designed from the outset to meet two sets of requirements. First, it must provide a useful increment to the fault management capabilities of the Johnson Space Center (JSC) Control Center Complex (CCC) Fault Detection Management system. Second, it must satisfy CCC operational environment constraints such as cost, computer resource requirements, verification, and validation, etc. The need to meet both requirement sets presents a much greater design challenge than would have been the case had functionality been the sole design consideration. The choice of technology, discussing aspects of that choice and the process for migrating it into the control center is overviewed.
Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter
2011-01-01
The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture. PMID:21525218
78 FR 70929 - Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... through the use of automated collection techniques or other forms of information technology. DATES... information collection technology and to connect with ongoing initiatives to implement automated systems to... Department of the Army announces a proposed public information collection and seeks public comment on the...
Laboratory automation: trajectory, technology, and tactics.
Markin, R S; Whalen, S A
2000-05-01
Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.
Small cities face greater impact from automation.
Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad
2018-02-01
The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.
Summary of astronaut inputs on automation and robotics for Space Station Freedom
NASA Technical Reports Server (NTRS)
Weeks, David J.
1990-01-01
Astronauts and payload specialists present specific recommendations in the form of an overview that relate to the use of automation and robotics on the Space Station Freedom. The inputs are based on on-orbit operations experience, time requirements for crews, and similar crew-specific knowledge that address the impacts of automation and robotics on productivity. Interview techniques and specific questionnaire results are listed, and the majority of the responses indicate that incorporating automation and robotics to some extent and with human backup can improve productivity. Specific support is found for the use of advanced automation and EVA robotics on the Space Station Freedom and for the use of advanced automation on ground-based stations. Ground-based control of in-flight robotics is required, and Space Station activities and crew tasks should be analyzed to assess the systems engineering approach for incorporating automation and robotics.
Small cities face greater impact from automation
Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad
2018-01-01
The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514
Recent Developments in Advanced Automated Post-Processing at AMOS
2014-09-01
Borelli KJS Consulting Lisa Thompson Air Force Research Laboratory ABSTRACT A new automated post-processing system has been developed to...the existing algorithms in addition to the development of new data processing features. 6. REFERENCES 1 Matson, C.L., Beckner, C.C., Borelli , K
IDAPS (Image Data Automated Processing System) System Description
1988-06-24
This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed
ERIC Educational Resources Information Center
Moore, John
2007-01-01
In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…
2012-02-09
The calibrated data are then sent to NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS...hyperspectral sensor in space we have not previously developed automated processing for hyperspectral ocean color data. The hyperspectral processing branch
Minor, K S; Willits, J A; Marggraf, M P; Jones, M N; Lysaker, P H
2018-04-25
Conveying information cohesively is an essential element of communication that is disrupted in schizophrenia. These disruptions are typically expressed through disorganized symptoms, which have been linked to neurocognitive, social cognitive, and metacognitive deficits. Automated analysis can objectively assess disorganization within sentences, between sentences, and across paragraphs by comparing explicit communication to a large text corpus. Little work in schizophrenia has tested: (1) links between disorganized symptoms measured via automated analysis and neurocognition, social cognition, or metacognition; and (2) if automated analysis explains incremental variance in cognitive processes beyond clinician-rated scales. Disorganization was measured in schizophrenia (n = 81) with Coh-Metrix 3.0, an automated program that calculates basic and complex language indices. Trained staff also assessed neurocognition, social cognition, metacognition, and clinician-rated disorganization. Findings showed that all three cognitive processes were significantly associated with at least one automated index of disorganization. When automated analysis was compared with a clinician-rated scale, it accounted for significant variance in neurocognition and metacognition beyond the clinician-rated measure. When combined, these two methods explained 28-31% of the variance in neurocognition, social cognition, and metacognition. This study illustrated how automated analysis can highlight the specific role of disorganization in neurocognition, social cognition, and metacognition. Generally, those with poor cognition also displayed more disorganization in their speech-making it difficult for listeners to process essential information needed to tie the speaker's ideas together. Our findings showcase how implementing a mixed-methods approach in schizophrenia can explain substantial variance in cognitive processes.
Automated matching software for clinical trials eligibility: measuring efficiency and flexibility.
Penberthy, Lynne; Brown, Richard; Puma, Federico; Dahman, Bassam
2010-05-01
Clinical trials (CT) serve as the media that translates clinical research into standards of care. Low or slow recruitment leads to delays in delivery of new therapies to the public. Determination of eligibility in all patients is one of the most important factors to assure unbiased results from the clinical trials process and represents the first step in addressing the issue of under representation and equal access to clinical trials. This is a pilot project evaluating the efficiency, flexibility, and generalizibility of an automated clinical trials eligibility screening tool across 5 different clinical trials and clinical trial scenarios. There was a substantial total savings during the study period in research staff time spent in evaluating patients for eligibility ranging from 165h to 1329h. There was a marked enhancement in efficiency with the automated system for all but one study in the pilot. The ratio of mean staff time required per eligible patient identified ranged from 0.8 to 19.4 for the manual versus the automated process. The results of this study demonstrate that automation offers an opportunity to reduce the burden of the manual processes required for CT eligibility screening and to assure that all patients have an opportunity to be evaluated for participation in clinical trials as appropriate. The automated process greatly reduces the time spent on eligibility screening compared with the traditional manual process by effectively transferring the load of the eligibility assessment process to the computer. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T
2015-06-01
To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe volume and greater variability in edge detection for the left hepatic lobe compared to manual segmentation.
75 FR 42828 - Proposed Collection; Comment Request for Form 13560
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-22
... handling. This form serves as supporting documentation for any funds returned by an HPA and clarifies where... of automated collection techniques or other forms of information technology; and (e) estimates of...
48 CFR 2452.239-70 - Background investigations for sensitive automated systems/applications.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Regulations System DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT CLAUSES AND FORMS SOLICITATION PROVISIONS AND..., the contractor shall submit the following completed forms: Standard Form (SF) 85P, Questionnaire for...
48 CFR 2452.239-70 - Background investigations for sensitive automated systems/applications.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Regulations System DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT CLAUSES AND FORMS SOLICITATION PROVISIONS AND..., the contractor shall submit the following completed forms: Standard Form (SF) 85P, Questionnaire for...
The development of a post-test diagnostic system for rocket engines
NASA Technical Reports Server (NTRS)
Zakrajsek, June F.
1991-01-01
An effort was undertaken by NASA to develop an automated post-test, post-flight diagnostic system for rocket engines. The automated system is designed to be generic and to automate the rocket engine data review process. A modular, distributed architecture with a generic software core was chosen to meet the design requirements. The diagnostic system is initially being applied to the Space Shuttle Main Engine data review process. The system modules currently under development are the session/message manager, and portions of the applications section, the component analysis section, and the intelligent knowledge server. An overview is presented of a rocket engine data review process, the design requirements and guidelines, the architecture and modules, and the projected benefits of the automated diagnostic system.
CoLiTec software - detection of the near-zero apparent motion
NASA Astrophysics Data System (ADS)
Khlamov, Sergii V.; Savanevych, Vadym E.; Briukhovetskyi, Olexandr B.; Pohorelov, Artem V.
2017-06-01
In this article we described CoLiTec software for full automated frames processing. CoLiTec software allows processing the Big Data of observation results as well as processing of data that is continuously formed during observation. The scope of solving tasks includes frames brightness equalization, moving objects detection, astrometry, photometry, etc. Along with the high efficiency of Big Data processing CoLiTec software also ensures high accuracy of data measurements. A comparative analysis of the functional characteristics and positional accuracy was performed between CoLiTec and Astrometrica software. The benefits of CoLiTec used with wide field and low quality frames were observed. The efficiency of the CoLiTec software was proved by about 700.000 observations and over 1.500 preliminary discoveries.
NASA Astrophysics Data System (ADS)
Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus
2018-04-01
Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.
NASA Astrophysics Data System (ADS)
Bitner, Rex M.; Koller, Susan C.
2002-06-01
The automated high throughput purification of genomic DNA form plant materials can be performed using MagneSil paramagnetic particles on the Beckman-Coulter FX, BioMek 2000, and the Tecan Genesis robot. Similar automated methods are available for DNA purifications from animal blood. These methods eliminate organic extractions, lengthy incubations and cumbersome filter plates. The DNA is suitable for applications such as PCR and RAPD analysis. Methods are described for processing traditionally difficult samples such as those containing large amounts of polyphenolics or oils, while still maintaining a high level of DNA purity. The robotic protocols have ben optimized for agricultural applications such as marker assisted breeding, seed-quality testing, and SNP discovery and scoring. In addition to high yield purification of DNA from plant samples or animal blood, the use of Promega's DNA-IQ purification system is also described. This method allows for the purification of a narrow range of DNA regardless of the amount of additional DNA that is present in the initial sample. This simultaneous Isolation and Quantification of DNA allows the DNA to be used directly in applications such as PCR, SNP analysis, and RAPD, without the need for separate quantitation of the DNA.
Development of an interactive data base management system for capturing large volumes of data.
Moritz, T E; Ellis, N K; VillaNueva, C B; Steeger, J E; Ludwig, S T; Deegan, N I; Shroyer, A L; Henderson, W G; Sethi, G K; Grover, F L
1995-10-01
Accurate collection and successful management of data are problems common to all scientific studies. For studies in which large quantities of data are collected by means of questionnaires and/or forms, data base management becomes quite laborious and time consuming. Data base management comprises data collection, data entry, data editing, and data base maintenance. In this article, the authors describe the development of an interactive data base management (IDM) system for the collection of more than 1,400 variables from a targeted population of 6,000 patients undergoing heart surgery requiring cardiopulmonary bypass. The goals of the IDM system are to increase the accuracy and efficiency with which this large amount of data is collected and processed, to reduce research nurse work load through automation of certain administrative and clerical activities, and to improve the process for implementing a uniform study protocol, standardized forms, and definitions across sites.
Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.
ERIC Educational Resources Information Center
Meghabghab, Dania Bilal
Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…
ERIC Educational Resources Information Center
Spaulding, Trent Joseph
2011-01-01
The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…
NASA Astrophysics Data System (ADS)
Schug, Alexander; Winkelbauer, Jonas; Hinterhölzl, Roland; Drechsler, Klaus
2017-10-01
The aim of this study was to analyse the forming behaviour of glass fibre reinforced polypropylene and to identify the influence of several process parameters on the resulting part quality. For this purpose, a complex forming tool was designed, consisting of several areas with single and double curvature. The specimens were produced from unidirectional (UD) tape using the Fiberforge RELAY2000® automated tape laying machine and a subsequent consolidation step. They were then fixed in a support frame, pre-heated in an infrared oven, and formed in the forming tool, which was mounted into a hydraulic heating press. The investigated process parameters were the number and force of the springs in the support frame, the tool temperature and the forming pressure and speed. The layups of the specimens were [0/90/0/90/0¯]s and [0/45/90/-45/0¯]s. After the forming process, the parts were analysed in terms of their quality, with a special focus on wrinkles, undulations, gaps and surface roughness. In addition to optical analysis with a statistical evaluation of the results, 3D scans of the specimens at different steps of the forming process were made to gain an impression of the forming mechanisms and the development of failures. The ATOS system of GOM was used for these 3D scans. The results show that the undulations were influenced by the tool temperature and the spring force. By contrast, the surface quality was most strongly dependent on the forming pressure, which also influenced the size and the number of gaps. The forming speed affected the gaps as well. The tool temperature had the largest influence on the development of wrinkles. As expected, the quasi-isotropic layup showed distinctly more wrinkles and undulations, but it also presented a better surface quality than the orthotropic layup.
Measuring Performance with Library Automated Systems.
ERIC Educational Resources Information Center
OFarrell, John P.
2000-01-01
Investigates the capability of three library automated systems to generate some of the datasets necessary to form the ISO (International Standards Organization) standard on performance measurement within libraries, based on research in Liverpool John Moores University (United Kingdom). Concludes that the systems are weak in generating the…
NASA Astrophysics Data System (ADS)
Steposhina, S. V.; Fedonin, O. N.
2018-03-01
Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.
Engineering visualization utilizing advanced animation
NASA Technical Reports Server (NTRS)
Sabionski, Gunter R.; Robinson, Thomas L., Jr.
1989-01-01
Engineering visualization is the use of computer graphics to depict engineering analysis and simulation in visual form from project planning through documentation. Graphics displays let engineers see data represented dynamically which permits the quick evaluation of results. The current state of graphics hardware and software generally allows the creation of two types of 3D graphics. The use of animated video as an engineering visualization tool is presented. The engineering, animation, and videography aspects of animated video production are each discussed. Specific issues include the integration of staffing expertise, hardware, software, and the various production processes. A detailed explanation of the animation process reveals the capabilities of this unique engineering visualization method. Automation of animation and video production processes are covered and future directions are proposed.
Blastocyst microinjection automation.
Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly
2009-09-01
Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-28
... Inventory Schedules; DD Form 1639, Scrap Warranty; DD Form 1640, Request for Plant Clearance; DD Form 1641... information collection on respondents, including the use of automated collection techniques or other forms of... Government property; [[Page 72915
Automated System for Early Breast Cancer Detection in Mammograms
NASA Technical Reports Server (NTRS)
Bankman, Isaac N.; Kim, Dong W.; Christens-Barry, William A.; Weinberg, Irving N.; Gatewood, Olga B.; Brody, William R.
1993-01-01
The increasing demand on mammographic screening for early breast cancer detection, and the subtlety of early breast cancer signs on mammograms, suggest an automated image processing system that can serve as a diagnostic aid in radiology clinics. We present a fully automated algorithm for detecting clusters of microcalcifications that are the most common signs of early, potentially curable breast cancer. By using the contour map of the mammogram, the algorithm circumvents some of the difficulties encountered with standard image processing methods. The clinical implementation of an automated instrument based on this algorithm is also discussed.
Results from the first fully automated PBS-mask process and pelliclization
NASA Astrophysics Data System (ADS)
Oelmann, Andreas B.; Unger, Gerd M.
1994-02-01
Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.
NASA Astrophysics Data System (ADS)
Moreno, R.; Bazán, A. M.
2017-10-01
The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.
NASA Astrophysics Data System (ADS)
Lyakhovets, M. V.; Wenger, K. G.; Myshlyaev, L. P.; Shipunov, M. V.; Grachev, V. V.; Melkozerov, M. Yu; Fairoshin, Sh A.
2018-05-01
The experience of modernization of the automation control system of technological processes at the preparation plant under the conditions of technical re-equipment of the preparation plant “Barzasskoye Tovarischestvo” LLC (Berezovsky) is considered. The automated process control systems (APCS), the modernization goals and the ways to achieve them are indicated, the main subsystems of the integrated APCS are presented, the enlarged functional and technical structure of the upgraded system is given. The procedure for commissioning an upgraded system is described.
Automated fiber placement: Evolution and current demonstrations
NASA Technical Reports Server (NTRS)
Grant, Carroll G.; Benson, Vernon M.
1993-01-01
The automated fiber placement process has been in development at Hercules since 1980. Fiber placement is being developed specifically for aircraft and other high performance structural applications. Several major milestones have been achieved during process development. These milestones are discussed in this paper. The automated fiber placement process is currently being demonstrated on the NASA ACT program. All demonstration projects to date have focused on fiber placement of transport aircraft fuselage structures. Hercules has worked closely with Boeing and Douglas on these demonstration projects. This paper gives a description of demonstration projects and results achieved.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.
2003-01-01
Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.
Jurrus, Elizabeth; Watanabe, Shigeki; Giuly, Richard J.; Paiva, Antonio R. C.; Ellisman, Mark H.; Jorgensen, Erik M.; Tasdizen, Tolga
2013-01-01
Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated process first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes. PMID:22644867
Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N
2017-08-01
To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple bacterial isolates. This will be a powerful experimental tool facilitating the study of bacterial invasion, drug resistance, and the development of new therapeutics. Copyright © 2017 Elsevier B.V. All rights reserved.
78 FR 52236 - Proposed Collection; Comment Request for Form 13560
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-22
... return of funds in order to ensure proper handling. This form serves as supporting documentation for any..., including through the use of automated collection techniques or other forms of information technology; and...
76 FR 55743 - Proposed Collection; Comment Request for Form 941-M
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-08
...; and (b) social security and Medicare taxes. Current Actions: There are no changes being made to Form... automated collection techniques or other forms of information technology; and (e) estimates of capital or...
Code of Federal Regulations, 2014 CFR
2014-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2013 CFR
2013-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2011 CFR
2011-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Code of Federal Regulations, 2010 CFR
2010-10-01
..., development or installation of a statewide automated application processing and information retrieval system.... (2) The system is compatible with the claims processing and information retrieval systems used in the... in the title IV-A (AFDC) Automated Application Processing and Information Retrieval System Guide...
Data processing and analysis with the autoPROC toolbox
Vonrhein, Clemens; Flensburg, Claus; Keller, Peter; Sharff, Andrew; Smart, Oliver; Paciorek, Wlodek; Womack, Thomas; Bricogne, Gérard
2011-01-01
A typical diffraction experiment will generate many images and data sets from different crystals in a very short time. This creates a challenge for the high-throughput operation of modern synchrotron beamlines as well as for the subsequent data processing. Novice users in particular may feel overwhelmed by the tables, plots and numbers that the different data-processing programs and software packages present to them. Here, some of the more common problems that a user has to deal with when processing a set of images that will finally make up a processed data set are shown, concentrating on difficulties that may often show up during the first steps along the path of turning the experiment (i.e. data collection) into a model (i.e. interpreted electron density). Difficulties such as unexpected crystal forms, issues in crystal handling and suboptimal choices of data-collection strategies can often be dealt with, or at least diagnosed, by analysing specific data characteristics during processing. In the end, one wants to distinguish problems over which one has no immediate control once the experiment is finished from problems that can be remedied a posteriori. A new software package, autoPROC, is also presented that combines third-party processing programs with new tools and an automated workflow script that is intended to provide users with both guidance and insight into the offline processing of data affected by the difficulties mentioned above, with particular emphasis on the automated treatment of multi-sweep data sets collected on multi-axis goniostats. PMID:21460447
Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S
2017-11-29
Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.
Kottawatta, Kottawattage S. A.; Van Bergen, Marcel A. P.; Abeynayake, Preeni; Wagenaar, Jaap A.; Veldman, Kees T.; Kalupahana, Ruwani S.
2017-01-01
Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102) and wet markets (n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance. PMID:29186018
ERIC Educational Resources Information Center
Cupoli, Patricia Dymkar
1981-01-01
Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)
NASA Astrophysics Data System (ADS)
Carson, John C.
1990-11-01
Various papers on materials, devices, techniques, and applications for X-plane focal plane array technology are presented. Individual topics addressed include: application of Z-plane technology to the remote sensing of the earth from GEO, applications of smart neuromorphic focal planes, image-processing of Z-plane technology, neural network Z-plane implementation with very high interconnection rates, using a small IR surveillance satellite for tactical applications, establishing requirements for homing applications, Z-plane technology. Also discussed are: on-array spike suppression signal processing, algorithms for on-focal-plane gamma circumvention and time-delay integration, current HYMOSS Z-technology, packaging of electrons for on- and off-FPA signal processing, space/performance qualification of tape automated bonded devices, automation in tape automated bonding, high-speed/high-volume radiometric testing of Z-technology focal planes, 128-layer HYMOSS-module fabrication issues, automation of IRFPA production processes.
Feasibility of Carbon Fiber/PEEK Composites for Cryogenic Fuel Tank Applications
NASA Astrophysics Data System (ADS)
Doyle, K.; Doyle, A.; O Bradaigh, C. M.; Jaredson, D.
2012-07-01
This paper investigates the feasibility of CF/PEEK composites for manufacture of cryogenic fuel tanks for Next Generation Space Launchers. The material considered is CF/PEEK tape from Suprem SA and the proposed manufacturing process for the fuel tank is Automated Tape Placement. Material characterization was carried out on test laminates manufactured in an autoclave and also by Automated Tape Placement with in-situ consolidation. The results of the two processes were compared to establish if there is any knock down in properties for the automated tape placement process. A permeability test rig was setup with a helium leak detector and the effect of thermal cycling on the permeability properties of CF/PEEK was measured. A 1/10th scale demonstrator was designed and manufactured consisting of a cylinder manufactured by automated tape placement and an upper dome manufactured by autoclave processing. The assembly was achieved by Amorphous Interlayer Bonding with PEI.
An intelligent CNC machine control system architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.J.; Loucks, C.S.
1996-10-01
Intelligent, agile manufacturing relies on automated programming of digitally controlled processes. Currently, processes such as Computer Numerically Controlled (CNC) machining are difficult to automate because of highly restrictive controllers and poor software environments. It is also difficult to utilize sensors and process models for adaptive control, or to integrate machining processes with other tasks within a factory floor setting. As part of a Laboratory Directed Research and Development (LDRD) program, a CNC machine control system architecture based on object-oriented design and graphical programming has been developed to address some of these problems and to demonstrate automated agile machining applications usingmore » platform-independent software.« less
Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.
1999-07-13
An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.
Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.
1999-01-01
An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.
Code of Federal Regulations, 2010 CFR
2010-10-01
... IV-D Systems and Office Automation? 310.15 Section 310.15 Public Welfare Regulations Relating to... AND OFFICE AUTOMATION Requirements for Computerized Tribal IV-D Systems and Office Automation § 310.15... ensure the security and privacy of Computerized Tribal IV-D Systems and Office Automation? (a...
Automation of Acquisition Records and Routine in the University Library, Newcastle upon Tyne
ERIC Educational Resources Information Center
Line, Maurice B.
2006-01-01
Purpose: Reports on the trial of an automated order routine for the University Library in Newcastle which began in April 1966. Design/methodology/approach: Presents the author's experiences of the manual order processing system, and the impetus for trialling an automated system. The stages of the automated system are described in detail. Findings:…
Employee Engagement Is Vital for the Successful Selection of a Total Laboratory Automation System.
Yu, Hoi-Ying E; Wilkerson, Myra L
2017-11-08
To concretely outline a process for selecting a total laboratory automation system that connects clinical chemistry, hematology, and coagulation analyzers and to serve as a reference for other laboratories. In Phase I, a committee including the laboratory's directors and technologists conducted a review of 5 systems based on formal request for information process, site visits, and vendor presentations. We developed evaluation criteria and selected the 2 highest performing systems. In Phase II, we executed a detailed comparison of the 2 vendors based on cost, instrument layout, workflow design, and future potential. In addition to selecting a laboratory automation system, we used the process to ensure employee engagement in preparation for implementation. Selecting a total laboratory automation system is a complicated process. This paper provides practical guide in how a thorough selection process can be done with participation of key stakeholders. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
de Brouwer, Hans; Stegeman, Gerrit
2011-02-01
To maximize utilization of expensive laboratory instruments and to make most effective use of skilled human resources, the entire chain of data processing, calculation, and reporting that is needed to transform raw NMR data into meaningful results was automated. The LEAN process improvement tools were used to identify non-value-added steps in the existing process. These steps were eliminated using an in-house developed software package, which allowed us to meet the key requirement of improving quality and reliability compared with the existing process while freeing up valuable human resources and increasing productivity. Reliability and quality were improved by the consistent data treatment as performed by the software and the uniform administration of results. Automating a single NMR spectrophotometer led to a reduction in operator time of 35%, doubling of the annual sample throughput from 1400 to 2800, and reducing the turn around time from 6 days to less than 2. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
Real time automated inspection
Fant, Karl M.; Fundakowski, Richard A.; Levitt, Tod S.; Overland, John E.; Suresh, Bindinganavle R.; Ulrich, Franz W.
1985-01-01
A method and apparatus relating to the real time automatic detection and classification of characteristic type surface imperfections occurring on the surfaces of material of interest such as moving hot metal slabs produced by a continuous steel caster. A data camera transversely scans continuous lines of such a surface to sense light intensities of scanned pixels and generates corresponding voltage values. The voltage values are converted to corresponding digital values to form a digital image of the surface which is subsequently processed to form an edge-enhanced image having scan lines characterized by intervals corresponding to the edges of the image. The edge-enhanced image is thresholded to segment out the edges and objects formed by the edges are segmented out by interval matching and bin tracking. Features of the objects are derived and such features are utilized to classify the objects into characteristic type surface imperfections.
Ask the experts: automation: part I.
Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming
2013-08-01
Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.
Automating Disk Forensic Processing with SleuthKit, XML and Python
2009-05-01
1 Automating Disk Forensic Processing with SleuthKit, XML and Python Simson L. Garfinkel Abstract We have developed a program called fiwalk which...files themselves. We show how it is relatively simple to create automated disk forensic applications using a Python module we have written that reads...software that the portable device may contain. Keywords: Computer Forensics; XML; Sleuth Kit; Python I. INTRODUCTION In recent years we have found many
Office Automation: A Look Beyond Word Processing.
1983-06-01
implying the type and nature of work performed by white-collar employees , and "products" will denote the techniques and type of equipment necessary...REPORT & PERIOD COVERED Office Automation: A Look Beyond Master’s Thesis Word Processing June 1983 6. PERFORMING ORG. REPORT NUMUER 7. AUTYOH) S...look at the problems of implementing an automated office and the possible impact it can have on human office workers. The purpose of this thesis is thus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chernoguzov, Alexander; Markham, Thomas R.; Haridas, Harshal S.
A method includes generating at least one access vector associated with a specified device in an industrial process control and automation system. The specified device has one of multiple device roles. The at least one access vector is generated based on one or more communication policies defining communications between one or more pairs of devices roles in the industrial process control and automation system, where each pair of device roles includes the device role of the specified device. The method also includes providing the at least one access vector to at least one of the specified device and one ormore » more other devices in the industrial process control and automation system in order to control communications to or from the specified device.« less
Performance Evaluation of the UT Automated Road Maintenance Machine
DOT National Transportation Integrated Search
1997-10-01
This final report focuses mainly on evaluating the overall performance of The University of Texas' Automated Road Maintenance Machine (ARMM). It was concluded that the introduction of automated methods to the pavement crack-sealing process will impro...
Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing
Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang
2018-01-01
Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855
Using artificial intelligence to automate remittance processing.
Adams, W T; Snow, G M; Helmick, P M
1998-06-01
The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.
Uncertainty and Motivation to Seek Information from Pharmacy Automated Communications.
Bones, Michelle; Nunlee, Martin
2018-05-28
Pharmacy personnel often answer telephones to respond to pharmacy customers (subjects) who received messages from automated systems. This research examines the communication process in terms of how users interact and engage with pharmacies after receiving automated messages. No study has directly addressed automated telephone calls and subjects' interactions. The purpose of this study is to test the interpersonal communication (IC) process of uncertainty in subjects in receipt of automated telephone calls ATCs from pharmacies. Subjects completed a survey of validated scales for Satisfaction (S); Relevance (R); Quality (Q); Need for Cognitive Closure (NFC). Relationships between S, R, Q, NFC, and subject preference to ATCs were analyzed to determine whether subjects contacting pharmacies display information seeking behavior. Results demonstrated that seeking information occurs if subjects: are dissatisfied with the content of the ATC; perceive that the Q of ATC is high and like receiving the ATC, or have a high NFC and do not like receiving ATCs. Other interactions presented complexities amongst uncertainty and tolerance of NFC within the IC process.
Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.
Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang
2018-02-15
Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
75 FR 51533 - Proposed Collection; Comment Request for Form 14134 and Form 14135
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-20
... through the use of automated collection techniques or other forms of information technology; and (e... Form 14134, Application for Certificate of Subordination of Federal Tax Lien, and Form 14135, Application for Certificate of Discharge of Property from Federal Tax Lien. DATES: Written comments should be...
Enhancing Users' Participation in Business Process Modeling through Ontology-Based Training
NASA Astrophysics Data System (ADS)
Macris, A.; Malamateniou, F.; Vassilacopoulos, G.
Successful business process design requires active participation of users who are familiar with organizational activities and business process modelling concepts. Hence, there is a need to provide users with reusable, flexible, agile and adaptable training material in order to enable them instil their knowledge and expertise in business process design and automation activities. Knowledge reusability is of paramount importance in designing training material on process modelling since it enables users participate actively in process design/redesign activities stimulated by the changing business environment. This paper presents a prototype approach for the design and use of training material that provides significant advantages to both the designer (knowledge - content reusability and semantic web enabling) and the user (semantic search, knowledge navigation and knowledge dissemination). The approach is based on externalizing domain knowledge in the form of ontology-based knowledge networks (i.e. training scenarios serving specific training needs) so that it is made reusable.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2012 CFR
2012-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2011 CFR
2011-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2010 CFR
2010-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2013 CFR
2013-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
41 CFR 101-30.401-2 - Automated catalog data output.
Code of Federal Regulations, 2014 CFR
2014-07-01
... available from the Federal Catalog System. (b) Regular file maintenance (RFM). This form of the file... Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 30-FEDERAL CATALOG SYSTEM 30.4-Use of the Federal Catalog System § 101-30.401-2 Automated catalog data output. As a...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-18
... Activities: Automated Commercial Environment Trade Survey AGENCY: U.S. Customs and Border Protection... Environment Trade Survey. This document is published to obtain comments from the public and affected agencies... Environment Trade Survey. OMB Number: Will be assigned upon approval. Form Number: None. Abstract: CBP plans...
Towards Automated Screening of Two-dimensional Crystals
Cheng, Anchi; Leung, Albert; Fellmann, Denis; Quispe, Joel; Suloway, Christian; Pulokas, James; Carragher, Bridget; Potter, Clinton S.
2007-01-01
Screening trials to determine the presence of two-dimensional (2D) protein crystals suitable for three-dimensional structure determination using electron crystallography is a very labor-intensive process. Methods compatible with fully automated screening have been developed for the process of crystal production by dialysis and for producing negatively stained grids of the resulting trials. Further automation via robotic handling of the EM grids, and semi-automated transmission electron microscopic imaging and evaluation of the trial grids is also possible. We, and others, have developed working prototypes for several of these tools and tested and evaluated them in a simple screen of 24 crystallization conditions. While further development of these tools is certainly required for a turn-key system, the goal of fully automated screening appears to be within reach. PMID:17977016
Automation of testing modules of controller ELSY-ТМК
NASA Astrophysics Data System (ADS)
Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.
2017-01-01
In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.
Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.
Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A
2002-10-01
Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.
NASA Astrophysics Data System (ADS)
Nikolaev, V. N.; Titov, D. V.; Syryamkin, V. I.
2018-05-01
The comparative assessment of the level of channel capacity of different variants of the structural organization of the automated information processing systems is made. The information processing time assessment model depending on the type of standard elements and their structural organization is developed.
Prototyping an automated lumber processing system
Powsiri Klinkhachorn; Ravi Kothari; Henry A. Huber; Charles W. McMillin; K. Mukherjee; V. Barnekov
1993-01-01
The Automated Lumber Processing System (ALPS)is a multi-disciplinary continuing effort directed toward increasing the yield obtained from hardwood lumber boards during their process of remanufacture into secondary products (furniture, etc.). ALPS proposes a nondestructive vision system to scan a board for its dimension and the location and expanse of surface defects on...
Simple and Efficient Technique for Spatial/Temporal Composite Imagery
2007-08-01
visible spectrum between 412nm and 869nm, three bands at 500m and two bands at 250m. The MODIS data was processed using the Automated Processing System2...Version 3.6 developed by the Naval Research Labo- ratory (NRL). The Automated Processing System (APS) is a collection of software programs assembled
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurrus, Elizabeth R.; Watanabe, Shigeki; Giuly, Richard J.
2013-01-01
Neuroscientists are developing new imaging techniques and generating large volumes of data in an effort to understand the complex structure of the nervous system. The complexity and size of this data makes human interpretation a labor-intensive task. To aid in the analysis, new segmentation techniques for identifying neurons in these feature rich datasets are required. This paper presents a method for neuron boundary detection and nonbranching process segmentation in electron microscopy images and visualizing them in three dimensions. It combines both automated segmentation techniques with a graphical user interface for correction of mistakes in the automated process. The automated processmore » first uses machine learning and image processing techniques to identify neuron membranes that deliniate the cells in each two-dimensional section. To segment nonbranching processes, the cell regions in each two-dimensional section are connected in 3D using correlation of regions between sections. The combination of this method with a graphical user interface specially designed for this purpose, enables users to quickly segment cellular processes in large volumes.« less
Pesavento, James J; Garcia, Benjamin A; Streeky, James A; Kelleher, Neil L; Mizzen, Craig A
2007-09-01
Recent developments in top down mass spectrometry have enabled closely related histone variants and their modified forms to be identified and quantitated with unprecedented precision, facilitating efforts to better understand how histones contribute to the epigenetic regulation of gene transcription and other nuclear processes. It is therefore crucial that intact MS profiles accurately reflect the levels of variants and modified forms present in a given cell type or cell state for the full benefit of such efforts to be realized. Here we show that partial oxidation of Met and Cys residues in histone samples prepared by conventional methods, together with oxidation that can accrue during storage or during chip-based automated nanoflow electrospray ionization, confounds MS analysis by altering the intact MS profile as well as hindering posttranslational modification localization after MS/MS. We also describe an optimized performic acid oxidation procedure that circumvents these problems without catalyzing additional oxidations or altering the levels of posttranslational modifications common in histones. MS and MS/MS of HeLa cell core histones confirmed that Met and Cys were the only residues oxidized and that complete oxidation restored true intact abundance ratios and significantly enhanced MS/MS data quality. This allowed for the unequivocal detection, at the intact molecule level, of novel combinatorially modified forms of H4 that would have been missed otherwise. Oxidation also enhanced the separation of human core histones by reverse phase chromatography and decreased the levels of salt-adducted forms observed in ESI-FTMS. This method represents a simple and easily automated means for enhancing the accuracy and sensitivity of top down analyses of combinatorially modified forms of histones that may also be of benefit for top down or bottom up analyses of other proteins.
Study of Automated Module Fabrication for Lightweight Solar Blanket Utilization
NASA Technical Reports Server (NTRS)
Gibson, C. E.
1979-01-01
Cost-effective automated techniques for accomplishing the titled purpose; based on existing in-house capability are described. As a measure of the considered automation, the production of a 50 kilowatt solar array blanket, exclusive of support and deployment structure, within an eight-month fabrication period was used. Solar cells considered for this blanket were 2 x 4 x .02 cm wrap-around cells, 2 x 2 x .005 cm and 3 x 3 x .005 cm standard bar contact thin cells, all welded contacts. Existing fabrication processes are described, the rationale for each process is discussed, and the capability for further automation is discussed.
19 CFR 128.23 - Entry requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... utilizing the procedures in this part shall comply with the requirements of the Customs Automated Commercial System (ACS). These requirements include those under the Automated Manifest System (AMS), Cargo Selectivity, Statement Processing, the Automated Broker Interface System (ABI), and enhancements of ACS. (2...
Banks, Victoria A; Stanton, Neville A
2015-01-01
Automated assistance in driving emergencies aims to improve the safety of our roads by avoiding or mitigating the effects of accidents. However, the behavioural implications of such systems remain unknown. This paper introduces the driver decision-making in emergencies (DDMiEs) framework to investigate how the level and type of automation may affect driver decision-making and subsequent responses to critical braking events using network analysis to interrogate retrospective verbalisations. Four DDMiE models were constructed to represent different levels of automation within the driving task and its effects on driver decision-making. Findings suggest that whilst automation does not alter the decision-making pathway (e.g. the processes between hazard detection and response remain similar), it does appear to significantly weaken the links between information-processing nodes. This reflects an unintended yet emergent property within the task network that could mean that we may not be improving safety in the way we expect. This paper contrasts models of driver decision-making in emergencies at varying levels of automation using the Southampton University Driving Simulator. Network analysis of retrospective verbalisations indicates that increasing the level of automation in driving emergencies weakens the link between information-processing nodes essential for effective decision-making.
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-04-01
Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large previously existing earthquake catalogues and data sets.
Results Of Automating A Photolithography Cell In A Clean Tunnel
NASA Astrophysics Data System (ADS)
June, David H.
1987-01-01
A prototype automated photobay was installed in an existing fab area utilizing flexible material handling techniques within a clean tunnel. The project objective was to prove design concepts of automated cassette-to-cassette handling within a clean tunnel that isolated operators from the wafers being processed. Material handling was by monorail track transport system to feed cassettes to pick and place robots. The robots loaded and unloaded cassettes of wafers to each of the various pieces of process equipment. The material handling algorithms, recipe downloading and statistical process control functions were all performed by custom software on the photobay cell controller.
NASA Astrophysics Data System (ADS)
Hopp, T.; Zapf, M.; Ruiter, N. V.
2014-03-01
An essential processing step for comparison of Ultrasound Computer Tomography images to other modalities, as well as for the use in further image processing, is to segment the breast from the background. In this work we present a (semi-) automated 3D segmentation method which is based on the detection of the breast boundary in coronal slice images and a subsequent surface fitting. The method was evaluated using a software phantom and in-vivo data. The fully automatically processed phantom results showed that a segmentation of approx. 10% of the slices of a dataset is sufficient to recover the overall breast shape. Application to 16 in-vivo datasets was performed successfully using semi-automated processing, i.e. using a graphical user interface for manual corrections of the automated breast boundary detection. The processing time for the segmentation of an in-vivo dataset could be significantly reduced by a factor of four compared to a fully manual segmentation. Comparison to manually segmented images identified a smoother surface for the semi-automated segmentation with an average of 11% of differing voxels and an average surface deviation of 2mm. Limitations of the edge detection may be overcome by future updates of the KIT USCT system, allowing a fully-automated usage of our segmentation approach.
Multi-step high-throughput conjugation platform for the development of antibody-drug conjugates.
Andris, Sebastian; Wendeler, Michaela; Wang, Xiangyang; Hubbuch, Jürgen
2018-07-20
Antibody-drug conjugates (ADCs) form a rapidly growing class of biopharmaceuticals which attracts a lot of attention throughout the industry due to its high potential for cancer therapy. They combine the specificity of a monoclonal antibody (mAb) and the cell-killing capacity of highly cytotoxic small molecule drugs. Site-specific conjugation approaches involve a multi-step process for covalent linkage of antibody and drug via a linker. Despite the range of parameters that have to be investigated, high-throughput methods are scarcely used so far in ADC development. In this work an automated high-throughput platform for a site-specific multi-step conjugation process on a liquid-handling station is presented by use of a model conjugation system. A high-throughput solid-phase buffer exchange was successfully incorporated for reagent removal by utilization of a batch cation exchange step. To ensure accurate screening of conjugation parameters, an intermediate UV/Vis-based concentration determination was established including feedback to the process. For conjugate characterization, a high-throughput compatible reversed-phase chromatography method with a runtime of 7 min and no sample preparation was developed. Two case studies illustrate the efficient use for mapping the operating space of a conjugation process. Due to the degree of automation and parallelization, the platform is capable of significantly reducing process development efforts and material demands and shorten development timelines for antibody-drug conjugates. Copyright © 2018 Elsevier B.V. All rights reserved.
Automated Production of Movies on a Cluster of Computers
NASA Technical Reports Server (NTRS)
Nail, Jasper; Le, Duong; Nail, William L.; Nail, William
2008-01-01
A method of accelerating and facilitating production of video and film motion-picture products, and software and generic designs of computer hardware to implement the method, are undergoing development. The method provides for automation of most of the tedious and repetitive tasks involved in editing and otherwise processing raw digitized imagery into final motion-picture products. The method was conceived to satisfy requirements, in industrial and scientific testing, for rapid processing of multiple streams of simultaneously captured raw video imagery into documentation in the form of edited video imagery and video derived data products for technical review and analysis. In the production of such video technical documentation, unlike in production of motion-picture products for entertainment, (1) it is often necessary to produce multiple video derived data products, (2) there are usually no second chances to repeat acquisition of raw imagery, (3) it is often desired to produce final products within minutes rather than hours, days, or months, and (4) consistency and quality, rather than aesthetics, are the primary criteria for judging the products. In the present method, the workflow has both serial and parallel aspects: processing can begin before all the raw imagery has been acquired, each video stream can be subjected to different stages of processing simultaneously on different computers that may be grouped into one or more cluster(s), and the final product may consist of multiple video streams. Results of processing on different computers are shared, so that workers can collaborate effectively.
The logic of automated glycan assembly.
Seeberger, Peter H
2015-05-19
Carbohydrates are the most abundant biopolymers on earth and part of every living creature. Glycans are essential as materials for nutrition and for information transfer in biological processes. To date, in few cases a detailed correlation between glycan structure and glycan function has been established. A molecular understanding of glycan function will require pure glycans for biological, immunological, and structural studies. Given the immense structural complexity of glycans found in living organisms and the lack of amplification methods or expression systems, chemical synthesis is the only means to access usable quantities of pure glycan molecules. While the solid-phase synthesis of DNA and peptides has become routine for decades, access to glycans has been technically difficult, time-consuming and confined to a few expert laboratories. In this Account, the development of a comprehensive approach to the automated synthesis of all classes of mammalian glycans, including glycosaminoglycans and glycosylphosphatidyl inositol (GPI) anchors, as well as bacterial and plant carbohydrates is described. A conceptual advance concerning the logic of glycan assembly was required in order to enable automated execution of the synthetic process. Based on the central glycosidic bond forming reaction, a general concept for the protecting groups and leaving groups has been developed. Building blocks that can be procured on large scale, are stable for prolonged periods of time, but upon activation result in high yields and selectivities were identified. A coupling-capping and deprotection cycle was invented that can be executed by an automated synthesis instrument. Straightforward postsynthetic protocols for cleavage from the solid support as well as purification of conjugation-ready oligosaccharides have been established. Introduction of methods to install selectively a wide variety of glycosidic linkages has enabled the rapid assembly of linear and branched oligo- and polysaccharides as large as 30-mers. Fast, reliable access to defined glycans that are ready for conjugation has given rise to glycan arrays, glycan probes, and synthetic glycoconjugate vaccines. While an ever increasing variety of glycans are accessible by automated synthesis, further methodological advances in carbohydrate chemistry are needed to make all possible glycans found in nature. These tools begin to fundamentally impact the medical but also materials aspects of the glycosciences.
NASA Technical Reports Server (NTRS)
Mulhall, B. D. L.
1980-01-01
The functions are identified and described in chart form as a tree in which the basic functions, to 'Provide National Identification Service,' are shown at the top. The lower levels of the tree branch out to indicate functions and sub-functions. Symbols are used to indicate whether or not a function was automated in the AIDS 1 or 2 system or is planned to be automated in the AIDS 3 system. The tree chart is shown in detail.
Polonchuk, Liudmila
2014-01-01
Patch-clamping is a powerful technique for investigating the ion channel function and regulation. However, its low throughput hampered profiling of large compound series in early drug development. Fortunately, automation has revolutionized the area of experimental electrophysiology over the past decade. Whereas the first automated patch-clamp instruments using the planar patch-clamp technology demonstrated rather a moderate throughput, few second-generation automated platforms recently launched by various companies have significantly increased ability to form a high number of high-resistance seals. Among them is SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich, Germany), a fully automated giga-seal patch-clamp system with the highest throughput on the market. By recording from up to 96 cells simultaneously, the SyncroPatch(®) 96 allows to substantially increase throughput without compromising data quality. This chapter describes features of the innovative automated electrophysiology system and protocols used for a successful transfer of the established hERG assay to this high-throughput automated platform.
78 FR 1219 - Agency Information Collection Activities: Delivery Ticket
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... appropriate automated, electronic, mechanical, or other technological techniques or other forms of information... imported merchandise. CBP Form 6043 is filled out by warehouse proprietors, [[Page 1220
Wess, Bernard P.; Jacobson, Gary
1987-01-01
In the process of forming a new medical malpractice reinsurance company, the authors analyzed thousands of medical malpractice cases, settlements, and verdicts. The evidence of those analyses indicated that the medical malpractice crisis is (1)emerging nation- and world-wide, (2)exacerbated by but not primarily a result of “predatory” legal action, (3)statistically determined by a small percentage of physicians and procedures, (4)overburdened with data but poor on information, (5)subject to classic forms of quality control and automation. The management information system developed to address this problem features a tiered data base architecture to accommodate medical, administrative, procedural, statistical, and actuarial analyses necessary to predict claims from untoward events, not merely to report them.
NASA Technical Reports Server (NTRS)
Chien, Steve A.
1996-01-01
A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.
1990-03-29
This is our final report on the Audit of Automated Data Processing Support of Investigative and Security Missions at the Defense Investigative...Service for your information and use. Comments on a draft of this report were considered in preparing the final report. The audit was made from May through...October 1989. The objectives of the audit were to determine if the Defense Investigative Service (DIS) was effectively managing automated data
Spitzer Telemetry Processing System
NASA Technical Reports Server (NTRS)
Stanboli, Alice; Martinez, Elmain M.; McAuley, James M.
2013-01-01
The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs.
Computer transformation of partial differential equations into any coordinate system
NASA Technical Reports Server (NTRS)
Sullivan, R. D.
1977-01-01
The use of tensors to provide a compact way of writing partial differential equations in a form valid in all coordinate systems is discussed. In order to find solutions to the equations with their boundary conditions they must be expressed in terms of the coordinate system under consideration. The process of arriving at these expressions from the tensor formulation was automated by a software system, TENSR. An allied system that analyzes the resulting expressions term by term and drops those that are negligible is also described.
Towards automated visual flexible endoscope navigation.
van der Stap, Nanda; van der Heijden, Ferdinand; Broeders, Ivo A M J
2013-10-01
The design of flexible endoscopes has not changed significantly in the past 50 years. A trend is observed towards a wider application of flexible endoscopes with an increasing role in complex intraluminal therapeutic procedures. The nonintuitive and nonergonomical steering mechanism now forms a barrier in the extension of flexible endoscope applications. Automating the navigation of endoscopes could be a solution for this problem. This paper summarizes the current state of the art in image-based navigation algorithms. The objectives are to find the most promising navigation system(s) to date and to indicate fields for further research. A systematic literature search was performed using three general search terms in two medical-technological literature databases. Papers were included according to the inclusion criteria. A total of 135 papers were analyzed. Ultimately, 26 were included. Navigation often is based on visual information, which means steering the endoscope using the images that the endoscope produces. Two main techniques are described: lumen centralization and visual odometry. Although the research results are promising, no successful, commercially available automated flexible endoscopy system exists to date. Automated systems that employ conventional flexible endoscopes show the most promising prospects in terms of cost and applicability. To produce such a system, the research focus should lie on finding low-cost mechatronics and technologically robust steering algorithms. Additional functionality and increased efficiency can be obtained through software development. The first priority is to find real-time, robust steering algorithms. These algorithms need to handle bubbles, motion blur, and other image artifacts without disrupting the steering process.
Onan, Arif; Simsek, Nurettin
2017-04-01
Automated external defibrillators are pervasive computing devices designed for the treatment and management of acute sudden cardiac arrest. This study aims to explain users' actual use behavior in teams formed by different professions taken after a short time span of interaction with automated external defibrillator. Before the intervention, all the participants were certified with the American Heart Association Basic Life Support for healthcare providers. A statistically significant difference was revealed in mean individual automated external defibrillator technical skills between uniprofessional and interprofessional groups. The technical automated external defibrillator team scores were greater for groups with interprofessional than for those with uniprofessional education. The nontechnical automated external defibrillator skills of interprofessional and uniprofessional teams revealed differences in advantage of interprofessional teams. Students positively accept automated external defibrillators if well-defined and validated training opportunities to use them expertly are available. Uniprofessional teams were successfully supported by their members and, thereby, used automated external defibrillator effectively. Furthermore, the interprofessional approach resulted in as much effective teamwork as the uniprofessional approach.
ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.
Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu
2015-02-01
IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.
Automating expert role to determine design concept in Kansei Engineering
NASA Astrophysics Data System (ADS)
Lokman, Anitawati Mohd; Haron, Mohammad Bakri Che; Abidin, Siti Zaleha Zainal; Khalid, Noor Elaiza Abd
2016-02-01
Affect has become imperative in product quality. In affective design field, Kansei Engineering (KE) has been recognized as a technology that enables discovery of consumer's emotion and formulation of guide to design products that win consumers in the competitive market. Albeit powerful technology, there is no rule of thumb in its analysis and interpretation process. KE expertise is required to determine sets of related Kansei and the significant concept of emotion. Many research endeavors become handicapped with the limited number of available and accessible KE experts. This work is performed to simulate the role of experts with the use of Natphoric algorithm thus providing sound solution to the complexity and flexibility in KE. The algorithm is designed to learn the process by implementing training datasets taken from previous KE research works. A framework for automated KE is then designed to realize the development of automated KE system. A comparative analysis is performed to determine feasibility of the developed prototype to automate the process. The result shows that the significant Kansei is determined by manual KE implementation and the automated process is highly similar. KE research advocates will benefit this system to automatically determine significant design concepts.
Automated data acquisition technology development:Automated modeling and control development
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1995-01-01
This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.
Workflow Automation: A Collective Case Study
ERIC Educational Resources Information Center
Harlan, Jennifer
2013-01-01
Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-28
... Collection Activities: Business Transformation--Automated Integrated Operating Environment (IOE), New... Transformation--Integrated Operating Environment (IOE); OMB Control No. 1615-NEW. SUMMARY: USCIS is developing an automated Integrated Operating Environment (IOE) to process benefit applications. The IOE will collect...
[Complex automatic data processing in multi-profile hospitals].
Dovzhenko, Iu M; Panov, G D
1990-01-01
The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.
Modular workcells: modern methods for laboratory automation.
Felder, R A
1998-12-01
Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2010-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Automated classification of Acid Rock Drainage potential from Corescan drill core imagery
NASA Astrophysics Data System (ADS)
Cracknell, M. J.; Jackson, L.; Parbhakar-Fox, A.; Savinova, K.
2017-12-01
Classification of the acid forming potential of waste rock is important for managing environmental hazards associated with mining operations. Current methods for the classification of acid rock drainage (ARD) potential usually involve labour intensive and subjective assessment of drill core and/or hand specimens. Manual methods are subject to operator bias, human error and the amount of material that can be assessed within a given time frame is limited. The automated classification of ARD potential documented here is based on the ARD Index developed by Parbhakar-Fox et al. (2011). This ARD Index involves the combination of five indicators: A - sulphide content; B - sulphide alteration; C - sulphide morphology; D - primary neutraliser content; and E - sulphide mineral association. Several components of the ARD Index require accurate identification of sulphide minerals. This is achieved by classifying Corescan Red-Green-Blue true colour images into the presence or absence of sulphide minerals using supervised classification. Subsequently, sulphide classification images are processed and combined with Corescan SWIR-based mineral classifications to obtain information on sulphide content, indices representing sulphide textures (disseminated versus massive and degree of veining), and spatially associated minerals. This information is combined to calculate ARD Index indicator values that feed into the classification of ARD potential. Automated ARD potential classifications of drill core samples associated with a porphyry Cu-Au deposit are compared to manually derived classifications and those obtained by standard static geochemical testing and X-ray diffractometry analyses. Results indicate a high degree of similarity between automated and manual ARD potential classifications. Major differences between approaches are observed in sulphide and neutraliser mineral percentages, likely due to the subjective nature of manual estimates of mineral content. The automated approach presented here for the classification of ARD potential offers rapid, repeatable and accurate outcomes comparable to manually derived classifications. Methods for automated ARD classifications from digital drill core data represent a step-change for geoenvironmental management practices in the mining industry.
Fast and objective detection and analysis of structures in downhole images
NASA Astrophysics Data System (ADS)
Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick
2017-09-01
Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.
An automated workflow for parallel processing of large multiview SPIM recordings
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-01-01
Summary: Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. Availability and implementation: The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT. The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows. Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction. Contact: schmied@mpi-cbg.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26628585
An automated workflow for parallel processing of large multiview SPIM recordings.
Schmied, Christopher; Steinbach, Peter; Pietzsch, Tobias; Preibisch, Stephan; Tomancak, Pavel
2016-04-01
Selective Plane Illumination Microscopy (SPIM) allows to image developing organisms in 3D at unprecedented temporal resolution over long periods of time. The resulting massive amounts of raw image data requires extensive processing interactively via dedicated graphical user interface (GUI) applications. The consecutive processing steps can be easily automated and the individual time points can be processed independently, which lends itself to trivial parallelization on a high performance computing (HPC) cluster. Here, we introduce an automated workflow for processing large multiview, multichannel, multiillumination time-lapse SPIM data on a single workstation or in parallel on a HPC cluster. The pipeline relies on snakemake to resolve dependencies among consecutive processing steps and can be easily adapted to any cluster environment for processing SPIM data in a fraction of the time required to collect it. The code is distributed free and open source under the MIT license http://opensource.org/licenses/MIT The source code can be downloaded from github: https://github.com/mpicbg-scicomp/snakemake-workflows Documentation can be found here: http://fiji.sc/Automated_workflow_for_parallel_Multiview_Reconstruction : schmied@mpi-cbg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Automated Sequence Generation Process and Software
NASA Technical Reports Server (NTRS)
Gladden, Roy
2007-01-01
"Automated sequence generation" (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences.
1993-12-10
applied to the 3-component IRIS/IDA data under simulated operational conditions. The result was a reduction in the number false-alarms produced by the automated processing and interpretation system by about 60%