DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurah, Mirwaise Y.; Roberts, Mark A.
Washington River Protection Solutions (WRPS), operator of High Level Radioactive Waste (HLW) Tank Farms at the Hanford Site, is taking an over 20-year leap in technology, replacing systems that were monitored with clipboards and obsolete computer systems, as well as solving major operations and maintenance hurdles in the area of process automation and information management. While WRPS is fully compliant with procedures and regulations, the current systems are not integrated and do not share data efficiently, hampering how information is obtained and managed.
Vonhofen, Geraldine; Evangelista, Tonya; Lordeon, Patricia
2012-04-01
The traditional method of administering radioactive isotopes to pediatric patients undergoing ictal brain single photon emission computed tomography testing has been by manual injections. This method presents certain challenges for nursing, including time requirements and safety risks. This quality improvement project discusses the implementation of an automated injection system for isotope administration and its impact on staffing, safety, and nursing satisfaction. It was conducted in an epilepsy monitoring unit at a large urban pediatric facility. Results of this project showed a decrease in the number of nurses exposed to radiation and improved nursing satisfaction with the use of the automated injection system. In addition, there was a decrease in the number of nursing hours required during ictal brain single photon emission computed tomography testing.
Ogata, Y; Nishizawa, K
1995-10-01
An automated smear counting and data processing system for a life science laboratory was developed to facilitate routine surveys and eliminate human errors by using a notebook computer. This system was composed of a personal computer, a liquid scintillation counter and a well-type NaI(Tl) scintillation counter. The radioactivity of smear samples was automatically measured by these counters. The personal computer received raw signals from the counters through an interface of RS-232C. The software for the computer evaluated the surface density of each radioisotope and printed out that value along with other items as a report. The software was programmed in Pascal language. This system was successfully applied to routine surveys for contamination in our facility.
Joseph, Leena; Das, A P; Ravindra, Anuradha; Kulkarni, D B; Kulkarni, M S
2018-07-01
4πβ-γ coincidence method is a powerful and widely used method to determine the absolute activity concentration of radioactive solutions. A new automated liquid scintillator based coincidence system has been designed, developed, tested and established as absolute standard for radioactivity measurements. The automation is achieved using PLC (programmable logic controller) and SCADA (supervisory control and data acquisition). Radioactive solution of 60 Co was standardized to compare the performance of the automated system with proportional counter based absolute standard maintained in the laboratory. The activity concentrations determined using these two systems were in very good agreement; the new automated system can be used for absolute measurement of activity concentration of radioactive solutions. Copyright © 2018. Published by Elsevier Ltd.
Stimson, D H R; Pringle, A J; Maillet, D; King, A R; Nevin, S T; Venkatachalam, T K; Reutens, D C; Bhalla, R
2016-09-01
The emphasis on the reduction of gaseous radioactive effluent associated with PET radiochemistry laboratories has increased. Various radioactive gas capture strategies have been employed historically including expensive automated compression systems. We have implemented a new cost-effective strategy employing gas capture bags with electronic feedback that are integrated with the cyclotron safety system. Our strategy is suitable for multiple automated 18 F radiosynthesis modules and individual automated 11 C radiosynthesis modules. We describe novel gas capture systems that minimize the risk of human error and are routinely used in our facility.
NASA Technical Reports Server (NTRS)
Matijevic, Jacob R.; Zimmerman, Wayne F.; Dolinsky, Shlomo
1990-01-01
Assembly of electromechanical and electronic equipment (including computers) constitutes test bed for development of advanced robotic systems for remote manipulation. Combines features not found in commercial systems. Its architecture allows easy growth in complexity and level of automation. System national resource for validation of new telerobotic technology. Intended primarily for robots used in outer space, test bed adapted to development of advanced terrestrial telerobotic systems for handling radioactive materials, dangerous chemicals, and explosives.
Dooraghi, Alex A.; Carroll, Lewis; Collins, Jeffrey; ...
2016-03-09
Automated protocols for measuring and dispensing solutions containing radioisotopes are essential not only for providing a safe environment for radiation workers but also to ensure accuracy of dispensed radioactivity and an efficient workflow. For this purpose, we have designed ARAS, an automated radioactivity aliquoting system for dispensing solutions containing positron-emitting radioisotopes with particular focus on fluorine-18 (18F). The key to the system is the combination of a radiation detector measuring radioactivity concentration, in line with a peristaltic pump dispensing known volumes. Results show the combined system demonstrates volume variation to be within 5 % for dispensing volumes of 20 μLmore » or greater. When considering volumes of 20 μL or greater, the delivered radioactivity is in agreement with the requested amount as measured independently with a dose calibrator to within 2 % on average. In conclusion, the integration of the detector and pump in an in-line system leads to a flexible and compact approach that can accurately dispense solutions containing radioactivity concentrations ranging from the high values typical of [18F]fluoride directly produced from a cyclotron (~0.1-1 mCi μL -1) to the low values typical of batches of [18F]fluoride-labeled radiotracers intended for preclinical mouse scans (~1-10 μCi μL -1).« less
Krauser, Joel; Walles, Markus; Wolf, Thierry; Graf, Daniel; Swart, Piet
2012-01-01
Generation and interpretation of biotransformation data on drugs, i.e. identification of physiologically relevant metabolites, defining metabolic pathways and elucidation of metabolite structures, have become increasingly important to the drug development process. Profiling using 14C or 3H radiolabel is defined as the chromatographic separation and quantification of drug-related material in a given biological sample derived from an in vitro, preclinical in vivo or clinical study. Metabolite profiling is a very time intensive activity, particularly for preclinical in vivo or clinical studies which have defined limitations on radiation burden and exposure levels. A clear gap exists for certain studies which do not require specialized high volume automation technologies, yet these studies would still clearly benefit from automation. Use of radiolabeled compounds in preclinical and clinical ADME studies, specifically for metabolite profiling and identification are a very good example. The current lack of automation for measuring low level radioactivity in metabolite profiling requires substantial capacity, personal attention and resources from laboratory scientists. To help address these challenges and improve efficiency, we have innovated, developed and implemented a novel and flexible automation platform that integrates a robotic plate handling platform, HPLC or UPLC system, mass spectrometer and an automated fraction collector. PMID:22723932
Flow Mapping in a Gas-Solid Riser via Computer Automated Radioactive Particle Tracking (CARPT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muthanna Al-Dahhan; Milorad P. Dudukovic; Satish Bhusarapu
2005-06-04
Statement of the Problem: Developing and disseminating a general and experimentally validated model for turbulent multiphase fluid dynamics suitable for engineering design purposes in industrial scale applications of riser reactors and pneumatic conveying, require collecting reliable data on solids trajectories, velocities ? averaged and instantaneous, solids holdup distribution and solids fluxes in the riser as a function of operating conditions. Such data are currently not available on the same system. Multiphase Fluid Dynamics Research Consortium (MFDRC) was established to address these issues on a chosen example of circulating fluidized bed (CFB) reactor, which is widely used in petroleum and chemicalmore » industry including coal combustion. This project addresses the problem of lacking reliable data to advance CFB technology. Project Objectives: The objective of this project is to advance the understanding of the solids flow pattern and mixing in a well-developed flow region of a gas-solid riser, operated at different gas flow rates and solids loading using the state-of-the-art non-intrusive measurements. This work creates an insight and reliable database for local solids fluid-dynamic quantities in a pilot-plant scale CFB, which can then be used to validate/develop phenomenological models for the riser. This study also attempts to provide benchmark data for validation of Computational Fluid Dynamic (CFD) codes and their current closures. Technical Approach: Non-Invasive Computer Automated Radioactive Particle Tracking (CARPT) technique provides complete Eulerian solids flow field (time average velocity map and various turbulence parameters such as the Reynolds stresses, turbulent kinetic energy, and eddy diffusivities). It also gives directly the Lagrangian information of solids flow and yields the true solids residence time distribution (RTD). Another radiation based technique, Computed Tomography (CT) yields detailed time averaged local holdup profiles at various planes. Together, these two techniques can provide the needed local solids flow dynamic information for the same setup under identical operating conditions, and the data obtained can be used as a benchmark for development, and refinement of the appropriate riser models. For the above reasons these two techniques were implemented in this study on a fully developed section of the riser. To derive the global mixing information in the riser, accurate solids RTD is needed and was obtained by monitoring the entry and exit of a single radioactive tracer. Other global parameters such as Cycle Time Distribution (CTD), overall solids holdup in the riser, solids recycle percentage at the bottom section of the riser were evaluated from different solids travel time distributions. Besides, to measure accurately and in-situ the overall solids mass flux, a novel method was applied.« less
Electronic Data Interchange in Procurement
1990-04-01
contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated
Employment Opportunities for the Handicapped in Programmable Automation.
ERIC Educational Resources Information Center
Swift, Richard; Leneway, Robert
A Computer Integrated Manufacturing System may make it possible for severely disabled people to custom design, machine, and manufacture either wood or metal parts. Programmable automation merges computer aided design, computer aided manufacturing, computer aided engineering, and computer integrated manufacturing systems with automated production…
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1960-09-01
Papers presented at the All-Union Conference on Industrial Applications of Radioactive Isotopes and Nuclear Emissions in the National Economy of USSR, April 12 to 16, 1960, in Riga are surveyed. Short summaries are given on applications of radioactive isotopes and nuclear emissions in prospecting, developing mineral resources, metallurgy, ore enrichment processes, machine construction technology, agriculture, food processing, and medicine. Sources of alpha , beta , and gamma radiation for control and automation of processes are also discussed. The full reports from the conference will be published in 1960. (R.V.J.)
Analyzing the Cohesion of English Text and Discourse with Automated Computer Tools
ERIC Educational Resources Information Center
Jeon, Moongee
2014-01-01
This article investigates the lexical and discourse features of English text and discourse with automated computer technologies. Specifically, this article examines the cohesion of English text and discourse with automated computer tools, Coh-Metrix and TEES. Coh-Metrix is a text analysis computer tool that can analyze English text and discourse…
Automation of GIS-based population data-collection for transportation risk analysis
DOT National Transportation Integrated Search
1999-11-01
Estimation of the potential radiological risks associated with highway transport of radioactive : materials (RAM) requires input data describing population densities adjacent to all portions of : the route to be traveled. Previously, aggregated risks...
Research on robotics by principal investigators of the Robotics Technology Development Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrigan, R.W.
The U.S. Department of Energy`s Office of Technology Development has been developing robotics and automation technologies for the clean-up and handling of hazardous and radioactive waste through one of its major elements, Cross Cutting and Advanced Technology development. CC&AT university research and development programs recognize the strong technology, base resident in the university community and sponsor a focused technology research and development program which stresses close interaction between the university sector and the DOE community. This report contains a compilation of research articles by each of 14 principle investigators supported by CC&AT to develop robotics and automation technologies for themore » clean-up and handling of hazardous and radioactive waste. This research has led to innovative solutions for waste clean-up problems, and it has moved technology out of university laboratories into functioning systems which has allowed early evaluation by site technologists.« less
The Change to Administrative Computing in Schools.
ERIC Educational Resources Information Center
Brown, Daniel J.
1984-01-01
Describes a study of the process of school office automation which focuses on personnel reactions to administrative computing, what users view as advantages and disadvantages of the automation, perceived barriers and facilitators of the change to automation, school personnel view of long term effects, and implications for school computer policy.…
Fundamentals of Library Automation and Technology. Participant Workbook.
ERIC Educational Resources Information Center
Bridge, Frank; Walton, Robert
This workbook presents outlines of topics to be covered during a two-day workshop on the fundamentals for library automation. Topics for the first day include: (1) Introduction; (2) Computer Technology--A Historical Overview; (3) Evolution of Library Automation; (4) Computer Hardware Technology--An Introduction; (5) Computer Software…
System for Computer Automated Typesetting (SCAT) of Computer Authored Texts.
ERIC Educational Resources Information Center
Keeler, F. Laurence
This description of the System for Automated Typesetting (SCAT), an automated system for typesetting text and inserting special graphic symbols in programmed instructional materials created by the computer aided authoring system AUTHOR, provides an outline of the design architecture of the system and an overview including the component…
Pilot interaction with automated airborne decision making systems
NASA Technical Reports Server (NTRS)
Rouse, W. B.; Chu, Y. Y.; Greenstein, J. S.; Walden, R. S.
1976-01-01
An investigation was made of interaction between a human pilot and automated on-board decision making systems. Research was initiated on the topic of pilot problem solving in automated and semi-automated flight management systems and attempts were made to develop a model of human decision making in a multi-task situation. A study was made of allocation of responsibility between human and computer, and discussed were various pilot performance parameters with varying degrees of automation. Optimal allocation of responsibility between human and computer was considered and some theoretical results found in the literature were presented. The pilot as a problem solver was discussed. Finally the design of displays, controls, procedures, and computer aids for problem solving tasks in automated and semi-automated systems was considered.
Code of Federal Regulations, 2010 CFR
2010-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2013 CFR
2013-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2014 CFR
2014-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2012 CFR
2012-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2011 CFR
2011-10-01
.... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...
Code of Federal Regulations, 2012 CFR
2012-10-01
..., financial records, and automated data systems; (ii) The data are free from computational errors and are... records, financial records, and automated data systems; (ii) The data are free from computational errors... records, and automated data systems; (ii) The data are free from computational errors and are internally...
Automated image quality assessment for chest CT scans.
Reeves, Anthony P; Xie, Yiting; Liu, Shuang
2018-02-01
Medical image quality needs to be maintained at standards sufficient for effective clinical reading. Automated computer analytic methods may be applied to medical images for quality assessment. For chest CT scans in a lung cancer screening context, an automated quality assessment method is presented that characterizes image noise and image intensity calibration. This is achieved by image measurements in three automatically segmented homogeneous regions of the scan: external air, trachea lumen air, and descending aorta blood. Profiles of CT scanner behavior are also computed. The method has been evaluated on both phantom and real low-dose chest CT scans and results show that repeatable noise and calibration measures may be realized by automated computer algorithms. Noise and calibration profiles show relevant differences between different scanners and protocols. Automated image quality assessment may be useful for quality control for lung cancer screening and may enable performance improvements to automated computer analysis methods. © 2017 American Association of Physicists in Medicine.
FUNCTION GENERATOR FOR ANALOGUE COMPUTERS
Skramstad, H.K.; Wright, J.H.; Taback, L.
1961-12-12
An improved analogue computer is designed which can be used to determine the final ground position of radioactive fallout particles in an atomic cloud. The computer determines the fallout pattern on the basis of known wind velocity and direction at various altitudes, and intensity of radioactivity in the mushroom cloud as a function of particle size and initial height in the cloud. The output is then displayed on a cathode-ray tube so that the average or total luminance of the tube screen at any point represents the intensity of radioactive fallout at the geographical location represented by that point. (AEC)
BEARS: Radioactive Ion Beams at Berkeley
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, J.; Joosten, R.; Donahue, C.A.
2000-03-14
A light-isotope radioactive ion beam capability has been added to the 88-Inch Cyclotron at Lawrence Berkeley National Laboratory by coupling to the production cyclotron of the Berkeley Isotope Facility. The connection required the development and construction of a 350 m gas transport system between the two accelerators as well as automated cryogenic separation of the produced activity. The first beam developed, {sup 11}C, has been successfully accelerated with an on-target intensity of 1 x 10{sup 8} ions/sec at energies of around 10 MeV/u.
Automated Help System For A Supercomputer
NASA Technical Reports Server (NTRS)
Callas, George P.; Schulbach, Catherine H.; Younkin, Michael
1994-01-01
Expert-system software developed to provide automated system of user-helping displays in supercomputer system at Ames Research Center Advanced Computer Facility. Users located at remote computer terminals connected to supercomputer and each other via gateway computers, local-area networks, telephone lines, and satellite links. Automated help system answers routine user inquiries about how to use services of computer system. Available 24 hours per day and reduces burden on human experts, freeing them to concentrate on helping users with complicated problems.
Computer Programs For Automated Welding System
NASA Technical Reports Server (NTRS)
Agapakis, John E.
1993-01-01
Computer programs developed for use in controlling automated welding system described in MFS-28578. Together with control computer, computer input and output devices and control sensors and actuators, provide flexible capability for planning and implementation of schemes for automated welding of specific workpieces. Developed according to macro- and task-level programming schemes, which increases productivity and consistency by reducing amount of "teaching" of system by technician. System provides for three-dimensional mathematical modeling of workpieces, work cells, robots, and positioners.
Rapid screening of radioactivity in food for emergency response.
Bari, A; Khan, A J; Semkow, T M; Syed, U-F; Roselan, A; Haines, D K; Roth, G; West, L; Arndt, M
2011-06-01
This paper describes the development of methods for the rapid screening of gross alpha (GA) and gross beta (GB) radioactivity in liquid foods, specifically, Tang drink mix, apple juice, and milk, as well as screening of GA, GB, and gamma radioactivity from surface deposition on apples. Detailed procedures were developed for spiking of matrices with (241)Am (alpha radioactivity), (90)Sr/(90)Y (beta radioactivity), and (60)Co, (137)Cs, and (241)Am (gamma radioactivity). Matrix stability studies were performed for 43 days after spiking. The method for liquid foods is based upon rapid digestion, evaporation, and flaming, followed by gas proportional (GP) counting. For the apple matrix, surface radioactivity was acid-leached, followed by GP counting and/or gamma spectrometry. The average leaching recoveries from four different apple brands were between 63% and 96%, and have been interpreted on the basis of ion transport through the apple cuticle. The minimum detectable concentrations (MDCs) were calculated from either the background or method-blank (MB) measurements. They were found to satisfy the required U.S. FDA's Derived Intervention Levels (DILs) in all but one case. The newly developed methods can perform radioactivity screening in foods within a few hours and have the potential to capacity with further automation. They are especially applicable to emergency response following accidental or intentional contamination of food with radioactivity. Published by Elsevier Ltd.
USSR Report: Cybernetics, Computers and Automation Technology. No. 69.
1983-05-06
computers in multiprocessor and multistation design , control and scientific research automation systems. The results of comparing the efficiency of...Podvizhnaya, Scientific Research Institute of Control Computers, Severodonetsk] [Text] The most significant change in the design of the SM-2M compared to...UPRAVLYAYUSHCHIYE SISTEMY I MASHINY, Nov-Dec 82) 95 APPLICATIONS Kiev Automated Control System, Design Features and Prospects for Development (V. A
Automated validation of a computer operating system
NASA Technical Reports Server (NTRS)
Dervage, M. M.; Milberg, B. A.
1970-01-01
Programs apply selected input/output loads to complex computer operating system and measure performance of that system under such loads. Technique lends itself to checkout of computer software designed to monitor automated complex industrial systems.
Li, Mengshi; Zhang, Xiuli; Quinn, Thomas P; Lee, Dongyoul; Liu, Dijie; Kunkel, Falk; Zimmerman, Brian E; McAlister, Daniel; Olewein, Keith; Menda, Yusuf; Mirzadeh, Saed; Copping, Roy; Johnson, Frances L; Schultz, Michael K
2017-09-01
A method for preparation of Pb-212 and Pb-203 labeled chelator-modified peptide-based radiopharmaceuticals for cancer imaging and radionuclide therapy has been developed and adapted for automated clinical production. Pre-concentration and isolation of radioactive Pb2+ from interfering metals in dilute hydrochloric acid was optimized using a commercially-available Pb-specific chromatography resin packed in disposable plastic columns. The pre-concentrated radioactive Pb2+ is eluted in NaOAc buffer directly to the reaction vessel containing chelator-modified peptides. Radiolabeling was found to proceed efficiently at 85°C (45min; pH 5.5). The specific activity of radiolabeled conjugates was optimized by separation of radiolabeled conjugates from unlabeled peptide via HPLC. Preservation of bioactivity was confirmed by in vivo biodistribution of Pb-203 and Pb-212 labeled peptides in melanoma-tumor-bearing mice. The approach has been found to be robustly adaptable to automation and a cassette-based fluid-handling system (Modular Lab Pharm Tracer) has been customized for clinical radiopharmaceutical production. Our findings demonstrate that the Pb-203/Pb-212 combination is a promising elementally-matched radionuclide pair for image-guided radionuclide therapy for melanoma, neuroendocrine tumors, and potentially other cancers. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Effect of Computer Automation on Institutional Review Board (IRB) Office Efficiency
ERIC Educational Resources Information Center
Oder, Karl; Pittman, Stephanie
2015-01-01
Companies purchase computer systems to make their processes more efficient through automation. Some academic medical centers (AMC) have purchased computer systems for their institutional review boards (IRB) to increase efficiency and compliance with regulations. IRB computer systems are expensive to purchase, deploy, and maintain. An AMC should…
Possible Computer Vision Systems and Automated or Computer-Aided Edging and Trimming
Philip A. Araman
1990-01-01
This paper discusses research which is underway to help our industry reduce costs, increase product volume and value recovery, and market more accurately graded and described products. The research is part of a team effort to help the hardwood sawmill industry automate with computer vision systems, and computer-aided or computer controlled processing. This paper...
NASA Astrophysics Data System (ADS)
Dooraghi, Alex Abreu
Positron Emission Tomography (PET) is a noninvasive molecular imaging tool that requires the use of a radioactive compound or radiotracer which targets a molecular pathway of interest. We have developed and employed three beta particle radiation detection systems to advance PET. Specifically, the goals of these systems are to: 1. Automate dispensing of solutions containing a positron emitting isotope. 2. Monitor radioactivity on-chip during synthesis of a positron emitting radiotracer. 3. Assay cellular uptake on-chip of a positron emitting radiotracer. Automated protocols for measuring and dispensing solutions containing radioisotopes are essential not only for providing an optimum environment for radiation workers, but also to ensure a quantitatively accurate workflow. For the first project, we describe the development and performance of a system for automated radioactivity distribution of beta particle emitting radioisotopes such as fluorine-18 (F-18). Key to the system is a radiation detector in-line with a peristaltic pump. The system demonstrates volume accuracy within 5 % for volumes of 20 muL or greater. When considering volumes of 20 muL or greater, delivered radioactivity is in agreement with the requested radioactivity as measured with the dose calibrator. The integration of the detector and pump leads to a flexible system that can accurately dispense solutions containing F-18 in radioactivity concentrations directly produced from a cyclotron (~ 0.1-1 mCi/muL), to low activity concentrations intended for preclinical mouse scans (~ 1-10 muCi/muL), and anywhere in between. Electrowetting on dielectric (EWOD) is an attractive microfluidic platform for batch synthesis of PET radiotracers. Visualization of radioisotopes on-chip is critical for synthesis optimization and technological development. For the second project, we describe the development and performance of a Cerenkov/real-time imaging system for PET radiotracer synthesis on EWOD. We also investigate fundamental physical characteristics of Cerenkov photon yield at different stages of [F-18]FDG synthesis on the EWOD platform. We are able to use this imaging system to optimize the mixing protocol as well as identify and correct for loss of radioactivity due to the migration of radioactive vapor outside of the EWOD heater, enabling an overall increase in the crude radiochemical yield from 50 +/- 3% (n = 3) to 72 +/- 13% (n = 5). Clinical use of PET has proven to be a critical tool for monitoring cancer treatment response. For the third project, we describe the redesign and performance of Betabox, a specialized device that incorporates PET radiotracers in an assay that gives clinicians and researchers the ability to assess the effectiveness of a drug therapy in-vitro by isolating small samples of patient tumor cells incubated in a polydimethylsiloxane (PDMS) microfluidic chip. We find that Betabox is a high sensitivity and low noise charged particle imaging system that can operate without significant impairment of its performance at both room and at elevated temperatures, such as those suitable for cell culture. The dark count rate is within range of the expected signal from cosmic rays, dictating a low detection limit that allows quantitative imaging of very small amounts of radioactivity. This system demonstrates the potential of direct cellular radioassay of small samples of cells (~100 cells per measurement).
Computers Launch Faster, Better Job Matching
ERIC Educational Resources Information Center
Stevenson, Gloria
1976-01-01
Employment Security Automation Project (ESAP), a five-year program sponsored by the Employment and Training Administration, features an innovative computer-assisted job matching system and instantaneous computer-assisted service for unemployment insurance claimants. ESAP will also consolidate existing automated employment security systems to…
InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Hamledari, Hesam
In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.
Computer vision in the poultry industry
USDA-ARS?s Scientific Manuscript database
Computer vision is becoming increasingly important in the poultry industry due to increasing use and speed of automation in processing operations. Growing awareness of food safety concerns has helped add food safety inspection to the list of tasks that automated computer vision can assist. Researc...
Computing and Office Automation: Changing Variables.
ERIC Educational Resources Information Center
Staman, E. Michael
1981-01-01
Trends in computing and office automation and their applications, including planning, institutional research, and general administrative support in higher education, are discussed. Changing aspects of information processing and an increasingly larger user community are considered. The computing literacy cycle may involve programming, analysis, use…
ADP Analysis project for the Human Resources Management Division
NASA Technical Reports Server (NTRS)
Tureman, Robert L., Jr.
1993-01-01
The ADP (Automated Data Processing) Analysis Project was conducted for the Human Resources Management Division (HRMD) of NASA's Langley Research Center. The three major areas of work in the project were computer support, automated inventory analysis, and an ADP study for the Division. The goal of the computer support work was to determine automation needs of Division personnel and help them solve computing problems. The goal of automated inventory analysis was to find a way to analyze installed software and usage on a Macintosh. Finally, the ADP functional systems study for the Division was designed to assess future HRMD needs concerning ADP organization and activities.
Automated computation of autonomous spectral submanifolds for nonlinear modal analysis
NASA Astrophysics Data System (ADS)
Ponsioen, Sten; Pedergnana, Tiemo; Haller, George
2018-04-01
We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.
Automated management of radioactive sources in Saudi Arabia
NASA Astrophysics Data System (ADS)
Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.; Tuwaili, W. R.
2014-09-01
For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check request status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.
Infrastructure development for radioactive materials at the NSLS-II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprouster, D. J.; Weidner, R.; Ghose, S. K.
2018-02-01
The X-ray Powder Diffraction (XPD) Beamline at the National Synchrotron Light Source-II is a multipurpose instrument designed for high-resolution, high-energy X-ray scattering techniques. In this article, the capabilities, opportunities and recent developments in the characterization of radioactive materials at XPD are described. The overarching goal of this work is to provide researchers access to advanced synchrotron techniques suited to the structural characterization of materials for advanced nuclear energy systems. XPD is a new beamline providing high photon flux for X-ray Diffraction, Pair Distribution Function analysis and Small Angle X-ray Scattering. The infrastructure and software described here extend the existing capabilitiesmore » at XPD to accommodate radioactive materials. Such techniques will contribute crucial information to the characterization and quantification of advanced materials for nuclear energy applications. We describe the automated radioactive sample collection capabilities and recent X-ray Diffraction and Small Angle X-ray Scattering results from neutron irradiated reactor pressure vessel steels and oxide dispersion strengthened steels.« less
Infrastructure development for radioactive materials at the NSLS-II
Sprouster, David J.; Weidner, R.; Ghose, S. K.; ...
2017-11-04
The X-ray Powder Diffraction (XPD) Beamline at the National Synchrotron Light Source-II is a multipurpose instrument designed for high-resolution, high-energy X-ray scattering techniques. In this paper, the capabilities, opportunities and recent developments in the characterization of radioactive materials at XPD are described. The overarching goal of this work is to provide researchers access to advanced synchrotron techniques suited to the structural characterization of materials for advanced nuclear energy systems. XPD is a new beamline providing high photon flux for X-ray Diffraction, Pair Distribution Function analysis and Small Angle X-ray Scattering. The infrastructure and software described here extend the existing capabilitiesmore » at XPD to accommodate radioactive materials. Such techniques will contribute crucial information to the characterization and quantification of advanced materials for nuclear energy applications. Finally, we describe the automated radioactive sample collection capabilities and recent X-ray Diffraction and Small Angle X-ray Scattering results from neutron irradiated reactor pressure vessel steels and oxide dispersion strengthened steels.« less
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2013 CFR
2013-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2014 CFR
2014-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2011 CFR
2011-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
45 CFR 310.1 - What definitions apply to this part?
Code of Federal Regulations, 2012 CFR
2012-10-01
... existing automated data processing computer system through an Intergovernmental Service Agreement; (4...) Office Automation means a generic adjunct component of a computer system that supports the routine... timely and satisfactory; (iv) Assurances that information in the computer system as well as access, use...
17 CFR 38.156 - Automated trade surveillance system.
Code of Federal Regulations, 2014 CFR
2014-04-01
... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...
17 CFR 38.156 - Automated trade surveillance system.
Code of Federal Regulations, 2013 CFR
2013-04-01
... potential trade practice violations. The automated system must load and process daily orders and trades no... anomalies; compute, retain, and compare trading statistics; compute trade gains, losses, and futures...
ERIC Educational Resources Information Center
Majchrzak, Ann
A study was conducted of the training programs used by plants with Computer Automated Design/Computer Automated Manufacturing (CAD/CAM) to help their employees adapt to automated manufacturing. The study sought to determine the relative priorities of manufacturing establishments for training certain workers in certain skills; the status of…
Radioactive Nanomaterials for Multimodality Imaging
Chen, Daiqin; Dougherty, Casey A.; Yang, Dongzhi; Wu, Hongwei; Hong, Hao
2016-01-01
Nuclear imaging techniques, including primarily positron emission tomography (PET) and single-photon emission computed tomography (SPECT), can provide quantitative information for a biological event in vivo with ultra-high sensitivity, however, the comparatively low spatial resolution is their major limitation in clinical application. By convergence of nuclear imaging with other imaging modalities like computed tomography (CT), magnetic resonance imaging (MRI) and optical imaging, the hybrid imaging platforms can overcome the limitations from each individual imaging technique. Possessing versatile chemical linking ability and good cargo-loading capacity, radioactive nanomaterials can serve as ideal imaging contrast agents. In this review, we provide a brief overview about current state-of-the-art applications of radioactive nanomaterials in the circumstances of multimodality imaging. We present strategies for incorporation of radioisotope(s) into nanomaterials along with applications of radioactive nanomaterials in multimodal imaging. Advantages and limitations of radioactive nanomaterials for multimodal imaging applications are discussed. Finally, a future perspective of possible radioactive nanomaterial utilization is presented for improving diagnosis and patient management in a variety of diseases. PMID:27227167
Using satellite communications for a mobile computer network
NASA Technical Reports Server (NTRS)
Wyman, Douglas J.
1993-01-01
The topics discussed include the following: patrol car automation, mobile computer network, network requirements, network design overview, MCN mobile network software, MCN hub operation, mobile satellite software, hub satellite software, the benefits of patrol car automation, the benefits of satellite mobile computing, and national law enforcement satellite.
NASA Technical Reports Server (NTRS)
Boyle, W. G.; Barton, G. W.
1979-01-01
The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…
Computer-Aided Instruction in Automated Instrumentation.
ERIC Educational Resources Information Center
Stephenson, David T.
1986-01-01
Discusses functions of automated instrumentation systems, i.e., systems which combine electrical measuring instruments and a controlling computer to measure responses of a unit under test. The computer-assisted tutorial then described is programmed for use on such a system--a modern microwave spectrum analyzer--to introduce engineering students to…
Computed tomography of radioactive objects and materials
NASA Astrophysics Data System (ADS)
Sawicka, B. D.; Murphy, R. V.; Tosello, G.; Reynolds, P. W.; Romaniszyn, T.
1990-12-01
Computed tomography (CT) has been performed on a number of radioactive objects and materials. Several unique technical problems are associated with CT of radioactive specimens. These include general safety considerations, techniques to reduce background-radiation effects on CT images and selection criteria for the CT source to permit object penetration and to reveal accurate values of material density. In the present paper, three groups of experiments will be described, for objects with low, medium and high levels of radioactivity. CT studies on radioactive specimens will be presented. They include the following: (1) examination of individual ceramic reactor-fuel (uranium dioxide) pellets, (2) examination of fuel samples from the Three Mile Island reactor, (3) examination of a CANDU (CANada Deuterium Uraniun: registered trademark) nuclear-fuel bundle which underwent a simulated loss-of-coolant accident resulting in high-temperature damage and (4) examination of a PWR nuclear-reactor fuel assembly.
An Analysis for Capital Expenditure Decisions at a Naval Regional Medical Center.
1981-12-01
Service Equipment Review Committee 1. Portable defibrilator Computed tomographic scanner and cardioscope 2. ECG cart Automated blood cell counter 3. Gas...system sterilizer Gas system sterilizer 4. Automated blood cell Portable defibrilator and counter cardioscope 5. Computed tomographic ECG cart scanner...dictating and automated typing) systems. e. Filing equipment f. Automatic data processing equipment including data communications equipment. g
An, Gao; Hong, Li; Zhou, Xiao-Bing; Yang, Qiong; Li, Mei-Qing; Tang, Xiang-Yang
2017-03-01
We investigated and compared the functionality of two 3D visualization software provided by a CT vendor and a third-party vendor, respectively. Using surgical anatomical measurement as baseline, we evaluated the accuracy of 3D visualization and verified their utility in computer-aided anatomical analysis. The study cohort consisted of 50 adult cadavers fixed with the classical formaldehyde method. The computer-aided anatomical analysis was based on CT images (in DICOM format) acquired by helical scan with contrast enhancement, using a CT vendor provided 3D visualization workstation (Syngo) and a third-party 3D visualization software (Mimics) that was installed on a PC. Automated and semi-automated segmentations were utilized in the 3D visualization workstation and software, respectively. The functionality and efficiency of automated and semi-automated segmentation methods were compared. Using surgical anatomical measurement as a baseline, the accuracy of 3D visualization based on automated and semi-automated segmentations was quantitatively compared. In semi-automated segmentation, the Mimics 3D visualization software outperformed the Syngo 3D visualization workstation. No significant difference was observed in anatomical data measurement by the Syngo 3D visualization workstation and the Mimics 3D visualization software (P>0.05). Both the Syngo 3D visualization workstation provided by a CT vendor and the Mimics 3D visualization software by a third-party vendor possessed the needed functionality, efficiency and accuracy for computer-aided anatomical analysis. Copyright © 2016 Elsevier GmbH. All rights reserved.
In vivo traffic of indium-111-oxine labeled human lymphocytes collected by automated apheresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Read, E.J.; Keenan, A.M.; Carter, C.S.
1990-06-01
The in vivo traffic patterns of autologous lymphocytes were studied in five normal human volunteers using lymphocytes obtained by automated apheresis, separated on Ficoll-Hypaque gradients, and labeled ex vivo with {sup 111}In-oxine. Final lymphocyte infusions contained 1.8-3.1 X 10(9) cells and 270-390 microCi (9.99-14.43 MBq) {sup 111}In, or 11-17 microCi (0.41-0.63 MBq) per 10(8) lymphocytes. Gamma imaging showed transient lung uptake and significant retention of radioactivity in the liver and spleen. Progressive uptake of activity in normal, nonpalpable axillary and inguinal lymph nodes was seen from 24 to 96 hr. Accumulation of radioactivity also was demonstrated at the forearm skinmore » test site, as well as in its associated epitrochlear and axillary lymph nodes, in a subject who had been tested for delayed hypersensitivity with tetanus toxoid. Indium-111-oxine labeled human lymphocytes may provide a useful tool for future studies of normal and abnormal lymphocyte traffic.« less
Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Victor
2012-07-15
A novel lab-on-valve system has been developed for strontium determination in environmental samples. Miniaturized lab-on-valve system potentially offers facilities to allow any kind of chemical and physical processes, including fluidic and microcarrier bead control, homogenous reaction and liquid-solid interaction. A rapid, inexpensive and fully automated method for the separation and preconcentration of total and radioactive strontium, using a solid phase extraction material (Sr-Resin), has been developed. Total strontium concentrations are determined by ICP-OES and (90)Sr activities by a low background proportional counter. The method has been successfully applied to different water samples of environmental interest. The proposed system offers minimization of sample handling, drastic reduction of reagent volume, improvement of the reproducibility and sample throughput and attains a significant decrease of both time and cost per analysis. The LLD of the total Sr reached is 1.8ng and the minimum detectable activity for (90)Sr is 0.008Bq. The repeatability of the separation procedure is 1.2% (n=10). Copyright © 2011 Elsevier B.V. All rights reserved.
Automated computer grading of hardwood lumber
P. Klinkhachorn; J.P. Franklin; Charles W. McMillin; R.W. Conners; H.A. Huber
1988-01-01
This paper describes an improved computer program to grade hardwood lumber. The program was created as part of a system to automate various aspects of the hardwood manufacturing industry. It enhances previous efforts by considering both faces of the board and provides easy application of species dependent rules. The program can be readily interfaced with a computer...
Computer Assisted School Automation (CASA) in Japan.
ERIC Educational Resources Information Center
Sakamoto, Takashi; Nakanome, Naoaki
1991-01-01
This assessment of the status of computer assisted school automation (CASA) in Japan begins by describing the structure of the Japanese educational system and the roles of CASA in that system. Statistics on various aspects of computers in Japanese schools and the findings of several surveys are cited to report on the present state of educational…
An Excel[TM] Model of a Radioactive Series
ERIC Educational Resources Information Center
Andrews, D. G. H.
2009-01-01
A computer model of the decay of a radioactive series, written in Visual Basic in Excel[TM], is presented. The model is based on the random selection of cells in an array. The results compare well with the theoretical equations. The model is a useful tool in teaching this aspect of radioactivity. (Contains 4 figures.)
DOT National Transportation Integrated Search
1974-08-01
Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work force, computer resources, controller productivity, system manning, failure ef...
Automated management of radioactive sources in Saudi Arabia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Kheliewi, Abdullah S.; Jamil, M. F.; Basar, M. R.
2014-09-30
For usage of radioactive substances, any facility has to register and take license from relevant authority of the country in which such facility is operating. In the Kingdom of Saudi Arabia (KSA), the authority for managing radioactive sources and providing licenses to organizations for its usage is the National Center of Radiation Protection (NCRP). This paper describes the system that automates registration and licensing process of the National Center of Radiation Protection. To provide 24×7 accesses to all the customers of NCRP, system is developed as web-based application that provide facility to online register, request license, renew license, check requestmore » status, view historical data and reports etc. and other features are provided as Electronic Services that would be accessible to users via internet. The system also was designed to streamline and optimize internal operations of NCRP besides providing ease of access to its customers by implementing a defined workflow through which every registration and license request will be routed. In addition to manual payment option, the system would also be integrated with SADAD (online payment system) that will avoid lengthy and cumbersome procedures associated with manual payment mechanism. Using SADAD payment option license fee could be paid through internet/ATM machine or branch of any designated bank, Payment will be instantly notified to NCRP hence delay in funds transfer and verification of invoice could be avoided, SADAD integration is discussed later in the document.« less
ERIC Educational Resources Information Center
Collins, Michael J.; Vitz, Ed
1988-01-01
Examines two computer interfaced lab experiments: 1) discusses the automation of a Perkin Elmer 337 infrared spectrophotometer noting the mechanical and electronic changes needed; 2) uses the Gouy method and Lotus Measure software to automate magnetic susceptibility determinations. Methodology is described. (MVL)
Safety in the Automated Office.
ERIC Educational Resources Information Center
Graves, Pat R.; Greathouse, Lillian R.
1990-01-01
Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)
Computer automation for feedback system design
NASA Technical Reports Server (NTRS)
1975-01-01
Mathematical techniques and explanations of various steps used by an automated computer program to design feedback systems are summarized. Special attention was given to refining the automatic evaluation suboptimal loop transmission and the translation of time to frequency domain specifications.
Understanding and enhancing user acceptance of computer technology
NASA Technical Reports Server (NTRS)
Rouse, William B.; Morris, Nancy M.
1986-01-01
Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Husler, R.O.; Weir, T.J.
1991-01-01
An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified tomore » include process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Husler, R.O.; Weir, T.J.
1991-12-31
An enhanced maintenance program is being established to characterize and monitor cables, components, and process response at the Savannah River Site, Defense Waste Processing Facility. This facility was designed and constructed to immobilize the radioactive waste currently stored in underground storage tanks and is expected to begin operation in 1993. The plant is initiating the program to baseline and monitor instrument and control (I&C) and electrical equipment, remote process equipment, embedded instrument and control cables, and in-cell jumper cables used in the facility. This program is based on the electronic characterization and diagnostic (ECAD) system which was modified to includemore » process response analysis and to meet rigid Department of Energy equipment requirements. The system consists of computer-automated, state-of-the-art electronics. The data that are gathered are stored in a computerized database for analysis, trending, and troubleshooting. It is anticipated that the data which are gathered and trended will aid in life extension for the facility.« less
New problems and opportunities of oil spill monitoring systems
NASA Astrophysics Data System (ADS)
Barenboim, G. M.; Borisov, V. M.; Golosov, V. N.; Saveca, A. Yu.
2015-04-01
Emergency oil and oil products spills represent a great danger to the environment, including ecosystems, and to the population. New problems of such dangerous spills and methods of early detection are discussed in this paper. It is proposed to conduct assessment of biological hazards of such spills on the basis of data on the distribution of individual oil hydrocarbons within the column of the water body and computer predictions of their toxicity. Oil radioactivity, which is associated with uranium and thorium, is seen as the important aspect of the oil spill danger, especially in watercourses. The need for an automated monitoring system for the early detection of oil spills in water bodies is analysed. The proposed system consists of three subsystems. The first remote sensing subsystem is based on powerful fluorescent lidars; experimental results on lidar registration of oil pollution of water are reported. The second subsystem uses a network of automatic monitoring stations with contact detectors. The third subsystem is the combined sensor system based on remote and contact technologies.
Automation to improve efficiency of field expedient injury prediction screening.
Teyhen, Deydre S; Shaffer, Scott W; Umlauf, Jon A; Akerman, Raymond J; Canada, John B; Butler, Robert J; Goffar, Stephen L; Walker, Michael J; Kiesel, Kyle B; Plisky, Phillip J
2012-07-01
Musculoskeletal injuries are a primary source of disability in the U.S. Military. Physical training and sports-related activities account for up to 90% of all injuries, and 80% of these injuries are considered overuse in nature. As a result, there is a need to develop an evidence-based musculoskeletal screen that can assist with injury prevention. The purpose of this study was to assess the capability of an automated system to improve the efficiency of field expedient tests that may help predict injury risk and provide corrective strategies for deficits identified. The field expedient tests include survey questions and measures of movement quality, balance, trunk stability, power, mobility, and foot structure and mobility. Data entry for these tests was automated using handheld computers, barcode scanning, and netbook computers. An automated algorithm for injury risk stratification and mitigation techniques was run on a server computer. Without automation support, subjects were assessed in 84.5 ± 9.1 minutes per subject compared with 66.8 ± 6.1 minutes per subject with automation and 47.1 ± 5.2 minutes per subject with automation and process improvement measures (p < 0.001). The average time to manually enter the data was 22.2 ± 7.4 minutes per subject. An additional 11.5 ± 2.5 minutes per subject was required to manually assign an intervention strategy. Automation of this injury prevention screening protocol using handheld devices and netbook computers allowed for real-time data entry and enhanced the efficiency of injury screening, risk stratification, and prescription of a risk mitigation strategy.
DOT National Transportation Integrated Search
1974-08-01
Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...
Automated clinical system for chromosome analysis
NASA Technical Reports Server (NTRS)
Castleman, K. R.; Friedan, H. J.; Johnson, E. T.; Rennie, P. A.; Wall, R. J. (Inventor)
1978-01-01
An automatic chromosome analysis system is provided wherein a suitably prepared slide with chromosome spreads thereon is placed on the stage of an automated microscope. The automated microscope stage is computer operated to move the slide to enable detection of chromosome spreads on the slide. The X and Y location of each chromosome spread that is detected is stored. The computer measures the chromosomes in a spread, classifies them by group or by type and also prepares a digital karyotype image. The computer system can also prepare a patient report summarizing the result of the analysis and listing suspected abnormalities.
Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D
NASA Technical Reports Server (NTRS)
Carle, Alan; Fagan, Mike; Green, Lawrence L.
1998-01-01
This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.
1977-01-26
Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU
Automation of the aircraft design process
NASA Technical Reports Server (NTRS)
Heldenfels, R. R.
1974-01-01
The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.
AN ULTRAVIOLET-VISIBLE SPECTROPHOTOMETER AUTOMATION SYSTEM. PART I: FUNCTIONAL SPECIFICATIONS
This document contains the project definition, the functional requirements, and the functional design for a proposed computer automation system for scanning spectrophotometers. The system will be implemented on a Data General computer using the BASIC language. The system is a rea...
Automated Intelligent Agents: Are They Trusted Members of Military Teams?
2008-12-01
computer -based team firefighting game (C3Fire). The order of presentation of the two trials (human – human vs. human – automation) was...agent. All teams played a computer -based team firefighting game (C3Fire). The order of presentation of the two trials (human – human vs. human...26 b. Participants’ Computer ..................27 C. VARIABLES .........................................27 1. Independent Variables
DOT National Transportation Integrated Search
1976-08-01
This report contains a functional design for the simulation of a future automation concept in support of the ATC Systems Command Center. The simulation subsystem performs airport airborne arrival delay predictions and computes flow control tables for...
Computerized Manufacturing Automation. Employment, Education, and the Workplace. Summary.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
The application of programmable automation (PA) offers new opportunities to enhance and streamline manufacturing processes. Five PA technologies are examined in this report: computer-aided design, robots, numerically controlled machine tools, flexible manufacturing systems, and computer-integrated manufacturing. Each technology is in a relatively…
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2014 CFR
2014-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
32 CFR 806b.35 - Balancing protection.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...
Schmidt, Taly Gilat; Wang, Adam S; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh
2016-10-01
The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was [Formula: see text], with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors.
Schmidt, Taly Gilat; Wang, Adam S.; Coradi, Thomas; Haas, Benjamin; Star-Lack, Josh
2016-01-01
Abstract. The overall goal of this work is to develop a rapid, accurate, and automated software tool to estimate patient-specific organ doses from computed tomography (CT) scans using simulations to generate dose maps combined with automated segmentation algorithms. This work quantified the accuracy of organ dose estimates obtained by an automated segmentation algorithm. We hypothesized that the autosegmentation algorithm is sufficiently accurate to provide organ dose estimates, since small errors delineating organ boundaries will have minimal effect when computing mean organ dose. A leave-one-out validation study of the automated algorithm was performed with 20 head-neck CT scans expertly segmented into nine regions. Mean organ doses of the automatically and expertly segmented regions were computed from Monte Carlo-generated dose maps and compared. The automated segmentation algorithm estimated the mean organ dose to be within 10% of the expert segmentation for regions other than the spinal canal, with the median error for each organ region below 2%. In the spinal canal region, the median error was −7%, with a maximum absolute error of 28% for the single-atlas approach and 11% for the multiatlas approach. The results demonstrate that the automated segmentation algorithm can provide accurate organ dose estimates despite some segmentation errors. PMID:27921070
Automated data acquisition and processing for a Hohlraum reflectometer
NASA Technical Reports Server (NTRS)
Difilippo, Frank; Mirtich, Michael J.
1988-01-01
A computer and data acquisition board were used to automate a Perkin-Elmer Model 13 spectrophotometer with a Hohlraum reflectivity attachment. Additional electronic circuitry was necessary for amplification, filtering, and debouncing. The computer was programmed to calculate spectral emittance from 1.7 to 14.7 micrometers and also total emittance versus temperature. Automation of the Hohlraum reflectometer reduced the time required to determine total emittance versus temperature from about three hours to about 40 minutes.
Computer automation of ultrasonic testing. [inspection of ultrasonic welding
NASA Technical Reports Server (NTRS)
Yee, B. G. W.; Kerlin, E. E.; Gardner, A. H.; Dunmyer, D.; Wells, T. G.; Robinson, A. R.; Kunselman, J. S.; Walker, T. C.
1974-01-01
Report describes a prototype computer-automated ultrasonic system developed for the inspection of weldments. This system can be operated in three modes: manual, automatic, and computer-controlled. In the computer-controlled mode, the system will automatically acquire, process, analyze, store, and display ultrasonic inspection data in real-time. Flaw size (in cross-section), location (depth), and type (porosity-like or crack-like) can be automatically discerned and displayed. The results and pertinent parameters are recorded.
Does Automated Feedback Improve Writing Quality?
ERIC Educational Resources Information Center
Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.
2014-01-01
The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…
Workshop on Office Automation and Telecommunication: Applying the Technology.
ERIC Educational Resources Information Center
Mitchell, Bill
This document contains 12 outlines that forecast the office of the future. The outlines cover the following topics: (1) office automation definition and objectives; (2) functional categories of office automation software packages for mini and mainframe computers; (3) office automation-related software for microcomputers; (4) office automation…
Translations on USSR Science and Technology Physical Sciences and Technology No. 18
1977-09-19
and Avetik Gukasyan discuss component arrangement alternatives. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND...1974. COPYRIGHT: Notice not available 8545 CSO: 1870 CYBERNETICS, COMPUTERS AND AUTOMATION TECHNOLOGY ’PROYEKC’ COMPUTER-ASSISTED DESIGN SYSTEM...throughout the world are struggling. The "Proyekt" system, produced in the Institute of Cybernetics, assists in automating the design and manufacture of
NASA Astrophysics Data System (ADS)
Markelov, A. Y.; Shiryaevskii, V. L.; Kudrinskiy, A. A.; Anpilov, S. V.; Bobrakov, A. N.
2017-11-01
A computational method of analysis of physical and chemical processes of high-temperature mineralizing of low-level radioactive waste in gas stream in the process of plasma treatment of radioactive waste in shaft furnaces was introduced. It was shown that the thermodynamic simulation method allows fairly adequately describing the changes in the composition of the pyrogas withdrawn from the shaft furnace at different waste treatment regimes. This offers a possibility of developing environmentally and economically viable technologies and small-sized low-cost facilities for plasma treatment of radioactive waste to be applied at currently operating nuclear power plants.
ELECTRONIC ANALOG COMPUTER FOR DETERMINING RADIOACTIVE DISINTEGRATION
Robinson, H.P.
1959-07-14
A computer is presented for determining growth and decay curves for elements in a radioactive disintegration series wherein one unstable element decays to form a second unstable element or isotope, which in turn forms a third element, etc. The growth and decay curves of radioactive elements are simulated by the charge and discharge curves of a resistance-capacitance network. Several such networks having readily adjustable values are connected in series with an amplifier between each successive pair. The time constant of each of the various networks is set proportional to the half-life of a corresponding element in the series represented and the charge and discharge curves of each of the networks simulates the element growth and decay curve.
Automated Quantification of Pneumothorax in CT
Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer
2012-01-01
An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091
A Review of Developments in Computer-Based Systems to Image Teeth and Produce Dental Restorations
Rekow, E. Dianne; Erdman, Arthur G.; Speidel, T. Michael
1987-01-01
Computer-aided design and manufacturing (CAD/CAM) make it possible to automate the creation of dental restorations. Currently practiced techniques are described. Three automated systems currently under development are described and compared. Advances in computer-aided design and computer-aided manufacturing (CAD/CAM) provide a new option for dentistry, creating an alternative technique for producing dental restorations. It is possible to create dental restorations that are automatically produced and meet or exceed current requirements for fit and occlusion.
Automation; The New Industrial Revolution.
ERIC Educational Resources Information Center
Arnstein, George E.
Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…
Final-Approach-Spacing Subsystem For Air Traffic
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh
1992-01-01
Automation subsystem of computers, computer workstations, communication equipment, and radar helps air-traffic controllers in terminal radar approach-control (TRACON) facility manage sequence and spacing of arriving aircraft for both efficiency and safety. Called FAST (Final Approach Spacing Tool), subsystem enables controllers to choose among various levels of automation.
ERIC Educational Resources Information Center
Kibirige, Harry M.
1991-01-01
Discussion of the potential effects of fiber optic-based communication technology on information networks and systems design highlights library automation. Topics discussed include computers and telecommunications systems, the importance of information in national economies, microcomputers, local area networks (LANs), national computer networks,…
An Introduction to Archival Automation: A RAMP Study with Guidelines.
ERIC Educational Resources Information Center
Cook, Michael
Developed under a contract with the International Council on Archives, these guidelines are designed to emphasize the role of automation techniques in archives and records services, provide an indication of existing computer systems used in different archives services and of specific computer applications at various stages of archives…
Kinetics analysis and quantitative calculations for the successive radioactive decay process
NASA Astrophysics Data System (ADS)
Zhou, Zhiping; Yan, Deyue; Zhao, Yuliang; Chai, Zhifang
2015-01-01
The general radioactive decay kinetics equations with branching were developed and the analytical solutions were derived by Laplace transform method. The time dependence of all the nuclide concentrations can be easily obtained by applying the equations to any known radioactive decay series. Taking the example of thorium radioactive decay series, the concentration evolution over time of various nuclide members in the family has been given by the quantitative numerical calculations with a computer. The method can be applied to the quantitative prediction and analysis for the daughter nuclides in the successive decay with branching of the complicated radioactive processes, such as the natural radioactive decay series, nuclear reactor, nuclear waste disposal, nuclear spallation, synthesis and identification of superheavy nuclides, radioactive ion beam physics and chemistry, etc.
A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Safety Metrics for Human-Computer Controlled Systems
NASA Technical Reports Server (NTRS)
Leveson, Nancy G; Hatanaka, Iwao
2000-01-01
The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.
Philip A. Araman; Janice K. Wiedenbeck
1995-01-01
Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...
DOT National Transportation Integrated Search
1994-04-30
The Transportation Legislative Data Base (TLDB) is a computer-based information service containing summaries of federal, state and certain local government statutes and regulations relating to the transportation of radioactive materials in the United...
Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald
2017-01-01
ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.
Multiphasic Health Testing in the Clinic Setting
LaDou, Joseph
1971-01-01
The economy of automated multiphasic health testing (amht) activities patterned after the high-volume Kaiser program can be realized in low-volume settings. amht units have been operated at daily volumes of 20 patients in three separate clinical environments. These programs have displayed economics entirely compatible with cost figures published by the established high-volume centers. This experience, plus the expanding capability of small, general purpose, digital computers (minicomputers) indicates that a group of six or more physicians generating 20 laboratory appraisals per day can economically justify a completely automated multiphasic health testing facility. This system would reside in the clinic or hospital where it is used and can be configured to do analyses such as electrocardiography and generate laboratory reports, and communicate with large computer systems in university medical centers. Experience indicates that the most effective means of implementing these benefits of automation is to make them directly available to the medical community with the physician playing the central role. Economic justification of a dedicated computer through low-volume health testing then allows, as a side benefit, automation of administrative as well as other diagnostic activities—for example, patient billing, computer-aided diagnosis, and computer-aided therapeutics. PMID:4935771
Measurement of fission product gases in the atmosphere
NASA Astrophysics Data System (ADS)
Schell, W. R.; Tobin, M. J.; Marsan, D. J.; Schell, C. W.; Vives-Batlle, J.; Yoon, S. R.
1997-01-01
The ability to quickly detect and assess the magnitude of releases of fission-produced radioactive material is of significant importance for ongoing operations of any conventional nuclear power plant or other activities with a potential for fission product release. In most instances, the control limits for the release of airborne radioactivity are low enough to preclude direct air sampling as a means of detection, especially for fission gases that decay by beta or electron emission. It is, therefore, customary to concentrate the major gaseous fission products (krypton, xenon and iodine) by cryogenic adsorption for subsequent separation and measurement. This study summarizes our initial efforts to develop an automated portable system for on-line separation and concentration with the potential for measuring environmental levels of radioactive gases, including 85Kr, 131,133,135Xe, 14C, 3H, 35S, 125,131I, etc., without using cryogenic fluids. Bench top and prototype models were constructed using the principle of heatless fractionation of the gases in a pressure swing system. This method removes the requirement for cryogenic fluids to concentrate gases and, with suitable electron and gamma ray detectors, provides for remote use under automatic computer control. Early results using 133Xe tracer show that kinetic chromatography, i.e., high pressure adsorption of xenon and low pressure desorption of air, using specific types of molecular sieves, permits the separation and quantification of xenon isotopes from large volume air samples. We are now developing the ability to measure the presence and amounts of fission-produced xenon isotopes that decay by internal conversion electrons and beta radiation with short half-lives, namely 131mXe, 11.8 d, 133mXe, 2.2 d, 133Xe, 5.2 d and 135Xe, 9.1 h. The ratio of the isotopic concentrations measured can be used to determine unequivocally the amount of fission gas and time of release of an air parcel many kilometers downwind from a nuclear activity where the fission products were discharged.
ERIC Educational Resources Information Center
Journal of College Science Teaching, 1976
1976-01-01
Presents information on storage of radioactive wastes, the AAAS February meeting, an NSF publication on doctoral scientists and engineers, the Research Associateship programs of the National Research Council, the international congress on cybernetics, the effects of nuclear war on noncombatants, radioactivity in drinking water, and computer based…
Using Software Tools to Automate the Assessment of Student Programs.
ERIC Educational Resources Information Center
Jackson, David
1991-01-01
Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…
In-House Automation of a Small Library Using a Mainframe Computer.
ERIC Educational Resources Information Center
Waranius, Frances B.; Tellier, Stephen H.
1986-01-01
An automated library routine management system was developed in-house to create system unique to the Library and Information Center, Lunar and Planetary Institute, Houston, Texas. A modular approach was used to allow continuity in operations and services as system was implemented. Acronyms and computer accounts and file names are appended.…
Automated Management Of Documents
NASA Technical Reports Server (NTRS)
Boy, Guy
1995-01-01
Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.
ERIC Educational Resources Information Center
Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.
2016-01-01
Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…
ERIC Educational Resources Information Center
Epstein, A. H.; And Others
The first phase of an ongoing library automation project at Stanford University is described. Project BALLOTS (Bibliographic Automation of Large Library Operations Using a Time-Sharing System) seeks to automate the acquisition and cataloging functions of a large library using an on-line time-sharing computer. The main objectives are to control…
The automation of an inlet mass flow control system
NASA Technical Reports Server (NTRS)
Supplee, Frank; Tcheng, Ping; Weisenborn, Michael
1989-01-01
The automation of a closed-loop computer controlled system for the inlet mass flow system (IMFS) developed for a wind tunnel facility at Langley Research Center is presented. This new PC based control system is intended to replace the manual control system presently in use in order to fully automate the plug positioning of the IMFS during wind tunnel testing. Provision is also made for communication between the PC and a host-computer in order to allow total animation of the plug positioning and data acquisition during the complete sequence of predetermined plug locations. As extensive running time is programmed for the IMFS, this new automated system will save both manpower and tunnel running time.
NMR-based automated protein structure determination.
Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter
2017-08-15
NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.
1981-02-01
Continue on tevetee «Id* If necemtery mid Identify br black number) Battlefield automated systems Human- computer interaction. Design criteria System...Report (this report) In-Depth Analyses of Individual Systems A. Tactical Fire Direction System (TACFIRE) (RP 81-26) B. Tactical Computer Terminal...select the design features and operating procedures of the human- computer Interface which best match the require- ments and capabilities of anticipated
A Computational Workflow for the Automated Generation of Models of Genetic Designs.
Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil
2018-06-05
Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.
Computer Series, 13: Bits and Pieces, 11.
ERIC Educational Resources Information Center
Moore, John W., Ed.
1982-01-01
Describes computer programs (with ordering information) on various topics including, among others, modeling of thermodynamics and economics of solar energy, radioactive decay simulation, stoichiometry drill/tutorial (in Spanish), computer-generated safety quiz, medical chemistry computer game, medical biochemistry question bank, generation of…
NASA Astrophysics Data System (ADS)
Kawai, Ryosuke; Hara, Takeshi; Katafuchi, Tetsuro; Ishihara, Tadahiko; Zhou, Xiangrong; Muramatsu, Chisako; Abe, Yoshiteru; Fujita, Hiroshi
2015-03-01
MIBG (iodine-123-meta-iodobenzylguanidine) is a radioactive medicine that is used to help diagnose not only myocardial diseases but also Parkinson's diseases (PD) and dementia with Lewy Bodies (DLB). The difficulty of the segmentation around the myocardium often reduces the consistency of measurement results. One of the most common measurement methods is the ratio of the uptake values of the heart to mediastinum (H/M). This ratio will be a stable independent of the operators when the uptake value in the myocardium region is clearly higher than that in background, however, it will be unreliable indices when the myocardium region is unclear because of the low uptake values. This study aims to develop a new measurement method by using the image fusion of three modalities of MIBG scintigrams, 201-Tl scintigrams, and chest radiograms, to increase the reliability of the H/M measurement results. Our automated method consists of the following steps: (1) construct left ventricular (LV) map from 201-Tl myocardium image database, (2) determine heart region in chest radiograms, (3) determine mediastinum region in chest radiograms, (4) perform image fusion of chest radiograms and MIBG scintigrams, and 5) perform H/M measurements on MIBG scintigrams by using the locations of heart and mediastinum determined on the chest radiograms. We collected 165 cases with 201-Tl scintigrams and chest radiograms to construct the LV map. Another 65 cases with MIBG scintigrams and chest radiograms were also collected for the measurements. Four radiological technologists (RTs) manually measured the H/M in the MIBG images. We compared the four RTs' results with our computer outputs by using Pearson's correlation, the Bland-Altman method, and the equivalency test method. As a result, the correlations of the H/M between four the RTs and the computer were 0.85 to 0.88. We confirmed systematic errors between the four RTs and the computer as well as among the four RTs. The variation range of the H/M among the four RTs was obtained as 0.22 based on the equivalency test method. The computer outputs were existed within this range. We concluded that our image fusion method could measure equivalent values between the system and the RTs.
NASA Astrophysics Data System (ADS)
Fiorini, Rodolfo A.; Dacquino, Gianfranco
2005-03-01
GEOGINE (GEOmetrical enGINE), a state-of-the-art OMG (Ontological Model Generator) based on n-D Tensor Invariants for n-Dimensional shape/texture optimal synthetic representation, description and learning, was presented in previous conferences elsewhere recently. Improved computational algorithms based on the computational invariant theory of finite groups in Euclidean space and a demo application is presented. Progressive model automatic generation is discussed. GEOGINE can be used as an efficient computational kernel for fast reliable application development and delivery in advanced biomedical engineering, biometric, intelligent computing, target recognition, content image retrieval, data mining technological areas mainly. Ontology can be regarded as a logical theory accounting for the intended meaning of a formal dictionary, i.e., its ontological commitment to a particular conceptualization of the world object. According to this approach, "n-D Tensor Calculus" can be considered a "Formal Language" to reliably compute optimized "n-Dimensional Tensor Invariants" as specific object "invariant parameter and attribute words" for automated n-Dimensional shape/texture optimal synthetic object description by incremental model generation. The class of those "invariant parameter and attribute words" can be thought as a specific "Formal Vocabulary" learned from a "Generalized Formal Dictionary" of the "Computational Tensor Invariants" language. Even object chromatic attributes can be effectively and reliably computed from object geometric parameters into robust colour shape invariant characteristics. As a matter of fact, any highly sophisticated application needing effective, robust object geometric/colour invariant attribute capture and parameterization features, for reliable automated object learning and discrimination can deeply benefit from GEOGINE progressive automated model generation computational kernel performance. Main operational advantages over previous, similar approaches are: 1) Progressive Automated Invariant Model Generation, 2) Invariant Minimal Complete Description Set for computational efficiency, 3) Arbitrary Model Precision for robust object description and identification.
DOT National Transportation Integrated Search
1974-08-01
Volume 4 describes the automation requirements. A presentation of automation requirements is made for an advanced air traffic management system in terms of controller work for-e, computer resources, controller productivity, system manning, failure ef...
Large-Scale Document Automation: The Systems Integration Issue.
ERIC Educational Resources Information Center
Kalthoff, Robert J.
1985-01-01
Reviews current technologies for electronic imaging and its recording and transmission, including digital recording, optical data disks, automated image-delivery micrographics, high-density-magnetic recording, and new developments in telecommunications and computers. The role of the document automation systems integrator, who will bring these…
Computer-controlled attenuator.
Mitov, D; Grozev, Z
1991-01-01
Various possibilities for applying electronic computer-controlled attenuators for the automation of physiological experiments are considered. A detailed description is given of the design of a 4-channel computer-controlled attenuator, in two of the channels of which the output signal can change by a linear step, in the other two channels--by a logarithmic step. This, together with the existence of additional programmable timers, allows to automate a wide range of studies in different spheres of physiology and psychophysics, including vision and hearing.
Analysis of Delays in Transmitting Time Code Using an Automated Computer Time Distribution System
1999-12-01
jlevine@clock. bldrdoc.gov Abstract An automated computer time distribution system broadcasts standard tune to users using computers and modems via...contributed to &lays - sofhareplatform (50% of the delay), transmission speed of time- codes (25OA), telephone network (lS%), modem and others (10’4). The... modems , and telephone lines. Users dial the ACTS server to receive time traceable to the national time scale of Singapore, UTC(PSB). The users can in
ERIC Educational Resources Information Center
Divilbiss, J. L., Ed.
To help the librarian in negotiating with vendors of automated library services, nine authors have presented methods of dealing with a specific service or situation. Paper topics include computer services, network contracts, innovative service, data processing, automated circulation, a turn-key system, data base sharing, online data base services,…
Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System
ERIC Educational Resources Information Center
Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia
2013-01-01
The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…
Students' Perceived Usefulness of Formative Feedback for a Computer-Adaptive Test
ERIC Educational Resources Information Center
Lilley, Mariana; Barker, Trevor
2007-01-01
In this paper we report on research related to the provision of automated feedback based on a computer adaptive test (CAT), used in formative assessment. A cohort of 76 second year university undergraduates took part in a formative assessment with a CAT and were provided with automated feedback on their performance. A sample of students responded…
Identifying and locating surface defects in wood: Part of an automated lumber processing system
Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa
1983-01-01
Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
ERIC Educational Resources Information Center
Klein, David C.
2014-01-01
As advancements in automation continue to alter the systemic behavior of computer systems in a wide variety of industrial applications, human-machine interactions are increasingly becoming supervisory in nature, with less hands-on human involvement. This maturing of the human role within the human-computer relationship is relegating operations…
ERIC Educational Resources Information Center
Federal Information Processing Standards Publication, 1976
1976-01-01
These guidelines provide a basis for determining the content and extent of documentation for computer programs and automated data systems. Content descriptions of ten document types plus examples of how management can determine when to use the various types are included. The documents described are (1) functional requirements documents, (2) data…
Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.
Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R
2018-01-01
Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom activity ratios 9.7:1, 4:1, and 2:1, respectively. For all phantoms and at all contrast ratios, the average RMS error was found to be significantly lower for the proposed automated method compared to the manual analysis of the phantom scans. The uptake measurements produced by the automated method showed high correlation with the independent reference standard (R 2 ≥ 0.9987). In addition, the average computing time for the automated method was 30.6 s and was found to be significantly lower (P ≪ 0.001) compared to manual analysis (mean: 247.8 s). The proposed automated approach was found to have less error when measured against the independent reference than the manual approach. It can be easily adapted to other phantoms with spherical inserts. In addition, it eliminates inter- and intraoperator variability in PET phantom analysis and is significantly more time efficient, and therefore, represents a promising approach to facilitate and simplify PET standardization and harmonization efforts. © 2017 American Association of Physicists in Medicine.
ERIC Educational Resources Information Center
Kay, Jack G.; And Others
1988-01-01
Describes two applications of the microcomputer for laboratory exercises. Explores radioactive decay using the Batemen equations on a Macintosh computer. Provides examples and screen dumps of data. Investigates polymer configurations using a Monte Carlo simulation on an IBM personal computer. (MVL)
Automating NEURON Simulation Deployment in Cloud Resources.
Stockton, David B; Santamaria, Fidel
2017-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.
Automating NEURON Simulation Deployment in Cloud Resources
Santamaria, Fidel
2016-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341
Computational Analysis of Behavior.
Egnor, S E Roian; Branson, Kristin
2016-07-08
In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.
The Computer Aided Aircraft-design Package (CAAP)
NASA Technical Reports Server (NTRS)
Yalif, Guy U.
1994-01-01
The preliminary design of an aircraft is a complex, labor-intensive, and creative process. Since the 1970's, many computer programs have been written to help automate preliminary airplane design. Time and resource analyses have identified, 'a substantial decrease in project duration with the introduction of an automated design capability'. Proof-of-concept studies have been completed which establish 'a foundation for a computer-based airframe design capability', Unfortunately, today's design codes exist in many different languages on many, often expensive, hardware platforms. Through the use of a module-based system architecture, the Computer aided Aircraft-design Package (CAAP) will eventually bring together many of the most useful features of existing programs. Through the use of an expert system, it will add an additional feature that could be described as indispensable to entry level engineers and students: the incorporation of 'expert' knowledge into the automated design process.
Automated Training Evaluation (ATE). Final Report.
ERIC Educational Resources Information Center
Charles, John P.; Johnson, Robert M.
The automation of weapons system training presents the potential for significant savings in training costs in terms of manpower, time, and money. The demonstration of the technical feasibility of automated training through the application of advanced digital computer techniques and advanced training techniques is essential before the application…
Progress in Fully Automated Abdominal CT Interpretation
Summers, Ronald M.
2016-01-01
OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207
Miller, M; Galloway, B; VanDerpoel, G; Johnson, E; Copland, J; Salazar, M
2000-02-01
Numerous sites in the United States and around the world are contaminated with depleted uranium (DU) in various forms. A prevalent form is fragmented DU originating from various scientific tests involving high explosives and DU during weapon development programs, at firing practice ranges, or war theaters where DU was used in armor-piercing projectiles. The contamination at these sites is typically very heterogeneous, with discreet, visually identifiable DU fragments mixed with native soil. That is, the bulk-averaged DU activity is quite low, while specific DU fragments, which are distinct from the soil matrix, have much higher specific activity. DU is best known as a dark, black metal that is nearly twice as dense as lead, but DU in the environment readily weathers (oxidizes) to a distinctive bright yellow color that is readily visible. While the specific activity (amount of radioactivity per mass of soil) of DU is relatively low and presents only a minor radiological hazard, the fact that it is radioactive and visually identifiable makes it desirable to remove the DU "contamination" from the environment. The typical approach to conducting this DU remediation is to use radiation detection instruments to identify the contaminant and separate it from the adjacent soil, packaging it for disposal as radioactive waste. This process can be performed manually or by specialized, automated equipment. Alternatively, in certain situations a more cost-effective approach might be simple mechanical or gravimetric separation of the DU fragments from the host soil matrix. At SNL/NM, both the automated and simple mechanical approaches have recently been employed. This paper discusses the pros/cons of the two approaches.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
NASA Astrophysics Data System (ADS)
Na, M.; Lee, S.; Kim, G.; Kim, H. S.; Rho, J.; Ok, J. G.
2017-12-01
Detecting and mapping the spatial distribution of radioactive materials is of great importance for environmental and security issues. We design and present a novel hemispherical rotational modulation collimator (H-RMC) system which can visualize the location of the radiation source by collecting signals from incident rays that go through collimator masks. The H-RMC system comprises a servo motor-controlled rotating module and a hollow heavy-metallic hemisphere with slits/slats equally spaced with the same angle subtended from the main axis. In addition, we also designed an auxiliary instrument to test the imaging performance of the H-RMC system, comprising a high-precision x- and y-axis staging station on which one can mount radiation sources of various shapes. We fabricated the H-RMC system which can be operated in a fully-automated fashion through the computer-based controller, and verify the accuracy and reproducibility of the system by measuring the rotational and linear positions with respect to the programmed values. Our H-RMC system may provide a pivotal tool for spatial radiation imaging with high reliability and accuracy.
Proof-of-concept automation of propellant processing
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar; Schallhorn, P. A.
1989-01-01
For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.
Nursing operations automation and health care technology innovations: 2025 and beyond.
Suby, ChrysMarie
2013-01-01
This article reviews why nursing operations automation is important, reviews the impact of computer technology on nursing from a historical perspective, and considers the future of nursing operations automation and health care technology innovations in 2025 and beyond. The increasing automation in health care organizations will benefit patient care, staffing and scheduling systems and central staffing offices, census control, and measurement of patient acuity.
Automation of Educational Tasks for Academic Radiology.
Lamar, David L; Richardson, Michael L; Carlson, Blake
2016-07-01
The process of education involves a variety of repetitious tasks. We believe that appropriate computer tools can automate many of these chores, and allow both educators and their students to devote a lot more of their time to actual teaching and learning. This paper details tools that we have used to automate a broad range of academic radiology-specific tasks on Mac OS X, iOS, and Windows platforms. Some of the tools we describe here require little expertise or time to use; others require some basic knowledge of computer programming. We used TextExpander (Mac, iOS) and AutoHotKey (Win) for automated generation of text files, such as resident performance reviews and radiology interpretations. Custom statistical calculations were performed using TextExpander and the Python programming language. A workflow for automated note-taking was developed using Evernote (Mac, iOS, Win) and Hazel (Mac). Automated resident procedure logging was accomplished using Editorial (iOS) and Python. We created three variants of a teaching session logger using Drafts (iOS) and Pythonista (iOS). Editorial and Drafts were used to create flashcards for knowledge review. We developed a mobile reference management system for iOS using Editorial. We used the Workflow app (iOS) to automatically generate a text message reminder for daily conferences. Finally, we developed two separate automated workflows-one with Evernote (Mac, iOS, Win) and one with Python (Mac, Win)-that generate simple automated teaching file collections. We have beta-tested these workflows, techniques, and scripts on several of our fellow radiologists. All of them expressed enthusiasm for these tools and were able to use one or more of them to automate their own educational activities. Appropriate computer tools can automate many educational tasks, and thereby allow both educators and their students to devote a lot more of their time to actual teaching and learning. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.
Luo, Yunhua; Ahmed, Sharif; Leslie, William D
2018-03-01
Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
What's New in the Library Automation Arena?
ERIC Educational Resources Information Center
Breeding, Marshall
1998-01-01
Reviews trends in library automation based on vendors at the 1998 American Library Association Annual Conference. Discusses the major industry trend, a move from host-based computer systems to the new generation of client/server, object-oriented, open systems-based automation. Includes a summary of developments for 26 vendors. (LRW)
Funding for Library Automation.
ERIC Educational Resources Information Center
Thompson, Ronelle K. H.
This paper provides a brief overview of planning and implementing a project to fund library automation. It is suggested that: (1) proposal budgets should include all costs of a project, such as furniture needed for computer terminals, costs for modifying library procedures, initial supplies, or ongoing maintenance; (2) automation does not save…
The Nature of Automated Jobs and Their Educational and Training Requirements.
ERIC Educational Resources Information Center
Fine, S.A.
Objective information concerning the impact of automation on educational and training requirements was obtained for 132 employees engaged in electron tube, computer, and steel manufacturing processes through management questionnaire responses, analysis of job functions, and employer interviews before and after the introduction of automation. The…
Analyzing Automated Instructional Systems: Metaphors from Related Design Professions.
ERIC Educational Resources Information Center
Jonassen, David H.; Wilson, Brent G.
Noting that automation has had an impact on virtually every manufacturing and information operation in the world, including instructional design (ID), this paper suggests three basic metaphors for automating instructional design activities: (1) computer-aided design and manufacturing (CAD/CAM) systems; (2) expert system advisor systems; and (3)…
Planning for the Automation of School Library Media Centers.
ERIC Educational Resources Information Center
Caffarella, Edward P.
1996-01-01
Geared for school library media specialists whose centers are in the early stages of automation or conversion to a new system, this article focuses on major components of media center automation: circulation control; online public access catalogs; machine readable cataloging; retrospective conversion of print catalog cards; and computer networks…
The Historical Evolution of Educational Software.
ERIC Educational Resources Information Center
Troutner, Joanne
This paper establishes the roots of computers and automated teaching in the field of psychology and describes Dr. S. L. Pressey's presentation of the teaching machine; B. F. Skinner's teaching machine; Meyer's steps in composing a program for the automated teaching machine; IBM's beginning research on automated courses and the development of the…
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2014 CFR
2014-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2013 CFR
2013-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2012 CFR
2012-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
5 CFR 293.107 - Special safeguards for automated records.
Code of Federal Regulations, 2011 CFR
2011-01-01
... for automated records. (a) In addition to following the security requirements of § 293.106 of this... security safeguards for data about individuals in automated records, including input and output documents, reports, punched cards, magnetic tapes, disks, and on-line computer storage. The safeguards must be in...
DOT National Transportation Integrated Search
1974-08-01
Volume 5 describes the DELTA Simulation Model. It includes all documentation of the DELTA (Determine Effective Levels of Task Automation) computer simulation developed by TRW for use in the Automation Applications Study. Volume 5A includes a user's m...
NASA Technical Reports Server (NTRS)
Harrison, Cecil A.
1986-01-01
The efforts to automate the electromagentic compatibility (EMC) test facilites at Marshall Flight Center were examined. A battery of nine standard tests is to be integrated by means of a desktop computer-controller in order to provide near real-time data assessment, store the data acquired during testing on flexible disk, and provide computer production of the certification report.
Preliminary Full-Scale Tests of the Center for Automated Processing of Hardwoods' Auto-Image
Philip A. Araman; Janice K. Wiedenbeck
1995-01-01
Automated lumber grading and yield optimization using computer controlled saws will be plausible for hardwoods if and when lumber scanning systems can reliably identify all defects by type. Existing computer programs could then be used to grade the lumber, identify the best cut-up solution, and control the sawing machines. The potential value of a scanning grading...
Integral flange design program. [procedure for computing stresses
NASA Technical Reports Server (NTRS)
Wilson, J. F.
1974-01-01
An automated interactive flange design program utilizing an electronic desk top calculator is presented. The program calculates the operating and seating stresses for circular flanges of the integral or optional type subjected to internal pressure. The required input information is documented. The program provides an automated procedure for computing stresses in selected flange geometries for comparison to the allowable code values.
Predictors of Interpersonal Trust in Virtual Distributed Teams
2008-09-01
understand systems that are very complex in nature . Such understanding is essential to facilitate building or maintaining operators’ mental models of the...a significant impact on overall system performance. Specifically, the level of automation that combined human generation of options with computer...and/or computer servers had a significant impact on automated system performance. Additionally, Parasuraman, Sheridan, & Wickens (2000) proposed
NASA Technical Reports Server (NTRS)
Roske-Hofstrand, Renate J.
1990-01-01
The man-machine interface and its influence on the characteristics of computer displays in automated air traffic is discussed. The graphical presentation of spatial relationships and the problems it poses for air traffic control, and the solution of such problems are addressed. Psychological factors involved in the man-machine interface are stressed.
ERIC Educational Resources Information Center
Fridge, Evorell; Bagui, Sikha
2016-01-01
The goal of this research was to investigate the effects of automated testing software on levels of student reflection and student performance. This was a self-selecting, between subjects design that examined the performance of students in introductory computer programming classes. Participants were given the option of using the Web-CAT…
Almost human: Anthropomorphism increases trust resilience in cognitive agents.
de Visser, Ewart J; Monfort, Samuel S; McKendrick, Ryan; Smith, Melissa A B; McKnight, Patrick E; Krueger, Frank; Parasuraman, Raja
2016-09-01
We interact daily with computers that appear and behave like humans. Some researchers propose that people apply the same social norms to computers as they do to humans, suggesting that social psychological knowledge can be applied to our interactions with computers. In contrast, theories of human–automation interaction postulate that humans respond to machines in unique and specific ways. We believe that anthropomorphism—the degree to which an agent exhibits human characteristics—is the critical variable that may resolve this apparent contradiction across the formation, violation, and repair stages of trust. Three experiments were designed to examine these opposing viewpoints by varying the appearance and behavior of automated agents. Participants received advice that deteriorated gradually in reliability from a computer, avatar, or human agent. Our results showed (a) that anthropomorphic agents were associated with greater trust resilience , a higher resistance to breakdowns in trust; (b) that these effects were magnified by greater uncertainty; and c) that incorporating human-like trust repair behavior largely erased differences between the agents. Automation anthropomorphism is therefore a critical variable that should be carefully incorporated into any general theory of human–agent trust as well as novel automation design. PsycINFO Database Record (c) 2016 APA, all rights reserved
Automated Measurement of Patient-Specific Tibial Slopes from MRI
Amerinatanzi, Amirhesam; Summers, Rodney K.; Ahmadi, Kaveh; Goel, Vijay K.; Hewett, Timothy E.; Nyman, Edward
2017-01-01
Background: Multi-planar proximal tibial slopes may be associated with increased likelihood of osteoarthritis and anterior cruciate ligament injury, due in part to their role in checking the anterior-posterior stability of the knee. Established methods suffer repeatability limitations and lack computational efficiency for intuitive clinical adoption. The aims of this study were to develop a novel automated approach and to compare the repeatability and computational efficiency of the approach against previously established methods. Methods: Tibial slope geometries were obtained via MRI and measured using an automated Matlab-based approach. Data were compared for repeatability and evaluated for computational efficiency. Results: Mean lateral tibial slope (LTS) for females (7.2°) was greater than for males (1.66°). Mean LTS in the lateral concavity zone was greater for females (7.8° for females, 4.2° for males). Mean medial tibial slope (MTS) for females was greater (9.3° vs. 4.6°). Along the medial concavity zone, female subjects demonstrated greater MTS. Conclusion: The automated method was more repeatable and computationally efficient than previously identified methods and may aid in the clinical assessment of knee injury risk, inform surgical planning, and implant design efforts. PMID:28952547
ERIC Educational Resources Information Center
Hudson, C. A.
1982-01-01
Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)
Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott
2016-03-10
In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered.
A Computational Architecture for Programmable Automation Research
NASA Astrophysics Data System (ADS)
Taylor, Russell H.; Korein, James U.; Maier, Georg E.; Durfee, Lawrence F.
1987-03-01
This short paper describes recent work at the IBM T. J. Watson Research Center directed at developing a highly flexible computational architecture for research on sensor-based programmable automation. The system described here has been designed with a focus on dynamic configurability, layered user inter-faces and incorporation of sensor-based real time operations into new commands. It is these features which distinguish it from earlier work. The system is cur-rently being implemented at IBM for research purposes and internal use and is an outgrowth of programmable automation research which has been ongoing since 1972 [e.g., 1, 2, 3, 4, 5, 6] .
Alhmidi, Heba; Cadnum, Jennifer L; Piedrahita, Christina T; John, Amrita R; Donskey, Curtis J
2018-04-01
Touchscreens are a potential source of pathogen transmission. In our facility, patients and visitors rarely perform hand hygiene after using interactive touchscreen computer kiosks. An automated ultraviolet-C touchscreen disinfection device was effective in reducing bacteriophage MS2, bacteriophage ϕX174, methicillin-resistant Staphylococcus aureus, and Clostridium difficile spores inoculated onto a touchscreen. In simulations, an automated ultraviolet-C touchscreen disinfection device alone or in combination with hand hygiene reduced transfer of the viruses from contaminated touchscreens to fingertips. Published by Elsevier Inc.
Toward an automated parallel computing environment for geosciences
NASA Astrophysics Data System (ADS)
Zhang, Huai; Liu, Mian; Shi, Yaolin; Yuen, David A.; Yan, Zhenzhen; Liang, Guoping
2007-08-01
Software for geodynamic modeling has not kept up with the fast growing computing hardware and network resources. In the past decade supercomputing power has become available to most researchers in the form of affordable Beowulf clusters and other parallel computer platforms. However, to take full advantage of such computing power requires developing parallel algorithms and associated software, a task that is often too daunting for geoscience modelers whose main expertise is in geosciences. We introduce here an automated parallel computing environment built on open-source algorithms and libraries. Users interact with this computing environment by specifying the partial differential equations, solvers, and model-specific properties using an English-like modeling language in the input files. The system then automatically generates the finite element codes that can be run on distributed or shared memory parallel machines. This system is dynamic and flexible, allowing users to address different problems in geosciences. It is capable of providing web-based services, enabling users to generate source codes online. This unique feature will facilitate high-performance computing to be integrated with distributed data grids in the emerging cyber-infrastructures for geosciences. In this paper we discuss the principles of this automated modeling environment and provide examples to demonstrate its versatility.
Néri-Quiroz, José; Canto, Fabrice; Guillerme, Laurent; Couston, Laurent; Magnaldo, Alastair; Dugas, Vincent
2016-10-01
A miniaturized and automated approach for the determination of free acidity in solutions containing uranium (VI) is presented. The measurement technique is based on the concept of sequential injection analysis with on-line spectroscopic detection. The proposed methodology relies on the complexation and alkalimetric titration of nitric acid using a pH 5.6 sodium oxalate solution. The titration process is followed by UV/VIS detection at 650nm thanks to addition of Congo red as universal pH indicator. Mixing sequence as well as method validity was investigated by numerical simulation. This new analytical design allows fast (2.3min), reliable and accurate free acidity determination of low volume samples (10µL) containing uranium/[H(+)] moles ratio of 1:3 with relative standard deviation of <7.0% (n=11). The linearity range of the free nitric acid measurement is excellent up to 2.77molL(-1) with a correlation coefficient (R(2)) of 0.995. The method is specific, presence of actinide ions up to 0.54molL(-1) does not interfere on the determination of free nitric acid. In addition to automation, the developed sequential injection analysis method greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight fold. These analytical parameters are important especially in nuclear-related applications to improve laboratory safety, personnel exposure to radioactive samples and to drastically reduce environmental impacts or analytical radioactive waste. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Stricker, L. T.
1975-01-01
The LOVES computer program was employed to analyze the geosynchronous portion of the NASA's 1973 automated satellite mission model from 1980 to 1990. The objectives of the analyses were: (1) to demonstrate the capability of the LOVES code to provide the depth and accuracy of data required to support the analyses; and (2) to tradeoff the concept of space servicing automated satellites composed of replaceable modules against the concept of replacing expendable satellites upon failure. The computer code proved to be an invaluable tool in analyzing the logistic requirements of the various test cases required in the tradeoff. It is indicated that the concept of space servicing offers the potential for substantial savings in the cost of operating automated satellite systems.
A Simple Method for Automated Equilibration Detection in Molecular Simulations.
Chodera, John D
2016-04-12
Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data.
A simple method for automated equilibration detection in molecular simulations
Chodera, John D.
2016-01-01
Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest, in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure, and demonstrate its utility on typical molecular simulation data. PMID:26771390
A review of automated image understanding within 3D baggage computed tomography security screening.
Mouton, Andre; Breckon, Toby P
2015-01-01
Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT.
NASA Astrophysics Data System (ADS)
Ingale, S. V.; Datta, D.
2010-10-01
Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tichler, J.L.
Information on release of radioactive materials in airborne and liquid effluents, solid waste shipments and selected operating information from commercial nuclear power plants in the United States is maintained in a computer data base at Brookhaven National Laboratory (BNL) for the United States Nuclear Regulatory Commission (USNRC). The information entered into the data base is obtained from semiannual reports submitted by the operators of the plants to the USNRC in compliance with the USNRC Regulatory Guide 1.21, ''Measuring, Evaluating, and Reporting Radioactivity in Solid Wastes and Releases of Radioactive Materials in Liquid and Gaseous Effluents from Light-Water-Cooled Nuclear Power Plants.''more » The data on releases in the calendar year 1986 include information from 69 plants representing 87 reactors and contain approximately 19,000 entries. Since all the information is contained in a computer data base management system, entry and rapidly respond to inquiries about the data set and to generate computer readable subsets of the data. Such a subset is used as input to the computer program which generates the annual report, ''Population Dose Commitments Due to Radioactive Releases from Nuclear Power Plant Sites,'' prepared by Pacific Northwest Laboratory for the USNRC. BNL began maintaining this data base for the USNRC with the 1978 information and has added information to the data base for each succeeding year. An annual report summarizing the information for each year, prepared by BNL, and published by the USNRC, is available to the general public. Prior to 1978, annual reports were prepared by the USNRC and are available for the years 1972--1977; however, the information for these years is not in a computer accessible data base.« less
Computer Series, 67: Bits and Pieces, 27.
ERIC Educational Resources Information Center
Moore, John W., Ed.
1986-01-01
Discusses a computer interfacing course using Commodore 64 microcomputers; a computer program for radioactive equilibrium; analysis of near infrared spectrum of hydrochloric acid molecules using Apple II microcomputers; microcomputer approach to conductivity titrations; balancing equations with Commodore 64's; formulation of mathematical…
ERIC Educational Resources Information Center
Chen, Jing; Zhang, Mo; Bejar, Isaac I.
2017-01-01
Automated essay scoring (AES) generally computes essay scores as a function of macrofeatures derived from a set of microfeatures extracted from the text using natural language processing (NLP). In the "e-rater"® automated scoring engine, developed at "Educational Testing Service" (ETS) for the automated scoring of essays, each…
Logistics Automation Master Plan (LAMP). Better Logistics Support through Automation.
1983-06-01
office micro-computers, positioned throughout the command chain , by providing real time links between LCA and all users: 2. Goals: Assist HQDA staff in...field i.e., Airland Battle 2000. IV-27 Section V: CONCEPT OF EXECUTION Suply (Retail) A. SRstem Description. I. The Division Logistics Property Book...7. Divisional Direct Support Unit Automated Supply System (DDASS)/Direct pport Level Suply Automation (DLSA). DDASS and DLSA are system development
1981-06-30
manpower needs as to quantity, quality and timing; all the internal functions of the personnel service are tapped to help meet these ends. Manpower...Program ACOS - Automated Computation of Service ACQ - Acquisition ACSAC - Assistant Chief of Staff for Automation and Comunications ACT - Automated...ARSTAF - Army Staff ARSTAFF - Army Staff ARTEP - Army Training and Evaluation Program ASI - Additional Skill Identifier ASVAB - Armed Services
Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Tom.
2013-01-01
NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.
SU-F-I-45: An Automated Technique to Measure Image Contrast in Clinical CT Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanders, J; Abadi, E; Meng, B
Purpose: To develop and validate an automated technique for measuring image contrast in chest computed tomography (CT) exams. Methods: An automated computer algorithm was developed to measure the distribution of Hounsfield units (HUs) inside four major organs: the lungs, liver, aorta, and bones. These organs were first segmented or identified using computer vision and image processing techniques. Regions of interest (ROIs) were automatically placed inside the lungs, liver, and aorta and histograms of the HUs inside the ROIs were constructed. The mean and standard deviation of each histogram were computed for each CT dataset. Comparison of the mean and standardmore » deviation of the HUs in the different organs provides different contrast values. The ROI for the bones is simply the segmentation mask of the bones. Since the histogram for bones does not follow a Gaussian distribution, the 25th and 75th percentile were computed instead of the mean. The sensitivity and accuracy of the algorithm was investigated by comparing the automated measurements with manual measurements. Fifteen contrast enhanced and fifteen non-contrast enhanced chest CT clinical datasets were examined in the validation procedure. Results: The algorithm successfully measured the histograms of the four organs in both contrast and non-contrast enhanced chest CT exams. The automated measurements were in agreement with manual measurements. The algorithm has sufficient sensitivity as indicated by the near unity slope of the automated versus manual measurement plots. Furthermore, the algorithm has sufficient accuracy as indicated by the high coefficient of determination, R2, values ranging from 0.879 to 0.998. Conclusion: Patient-specific image contrast can be measured from clinical datasets. The algorithm can be run on both contrast enhanced and non-enhanced clinical datasets. The method can be applied to automatically assess the contrast characteristics of clinical chest CT images and quantify dependencies that may not be captured in phantom data.« less
High-reliability computing for the smarter planet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul; Manuzzato, Andrea
2010-01-01
The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities ofmore » inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary. Already critical infrastructure is failing too frequently. In this paper, we will introduce the Cross-Layer Reliability concept for designing more reliable computer systems.« less
An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*
Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.
2014-01-01
Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144
Computer program CDCID: an automated quality control program using CDC update
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singer, G.L.; Aguilar, F.
1984-04-01
A computer program, CDCID, has been developed in coordination with a quality control program to provide a highly automated method of documenting changes to computer codes at EG and G Idaho, Inc. The method uses the standard CDC UPDATE program in such a manner that updates and their associated documentation are easily made and retrieved in various formats. The method allows each card image of a source program to point to the document which describes it, who created the card, and when it was created. The method described is applicable to the quality control of computer programs in general. Themore » computer program described is executable only on CDC computing systems, but the program could be modified and applied to any computing system with an adequate updating program.« less
1988-10-01
overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and
Artificial intelligence issues related to automated computing operations
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1989-01-01
Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.
Automated Measurement of Facial Expression in Infant-Mother Interaction: A Pilot Study
ERIC Educational Resources Information Center
Messinger, Daniel S.; Mahoor, Mohammad H.; Chow, Sy-Miin; Cohn, Jeffrey F.
2009-01-01
Automated facial measurement using computer vision has the potential to objectively document continuous changes in behavior. To examine emotional expression and communication, we used automated measurements to quantify smile strength, eye constriction, and mouth opening in two 6-month-old infant-mother dyads who each engaged in a face-to-face…
AUTOMATED LITERATURE PROCESSING HANDLING AND ANALYSIS SYSTEM--FIRST GENERATION.
ERIC Educational Resources Information Center
Redstone Scientific Information Center, Redstone Arsenal, AL.
THE REPORT PRESENTS A SUMMARY OF THE DEVELOPMENT AND THE CHARACTERISTICS OF THE FIRST GENERATION OF THE AUTOMATED LITERATURE PROCESSING, HANDLING AND ANALYSIS (ALPHA-1) SYSTEM. DESCRIPTIONS OF THE COMPUTER TECHNOLOGY OF ALPHA-1 AND THE USE OF THIS AUTOMATED LIBRARY TECHNIQUE ARE PRESENTED. EACH OF THE SUBSYSTEMS AND MODULES NOW IN OPERATION ARE…
Ercan, Ertuğrul; Kırılmaz, Bahadır; Kahraman, İsmail; Bayram, Vildan; Doğan, Hüseyin
2012-11-01
Flow-mediated dilation (FMD) is used to evaluate endothelial functions. Computer-assisted analysis utilizing edge detection permits continuous measurements along the vessel wall. We have developed a new fully automated software program to allow accurate and reproducible measurement. FMD has been measured and analyzed in 18 coronary artery disease (CAD) patients and 17 controls both by manually and by the software developed (computer supported) methods. The agreement between methods was assessed by Bland-Altman analysis. The mean age, body mass index and cardiovascular risk factors were higher in CAD group. Automated FMD% measurement for the control subjects was 18.3±8.5 and 6.8±6.5 for the CAD group (p=0.0001). The intraobserver and interobserver correlation for automated measurement was high (r=0.974, r=0.981, r=0.937, r=0.918, respectively). Manual FMD% at 60th second was correlated with automated FMD % (r=0.471, p=0.004). The new fully automated software© can be used to precise measurement of FMD with low intra- and interobserver variability than manual assessment.
Interactive visualization of Earth and Space Science computations
NASA Technical Reports Server (NTRS)
Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise
1994-01-01
Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.
Nuclear sensor signal processing circuit
Kallenbach, Gene A [Bosque Farms, NM; Noda, Frank T [Albuquerque, NM; Mitchell, Dean J [Tijeras, NM; Etzkin, Joshua L [Albuquerque, NM
2007-02-20
An apparatus and method are disclosed for a compact and temperature-insensitive nuclear sensor that can be calibrated with a non-hazardous radioactive sample. The nuclear sensor includes a gamma ray sensor that generates tail pulses from radioactive samples. An analog conditioning circuit conditions the tail-pulse signals from the gamma ray sensor, and a tail-pulse simulator circuit generates a plurality of simulated tail-pulse signals. A computer system processes the tail pulses from the gamma ray sensor and the simulated tail pulses from the tail-pulse simulator circuit. The nuclear sensor is calibrated under the control of the computer. The offset is adjusted using the simulated tail pulses. Since the offset is set to zero or near zero, the sensor gain can be adjusted with a non-hazardous radioactive source such as, for example, naturally occurring radiation and potassium chloride.
NASA Technical Reports Server (NTRS)
Hubler, M.; Souders, J. E.; Shade, E. D.; Hlastala, M. P.; Polissar, N. L.; Glenny, R. W.
1999-01-01
The aim of the study was to validate a nonradioactive method for relative blood flow measurements in severely injured lungs that avoids labor-intensive tissue processing. The use of fluorescent-labeled microspheres was compared with the standard radiolabeled-microsphere method. In seven sheep, lung injury was established by using oleic acid. Five pairs of radio- and fluorescent-labeled microspheres were injected before and after established lung injury. Across all animals, 175 pieces were selected randomly. The radioactivity of each piece was determined by using a scintillation counter. The fluorescent dye was extracted from each piece with a solvent without digestion or filtering. The fluorescence was determined with an automated fluorescent spectrophotometer. Perfusion was calculated for each piece from both the radioactivity and fluorescence and volume normalized. Correlations between flow determined by the two methods were in the range from 0.987 +/- 0.007 (SD) to 0.991 +/- 0.002 (SD) after 9 days of soaking. Thus the fluorescent microsphere technique is a valuable tool for investigating regional perfusion in severely injured lungs and can replace radioactivity.
Predicting Flows of Rarefied Gases
NASA Technical Reports Server (NTRS)
LeBeau, Gerald J.; Wilmoth, Richard G.
2005-01-01
DSMC Analysis Code (DAC) is a flexible, highly automated, easy-to-use computer program for predicting flows of rarefied gases -- especially flows of upper-atmospheric, propulsion, and vented gases impinging on spacecraft surfaces. DAC implements the direct simulation Monte Carlo (DSMC) method, which is widely recognized as standard for simulating flows at densities so low that the continuum-based equations of computational fluid dynamics are invalid. DAC enables users to model complex surface shapes and boundary conditions quickly and easily. The discretization of a flow field into computational grids is automated, thereby relieving the user of a traditionally time-consuming task while ensuring (1) appropriate refinement of grids throughout the computational domain, (2) determination of optimal settings for temporal discretization and other simulation parameters, and (3) satisfaction of the fundamental constraints of the method. In so doing, DAC ensures an accurate and efficient simulation. In addition, DAC can utilize parallel processing to reduce computation time. The domain decomposition needed for parallel processing is completely automated, and the software employs a dynamic load-balancing mechanism to ensure optimal parallel efficiency throughout the simulation.
Evolutionary and biological metaphors for engineering design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakiela, M.
1994-12-31
Since computing became generally available, there has been strong interest in using computers to assist and automate engineering design processes. Specifically, for design optimization and automation, nonlinear programming and artificial intelligence techniques have been extensively studied. New computational techniques, based upon the natural processes of evolution, adaptation, and learing, are showing promise because of their generality and robustness. This presentation will describe the use of two such techniques, genetic algorithms and classifier systems, for a variety of engineering design problems. Structural topology optimization, meshing, and general engineering optimization are shown as example applications.
Automated Reporting of DXA Studies Using a Custom-Built Computer Program.
England, Joseph R; Colletti, Patrick M
2018-06-01
Dual-energy x-ray absorptiometry (DXA) scans are a critical population health tool and relatively simple to interpret but can be time consuming to report, often requiring manual transfer of bone mineral density and associated statistics into commercially available dictation systems. We describe here a custom-built computer program for automated reporting of DXA scans using Pydicom, an open-source package built in the Python computer language, and regular expressions to mine DICOM tags for patient information and bone mineral density statistics. This program, easy to emulate by any novice computer programmer, has doubled our efficiency at reporting DXA scans and has eliminated dictation errors.
Automation of multi-agent control for complex dynamic systems in heterogeneous computational network
NASA Astrophysics Data System (ADS)
Oparin, Gennady; Feoktistov, Alexander; Bogdanova, Vera; Sidorov, Ivan
2017-01-01
The rapid progress of high-performance computing entails new challenges related to solving large scientific problems for various subject domains in a heterogeneous distributed computing environment (e.g., a network, Grid system, or Cloud infrastructure). The specialists in the field of parallel and distributed computing give the special attention to a scalability of applications for problem solving. An effective management of the scalable application in the heterogeneous distributed computing environment is still a non-trivial issue. Control systems that operate in networks, especially relate to this issue. We propose a new approach to the multi-agent management for the scalable applications in the heterogeneous computational network. The fundamentals of our approach are the integrated use of conceptual programming, simulation modeling, network monitoring, multi-agent management, and service-oriented programming. We developed a special framework for an automation of the problem solving. Advantages of the proposed approach are demonstrated on the parametric synthesis example of the static linear regulator for complex dynamic systems. Benefits of the scalable application for solving this problem include automation of the multi-agent control for the systems in a parallel mode with various degrees of its detailed elaboration.
NASA Tech Briefs, May 1994. Volume 18, No. 5
NASA Technical Reports Server (NTRS)
1994-01-01
Topics covered include: Robotics/Automation; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Genetic algorithms in teaching artificial intelligence (automated generation of specific algebras)
NASA Astrophysics Data System (ADS)
Habiballa, Hashim; Jendryscik, Radek
2017-11-01
The problem of teaching essential Artificial Intelligence (AI) methods is an important task for an educator in the branch of soft-computing. The key focus is often given to proper understanding of the principle of AI methods in two essential points - why we use soft-computing methods at all and how we apply these methods to generate reasonable results in sensible time. We present one interesting problem solved in the non-educational research concerning automated generation of specific algebras in the huge search space. We emphasize above mentioned points as an educational case study of an interesting problem in automated generation of specific algebras.
Agent Models for Self-Motivated Home-Assistant Bots
NASA Astrophysics Data System (ADS)
Merrick, Kathryn; Shafi, Kamran
2010-01-01
Modern society increasingly relies on technology to support everyday activities. In the past, this technology has focused on automation, using computer technology embedded in physical objects. More recently, there is an expectation that this technology will not just embed reactive automation, but also embed intelligent, proactive automation in the environment. That is, there is an emerging desire for novel technologies that can monitor, assist, inform or entertain when required, and not just when requested. This paper presents three self-motivated, home-assistant bot applications using different self-motivated agent models. Self-motivated agents use a computational model of motivation to generate goals proactively. Technologies based on self-motivated agents can thus respond autonomously and proactively to stimuli from their environment. Three prototypes of different self-motivated agent models, using different computational models of motivation, are described to demonstrate these concepts.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
Wickens, Christopher D; Sebok, Angelia; Li, Huiyang; Sarter, Nadine; Gacy, Andrew M
2015-09-01
The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Some computational models of complacency in human-automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development. © 2015, Human Factors and Ergonomics Society.
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2013 CFR
2013-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2014 CFR
2014-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2011 CFR
2011-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
7 CFR 247.25 - Allowable uses of administrative funds and other funds.
Code of Federal Regulations, 2012 CFR
2012-01-01
... equipment include automated information systems, automated data processing equipment, and other computer... sale of packing containers or pallets, and the salvage of commodities. Program income does not include...
Topics in programmable automation. [for materials handling, inspection, and assembly
NASA Technical Reports Server (NTRS)
Rosen, C. A.
1975-01-01
Topics explored in the development of integrated programmable automation systems include: numerically controlled and computer controlled machining; machine intelligence and the emulation of human-like capabilities; large scale semiconductor integration technology applications; and sensor technology for asynchronous local computation without burdening the executive minicomputer which controls the whole system. The role and development of training aids, and the potential application of these aids to augmented teleoperator systems are discussed.
[Automated processing of data from the 1985 population and housing census].
Cholakov, S
1987-01-01
The author describes the method of automated data processing used in the 1985 census of Bulgaria. He notes that the computerization of the census involves decentralization and the use of regional computing centers as well as data processing at the Central Statistical Office's National Information Computer Center. Special attention is given to problems concerning the projection and programming of census data. (SUMMARY IN ENG AND RUS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, G.R.; Bystroff, R.I.; Downey, R.M.
1975-09-01
In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less
Summers, Ronald M; Baecher, Nicolai; Yao, Jianhua; Liu, Jiamin; Pickhardt, Perry J; Choi, J Richard; Hill, Suvimol
2011-01-01
To show the feasibility of calculating the bone mineral density (BMD) from computed tomographic colonography (CTC) scans using fully automated software. Automated BMD measurement software was developed that measures the BMD of the first and second lumbar vertebrae on computed tomography and calculates the mean of the 2 values to provide a per patient BMD estimate. The software was validated in a reference population of 17 consecutive women who underwent quantitative computed tomography and in a population of 475 women from a consecutive series of asymptomatic patients enrolled in a CTC screening trial conducted at 3 medical centers. The mean (SD) BMD was 133.6 (34.6) mg/mL (95% confidence interval, 130.5-136.7; n = 475). In women aged 42 to 60 years (n = 316) and 61 to 79 years (n = 159), the mean (SD) BMDs were 143.1 (33.5) and 114.7 (28.3) mg/mL, respectively (P < 0.0001). Fully automated BMD measurements were reproducible for a given patient with 95% limits of agreement of -9.79 to 8.46 mg/mL for the mean difference between paired assessments on supine and prone CTC. Osteoporosis screening can be performed simultaneously with screening for colorectal polyps.
Automation Problems of 1968; Papers Presented at the Meeting...October 4-5, 1968.
ERIC Educational Resources Information Center
Andrews, Theodora, Ed.
Librarians and their concerned colleagues met to give, hear and discuss papers on library automation, primarily by computers. Noted at this second meeting on library automation were: (1) considerably more sophistication and casualness about the techniques involved, (2) considerably more assurance of what and where things can be applied and (3)…
Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry
ERIC Educational Resources Information Center
Gerard, Libby F.; Linn, Marcia C.
2016-01-01
Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students…
MARC and the Library Service Center: Automation at Bargain Rates.
ERIC Educational Resources Information Center
Pearson, Karl M.
Despite recent research and development in the field of library automation, libraries have been unable to reap the benefits promised by technology due to the high cost of building and maintaining their own computer-based systems. Time-sharing and disc mass storage devices will bring automation costs, if spread over a number of users, within the…
The Automated Logistics Element Planning System (ALEPS)
NASA Technical Reports Server (NTRS)
Schwaab, Douglas G.
1991-01-01
The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.
Fink, Christine; Uhlmann, Lorenz; Klose, Christina; Haenssle, Holger A
2018-05-17
Reliable and accurate assessment of severity in psoriasis is very important in order to meet indication criteria for initiation of systemic treatment or to evaluate treatment efficacy. The most acknowledged tool for measuring the extent of psoriatic skin changes is the Psoriasis Area and Severity Index (PASI). However, the calculation of PASI can be tedious and subjective and high intraobserver and interobserver variability is an important concern. Therefore, there is a great need for a standardised and objective method that guarantees a reproducible PASI calculation. Within this study we will investigate the precision and reproducibility of automated, computer-guided PASI measurements in comparison to trained physicians to address these limitations. Non-interventional analyses of PASI calculations by either physicians in a prospective versus retrospective setting or an automated computer-guided algorithm in 120 patients with plaque psoriasis. All retrospective PASI calculations by physicians or by the computer algorithm are based on total body digital images. The primary objective of this study is comparison of automated computer-guided PASI measurements by means of digital image analysis versus conventional, prospective or retrospective physicians' PASI assessments. Secondary endpoints include (1) the assessment of physicians' interobserver variance in PASI calculations, (2) the assessment of physicians' intraobserver variance in PASI assessments of the same patients' images after a time interval of at least 4 weeks, (3) the assessment of the deviation between physicians' prospective versus retrospective PASI calculations, and (4) the reproducibility of automated computer-guided PASI measurements by assessment of two sets of total body digital images of the same patients taken at one time point. Ethical approval was provided by the Ethics Committee of the Medical Faculty of the University of Heidelberg (ethics approval number S-379/2016). DRKS00011818; Results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sweatman, W.J.; Brandon, D.R.; Cranstone, S.
The preparation of indium-111 tropolonate-radiolabeled guinea pig peripheral mixed white cells (greater than 80% neutrophils) is described. Autologous rather than homologous cells are required to provide a population of labeled, functional cells on reintroduction to the animals. Surgery has been shown to result in a profound neutropenia from which the animals must recover before removal of blood for cell preparation. The response of radiolabeled cells parallels that of the unlabeled cell population to a chemotaxin, leukotriene B4. This material causes a profound neutropenia of rapid onset accompanied by a parallel fall in blood radioactivity. The fall in circulating radiolabel ismore » accompanied by an increase in radioactivity in the thoracic region. These changes have been monitored externally using an automated isotope monitoring system.« less
A Low Cost Micro-Computer Based Local Area Network for Medical Office and Medical Center Automation
Epstein, Mel H.; Epstein, Lynn H.; Emerson, Ron G.
1984-01-01
A Low Cost Micro-computer based Local Area Network for medical office automation is described which makes use of an array of multiple and different personal computers interconnected by a local area network. Each computer on the network functions as fully potent workstations for data entry and report generation. The network allows each workstation complete access to the entire database. Additionally, designated computers may serve as access ports for remote terminals. Through “Gateways” the network may serve as a front end for a large mainframe, or may interface with another network. The system provides for the medical office environment the expandability and flexibility of a multi-terminal mainframe system at a far lower cost without sacrifice of performance.
NASA Technical Reports Server (NTRS)
Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)
1983-01-01
Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.
Computational methods for structural load and resistance modeling
NASA Technical Reports Server (NTRS)
Thacker, B. H.; Millwater, H. R.; Harren, S. V.
1991-01-01
An automated capability for computing structural reliability considering uncertainties in both load and resistance variables is presented. The computations are carried out using an automated Advanced Mean Value iteration algorithm (AMV +) with performance functions involving load and resistance variables obtained by both explicit and implicit methods. A complete description of the procedures used is given as well as several illustrative examples, verified by Monte Carlo Analysis. In particular, the computational methods described in the paper are shown to be quite accurate and efficient for a material nonlinear structure considering material damage as a function of several primitive random variables. The results show clearly the effectiveness of the algorithms for computing the reliability of large-scale structural systems with a maximum number of resolutions.
NASA Technical Reports Server (NTRS)
Hockaday, Stephen; Kuhlenschmidt, Sharon (Editor)
1991-01-01
The objective of the workshop was to explore the role of human factors in facilitating the introduction of artificial intelligence (AI) to advanced air traffic control (ATC) automation concepts. AI is an umbrella term which is continually expanding to cover a variety of techniques where machines are performing actions taken based upon dynamic, external stimuli. AI methods can be implemented using more traditional programming languages such as LISP or PROLOG, or they can be implemented using state-of-the-art techniques such as object-oriented programming, neural nets (hardware or software), and knowledge based expert systems. As this technology advances and as increasingly powerful computing platforms become available, the use of AI to enhance ATC systems can be realized. Substantial efforts along these lines are already being undertaken at the FAA Technical Center, NASA Ames Research Center, academic institutions, industry, and elsewhere. Although it is clear that the technology is ripe for bringing computer automation to ATC systems, the proper scope and role of automation are not at all apparent. The major concern is how to combine human controllers with computer technology. A wide spectrum of options exists, ranging from using automation only to provide extra tools to augment decision making by human controllers to turning over moment-by-moment control to automated systems and using humans as supervisors and system managers. Across this spectrum, it is now obvious that the difficulties that occur when tying human and automated systems together must be resolved so that automation can be introduced safely and effectively. The focus of the workshop was to further explore the role of injecting AI into ATC systems and to identify the human factors that need to be considered for successful application of the technology to present and future ATC systems.
Budgeting for Computer Technology in the Small College Library
ERIC Educational Resources Information Center
Axford, H. William
1978-01-01
Addresses the need for liberal arts colleges and the use of available technology/automation to help secure their survival. Some factors to be considered in planning and budgeting for automation are discussed. (Author/MBR)
Long-Term Pavement Performance Automated Faulting Measurement
DOT National Transportation Integrated Search
2015-02-01
This study focused on identifying transverse joint locations on jointed plain concrete pavements using an automated joint detection algorithm and computing faulting at these locations using Long-Term Pavement Performance (LTPP) Program profile data c...
ERIC Educational Resources Information Center
Library Computing, 1985
1985-01-01
Special supplement to "Library Journal" and "School Library Journal" covers topics of interest to school, public, academic, and special libraries planning for automation: microcomputer use, readings in automation, online searching, databases of microcomputer software, public access to microcomputers, circulation, creating a…
Changing technology in transportation : automated vehicles in freight.
DOT National Transportation Integrated Search
2017-06-27
The world of transportation is on the verge of undergoing an impactful transformation. Over the past decade, automotive computing technology has progressed far more rapidly than anticipated. Most major auto manufacturers integrated automated features...
Population array and agricultural data arrays for the Los Alamos National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobson, K.W.; Duffy, S.; Kowalewsky, K.
1998-07-01
To quantify or estimate the environmental and radiological impacts from man-made sources of radioactive effluents, certain dose assessment procedures were developed by various government and regulatory agencies. Some of these procedures encourage the use of computer simulations (models) to calculate air dispersion, environmental transport, and subsequent human exposure to radioactivity. Such assessment procedures are frequently used to demonstrate compliance with Department of Energy (DOE) and US Environmental Protection Agency (USEPA) regulations. Knowledge of the density and distribution of the population surrounding a source is an essential component in assessing the impacts from radioactive effluents. Also, as an aid to calculatingmore » the dose to a given population, agricultural data relevant to the dose assessment procedure (or computer model) are often required. This report provides such population and agricultural data for the area surrounding Los Alamos National Laboratory.« less
Automatic measurements and computations for radiochemical analyses
Rosholt, J.N.; Dooley, J.R.
1960-01-01
In natural radioactive sources the most important radioactive daughter products useful for geochemical studies are protactinium-231, the alpha-emitting thorium isotopes, and the radium isotopes. To resolve the abundances of these thorium and radium isotopes by their characteristic decay and growth patterns, a large number of repeated alpha activity measurements on the two chemically separated elements were made over extended periods of time. Alpha scintillation counting with automatic measurements and sample changing is used to obtain the basic count data. Generation of the required theoretical decay and growth functions, varying with time, and the least squares solution of the overdetermined simultaneous count rate equations are done with a digital computer. Examples of the complex count rate equations which may be solved and results of a natural sample containing four ??-emitting isotopes of thorium are illustrated. These methods facilitate the determination of the radioactive sources on the large scale required for many geochemical investigations.
Uranium Glass: A Glowing Alternative to Conventional Sources of Radioactivity
ERIC Educational Resources Information Center
Boot, Roeland
2017-01-01
There is a relatively simple way of using radioactive material in classroom experiments: uranium glass, which provides teachers with a suitable substance. By using the right computer software and a radiation sensor, it can be demonstrated that uranium glass emits radiation at a greater rate than the background radiation and with the aid of UV…
The acceptability of computer applications to group practices.
Zimmerman, J; Gordon, R S; Tao, D K; Boxerman, S B
1978-01-01
Of the 72 identified group practices in a midwest urban environment, 39 were found to use computers. The practices had been influenced strongly by vendors in their selection of an automated system or service, and had usually spent less than a work-month analyzing their needs and reviewing alternate ways in which those needs could be met. Ninety-seven percent of the practices had some financial applications and 64% had administrative applications, but only 2.5% had medical applications. For half the practices at least 2 months elapsed from the time the automated applications were put into operation until they were considered to be integrated into the office routine. Advantages experienced by at least a third of the practices using computers were that the work was done faster, information was more readily available, and costs were reduced. The most common disadvantage was inflexibility. Most (89%) of the practices believed that automation was preferable to their previous manual system.
Recent advances in automated protein design and its future challenges.
Setiawan, Dani; Brender, Jeffrey; Zhang, Yang
2018-04-25
Protein function is determined by protein structure which is in turn determined by the corresponding protein sequence. If the rules that cause a protein to adopt a particular structure are understood, it should be possible to refine or even redefine the function of a protein by working backwards from the desired structure to the sequence. Automated protein design attempts to calculate the effects of mutations computationally with the goal of more radical or complex transformations than are accessible by experimental techniques. Areas covered: The authors give a brief overview of the recent methodological advances in computer-aided protein design, showing how methodological choices affect final design and how automated protein design can be used to address problems considered beyond traditional protein engineering, including the creation of novel protein scaffolds for drug development. Also, the authors address specifically the future challenges in the development of automated protein design. Expert opinion: Automated protein design holds potential as a protein engineering technique, particularly in cases where screening by combinatorial mutagenesis is problematic. Considering solubility and immunogenicity issues, automated protein design is initially more likely to make an impact as a research tool for exploring basic biology in drug discovery than in the design of protein biologics.
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-03-06
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user's home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered.
Off the Shelf Cloud Robotics for the Smart Home: Empowering a Wireless Robot through Cloud Computing
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-01-01
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user’s home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered. PMID:28272305
Advanced computer architecture specification for automated weld systems
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1994-01-01
This report describes the requirements for an advanced automated weld system and the associated computer architecture, and defines the overall system specification from a broad perspective. According to the requirements of welding procedures as they relate to an integrated multiaxis motion control and sensor architecture, the computer system requirements are developed based on a proven multiple-processor architecture with an expandable, distributed-memory, single global bus architecture, containing individual processors which are assigned to specific tasks that support sensor or control processes. The specified architecture is sufficiently flexible to integrate previously developed equipment, be upgradable and allow on-site modifications.
Computer-Generated Feedback on Student Writing
ERIC Educational Resources Information Center
Ware, Paige
2011-01-01
A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…
ERIC Educational Resources Information Center
Husby, Ole
1990-01-01
The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…
Jasko, D J; Lein, D H; Foote, R H
1990-01-01
Two commercially available computer-automate semen analysis instruments (CellSoft Automated Semen Analyzer and HTM-2000 Motion Analyzer) were compared for their ability to report similar results based on the analysis of pre-recorded video tapes of extended, motile stallion semen. The determinations of the percentage of motile cells by these instruments were more similar than the comparisons between subjective estimates and either instrument. However, mean values obtained from the same sample may still differ by as much as 30 percentage units between instruments. Instruments varied with regard to the determinations of mean sperm curvilinear velocity and sperm concentration, but mean sperm linearity determinations were similar between the instruments. We concluded that the determinations of sperm motion characteristics by subjective estimation, CellSoft Automated Semen Analyzer, and HTM-2000 Motility Analyzer are often dissimilar, making direct comparisons of results difficult.
Automated procedures for sizing aerospace vehicle structures /SAVES/
NASA Technical Reports Server (NTRS)
Giles, G. L.; Blackburn, C. L.; Dixon, S. C.
1972-01-01
Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.
Altering user' acceptance of automation through prior automation exposure.
Bekier, Marek; Molesworth, Brett R C
2017-06-01
Air navigation service providers worldwide see increased use of automation as one solution to overcome the capacity constraints imbedded in the present air traffic management (ATM) system. However, increased use of automation within any system is dependent on user acceptance. The present research sought to determine if the point at which an individual is no longer willing to accept or cooperate with automation can be manipulated. Forty participants underwent training on a computer-based air traffic control programme, followed by two ATM exercises (order counterbalanced), one with and one without the aid of automation. Results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation ('tipping point') decreased; suggesting it is indeed possible to alter automation acceptance. Practitioner Summary: This paper investigates whether the point at which a user of automation rejects automation (i.e. 'tipping point') is constant or can be manipulated. The results revealed after exposure to a task with automation assistance, user acceptance of high(er) levels of automation decreased; suggesting it is possible to alter automation acceptance.
Texas A & M University at Galveston: College and University Computing Environment.
ERIC Educational Resources Information Center
CAUSE/EFFECT, 1986
1986-01-01
Texas A & M University at Galveston is the only marine and maritime-oriented University in the Southwest. Its computing policy/direction, academic computing, administrative computing, and library automation are described, and hurricance emergency plans are also discussed. (MLW)
Prakash, Jaya; Yalavarthy, Phaneendra K
2013-03-01
Developing a computationally efficient automated method for the optimal choice of regularization parameter in diffuse optical tomography. The least-squares QR (LSQR)-type method that uses Lanczos bidiagonalization is known to be computationally efficient in performing the reconstruction procedure in diffuse optical tomography. The same is effectively deployed via an optimization procedure that uses the simplex method to find the optimal regularization parameter. The proposed LSQR-type method is compared with the traditional methods such as L-curve, generalized cross-validation (GCV), and recently proposed minimal residual method (MRM)-based choice of regularization parameter using numerical and experimental phantom data. The results indicate that the proposed LSQR-type and MRM-based methods performance in terms of reconstructed image quality is similar and superior compared to L-curve and GCV-based methods. The proposed method computational complexity is at least five times lower compared to MRM-based method, making it an optimal technique. The LSQR-type method was able to overcome the inherent limitation of computationally expensive nature of MRM-based automated way finding the optimal regularization parameter in diffuse optical tomographic imaging, making this method more suitable to be deployed in real-time.
Panuccio, Giuseppe; Torsello, Giovanni Federico; Pfister, Markus; Bisdas, Theodosios; Bosiers, Michel J; Torsello, Giovanni; Austermann, Martin
2016-12-01
To assess the usability of a fully automated fusion imaging engine prototype, matching preinterventional computed tomography with intraoperative fluoroscopic angiography during endovascular aortic repair. From June 2014 to February 2015, all patients treated electively for abdominal and thoracoabdominal aneurysms were enrolled prospectively. Before each procedure, preoperative planning was performed with a fully automated fusion engine prototype based on computed tomography angiography, creating a mesh model of the aorta. In a second step, this three-dimensional dataset was registered with the two-dimensional intraoperative fluoroscopy. The main outcome measure was the applicability of the fully automated fusion engine. Secondary outcomes were freedom from failure of automatic segmentation or of the automatic registration as well as accuracy of the mesh model, measuring deviations from intraoperative angiography in millimeters, if applicable. Twenty-five patients were enrolled in this study. The fusion imaging engine could be used in successfully 92% of the cases (n = 23). Freedom from failure of automatic segmentation was 44% (n = 11). The freedom from failure of the automatic registration was 76% (n = 19), the median error of the automatic registration process was 0 mm (interquartile range, 0-5 mm). The fully automated fusion imaging engine was found to be applicable in most cases, albeit in several cases a fully automated data processing was not possible, requiring manual intervention. The accuracy of the automatic registration yielded excellent results and promises a useful and simple to use technology. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Global risk of radioactive fallout after nuclear reactor accidents
NASA Astrophysics Data System (ADS)
Lelieveld, J.; Kunkel, D.; Lawrence, M. G.
2011-11-01
Reactor core meltdowns of nuclear power plants are rare, yet the consequences are catastrophic. But what is meant by "rare"? And what can be learned from the Chernobyl and Fukushima incidents? Here we assess the risk of exposure to radioactivity due to atmospheric dispersion of gases and particles following severe nuclear accidents, using particulate 137Cs and gaseous 131I as proxies for the fallout. It appears that previously the occurrence of major accidents and the risks of radioactive contamination have been underestimated. Using a global model of the atmosphere we compute that on average, in the event of a core melt of any nuclear power plant worldwide, more than 90% of emitted 137Cs would be transported beyond 50km and about 50% beyond 1000 km distance. This corroborates that such accidents have large-scale and trans-boundary impacts. Although the emission strengths and atmospheric removal processes of 137Cs and 131I are quite different, the radioactive contamination patterns over land and the human deposition exposure are computed to be similar. High human exposure risks occur around reactors in densely populated regions, notably in southern Asia where a core melt can subject 55 million people to radioactive contamination. The recent decision by Germany to phase out its nuclear reactors will reduce the national risk, though a large risk will still remain from the reactors in neighbouring countries.
Global risk of radioactive fallout after nuclear reactor accidents
NASA Astrophysics Data System (ADS)
Kunkel, D.; Lelieveld, J.; Lawrence, M. G.
2012-04-01
Reactor core meltdowns of nuclear power plants are rare, yet the consequences are catastrophic. But what is meant by "rare"? And what can be learned from the Chernobyl and Fukushima incidents? Here we assess the risk of exposure to radioactivity due to atmospheric dispersion of gases and particles following severe nuclear accidents, using particulate 137Cs and gaseous 131I as proxies for the fallout. It appears that previously the occurrence of major accidents and the risks of radioactive contamination have been underestimated. Using a global model of the atmosphere we compute that on average, in the event of a core melt of any nuclear power plant worldwide, more than 90 % of emitted 137Cs would be transported beyond 50 km and about 50 % beyond 1000 km distance. This corroborates that such accidents have large-scale and trans-boundary impacts. Although the emission strengths and atmospheric removal processes of 137Cs and 131I are quite different, the radioactive contamination patterns over land and the human deposition exposure are computed to be similar. High human exposure risks occur around reactors in densely populated regions, notably in southern Asia where a core melt can subject 55 million people to radioactive contamination. The recent decision by Germany to phase out its nuclear reactors will reduce the national risk, though a large risk will still remain from the reactors in neighbouring countries.
"First generation" automated DNA sequencing technology.
Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M
2011-10-01
Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.
Computer Applications in the Design Process.
ERIC Educational Resources Information Center
Winchip, Susan
Computer Assisted Design (CAD) and Computer Assisted Manufacturing (CAM) are emerging technologies now being used in home economics and interior design applications. A microcomputer in a computer network system is capable of executing computer graphic functions such as three-dimensional modeling, as well as utilizing office automation packages to…
Vision 20/20: Automation and advanced computing in clinical radiation oncology.
Moore, Kevin L; Kagadis, George C; McNutt, Todd R; Moiseenko, Vitali; Mutic, Sasa
2014-01-01
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authors contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.
ANL statement of site strategy for computing workstations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fenske, K.R.; Boxberger, L.M.; Amiot, L.W.
1991-11-01
This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is tomore » develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.« less
Methodology for Prototyping Increased Levels of Automation for Spacecraft Rendezvous Functions
NASA Technical Reports Server (NTRS)
Hart, Jeremy J.; Valasek, John
2007-01-01
The Crew Exploration Vehicle necessitates higher levels of automation than previous NASA vehicles, due to program requirements for automation, including Automated Rendezvous and Docking. Studies of spacecraft development often point to the locus of decision-making authority between humans and computers (i.e. automation) as a prime driver for cost, safety, and mission success. Therefore, a critical component in the Crew Exploration Vehicle development is the determination of the correct level of automation. To identify the appropriate levels of automation and autonomy to design into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool. This paper develops a methodology for prototyping increased levels of automation for spacecraft rendezvous functions. This methodology is used to evaluate the accuracy of the Function-specific Level of Autonomy and Automation Tool specified levels of automation, via prototyping. Spacecraft rendezvous planning tasks are selected and then prototyped in Matlab using Fuzzy Logic techniques and existing Space Shuttle rendezvous trajectory algorithms.
Automation of Periodic Reports
DOT National Transportation Integrated Search
1975-06-01
The manual is a user's guide to the automation of the 'Summary of National Transportation Statistics.' The System is stored on the in-house PDP-10 computer to provide ready access and retrieval of the data. The information stored in the system includ...
Human Factors Considerations in System Design
NASA Technical Reports Server (NTRS)
Mitchell, C. M. (Editor); Vanbalen, P. M. (Editor); Moe, K. L. (Editor)
1983-01-01
Human factors considerations in systems design was examined. Human factors in automated command and control, in the efficiency of the human computer interface and system effectiveness are outlined. The following topics are discussed: human factors aspects of control room design; design of interactive systems; human computer dialogue, interaction tasks and techniques; guidelines on ergonomic aspects of control rooms and highly automated environments; system engineering for control by humans; conceptual models of information processing; information display and interaction in real time environments.
A Computational Approach for Automated Posturing of a Human Finite Element Model
2016-07-01
Std. Z39.18 July 2016 Memorandum Report A Computational Approach for Automated Posturing of a Human Finite Element Model Justin McKee and Adam...protection by influencing the path that loading will be transferred into the body and is a major source of variability. The development of a finite element ...posture, human body, finite element , leg, spine 42 Adam Sokolow 410-306-2985Unclassified Unclassified Unclassified UU ii Approved for public release
Automated High-Temperature Hall-Effect Apparatus
NASA Technical Reports Server (NTRS)
Parker, James B.; Zoltan, Leslie D.
1992-01-01
Automated apparatus takes Hall-effect measurements of specimens of thermoelectric materials at temperatures from ambient to 1,200 K using computer control to obtain better resolution of data and more data points about three times as fast as before. Four-probe electrical-resistance measurements taken in 12 electrical and 2 magnetic orientations to characterize specimens at each temperature. Computer acquires data, and controls apparatus via three feedback loops: one for temperature, one for magnetic field, and one for electrical-potential data.
NASA Astrophysics Data System (ADS)
Gatti, Vijay; Hill, Jason; Mitra, Sunanda; Nutter, Brian
2014-03-01
Despite the current availability in resource-rich regions of advanced technologies in scanning and 3-D imaging in current ophthalmology practice, world-wide screening tests for early detection and progression of glaucoma still consist of a variety of simple tools, including fundus image-based parameters such as CDR (cup to disc diameter ratio) and CAR (cup to disc area ratio), especially in resource -poor regions. Reliable automated computation of the relevant parameters from fundus image sequences requires robust non-rigid registration and segmentation techniques. Recent research work demonstrated that proper non-rigid registration of multi-view monocular fundus image sequences could result in acceptable segmentation of cup boundaries for automated computation of CAR and CDR. This research work introduces a composite diffeomorphic demons registration algorithm for segmentation of cup boundaries from a sequence of monocular images and compares the resulting CAR and CDR values with those computed manually by experts and from 3-D visualization of stereo pairs. Our preliminary results show that the automated computation of CDR and CAR from composite diffeomorphic segmentation of monocular image sequences yield values comparable with those from the other two techniques and thus may provide global healthcare with a cost-effective yet accurate tool for management of glaucoma in its early stage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson-Moore, J.L.; Collins, D.B.; Hornbaker, A.L.
This two-part report provides an essentially complete listing of radioactive occurrences in Colorado, with a comprehensive bibliography and bibliographic cross-indexes. Part 1 lists approximately 3000 known radioactive occurrences with their locations and brief accounts of the geology, mineralogy, radioactivity, host rock, production data, and source of data for each. The occurrences are classified by host rock and plotted on U.S. Geological Survey 1/sup 0/ x 2/sup 0/ topographic quadrangle maps with a special 1 : 100,000-scale base map for the Uravan mineral belt. Part 2 contains the bibliography of approximately 2500 citations on radioactive mineral occurrences in the state, withmore » cross-indexes by county, host rock, and the special categories of ''Front Range,'' ''Colorado Plateau,'' and ''thorium.'' The term ''occurrence'' as used in this report is defined as any site where the concentration of uranium or thorium is at least 0.01% or where the range of radioactivity is greater than twice the background radioactivity. All citations and occurrence data are stored on computer diskettes for easy retrieval, correction, and updating.« less
The interaction of representation and reasoning.
Bundy, Alan
2013-09-08
Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group.
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.
Stockton, David B; Santamaria, Fidel
2017-10-01
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety
NASA Technical Reports Server (NTRS)
Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.
1994-01-01
Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.
The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety
NASA Astrophysics Data System (ADS)
Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.
1994-02-01
Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.
Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC
NASA Technical Reports Server (NTRS)
Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet
1999-01-01
The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.
Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC
NASA Technical Reports Server (NTRS)
Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet
1998-01-01
The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.
McKenzie, Kirsten; Walker, Sue; Tong, Shilu
It remains unclear whether the change from a manual to an automated coding system (ACS) for deaths has significantly affected the consistency of Australian mortality data. The underlying causes of 34,000 deaths registered in 1997 in Australia were dual coded, in ICD-9 manually, and by using an automated computer coding program. The diseases most affected by the change from manual to ACS were senile/presenile dementia, and pneumonia. The most common disease to which a manually assigned underlying cause of senile dementia was coded with ACS was unspecified psychoses (37.2%). Only 12.5% of codes assigned by ACS as senile dementia were coded the same by manual coders. This study indicates some important differences in mortality rates when comparing mortality data that have been coded manually with those coded using an automated computer coding program. These differences may be related to both the different interpretation of ICD coding rules between manual and automated coding, and different co-morbidities or co-existing conditions among demographic groups.
Applications of Automation Methods for Nonlinear Fracture Test Analysis
NASA Technical Reports Server (NTRS)
Allen, Phillip A.; Wells, Douglas N.
2013-01-01
Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.
1974-07-01
automated manufacturing processes and a rough technoeconomic evaluation of those concepts. Our evaluation is largely based on estimates; therefore, the...must be subjected to thorough analysis and experimental verification before they can be considered definitive. They are being published at this time...hardware and sensor technology, manufacturing engineering, automation, and economic analysis . Members of this team inspected over thirty manufacturing
1980-07-25
matrix (DTM) and digital planimetric data, combined and integrated into so-called "data bases." I’ll say more about this later. AUTOMATION OF...projection with mechanical inversors to maintain the Scheimpflug condition. Some automation has been achieved, with computer control to determine rectifier... matrix (DTM) form that is not necessarily collected from the same photography as that from which the orthophoto is being produced. Because they are
Nakanishi, Rine; Sankaran, Sethuraman; Grady, Leo; Malpeso, Jenifer; Yousfi, Razik; Osawa, Kazuhiro; Ceponiene, Indre; Nazarat, Negin; Rahmani, Sina; Kissel, Kendall; Jayawardena, Eranthi; Dailing, Christopher; Zarins, Christopher; Koo, Bon-Kwon; Min, James K; Taylor, Charles A; Budoff, Matthew J
2018-03-23
Our goal was to evaluate the efficacy of a fully automated method for assessing the image quality (IQ) of coronary computed tomography angiography (CCTA). The machine learning method was trained using 75 CCTA studies by mapping features (noise, contrast, misregistration scores, and un-interpretability index) to an IQ score based on manual ground truth data. The automated method was validated on a set of 50 CCTA studies and subsequently tested on a new set of 172 CCTA studies against visual IQ scores on a 5-point Likert scale. The area under the curve in the validation set was 0.96. In the 172 CCTA studies, our method yielded a Cohen's kappa statistic for the agreement between automated and visual IQ assessment of 0.67 (p < 0.01). In the group where good to excellent (n = 163), fair (n = 6), and poor visual IQ scores (n = 3) were graded, 155, 5, and 2 of the patients received an automated IQ score > 50 %, respectively. Fully automated assessment of the IQ of CCTA data sets by machine learning was reproducible and provided similar results compared with visual analysis within the limits of inter-operator variability. • The proposed method enables automated and reproducible image quality assessment. • Machine learning and visual assessments yielded comparable estimates of image quality. • Automated assessment potentially allows for more standardised image quality. • Image quality assessment enables standardization of clinical trial results across different datasets.
Incorporating radioactive decay into charging and coagulation of multicomponent radioactive aerosols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Yong-ha; Yiacoumi, Sotira; Nenes, Athanasios
Compositional changes by the decay of radionuclides in radioactive aerosols can influence their charging state, coagulation frequency and size distribution throughout their atmospheric lifetime. The importance of such effects is unknown as they have not been considered in microphysical and global radioactivity transport studies to date. Here, we explore the effects of compositional changes on the charging efficiency and coagulation rates of aerosols using a set of kinetic equations that couple all relevant processes (decay, charging and coagulation) and their evolution over time. Compared to a coupled aggregation-tracer model for the prediction of the radioactive composition of particulates undergoing coagulation,more » our kinetic approach can provide similar results using much less central processing unit time. Altogether with other considerations, our approach is computational efficient enough to allow implementation in 3D atmospheric transport models. The decay of radionuclides and the production of decay products within radioactive aerosols may significantly affect the aerosol charging rates, and either hinder or promote the coagulation of multicomponent radioactive aerosols. Our results suggest that radiological phenomena occurring within radioactive aerosols, as well as subsequent effects on aerosol microphysics, should be considered in regional and global models to more accurately predict radioactivity transport in the atmosphere in case of a nuclear plant accident.« less
Incorporating radioactive decay into charging and coagulation of multicomponent radioactive aerosols
Kim, Yong-ha; Yiacoumi, Sotira; Nenes, Athanasios; ...
2017-09-29
Compositional changes by the decay of radionuclides in radioactive aerosols can influence their charging state, coagulation frequency and size distribution throughout their atmospheric lifetime. The importance of such effects is unknown as they have not been considered in microphysical and global radioactivity transport studies to date. Here, we explore the effects of compositional changes on the charging efficiency and coagulation rates of aerosols using a set of kinetic equations that couple all relevant processes (decay, charging and coagulation) and their evolution over time. Compared to a coupled aggregation-tracer model for the prediction of the radioactive composition of particulates undergoing coagulation,more » our kinetic approach can provide similar results using much less central processing unit time. Altogether with other considerations, our approach is computational efficient enough to allow implementation in 3D atmospheric transport models. The decay of radionuclides and the production of decay products within radioactive aerosols may significantly affect the aerosol charging rates, and either hinder or promote the coagulation of multicomponent radioactive aerosols. Our results suggest that radiological phenomena occurring within radioactive aerosols, as well as subsequent effects on aerosol microphysics, should be considered in regional and global models to more accurately predict radioactivity transport in the atmosphere in case of a nuclear plant accident.« less
Jackson, Darryl D.; Hollen, Robert M.
1983-01-01
A new automatable cleaning apparatus which makes use of a method of very thoroughly and quickly cleaning a gauze electrode used in chemical analyses is given. The method generates very little waste solution, and this is very important in analyzing radioactive materials, especially in aqueous solutions. The cleaning apparatus can be used in a larger, fully automated controlled potential coulometric apparatus. About 99.98% of a 5 mg. plutonium sample was removed in less than 3 minutes, using only about 60 ml. of rinse solution and two main rinse steps.
Automated QA/QC Check for Beta-Gamma Coincidence Detector
2007-09-01
of the ARSA, 222Rn gas can be introduced into the gas cell, along with the radioactive xenon isotopes. While this radon decays via alpha decay and...Explosion Monitoring Technologies 741 Figure 2. γ-singles spectrum from a 222Rn spike. The peaks are primarily from the radon daughter 214Pb with...National Laboratory (PNNL), can collect and detect several radioxenon isotopes. The ARSA is very sensitive to 133Xe, 131mXe, 133mXe, and 135Xe due to the
ERIC Educational Resources Information Center
STONE, PHILIP J.
AUTOMATED LANGUAGE PROCESSING (CONTENT ANALYSIS) IS ENGAGED IN NEW VENTURES IN COMPUTER DIALOG AS A RESULT OF NEW TECHNIQUES IN CATEGORIZING RESPONSES. A COMPUTER "NEED-ACHIEVEMENT" SCORING SYSTEM HAS BEEN DEVELOPED. A SET OF COMPUTER PROGRAMS, LABELED "THE GENERAL INQUIRER," WILL SCORE COMPUTER INPUTS WITH RESPONSES FED FROM…
Automated technologies needed to prevent radioactive materials from reentering the atmosphere
NASA Astrophysics Data System (ADS)
Buden, David; Angelo, Joseph A., Jr.
Project SIREN (Search, Intercept, Retrieve, Expulsion Nuclear) has been created to identify and evaluate the technologies and operational strategies needed to rendezvous with and capture aerospace radioactive materials (e.g., a distressed or spent space reactor core) before such materials can reenter the terrestrial atmosphere and then to safely move these captured materials to an acceptable space destination for proper disposal. A major component of the current Project SIREN effort is the development of an interactive technology model (including a computerized data base) that explores in building block fashion the interaction of the technologies and procedures needed to successfully accomplish a SIREN mission. This SIREN model will include appropriate national and international technology elements-both contemporary and projected into the next century. To permit maximum flexibility and use, the SIREN technology data base is being programmed for use on 386-class PC's.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Andrew; Lawrence, Earl
The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less
Lean coding machine. Facilities target productivity and job satisfaction with coding automation.
Rollins, Genna
2010-07-01
Facilities are turning to coding automation to help manage the volume of electronic documentation, streamlining workflow, boosting productivity, and increasing job satisfaction. As EHR adoption increases, computer-assisted coding may become a necessity, not an option.
Automated drafting system uses computer techniques
NASA Technical Reports Server (NTRS)
Millenson, D. H.
1966-01-01
Automated drafting system produces schematic and block diagrams from the design engineers freehand sketches. This system codes conventional drafting symbols and their coordinate locations on standard size drawings for entry on tapes that are used to drive a high speed photocomposition machine.
Automated tetraploid genotype calling by hierarchical clustering
USDA-ARS?s Scientific Manuscript database
SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...
AUTOMATION OF EXPERIMENTS WITH A HAND-HELD PROGRAMMABLE CALCULATOR
Technological developments have dramatically reduced the cost of data collection, experimental control and computation. Products are now available which allow automation of experiments both in the laboratory and in the field at substantially lower cost and with less technical exp...
The 3D Euler solutions using automated Cartesian grid generation
NASA Technical Reports Server (NTRS)
Melton, John E.; Enomoto, Francis Y.; Berger, Marsha J.
1993-01-01
Viewgraphs on 3-dimensional Euler solutions using automated Cartesian grid generation are presented. Topics covered include: computational fluid dynamics (CFD) and the design cycle; Cartesian grid strategy; structured body fit; grid generation; prolate spheroid; and ONERA M6 wing.
Iwanishi, Katsuhiro; Watabe, Hiroshi; Hayashi, Takuya; Miyake, Yoshinori; Minato, Kotaro; Iida, Hidehiro
2009-06-01
Cerebral blood flow (CBF), cerebral metabolic rate of oxygen (CMRO(2)), oxygen extraction fraction (OEF), and cerebral blood volume (CBV) are quantitatively measured with PET with (15)O gases. Kudomi et al. developed a dual tracer autoradiographic (DARG) protocol that enables the duration of a PET study to be shortened by sequentially administrating (15)O(2) and C(15)O(2) gases. In this protocol, before the sequential PET scan with (15)O(2) and C(15)O(2) gases ((15)O(2)-C(15)O(2) PET scan), a PET scan with C(15)O should be preceded to obtain CBV image. C(15)O has a high affinity for red blood cells and a very slow washout rate, and residual radioactivity from C(15)O might exist during a (15)O(2)-C(15)O(2) PET scan. As the current DARG method assumes no residual C(15)O radioactivity before scanning, we performed computer simulations to evaluate the influence of the residual C(15)O radioactivity on the accuracy of measured CBF and OEF values with DARG method and also proposed a subtraction technique to minimize the error due to the residual C(15)O radioactivity. In the simulation, normal and ischemic conditions were considered. The (15)O(2) and C(15)O(2) PET count curves with the residual C(15)O PET counts were generated by the arterial input function with the residual C(15)O radioactivity. The amounts of residual C(15)O radioactivity were varied by changing the interval between the C(15)O PET scan and (15)O(2)-C(15)O(2) PET scan, and the absolute inhaled radioactivity of the C(15)O gas. Using the simulated input functions and the PET counts, the CBF and OEF were computed by the DARG method. Furthermore, we evaluated a subtraction method that subtracts the influence of the C(15)O gas in the input function and PET counts. Our simulations revealed that the CBF and OEF values were underestimated by the residual C(15)O radioactivity. The magnitude of this underestimation depended on the amount of C(15)O radioactivity and the physiological conditions. This underestimation was corrected by the subtraction method. This study showed the influence of C(15)O radioactivity in DARG protocol, and the magnitude of the influence was affected by several factors, such as the radioactivity of C(15)O, and the physiological condition.
NASA Astrophysics Data System (ADS)
van Leunen, J. A. J.; Dreessen, J.
1984-05-01
The result of a measurement of the modulation transfer function is only useful as long as it is accompanied by a complete description of all relevant measuring conditions involved. For this reason it is necessary to file a full description of the relevant measuring conditions together with the results. In earlier times some of our results were rendered useless because some of the relevant measuring conditions were accidentally not written down and were forgotten. This was mainly due to the lack of concensus about which measuring conditions had to be filed together with the result of a measurement. One way to secure uniform and complete archiving of measuring conditions and results is to automate the data handling. An attendent advantage of automation of data handling is that it does away with the time-consuming correction of rough measuring results. The automation of the data handling was accomplished with rather cheap desktop computers, which were powerfull enough, however, to allow us to automate the measurement as well. After automation of the data handling we started with automatic collection of rough measurement data. Step by step we extended the automation by letting the desktop computer control more and more of the measuring set-up. At present the desktop computer controls all the electrical and most of the mechanical measuring conditions. Further it controls and reads the MTF measuring instrument. Focussing and orientation optimization can be fully automatic, semi-automatic or completely manual. MTF measuring results can be collected automatically but they can also be typed in by hand. Due to the automation we are able to implement proper archival of measuring results together with all necessary measuring conditions. The improved measuring efficiency made it possible to increase the number of routine measurements done in the same time period by an order of magnitude. To our surprise the measuring accuracy also improved by a factor of two. This was due to the much better reproducibility of the automatic optimization, which resulted in better reproducibility of the measurement result. Another advantage of the automation is that the programs that control the data handling and the automatic measurement are "user friendly". They guide the operator through the measuring procedure using information from earlier measurements of equivalent test specimens. This makes it possible to let routine measurements be done by much less skilled assistants. It also removes much of the tedious routine labour normally involved in MTF measurements. It can be concluded that automation of MTF measurements as described in the foregoing enhances the usefulness of MTF results as well as reducing the cost of MTF measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oetting, W.S.; Lee, H.K.; Flanders, D.J.
The use of short tandem repeat polymorphisms (STRPs) as marker loci for linkage analysis is becoming increasingly important due to their large numbers in the human genome and their high degree of polymorphism. Fluorescence-based detection of the STRP pattern with an automated DNA sequencer has improved the efficiency of this technique by eliminating the need for radioactivity and producing a digitized autoradiogram-like image that can be used for computer analysis. In an effort to simplify the procedure and to reduce the cost of fluorescence STRP analysis, we have developed a technique known as multiplexing STRPs with tailed primers (MSTP) usingmore » primers that have a 19-bp extension, identical to the sequence of an M13 sequencing primer, on the 5{prime} end of the forward primer in conjunction with multiplexing several primer pairs in a single polymerase chain reaction (PCR) amplification. The banding pattern is detected with the addition of the M13 primer-dye conjugate as the sole primer conjugated to the fluorescent dye, eliminating the need for direct conjugation of the infrared fluorescent dye to the STRP primers. The use of MSTP for linkage analysis greatly reduces the number of PCR reactions. Up to five primer pairs can be multiplexed together in the same reaction. At present, a set of 148 STRP markers spaced at an average genetic distance of 28 cM throughout the autosomal genome can be analyzed in 37 sets of multiplexed amplification reactions. We have automated the analysis of these patterns for linkage using software that both detects the STRP banding pattern and determines their sizes. This information can then be exported in a user-defined format from a database manager for linkage analysis. 15 refs., 2 figs., 4 tabs.« less
Computers in the General Physics Laboratory.
ERIC Educational Resources Information Center
Preston, Daryl W.; Good, R. H.
1996-01-01
Provides ideas and outcomes for nine computer laboratory experiments using a commercial eight-bit analog to digital (ADC) interface. Experiments cover statistics; rotation; harmonic motion; voltage, current, and resistance; ADC conversions; temperature measurement; single slit diffraction; and radioactive decay. Includes necessary schematics. (MVL)
DOT National Transportation Integrated Search
1996-01-01
The RISKIND computer program was developed for the analysis of radiological consequences and health risks to individuals and the collective population from exposures associated with the transportation of spent nuclear fuel (SNF) or other radioactive ...
Concepts and algorithms for terminal-area traffic management
NASA Technical Reports Server (NTRS)
Erzberger, H.; Chapel, J. D.
1984-01-01
The nation's air-traffic-control system is the subject of an extensive modernization program, including the planned introduction of advanced automation techniques. This paper gives an overview of a concept for automating terminal-area traffic management. Four-dimensional (4D) guidance techniques, which play an essential role in the automated system, are reviewed. One technique, intended for on-board computer implementation, is based on application of optimal control theory. The second technique is a simplified approach to 4D guidance intended for ground computer implementation. It generates advisory messages to help the controller maintain scheduled landing times of aircraft not equipped with on-board 4D guidance systems. An operational system for the second technique, recently evaluated in a simulation, is also described.
Automated Illustration of Patients Instructions
Bui, Duy; Nakamura, Carlos; Bray, Bruce E.; Zeng-Treitler, Qing
2012-01-01
A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration. PMID:23304392
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
Computer aided fixture design - A case based approach
NASA Astrophysics Data System (ADS)
Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom
2017-11-01
Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.
Continuous stacking computational approach based automated microscope slide scanner
NASA Astrophysics Data System (ADS)
Murali, Swetha; Adhikari, Jayesh Vasudeva; Jagannadh, Veerendra Kalyan; Gorthi, Sai Siva
2018-02-01
Cost-effective and automated acquisition of whole slide images is a bottleneck for wide-scale deployment of digital pathology. In this article, a computation augmented approach for the development of an automated microscope slide scanner is presented. The realization of a prototype device built using inexpensive off-the-shelf optical components and motors is detailed. The applicability of the developed prototype to clinical diagnostic testing is demonstrated by generating good quality digital images of malaria-infected blood smears. Further, the acquired slide images have been processed to identify and count the number of malaria-infected red blood cells and thereby perform quantitative parasitemia level estimation. The presented prototype would enable cost-effective deployment of slide-based cyto-diagnostic testing in endemic areas.
Computers and Technological Forecasting
ERIC Educational Resources Information Center
Martino, Joseph P.
1971-01-01
Forecasting is becoming increasingly automated, thanks in large measure to the computer. It is now possible for a forecaster to submit his data to a computation center and call for the appropriate program. (No knowledge of statistics is required.) (Author)
Computer Administering of the Psychological Investigations: Set-Relational Representation
NASA Astrophysics Data System (ADS)
Yordzhev, Krasimir
Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.
Industrial applications of automated X-ray inspection
NASA Astrophysics Data System (ADS)
Shashishekhar, N.
2015-03-01
Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.
NASA Technical Reports Server (NTRS)
Clancey, William J.
2003-01-01
A human-centered approach to computer systems design involves reframing analysis in terms of people interacting with each other, not only human-machine interaction. The primary concern is not how people can interact with computers, but how shall we design computers to help people work together? An analysis of astronaut interactions with CapCom on Earth during one traverse of Apollo 17 shows what kind of information was conveyed and what might be automated today. A variety of agent and robotic technologies are proposed that deal with recurrent problems in communication and coordination during the analyzed traverse.
ERIC Educational Resources Information Center
Popovich, Donna
This descriptive study surveys the staff of all 18 founding member libraries of OhioLINK to see whether or not they prefer the new system or the old one and why. The purpose of the study is to determine if resistance to change, computer anxiety and technostress can be found in libraries converting their automated systems over to the OhioLINK…
Software For Computer-Security Audits
NASA Technical Reports Server (NTRS)
Arndt, Kate; Lonsford, Emily
1994-01-01
Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The ARES (Automated Residential Energy Standard) User`s Guide is designed to the user successfully operate the ARES computer program. This guide assumes that the user is familiar with basic PC skills such as using a keyboard and loading a disk drive. The ARES computer program was designed to assist building code officials in creating a residential energy standard based on local climate and costs.
Automated CPX support system preliminary design phase
NASA Technical Reports Server (NTRS)
Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.
1984-01-01
The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System
Punjabi, Naresh M.; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N.
2015-01-01
Study Objectives: Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. Design: The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Setting: Clinical sleep laboratories. Measurements and Results: A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90–0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91–0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Conclusion: Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. Citation: Punjabi NM, Shifa N, Dorffner G, Patil S, Pien G, Aurora RN. Computer-assisted automated scoring of polysomnograms using the Somnolyzer system. SLEEP 2015;38(10):1555–1566. PMID:25902809
Model-Based Design of Air Traffic Controller-Automation Interaction
NASA Technical Reports Server (NTRS)
Romahn, Stephan; Callantine, Todd J.; Palmer, Everett A.; Null, Cynthia H. (Technical Monitor)
1998-01-01
A model of controller and automation activities was used to design the controller-automation interactions necessary to implement a new terminal area air traffic management concept. The model was then used to design a controller interface that provides the requisite information and functionality. Using data from a preliminary study, the Crew Activity Tracking System (CATS) was used to help validate the model as a computational tool for describing controller performance.
Precision Relative Positioning for Automated Aerial Refueling from a Stereo Imaging System
2015-03-01
PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS Kyle P. Werner, 2Lt, USAF AFIT-ENG-MS-15-M-048...REFUELING FROM A STEREO IMAGING SYSTEM THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of...RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-M-048 PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS
West europe Report, Science and Technology.
1986-04-15
BLICK DURCH DIE WIRTSCHAFT, 21 Feb 86) 38 Seiaf: Elsag /lBM’s New Creation in Factory Automation (Mauro Flego Interview; AUTOMAZIONE INTEGRATA...SEIAF: ELSAG /IBM’S NEW CREATION IN FACTORY AUTOMATION Milan AUTOMAZIONE INTEGRATA in Italian Apr 85 pp 110-112 [Interview with Mauro Flego...objectives of SEIAF? [Answer] SEIAF, or better—the joint venture ELSAG /IBM—concerns itself with electronic and computer systems for factory automation
An Automated Weather Research and Forecasting (WRF)-Based Nowcasting System: Software Description
2013-10-01
14. ABSTRACT A Web service /Web interface software package has been engineered to address the need for an automated means to run the Weather Research...An Automated Weather Research and Forecasting (WRF)- Based Nowcasting System: Software Description by Stephen F. Kirby, Brian P. Reen, and...Based Nowcasting System: Software Description Stephen F. Kirby, Brian P. Reen, and Robert E. Dumais Jr. Computational and Information Sciences
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
Faster Aerodynamic Simulation With Cart3D
NASA Technical Reports Server (NTRS)
2003-01-01
A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.
Vision 20/20: Automation and advanced computing in clinical radiation oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, Kevin L., E-mail: kevinmoore@ucsd.edu; Moiseenko, Vitali; Kagadis, George C.
2014-01-15
This Vision 20/20 paper considers what computational advances are likely to be implemented in clinical radiation oncology in the coming years and how the adoption of these changes might alter the practice of radiotherapy. Four main areas of likely advancement are explored: cloud computing, aggregate data analyses, parallel computation, and automation. As these developments promise both new opportunities and new risks to clinicians and patients alike, the potential benefits are weighed against the hazards associated with each advance, with special considerations regarding patient safety under new computational platforms and methodologies. While the concerns of patient safety are legitimate, the authorsmore » contend that progress toward next-generation clinical informatics systems will bring about extremely valuable developments in quality improvement initiatives, clinical efficiency, outcomes analyses, data sharing, and adaptive radiotherapy.« less
77 FR 67381 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
.... ``Computational and Experimental RNA Nanoparticle Design,'' in Automation in Genomics and Proteomics: An... and Experimental RNA Nanoparticle Design,'' in Automation in Genomics and Proteomics: An Engineering... Development Stage: Prototype Pre-clinical In vitro data available Inventors: Robert J. Crouch and Yutaka...
Program for improved electrical harness documentation and fabrication
NASA Technical Reports Server (NTRS)
1971-01-01
Computer program provides automated print-out of harness interconnection table and automated cross-check of reciprocal pin/connector assignments, and improves accuracy and reliability of final documented data. Programs and corresponding library tapes are successfully and continuously employed on Nimbus spacecraft programs.
Computer Science and Technology Publications. NBS Publications List 84.
ERIC Educational Resources Information Center
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2012 CFR
2012-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2014 CFR
2014-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2013 CFR
2013-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2011 CFR
2011-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
10 CFR 727.2 - What are the definitions of the terms used in this part?
Code of Federal Regulations, 2010 CFR
2010-01-01
... information. Computer means desktop computers, portable computers, computer networks (including the DOE network and local area networks at or controlled by DOE organizations), network devices, automated.... DOE means the Department of Energy, including the National Nuclear Security Administration. DOE...
Advanced computer architecture for large-scale real-time applications.
DOT National Transportation Integrated Search
1973-04-01
Air traffic control automation is identified as a crucial problem which provides a complex, real-time computer application environment. A novel computer architecture in the form of a pipeline associative processor is conceived to achieve greater perf...
Computer grading of examinations
NASA Technical Reports Server (NTRS)
Frigerio, N. A.
1969-01-01
A method, using IBM cards and computer processing, automates examination grading and recording and permits use of computational problems. The student generates his own answers, and the instructor has much greater freedom in writing questions than is possible with multiple choice examinations.
NASA Astrophysics Data System (ADS)
Sasaki, Syota; Yamada, Tadashi; Yamada, Tomohito J.
2014-05-01
We aim to propose a kinematic-based methodology similar with runoff analysis for readily understandable radiological protection. A merit of this methodology is to produce sufficiently accurate effective doses by basic analysis. The great earthquake attacked the north-east area in Japan on March 11, 2011. The system of electrical facilities to control Fukushima Daiichi nuclear power plant was completely destroyed by the following tsunamis. From the damaged reactor containment vessels, an amount of radioactive isotopes had leaked and been diffused in the vicinity of the plant. Radiological internal exposure caused by ingestion of food containing radioactive isotopes has become an issue of great interest to the public, and has caused excessive anxiety because of a deficiency of fundamental knowledge concerning radioactivity. Concentrations of radioactivity in the human body and internal exposure have been studied extensively. Previous radiologic studies, for example, studies by International Commission on Radiological Protection(ICRP), employ a large-scale computational simulation including actual mechanism of metabolism in the human body. While computational simulation is a standard method for calculating exposure doses among radiology specialists, these methods, although exact, are too difficult for non-specialists to grasp the whole image owing to the sophistication. In this study, the human body is treated as a vessel. The number of radioactive atoms in the human body can be described by an equation of continuity, which is the only governing equation. Half-life, the period of time required for the amount of a substance decreases by half, is only parameter to calculate the number of radioactive isotopes in the human body. Half-life depends only on the kinds of nuclides, there are no arbitrary parameters. It is known that the number of radioactive isotopes decrease exponentially by radioactive decay (physical outflow). It is also known that radioactive isotopes decrease exponentially by excretion (biological outflow). The total outflow is the sum of physical outflow and biological outflow. As a result, the number of radioactive atoms in the human body also decreases exponentially. Half-life can be determined by outflow flux from the definition. Intensity of radioactivity is linear respect to the number of radioactive atoms, both are equivalent analytically. Internal total exposure can be calculated by the time integral of intensity of radioactivity. The absorbed energy into the human body per radioactive decay and the effective dose are calculated by aid of Fermi's theory of beta decay and special relativity. The effective doses calculated by the present method almost agree with those of a study by ICRP. The present method shows that standard limit in general foods for radioactive cesium enforced in Japan, 100 Bq/kg, is too excessive. When we eat foods which contain cesium-137 of 100 Bq/kg at 1 kg/d during 50 years, we receive the effective dose less than natural exposure. Similarly, it is shown that we cannot find significant health damage medically and statistically by ingestion of rice which is harvested from a paddy field deposited current (January, 2014) radioactive cesium.
Networked Microcomputers--The Next Generation in College Computing.
ERIC Educational Resources Information Center
Harris, Albert L.
The evolution of computer hardware for college computing has mirrored the industry's growth. When computers were introduced into the educational environment, they had limited capacity and served one user at a time. Then came large mainframes with many terminals sharing the resource. Next, the use of computers in office automation emerged. As…
ERIC Educational Resources Information Center
Micro-Ideas, Glenview, IL.
The 47 papers in these proceedings describe computer technology and its many applications to the educational process. Topics discussed include computer literacy, networking, word processing, automated instructional management, computer conferencing, career information services, computer-aided drawing/design, and robotics. Programming languages…
Automated aortic calcification detection in low-dose chest CT images
NASA Astrophysics Data System (ADS)
Xie, Yiting; Htwe, Yu Maw; Padgett, Jennifer; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.
2014-03-01
The extent of aortic calcification has been shown to be a risk indicator for vascular events including cardiac events. We have developed a fully automated computer algorithm to segment and measure aortic calcification in low-dose noncontrast, non-ECG gated, chest CT scans. The algorithm first segments the aorta using a pre-computed Anatomy Label Map (ALM). Then based on the segmented aorta, aortic calcification is detected and measured in terms of the Agatston score, mass score, and volume score. The automated scores are compared with reference scores obtained from manual markings. For aorta segmentation, the aorta is modeled as a series of discrete overlapping cylinders and the aortic centerline is determined using a cylinder-tracking algorithm. Then the aortic surface location is detected using the centerline and a triangular mesh model. The segmented aorta is used as a mask for the detection of aortic calcification. For calcification detection, the image is first filtered, then an elevated threshold of 160 Hounsfield units (HU) is used within the aorta mask region to reduce the effect of noise in low-dose scans, and finally non-aortic calcification voxels (bony structures, calcification in other organs) are eliminated. The remaining candidates are considered as true aortic calcification. The computer algorithm was evaluated on 45 low-dose non-contrast CT scans. Using linear regression, the automated Agatston score is 98.42% correlated with the reference Agatston score. The automated mass and volume score is respectively 98.46% and 98.28% correlated with the reference mass and volume score.
Going Around On Circles: Mathematics and Computer Art. Part 2.
ERIC Educational Resources Information Center
Gordon, Sheldon P.; Gordon, Florence S.
1984-01-01
Discusses properties of epicycloids. (The easiest way to picture them is to think of a piece of radioactive bubble gum attached to a wheel which is rolling around the outside of a larger wheel.) Includes a computer program (TRS-80 color computer) that will graph any epicycloid with integer values for the radii. (JN)
Pilots of the future - Human or computer?
NASA Technical Reports Server (NTRS)
Chambers, A. B.; Nagel, D. C.
1985-01-01
In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.
A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data
NASA Technical Reports Server (NTRS)
Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.
2011-01-01
A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.
Satellite freeze forecast system: Executive summary
NASA Technical Reports Server (NTRS)
Martsolf, J. D. (Principal Investigator)
1983-01-01
A satellite-based temperature monitoring and prediction system consisting of a computer controlled acquisition, processing, and display system and the ten automated weather stations called by that computer was developed and transferred to the national weather service. This satellite freeze forecasting system (SFFS) acquires satellite data from either one of two sources, surface data from 10 sites, displays the observed data in the form of color-coded thermal maps and in tables of automated weather station temperatures, computes predicted thermal maps when requested and displays such maps either automatically or manually, archives the data acquired, and makes comparisons with historical data. Except for the last function, SFFS handles these tasks in a highly automated fashion if the user so directs. The predicted thermal maps are the result of two models, one a physical energy budget of the soil and atmosphere interface and the other a statistical relationship between the sites at which the physical model predicts temperatures and each of the pixels of the satellite thermal map.
Computational needs survey of NASA automation and robotics missions. Volume 1: Survey and results
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is that mission computing requirements are frequently unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. A preliminary set of advanced mission computational processing requirements of automation and robotics (A&R) systems are provided for use by NASA, industry, and academic communities. These results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implementation capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Volume one includes the survey and results. Volume two contains the appendixes.
Computational needs survey of NASA automation and robotics missions. Volume 2: Appendixes
NASA Technical Reports Server (NTRS)
Davis, Gloria J.
1991-01-01
NASA's operational use of advanced processor technology in space systems lags behind its commercial development by more than eight years. One of the factors contributing to this is the fact that mission computing requirements are frequency unknown, unstated, misrepresented, or simply not available in a timely manner. NASA must provide clear common requirements to make better use of available technology, to cut development lead time on deployable architectures, and to increase the utilization of new technology. Here, NASA, industry and academic communities are provided with a preliminary set of advanced mission computational processing requirements of automation and robotics (A and R) systems. The results were obtained in an assessment of the computational needs of current projects throughout NASA. The high percent of responses indicated a general need for enhanced computational capabilities beyond the currently available 80386 and 68020 processor technology. Because of the need for faster processors and more memory, 90 percent of the polled automation projects have reduced or will reduce the scope of their implemented capabilities. The requirements are presented with respect to their targeted environment, identifying the applications required, system performance levels necessary to support them, and the degree to which they are met with typical programmatic constraints. Here, appendixes are provided.
Laboratory systems integration: robotics and automation.
Felder, R A
1991-01-01
Robotic technology is going to have a profound impact on the clinical laboratory of the future. Faced with increased pressure to reduce health care spending yet increase services to patients, many laboratories are looking for alternatives to the inflexible or "fixed" automation found in many clinical analyzers. Robots are being examined by many clinical pathologists as an attractive technology which can adapt to the constant changes in laboratory testing. Already, laboratory designs are being altered to accommodate robotics and automated specimen processors. However, the use of robotics and computer intelligence in the clinical laboratory is still in its infancy. Successful examples of robotic automation exist in several laboratories. Investigators have used robots to automate endocrine testing, high performance liquid chromatography, and specimen transportation. Large commercial laboratories are investigating the use of specimen processors which combine the use of fixed automation and robotics. Robotics have also reduced the exposure of medical technologists to specimens infected with viral pathogens. The successful examples of clinical robotics applications were a result of the cooperation of clinical chemists, engineers, and medical technologists. At the University of Virginia we have designed and implemented a robotic critical care laboratory. Initial clinical experience suggests that robotic performance is reliable, however, staff acceptance and utilization requires continuing education. We are also developing a robotic cyclosporine which promises to greatly reduce the labor costs of this analysis. The future will bring lab wide automation that will fully integrate computer artificial intelligence and robotics. Specimens will be transported by mobile robots. Specimen processing, aliquotting, and scheduling will be automated.(ABSTRACT TRUNCATED AT 250 WORDS)
The application of automated operations at the Institutional Processing Center
NASA Technical Reports Server (NTRS)
Barr, Thomas H.
1993-01-01
The JPL Institutional and Mission Computing Division, Communications, Computing and Network Services Section, with its mission contractor, OAO Corporation, have for some time been applying automation to the operation of JPL's Information Processing Center (IPC). Automation does not come in one easy to use package. Automation for a data processing center is made up of many different software and hardware products supported by trained personnel. The IPC automation effort formally began with console automation, and has since spiraled out to include production scheduling, data entry, report distribution, online reporting, failure reporting and resolution, documentation, library storage, and operator and user education, while requiring the interaction of multi-vendor and locally developed software. To begin the process, automation goals are determined. Then a team including operations personnel is formed to research and evaluate available options. By acquiring knowledge of current products and those in development, taking an active role in industry organizations, and learning of other data center's experiences, a forecast can be developed as to what direction technology is moving. With IPC management's approval, an implementation plan is developed and resources identified to test or implement new systems. As an example, IPC's new automated data entry system was researched by Data Entry, Production Control, and Advance Planning personnel. A proposal was then submitted to management for review. A determination to implement the new system was made and elements/personnel involved with the initial planning performed the implementation. The final steps of the implementation were educating data entry personnel in the areas effected and procedural changes necessary to the successful operation of the new system.
Career Education via Data Processing
ERIC Educational Resources Information Center
Wagner, Gerald E.
1975-01-01
A data processing instructional program should provide students with career awareness, exploration, and orientation. This can be accomplished by establishing three objectives: (1) familiarization with automation terminology; (2) understanding the influence of the cultural and social impact of computers and automation; and (3) the kinds of job…
Office Automation in Student Affairs.
ERIC Educational Resources Information Center
Johnson, Sharon L.; Hamrick, Florence A.
1987-01-01
Offers recommendations to assist in introducing or expanding computer assistance in student affairs. Describes need for automation and considers areas of choosing hardware and software, funding and competitive bidding, installation and training, and system management. Cites greater efficiency in handling tasks and data and increased levels of…
Development of an automated pre-sampling plan for construction projects : final report.
DOT National Transportation Integrated Search
1983-03-01
The development of an automated pre-sampling plan was undertaken to free the district construction personnel from the cumbersome and time-consuming task of preparing such plans manually. A computer program was written and linked to a data file which ...
Automated Induction Of Rule-Based Neural Networks
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.; Goodman, Rodney M.
1994-01-01
Prototype expert systems implemented in software and are functionally equivalent to neural networks set up automatically and placed into operation within minutes following information-theoretic approach to automated acquisition of knowledge from large example data bases. Approach based largely on use of ITRULE computer program.
USSR Report, Cybernetics, Computers and Automation Technology.
1987-03-02
Studies in the Area of EPR of Non- Ordered Solids, Spectral Recording, Processing and Analysis System (A.N. Bals, L.M. Kuzmina ; AVTOMETRIYA, No 2, Feb...L.M. Kuzmina , Riga] [Abstract] An automated system has been developed for electron paramagnetic resonance studies, oriented toward achievement of
One of My Favorite Assignments: Automated Teller Machine Simulation.
ERIC Educational Resources Information Center
Oberman, Paul S.
2001-01-01
Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)
An anatomy of industrial robots and their controls
NASA Astrophysics Data System (ADS)
Luh, J. Y. S.
1983-02-01
The modernization of manufacturing facilities by means of automation represents an approach for increasing productivity in industry. The three existing types of automation are related to continuous process controls, the use of transfer conveyor methods, and the employment of programmable automation for the low-volume batch production of discrete parts. The industrial robots, which are defined as computer controlled mechanics manipulators, belong to the area of programmable automation. Typically, the robots perform tasks of arc welding, paint spraying, or foundary operation. One may assign a robot to perform a variety of job assignments simply by changing the appropriate computer program. The present investigation is concerned with an evaluation of the potential of the robot on the basis of its basic structure and controls. It is found that robots function well in limited areas of industry. If the range of tasks which robots can perform is to be expanded, it is necessary to provide multiple-task sensors, or special tooling, or even automatic tooling.
Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.
2012-01-01
The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.
Computation of Flow Through Water-Control Structures Using Program DAMFLO.2
Sanders, Curtis L.; Feaster, Toby D.
2004-01-01
As part of its mission to collect, analyze, and store streamflow data, the U.S. Geological Survey computes flow through several dam structures throughout the country. Flows are computed using hydraulic equations that describe flow through sluice and Tainter gates, crest gates, lock gates, spillways, locks, pumps, and siphons, which are calibrated using flow measurements. The program DAMFLO.2 was written to compute, tabulate, and plot flow through dam structures using data that describe the physical properties of dams and various hydraulic parameters and ratings that use time-varying data, such as lake elevations or gate openings. The program uses electronic computer files of time-varying data, such as lake elevation or gate openings, retrieved from the U.S. Geological Survey Automated Data Processing System. Computed time-varying flow data from DAMFLO.2 are output in flat files, which can be entered into the Automated Data Processing System database. All computations are made in units of feet and seconds. DAMFLO.2 uses the procedures and language developed by the SAS Institute Inc.
Medical Information Processing by Computer.
ERIC Educational Resources Information Center
Kleinmuntz, Benjamin
The use of the computer for medical information processing was introduced about a decade ago. Considerable inroads have now been made toward its applications to problems in medicine. Present uses of the computer, both as a computational and noncomputational device include the following: automated search of patients' files; on-line clinical data…
ERIC Educational Resources Information Center
Clyde, Anne
1999-01-01
Discussion of the Year 2000 (Y2K) problem, the computer-code problem that affects computer programs or computer chips, focuses on the impact on teacher-librarians. Topics include automated library systems, access to online information services, library computers and software, and other electronic equipment such as photocopiers and fax machines.…
Automating Disk Forensic Processing with SleuthKit, XML and Python
2009-05-01
1 Automating Disk Forensic Processing with SleuthKit, XML and Python Simson L. Garfinkel Abstract We have developed a program called fiwalk which...files themselves. We show how it is relatively simple to create automated disk forensic applications using a Python module we have written that reads...software that the portable device may contain. Keywords: Computer Forensics; XML; Sleuth Kit; Python I. INTRODUCTION In recent years we have found many
1982-01-27
Visible 3. 3 Ea r th Location, Colocation, and Normalization 4. IMAGE ANALYSIS 4. 1 Interactive Capabilities 4.2 Examples 5. AUTOMATED CLOUD...computer Interactive Data Access System (McIDAS) before image analysis and algorithm development were done. Earth-location is an automated procedure to...the factor l / s in (SSE) toward the gain settings given in Table 5. 4. IMAGE ANALYSIS 4.1 Interactive Capabilities The development of automated
Cost considerations in automating the library.
Bolef, D
1987-01-01
The purchase price of a computer and its software is but a part of the cost of any automated system. There are many additional costs, including one-time costs of terminals, printers, multiplexors, microcomputers, consultants, workstations and retrospective conversion, and ongoing costs of maintenance and maintenance contracts for the equipment and software, telecommunications, and supplies. This paper examines those costs in an effort to produce a more realistic picture of an automated system. PMID:3594021
NASA Tech Briefs, June 1996. Volume 20, No. 6
NASA Technical Reports Server (NTRS)
1996-01-01
Topics: New Computer Hardware; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;Books and Reports.
Automation of electromagnetic compatability (EMC) test facilities
NASA Technical Reports Server (NTRS)
Harrison, C. A.
1986-01-01
Efforts to automate electromagnetic compatibility (EMC) test facilities at Marshall Space Flight Center are discussed. The present facility is used to accomplish a battery of nine standard tests (with limited variations) deigned to certify EMC of Shuttle payload equipment. Prior to this project, some EMC tests were partially automated, but others were performed manually. Software was developed to integrate all testing by means of a desk-top computer-controller. Near real-time data reduction and onboard graphics capabilities permit immediate assessment of test results. Provisions for disk storage of test data permit computer production of the test engineer's certification report. Software flexibility permits variation in the tests procedure, the ability to examine more closely those frequency bands which indicate compatibility problems, and the capability to incorporate additional test procedures.
A Computational Framework for Automation of Point Defect Calculations
NASA Astrophysics Data System (ADS)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration
A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.
NASA Astrophysics Data System (ADS)
Srivastava, Vishal; Dalal, Devjyoti; Kumar, Anuj; Prakash, Surya; Dalal, Krishna
2018-06-01
Moisture content is an important feature of fruits and vegetables. As 80% of apple content is water, so decreasing the moisture content will degrade the quality of apples (Golden Delicious). The computational and texture features of the apples were extracted from optical coherence tomography (OCT) images. A support vector machine with a Gaussian kernel model was used to perform automated classification. To evaluate the quality of wax coated apples during storage in vivo, our proposed method opens up the possibility of fully automated quantitative analysis based on the morphological features of apples. Our results demonstrate that the analysis of the computational and texture features of OCT images may be a good non-destructive method for the assessment of the quality of apples.
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Providing security for automated process control systems at hydropower engineering facilities
NASA Astrophysics Data System (ADS)
Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.
2016-12-01
This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.
A test matrix sequencer for research test facility automation
NASA Technical Reports Server (NTRS)
Mccartney, Timothy P.; Emery, Edward F.
1990-01-01
The hardware and software configuration of a Test Matrix Sequencer, a general purpose test matrix profiler that was developed for research test facility automation at the NASA Lewis Research Center, is described. The system provides set points to controllers and contact closures to data systems during the course of a test. The Test Matrix Sequencer consists of a microprocessor controlled system which is operated from a personal computer. The software program, which is the main element of the overall system is interactive and menu driven with pop-up windows and help screens. Analog and digital input/output channels can be controlled from a personal computer using the software program. The Test Matrix Sequencer provides more efficient use of aeronautics test facilities by automating repetitive tasks that were once done manually.
Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen
2017-09-01
Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.
Adaptive function allocation reduces performance costs of static automation
NASA Technical Reports Server (NTRS)
Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian
1993-01-01
Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.
The interaction of representation and reasoning
Bundy, Alan
2013-01-01
Automated reasoning is an enabling technology for many applications of informatics. These applications include verifying that a computer program meets its specification; enabling a robot to form a plan to achieve a task and answering questions by combining information from diverse sources, e.g. on the Internet, etc. How is automated reasoning possible? Firstly, knowledge of a domain must be stored in a computer, usually in the form of logical formulae. This knowledge might, for instance, have been entered manually, retrieved from the Internet or perceived in the environment via sensors, such as cameras. Secondly, rules of inference are applied to old knowledge to derive new knowledge. Automated reasoning techniques have been adapted from logic, a branch of mathematics that was originally designed to formalize the reasoning of humans, especially mathematicians. My special interest is in the way that representation and reasoning interact. Successful reasoning is dependent on appropriate representation of both knowledge and successful methods of reasoning. Failures of reasoning can suggest changes of representation. This process of representational change can also be automated. We will illustrate the automation of representational change by drawing on recent work in my research group. PMID:24062623
Fraser, John K; Hicok, Kevin C; Shanahan, Rob; Zhu, Min; Miller, Scott; Arm, Douglas M
2014-01-01
Objective: To develop a closed, automated system that standardizes the processing of human adipose tissue to obtain and concentrate regenerative cells suitable for clinical treatment of thermal and radioactive burn wounds. Approach: A medical device was designed to automate processing of adipose tissue to obtain a clinical-grade cell output of stromal vascular cells that may be used immediately as a therapy for a number of conditions, including nonhealing wounds resulting from radiation damage. Results: The Celution ® System reliably and reproducibly generated adipose-derived regenerative cells (ADRCs) from tissue collected manually and from three commercial power-assisted liposuction devices. The entire process of introducing tissue into the system, tissue washing and proteolytic digestion, isolation and concentration of the nonadipocyte nucleated cell fraction, and return to the patient as a wound therapeutic, can be achieved in approximately 1.5 h. An alternative approach that applies ultrasound energy in place of enzymatic digestion demonstrates extremely poor efficiency cell extraction. Innovation: The Celution System is the first medical device validated and approved by multiple international regulatory authorities to generate autologous stromal vascular cells from adipose tissue that can be used in a real-time bedside manner. Conclusion: Initial preclinical and clinical studies using ADRCs obtained using the automated tissue processing Celution device described herein validate a safe and effective manner to obtain a promising novel cell-based treatment for wound healing.
An automated digital imaging system for environmental monitoring applications
Bogle, Rian; Velasco, Miguel; Vogel, John
2013-01-01
Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.
Development of a plan for automating integrated circuit processing
NASA Technical Reports Server (NTRS)
1971-01-01
The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.
Automating quantum experiment control
NASA Astrophysics Data System (ADS)
Stevens, Kelly E.; Amini, Jason M.; Doret, S. Charles; Mohler, Greg; Volin, Curtis; Harter, Alexa W.
2017-03-01
The field of quantum information processing is rapidly advancing. As the control of quantum systems approaches the level needed for useful computation, the physical hardware underlying the quantum systems is becoming increasingly complex. It is already becoming impractical to manually code control for the larger hardware implementations. In this chapter, we will employ an approach to the problem of system control that parallels compiler design for a classical computer. We will start with a candidate quantum computing technology, the surface electrode ion trap, and build a system instruction language which can be generated from a simple machine-independent programming language via compilation. We incorporate compile time generation of ion routing that separates the algorithm description from the physical geometry of the hardware. Extending this approach to automatic routing at run time allows for automated initialization of qubit number and placement and additionally allows for automated recovery after catastrophic events such as qubit loss. To show that these systems can handle real hardware, we present a simple demonstration system that routes two ions around a multi-zone ion trap and handles ion loss and ion placement. While we will mainly use examples from transport-based ion trap quantum computing, many of the issues and solutions are applicable to other architectures.
Farahani, Navid; Liu, Zheng; Jutt, Dylan; Fine, Jeffrey L
2017-10-01
- Pathologists' computer-assisted diagnosis (pCAD) is a proposed framework for alleviating challenges through the automation of their routine sign-out work. Currently, hypothetical pCAD is based on a triad of advanced image analysis, deep integration with heterogeneous information systems, and a concrete understanding of traditional pathology workflow. Prototyping is an established method for designing complex new computer systems such as pCAD. - To describe, in detail, a prototype of pCAD for the sign-out of a breast cancer specimen. - Deidentified glass slides and data from breast cancer specimens were used. Slides were digitized into whole-slide images with an Aperio ScanScope XT, and screen captures were created by using vendor-provided software. The advanced workflow prototype was constructed by using PowerPoint software. - We modeled an interactive, computer-assisted workflow: pCAD previews whole-slide images in the context of integrated, disparate data and predefined diagnostic tasks and subtasks. Relevant regions of interest (ROIs) would be automatically identified and triaged by the computer. A pathologist's sign-out work would consist of an interactive review of important ROIs, driven by required diagnostic tasks. The interactive session would generate a pathology report automatically. - Using animations and real ROIs, the pCAD prototype demonstrates the hypothetical sign-out in a stepwise fashion, illustrating various interactions and explaining how steps can be automated. The file is publicly available and should be widely compatible. This mock-up is intended to spur discussion and to help usher in the next era of digitization for pathologists by providing desperately needed and long-awaited automation.
History of a Building Automation System.
ERIC Educational Resources Information Center
Martin, Anthony A.
1984-01-01
Having successfully used computer control in the solar-heated and cooled Terraset School, the Fairfax County, VA, Public Schools are now computerizing all their facilities. This article discusses the configuration and use of a countywide control system, reasons for the project's success, and problems of facility automation. (MCG)
Automated Instructional Management Systems (AIMS) Version III, Users Manual.
ERIC Educational Resources Information Center
New York Inst. of Tech., Old Westbury.
This document sets forth the procedures necessary to utilize and understand the operating characteristics of the Automated Instructional Management System - Version III, a computer-based system for management of educational processes. Directions for initialization, including internal and user files; system and operational input requirements;…
Ringling School of Art and Design Builds a CASTLE.
ERIC Educational Resources Information Center
Morse, Yvonne; Davis, Wendy
1984-01-01
Describes the development and installation of the Computer Automated Software for the Total Library Environment System (CASTLE), which uses a microcomputer to automate operations of small academic library in six main areas: circulation, online catalog, inventory and file maintenance, audiovisual equipment, accounting, and information and…
Air Force Tech Order Management System (AFTOMS). Automation Plan-Final Report. Version 1.0
DOT National Transportation Integrated Search
1988-02-01
Computer aided Acquisition and Logistics Support (CALS) is a Department of Defense (DoD) program designed to improve weapon systems support through digital automation. In June 1985, the joint industry/DoD Task Force on CALS issued a five volume repor...
An Overview of Automated Scoring of Essays
ERIC Educational Resources Information Center
Dikli, Semire
2006-01-01
Automated Essay Scoring (AES) is defined as the computer technology that evaluates and scores the written prose (Shermis & Barrera, 2002; Shermis & Burstein, 2003; Shermis, Raymat, & Barrera, 2003). AES systems are mainly used to overcome time, cost, reliability, and generalizability issues in writing assessment (Bereiter, 2003; Burstein,…
Providing Access to Library Automation Systems for Students with Disabilities.
ERIC Educational Resources Information Center
California Community Colleges, Sacramento. High-Tech Center for the Disabled.
This document provides information on the integration of assistive computer technologies and library automation systems at California Community Colleges in order to ensure access for students with disabilities. Topics covered include planning, upgrading, purchasing, implementing and using these technologies with library systems. As information…
Automated lettuce nutrient solution management using an array of ion-selective electrodes
USDA-ARS?s Scientific Manuscript database
Automated sensing and control of macronutrients in hydroponic solutions would allow more efficient management of nutrients for crop growth in closed systems. This paper describes the development and evaluation of a computer-controlled nutrient management system with an array of ion-selective electro...
Powsiri Klinkhachorn; J. Moody; Philip A. Araman
1995-01-01
For the past few decades, researchers have devoted time and effort to apply automation and modern computer technologies towards improving the productivity of traditional industries. To be competitive, one must streamline operations and minimize production costs, while maintaining an acceptable margin of profit. This paper describes the effort of one such endeavor...
Some Automated Cartography Developments at the Defense Mapping Agency.
1981-01-01
on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these
An Automated Approach to Instructional Design Guidance.
ERIC Educational Resources Information Center
Spector, J. Michael; And Others
This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…
NASA Tech Briefs, March 1996. Volume 20, No. 3
NASA Technical Reports Server (NTRS)
1996-01-01
Topics: Computer-Aided Design and Engineering; Electronic Components and Cicuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information; Books and Reports.
NASA Tech Briefs, September 1999. Volume 23, No. 9
NASA Technical Reports Server (NTRS)
1999-01-01
Topics discussed include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences;
Automated error correction in IBM quantum computer and explicit generalization
NASA Astrophysics Data System (ADS)
Ghosh, Debjit; Agarwal, Pratik; Pandey, Pratyush; Behera, Bikash K.; Panigrahi, Prasanta K.
2018-06-01
Construction of a fault-tolerant quantum computer remains a challenging problem due to unavoidable noise and fragile quantum states. However, this goal can be achieved by introducing quantum error-correcting codes. Here, we experimentally realize an automated error correction code and demonstrate the nondestructive discrimination of GHZ states in IBM 5-qubit quantum computer. After performing quantum state tomography, we obtain the experimental results with a high fidelity. Finally, we generalize the investigated code for maximally entangled n-qudit case, which could both detect and automatically correct any arbitrary phase-change error, or any phase-flip error, or any bit-flip error, or combined error of all types of error.
Integration of analytical instruments with computer scripting.
Carvalho, Matheus C
2013-08-01
Automation of laboratory routines aided by computer software enables high productivity and is the norm nowadays. However, the integration of different instruments made by different suppliers is still difficult, because to accomplish it, the user must have knowledge of electronics and/or low-level programming. An alternative approach is to control different instruments without an electronic connection between them, relying only on their software interface on a computer. This can be achieved through scripting, which is the emulation of user operations (mouse clicks and keyboard inputs) on the computer. The main advantages of this approach are its simplicity, which enables people with minimal knowledge of computer programming to employ it, and its universality, which enables the integration of instruments made by different suppliers, meaning that the user is totally free to choose the devices to be integrated. Therefore, scripting can be a useful, accessible, and economic solution for laboratory automation.
NASA Technical Reports Server (NTRS)
Kim, Jonnathan H.
1995-01-01
Humans can perform many complicated tasks without explicit rules. This inherent and advantageous capability becomes a hurdle when a task is to be automated. Modern computers and numerical calculations require explicit rules and discrete numerical values. In order to bridge the gap between human knowledge and automating tools, a knowledge model is proposed. Knowledge modeling techniques are discussed and utilized to automate a labor and time intensive task of detecting anomalous bearing wear patterns in the Space Shuttle Main Engine (SSME) High Pressure Oxygen Turbopump (HPOTP).
Creating Science Simulations through Computational Thinking Patterns
ERIC Educational Resources Information Center
Basawapatna, Ashok Ram
2012-01-01
Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…
Radioactivity observed in the sodium iodide gamma-ray spectrometer returned on the Apollo 17 mission
NASA Technical Reports Server (NTRS)
Dyer, C. S.; Trombka, J. I.; Schmadebeck, R. L.; Eller, E.; Bielefeld, M. J.; Okelley, G. D.; Eldridge, J. S.; Northcutt, K. J.; Metzger, A. E.; Reedy, R. C.
1975-01-01
In order to obtain information on radioactive background induced in the Apollo 15 and 16 gamma-ray spectrometers (7 cm x 7 cm NaI) by particle irradiation during spaceflight, and identical detector was flown and returned to earth on the Apollo 17 mission. The induced radioactivity was monitored both internally and externally from one and a half hours after splashdown. When used in conjunction with a computation scheme for estimating induced activation from calculated trapped proton and cosmic-ray fluences, these results show an important contribution resulting from both thermal and energetic neutrons produced in the heavy spacecraft by cosmic-ray interactions.
Computer vision for microscopy diagnosis of malaria.
Tek, F Boray; Dempster, Andrew G; Kale, Izzet
2009-07-13
This paper reviews computer vision and image analysis studies aiming at automated diagnosis or screening of malaria infection in microscope images of thin blood film smears. Existing works interpret the diagnosis problem differently or propose partial solutions to the problem. A critique of these works is furnished. In addition, a general pattern recognition framework to perform diagnosis, which includes image acquisition, pre-processing, segmentation, and pattern classification components, is described. The open problems are addressed and a perspective of the future work for realization of automated microscopy diagnosis of malaria is provided.
NASA Tech Briefs, June 1997. Volume 21, No. 6
NASA Technical Reports Server (NTRS)
1997-01-01
Topics include: Computer Hardware and Peripherals; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Books and Reports.
NASA Tech Briefs, November 1999. Volume 23, No. 11
NASA Technical Reports Server (NTRS)
1999-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Materials; Computer Programs; Mechanics; Machinery/Automation; Physical Sciences; Mathematics and Information Sciences; Books and Reports.
Jacob, Joseph; Bartholmai, Brian J; Rajagopalan, Srinivasan; Brun, Anne Laure; Egashira, Ryoko; Karwoski, Ronald; Kokosi, Maria; Wells, Athol U; Hansell, David M
2016-11-23
To evaluate computer-based computer tomography (CT) analysis (CALIPER) against visual CT scoring and pulmonary function tests (PFTs) when predicting mortality in patients with connective tissue disease-related interstitial lung disease (CTD-ILD). To identify outcome differences between distinct CTD-ILD groups derived following automated stratification of CALIPER variables. A total of 203 consecutive patients with assorted CTD-ILDs had CT parenchymal patterns evaluated by CALIPER and visual CT scoring: honeycombing, reticular pattern, ground glass opacities, pulmonary vessel volume, emphysema, and traction bronchiectasis. CT scores were evaluated against pulmonary function tests: forced vital capacity, diffusing capacity for carbon monoxide, carbon monoxide transfer coefficient, and composite physiologic index for mortality analysis. Automated stratification of CALIPER-CT variables was evaluated in place of and alongside forced vital capacity and diffusing capacity for carbon monoxide in the ILD gender, age physiology (ILD-GAP) model using receiver operating characteristic curve analysis. Cox regression analyses identified four independent predictors of mortality: patient age (P < 0.0001), smoking history (P = 0.0003), carbon monoxide transfer coefficient (P = 0.003), and pulmonary vessel volume (P < 0.0001). Automated stratification of CALIPER variables identified three morphologically distinct groups which were stronger predictors of mortality than all CT and functional indices. The Stratified-CT model substituted automated stratified groups for functional indices in the ILD-GAP model and maintained model strength (area under curve (AUC) = 0.74, P < 0.0001), ILD-GAP (AUC = 0.72, P < 0.0001). Combining automated stratified groups with the ILD-GAP model (stratified CT-GAP model) strengthened predictions of 1- and 2-year mortality: ILD-GAP (AUC = 0.87 and 0.86, respectively); stratified CT-GAP (AUC = 0.89 and 0.88, respectively). CALIPER-derived pulmonary vessel volume is an independent predictor of mortality across all CTD-ILD patients. Furthermore, automated stratification of CALIPER CT variables represents a novel method of prognostication at least as robust as PFTs in CTD-ILD patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd
2015-04-29
A DAQ (data acquisition) software called RPTv2.0 has been developed for Radioactive Particle Tracking System in Malaysian Nuclear Agency. RPTv2.0 that features scanning control GUI, data acquisition from 12-channel counter via RS-232 interface, and multichannel analyzer (MCA). This software is fully developed on National Instruments Labview 8.6 platform. Ludlum Model 4612 Counter is used to count the signals from the scintillation detectors while a host computer is used to send control parameters, acquire and display data, and compute results. Each detector channel consists of independent high voltage control, threshold or sensitivity value and window settings. The counter is configured withmore » a host board and twelve slave boards. The host board collects the counts from each slave board and communicates with the computer via RS-232 data interface.« less
Impact of pharmacy automation on patient waiting time: an application of computer simulation.
Tan, Woan Shin; Chua, Siang Li; Yong, Keng Woh; Wu, Tuck Seng
2009-06-01
This paper aims to illustrate the use of computer simulation in evaluating the impact of a prototype automated dispensing system on waiting time in an outpatient pharmacy and its potential as a routine tool in pharmacy management. A discrete event simulation model was developed to investigate the impact of a prototype automated dispensing system on operational efficiency and service standards in an outpatient pharmacy. The simulation results suggest that automating the prescription-filing function using a prototype that picks and packs at 20 seconds per item will not assist the pharmacy in achieving the waiting time target of 30 minutes for all patients. Regardless of the state of automation, to meet the waiting time target, 2 additional pharmacists are needed to overcome the process bottleneck at the point of medication dispense. However, if the automated dispensing is the preferred option, the speed of the system needs to be twice as fast as the current configuration to facilitate the reduction of the 95th percentile patient waiting time to below 30 minutes. The faster processing speed will concomitantly allow the pharmacy to reduce the number of pharmacy technicians from 11 to 8. Simulation was found to be a useful and low cost method that allows an otherwise expensive and resource intensive evaluation of new work processes and technology to be completed within a short time.
NASA Technical Reports Server (NTRS)
Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)
1998-01-01
Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.
Toward Automated Computer-Based Visualization and Assessment of Team-Based Performance
ERIC Educational Resources Information Center
Ifenthaler, Dirk
2014-01-01
A considerable amount of research has been undertaken to provide insights into the valid assessment of team performance. However, in many settings, manual and therefore labor-intensive assessment instruments for team performance have limitations. Therefore, automated assessment instruments enable more flexible and detailed insights into the…
ERIC Educational Resources Information Center
Kennedy, Joyce Lain
1994-01-01
Discusses significant new developments in the electronic search process: (1) New Government Automation; (2) New Federal Initiatives; (3) New Telecommunications Services; (4) Campus Data Bases; (5) Off-Campus Data Bases; (6) Faxed or E-Mailed Resumes; (7) Automation of 3rd-Party Recruiters; (8) New Cyberservices; (9) Interview-Prep Software; (10)…
Hardware Realization of an Ethernet Packet Analyzer Search Engine
2000-06-30
specific for the home automation industry. This analyzer will be at the gateway of a network and analyze Ethernet packets as they go by. It will keep... home automation and not the computer network. This system is a stand-alone real-time network analyzer capable of decoding Ethernet protocols. The
The Automated Circulation Marketplace: Active and Heating Up.
ERIC Educational Resources Information Center
Matthews, Joseph R.
1982-01-01
Predicts that the growing market for automated circulation systems will expand even faster in the near future, given the availability of a wide variety of systems and computer types, which enables libraries of all sizes to obtain a system to fit their needs. Currently there are 301 systems installed. (RAA)
AUTOMATED GIS WATERSHED ANALYSIS TOOLS FOR RUSLE/SEDMOD SOIL EROSION AND SEDIMENTATION MODELING
A comprehensive procedure for computing soil erosion and sediment delivery metrics has been developed using a suite of automated Arc Macro Language (AML ) scripts and a pair of processing- intensive ANSI C++ executable programs operating on an ESRI ArcGIS 8.x Workstation platform...
Small but Pristine--Lessons for Small Library Automation.
ERIC Educational Resources Information Center
Clement, Russell; Robertson, Dane
1990-01-01
Compares the more positive library automation experiences of a small public library with those of a large research library. Topics addressed include collection size; computer size and the need for outside control of a data processing center; staff size; selection process for hardware and software; and accountability. (LRW)
Library Automation: A "First Course" Teaching Syllabus.
ERIC Educational Resources Information Center
Dyson, Sam A.
This syllabus for a basic course in library automation is designed for advanced library students and practicing librarians. It is intended not to make librarians and students qualified programmers, but to give them enough background information for intelligent discussion of library problems with computer personnel. It may also stimulate the…
Automated Tutoring in Interactive Environments: A Task-Centered Approach.
ERIC Educational Resources Information Center
Wolz, Ursula; And Others
1989-01-01
Discusses tutoring and consulting functions in interactive computer environments. Tutoring strategies are considered, the expert model and the user model are described, and GENIE (Generated Informative Explanations)--an answer generating system for the Berkeley Unix Mail system--is explained as an example of an automated consulting system. (33…
Industrial Arts Curriculum Guide for Automated Machining in Metals Technology.
ERIC Educational Resources Information Center
1985
This curriculum guide is designed to be used for creating programs in automated machining education in Connecticut. The first sections of the guide are introductory, explaining the importance of computer-numerically controlled machines, describing the industrial arts scope and sequence for kindergarten through adult levels, describing the…
Automation Training Tools of the Future.
ERIC Educational Resources Information Center
Rehg, James
1986-01-01
Manufacturing isn't what it used to be, and the United States must ensure its position in the world trade market by educating factory workers in new automated systems. A computer manufacturing engineer outlines the training requirements of a modern workforce and details robotic training devices suitable for classroom use. (JN)
An Automated Circulation System for a Small Technical Library.
ERIC Educational Resources Information Center
Culnan, Mary J.
The traditional manually-controlled circulation records of the Burroughs Corporation Library in Goleta, California, presented problems of inaccuracies, time time-consuming searches, and lack of use statistics. An automated system with the capacity to do file maintenance and statistical record-keeping was implemented on a Burroughts B1700 computer.…
Automation for Primary Processing of Hardwoods
Daniel L. Schmoldt
1992-01-01
Hardwood sawmills critically need to incorporate automation and computer technology into their operations. Social constraints, forest biology constraints, forest product market changes, and financial necessity are forcing primary processors to boost their productivity and efficiency to higher levels. The locations, extent, and types of defects found in logs and on...
Automated Bilingual Circulation System Using PC Local Area Networks.
ERIC Educational Resources Information Center
Iskanderani, A. I.; Anwar, M. A.
1992-01-01
Describes a personal computer and LAN-based automated circulation system capable of handling both Arabic and Latin characters that was developed for use at King Abdullaziz University (Jeddah, Saudi Arabia). Outlines system requirements, system structure, hardware needs, and individual functional modules of the system. Numerous examples and flow…
The Automation-by-Expertise-by-Training Interaction.
Strauch, Barry
2017-03-01
I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.
Safe transport of radioactive materials in Egypt
NASA Astrophysics Data System (ADS)
El-Shinawy, Rifaat M. K.
1994-07-01
In Egypt the national regulations for safe transport of radioactive materials (RAM) are based on the International Atomic Energy Agency (IAEA) regulations. In addition, regulations for the safe transport of these materials through the Suez Canal (SC) were laid down by the Egyptian Atomic Energy Authority (EAEA) and the Suez Canal Authority (SCA). They are continuously updated to meet the increased knowledge and the gained experience. The technical and protective measures taken during transport of RAM through SC are mentioned. Assessment of the impact of transporting radioactive materials through the Suez Canal using the INTERTRAN computer code was carried out in cooperation with IAEA. The transported activities and empty containers, the number of vessels carrying RAM through the canal from 1963 to 1991 and their nationalities are also discussed. The protective measures are mentioned.A review of the present situation of the radioactive wastes storage facilities at the Atomic Energy site at Inshas is given along with the regulation for safe transportation and disposal of radioactive wastes
Jackson, D.D.; Hollen, R.M.
1981-02-27
A method of very thoroughly and quikcly cleaning a guaze electrode used in chemical analyses is given, as well as an automobile cleaning apparatus which makes use of the method. The method generates very little waste solution, and this is very important in analyzing radioactive materials, especially in aqueous solutions. The cleaning apparatus can be used in a larger, fully automated controlled potential coulometric apparatus. About 99.98% of a 5 mg plutonium sample was removed in less than 3 minutes, using only about 60 ml of rinse solution and two main rinse steps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madrid, Gregory J.; Whitener, Dustin Heath; Folz, Wesley
2017-05-27
The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fulton, John; Gallagher, Linda; Gonzales, Alejandro
The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.
Turbo FRMAC 2016 Version 7.1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fulton, John; Gallagher, Linda K.; Madrid, Gregory J.
2016-08-01
The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madrid, Gregory J.; Whitener, Dustin Heath; Folz, Wesley
2017-02-27
The Turbo FRMAC (TF) software program is the software implementation of the science and methodologies utilized in the Federal Radiological Monitoring and Assessment Center (FRMAC). The software automates the calculations described in volumes 1 of "The Federal Manual for Assessing Environmental Data during a Radiological Emergency" (2015 version). In the event of the intentional or accidental release of radioactive material, the software is used to guide and govern the response of the Federal, State, Local, and Tribal governments. The manual, upon which the software is based, is unclassified and freely available on the Internet.
The Application of Computers to Library Technical Processing
ERIC Educational Resources Information Center
Veaner, Allen B.
1970-01-01
Describes computer applications to acquisitions and technical processing and reports in detail on Stanford's development work in automated technical processing. Author is Assistant Director for Bibliographic Operation, Stanford University Libraries. (JB)
NASA Tech Briefs, January 2000. Volume 24, No. 1
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Data Acquisition; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Bio-Medical; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Information Sciences; Books and reports.
University of Arizona: College and University Systems Environment.
ERIC Educational Resources Information Center
CAUSE/EFFECT, 1985
1985-01-01
The University of Arizona has begun to reorganize campus computing. Six working groups were formed to address six areas of computing: academic computing, library automation, administrative data processing and information systems, writing and graphics, video and audio services, and outreach and public service. (MLW)
Computational efficiency for the surface renewal method
NASA Astrophysics Data System (ADS)
Kelley, Jason; Higgins, Chad
2018-04-01
Measuring surface fluxes using the surface renewal (SR) method requires programmatic algorithms for tabulation, algebraic calculation, and data quality control. A number of different methods have been published describing automated calibration of SR parameters. Because the SR method utilizes high-frequency (10 Hz+) measurements, some steps in the flux calculation are computationally expensive, especially when automating SR to perform many iterations of these calculations. Several new algorithms were written that perform the required calculations more efficiently and rapidly, and that tested for sensitivity to length of flux averaging period, ability to measure over a large range of lag timescales, and overall computational efficiency. These algorithms utilize signal processing techniques and algebraic simplifications that demonstrate simple modifications that dramatically improve computational efficiency. The results here complement efforts by other authors to standardize a robust and accurate computational SR method. Increased speed of computation time grants flexibility to implementing the SR method, opening new avenues for SR to be used in research, for applied monitoring, and in novel field deployments.
The future of scientific workflows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deelman, Ewa; Peterka, Tom; Altintas, Ilkay
Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less
Soviet Cybernetics Review. Volume 2, Number 5,
prize; Aeroflot’s sirena system turned on; Computer system controls 2500 construction sites; Automation of aircraft languages; Diagnosis by teletype; ALGEM-1 and ALGEM-2 languages; Nuclear institute’s computer facilities.
NASA Tech Briefs, December 1993. Volume 17, No. 12
NASA Technical Reports Server (NTRS)
1993-01-01
Topics covered include: High-Performance Computing; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
NASA Tech Briefs, March 1994. Volume 18, No. 3
NASA Technical Reports Server (NTRS)
1994-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
Drill user's manual. [drilling machine automation
NASA Technical Reports Server (NTRS)
Pitts, E. A.
1976-01-01
Instructions are given for using the DRILL computer program which converts data contained in an Interactive Computer Graphics System (IGDS) design file to production of a paper tape for driving a numerically controlled drilling machine.
NASA Tech Briefs, March 2000. Volume 24, No. 3
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
NASA Tech Briefs, March 1997. Volume 21, No. 3
NASA Technical Reports Server (NTRS)
1997-01-01
Topics: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports.
Automated CFD Database Generation for a 2nd Generation Glide-Back-Booster
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Michael J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejmil, Edward
2003-01-01
A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment using 13 computers located at 4 different geographical sites. Process automation and web-based access to the database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The database consists of forces, moments, and solution files obtained by varying the Mach number, angle of attack, and sideslip angle. The forces and moments compare well with experimental data. Stability derivatives are also computed using a monotone cubic spline procedure. Flow visualization and three-dimensional surface plots are used to interpret and characterize the nature of computed flow fields.
Automated segmentation of pulmonary structures in thoracic computed tomography scans: a review
NASA Astrophysics Data System (ADS)
van Rikxoort, Eva M.; van Ginneken, Bram
2013-09-01
Computed tomography (CT) is the modality of choice for imaging the lungs in vivo. Sub-millimeter isotropic images of the lungs can be obtained within seconds, allowing the detection of small lesions and detailed analysis of disease processes. The high resolution of thoracic CT and the high prevalence of lung diseases require a high degree of automation in the analysis pipeline. The automated segmentation of pulmonary structures in thoracic CT has been an important research topic for over a decade now. This systematic review provides an overview of current literature. We discuss segmentation methods for the lungs, the pulmonary vasculature, the airways, including airway tree construction and airway wall segmentation, the fissures, the lobes and the pulmonary segments. For each topic, the current state of the art is summarized, and topics for future research are identified.
Remote voice training: A case study on space shuttle applications, appendix C
NASA Technical Reports Server (NTRS)
Mollakarimi, Cindy; Hamid, Tamin
1990-01-01
The Tile Automation System includes applications of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. An integrated set of rapid prototyping testbeds was developed which include speech recognition and synthesis, laser imaging systems, distributed Ada programming environments, distributed relational data base architectures, distributed computer network architectures, multi-media workbenches, and human factors considerations. Remote voice training in the Tile Automation System is discussed. The user is prompted over a headset by synthesized speech for the training sequences. The voice recognition units and the voice output units are remote from the user and are connected by Ethernet to the main computer system. A supervisory channel is used to monitor the training sequences. Discussions include the training approaches as well as the human factors problems and solutions for this system utilizing remote training techniques.
Promoting autonomy in a smart home environment with a smarter interface.
Brennan, C P; McCullagh, P J; Galway, L; Lightbody, G
2015-01-01
In the not too distant future, the median population age will tend towards 65; an age at which the need for dependency increases. Most older people want to remain autonomous and self-sufficient for as long as possible. As environments become smarter home automation solutions can be provided to support this aspiration. The technology discussed within this paper focuses on providing a home automation system that can be controlled by most users regardless of mobility restrictions, and hence it may be applicable to older people. It comprises a hybrid Brain-Computer Interface, home automation user interface and actuators. In the first instance, our system is controlled with conventional computer input, which is then replaced with eye tracking and finally a BCI and eye tracking collaboration. The systems have been assessed in terms of information throughput; benefits and limitations are evaluated.
[Health technology assessment report: Computer-assisted Pap test for cervical cancer screening].
Della Palma, Paolo; Moresco, Luca; Giorgi Rossi, Paolo
2012-01-01
HEALTH PROBLEM: Cervical cancer is a disease which is highly preventable by means of Pap test screening for the precancerous lesions, which can be easily treated. Furthermore, in the near future, control of the disease will be enhanced by the vaccination which prevents the infection of those human papillomavirus types that cause the vast majority of cervical cancers. The effectiveness of screening in drastically reducing cervical cancer incidence has been clearly demonstrated. The epidemiology of cervical cancer in industrialised countries is now determined mostly by the Pap test coverage of the female population and by the ability of health systems to assure appropriate follow up after an abnormal Pap test. Today there are two fully automated systems for computer-assisted Pap test: the BD FocalPoint and the Hologic Imager. Recently, the Hologic Integrated Imager, a semi-automated system, was launched. The two fully automated systems are composed of a central scanner, where the machine examines the cytologic slide, and of one or more review stations, where the cytologists analyze the slides previously centrally scanned. The softwares used by the two systems identify the fields of interest so that the cytologists can look only at those points, automatically pointed out by the review station. Furthermore, the FocalPoint system classifies the slides according to their level of risk of containing signs of relevant lesions. Those in the upper classes--about one fifth of the slides--are labelled as « further review », while those in the lower level of risk, i.e. slides that have such a low level of risk that they can be considered as negative with no human review, are labelled as « no further review ». The aim of computer-assisted Pap test is to reduce the time of slide examination and to increase productivity. Furthermore, the number of errors due to lack of attention may decrease. Both the systems can be applied to liquidbased cytology, while only the BD Focal Point can be used on conventional smears. Cytology screening has some critical points: there is a shortage of cytologists/cytotechnicians; the quality strongly depends on the experience and ability of the cytologist; there is a subjective component in the cytological diagnosis; in highly screened populations, the prevalence of lesions is very low and the activity of cytologists is very monotonous. On the other hand, a progressive shift to molecular screening using HPV-DNA test as primary screening test is very likely in the near future; cytology will be used as triage test, dramatically reducing the number of slides to process and increasing the prevalence of lesions in those Pap tests. In this Report we assume that the diagnostic accuracy of computer-assisted Pap test is equal to the accuracy of manual Pap test and, consequently, that screening using computer-assisted Pap test has the same efficacy in reducing cervical cancer incidence and mortality. Under this assumption, the effectiveness/ benefit/utility is the same for the two screening modes, i.e. the economic analysis will be a cost minimization study. Furthermore, the screening process is identical for the two modalities in all the phases except for slide interpretation. The cost minimization analysis will be limited to the only phase differing between the two modes, i.e. the study will be a differential cost analysis between a labour-intensive strategy (traditional Pap test) and a technology-intensive strategy (the computer-assisted Pap test). Briefly, the objectives of this HTA Report are: to determine the break even point of computer-assisted Pap test systems, i.e. the volume of slides processed per year at which putting in place a computer-assisted Pap test system becomes economically convenient; to quantify the cost per Pap test in different scenarios according to screening centre activity volume, productivity of cytologist, type of cytology (conventional smear or liquid-based, fully automated or semi-automated computer-assisted); to analyse the computer-assisted Pap test in the Italian context, through a survey of the centres using the technology, collecting data useful for the sensitivity analysis of the economic evaluation; to evaluate the acceptability of the technology in the screening services; to evaluate the organizational and financial impact of the computer-assisted Pap test in different scenarios; to illustrate the ideal organization to implement computer-assisted Pap test in terms of volume of activity, productivity, and human and technological resources. to produce this Report, the following process was adopted: application to the Ministry of health for a grant « Analysis of the impact of professional involvement in evidence generation for the HTA process »; within this project, the sub-project « Cost effectiveness evaluation of the computer-assisted Pap test in the Italian screening programmes » was financed; constitution of the Working Group, which included the project coordinator, the principal investigator, and the health economist; identification of the centres using the computer-assisted Pap test and which had published scientific reports on the subject; identification of the Consulting Committee (stakeholder), which included screening programmes managers, pathologists, economists, health policy-makers, citizen organizations, and manufacturers. Once the evaluation was concluded, a plenary meeting with Working Group and Consulting Committee was held. The working group drafted the final version of this Report, which took into account the comments received. the fully automated computer-assisted Pap test has an important financial and organizational impact on screening programmes. The assessment of this health technology reached the following conclusions: according to the survey results, after some distrust, cytologists accepted the use of the machine and appreciated the reduction in interpretation time and the reliability in identifying the fields of interest; from an economic point of view, the automated computer-assisted Pap test can be convenient only with conventional smears if the screening centre has a volume of more than 49,000 slides/year and the cytologist productivity increases about threefold. It must be highlighted that it is not sufficient to adopt the automated Pap test to reach such an increase in productivity; the laboratory must be organised or re-organised to optimise the use of the review stations and the person time. In the case of liquid-based cytology, the adoption of automated computer- assisted Pap test can only increase the costs. In fact, liquid-based cytology increases the cost of consumable materials but reduces the interpretation time, even in manual screening. Consequently, the reduction of human costs is smaller in the case of computer-assisted screening. Liquid-based cytology has other implications and advantages not linked to the use of computer-assisted Pap test that should be taken into account and are beyond the scope of this Report; given that the computer-assisted Pap test reduces human costs, it may be more advantageous where the cost of cytologists is higher; given the relatively small volume of activity of screening centres in Italy, computer-assisted Pap test may be reasonable for a network using only one central scanner and several remote review stations; the use of automated computer-assisted Pap test only for quality control in a single centre is not economically sustainable. In this case as well, several centres, for example at the regional level, may form a consortium to reach a reasonable number of slides to achieve the break even point. Regarding the use of a machine rather than human intelligence to interpret the slides, some ethical issues were initially raised, but both the scientific community and healthcare professionals have accepted this technology. The identification of fields of interest by the machine is highly reproducible, reducing subjectivity in the diagnostic process. The Hologic system always includes a check by the human eye, while the FocalPoint system identifies about one fifth of the slides as No Further Review. Several studies, some of which conducted in Italy, confirmed the reliability of this classification. There is still some resistance to accept the practice of No Further Review. A check of previous slides and clinical data can be useful to make the cytologist and the clinician more confident. Computer-assisted automated Pap test may be introduced only if there is a need to increase the volume of slides screened to cover the screening target population and sufficient human resources are not available. Switching a programme using conventional slides to automatic scanning can only lead to a reduction in costs if the volume of slides per year exceeds 49,000 slides/annum and cytologist productivity is optimised to more than 20,000 slides per year. At a productivity of 15,000 or fewer, the automated computer-assisted Pap test cannot be convenient. Switching from manual screening with conventional slides to automatic scanning with liquid-based cytology cannot generate any economic saving, but the system could increase output with a given number of staff. The transition from manual to computer assisted automated screening of liquid based cytology will not generate savings and the increase in productivity will be lower than that of the switch from manual/conventional to automated/conventional. The use of biologists or pathologists as cytologists is more costly than the use of cytoscreeners. Given that the automated computer-assisted Pap test reduces human resource costs, its adoption in a model using only biologists and pathologists for screening is more economically advantageous. (ABSTRACT TRUNCATED)
NASA Astrophysics Data System (ADS)
Litjens, G.; Ehteshami Bejnordi, B.; Timofeeva, N.; Swadi, G.; Kovacs, I.; Hulsbergen-van de Kaa, C.; van der Laak, J.
2015-03-01
Automated detection of prostate cancer in digitized H and E whole-slide images is an important first step for computer-driven grading. Most automated grading algorithms work on preselected image patches as they are too computationally expensive to calculate on the multi-gigapixel whole-slide images. An automated multi-resolution cancer detection system could reduce the computational workload for subsequent grading and quantification in two ways: by excluding areas of definitely normal tissue within a single specimen or by excluding entire specimens which do not contain any cancer. In this work we present a multi-resolution cancer detection algorithm geared towards the latter. The algorithm methodology is as follows: at a coarse resolution the system uses superpixels, color histograms and local binary patterns in combination with a random forest classifier to assess the likelihood of cancer. The five most suspicious superpixels are identified and at a higher resolution more computationally expensive graph and gland features are added to refine classification for these superpixels. Our methods were evaluated in a data set of 204 digitized whole-slide H and E stained images of MR-guided biopsy specimens from 163 patients. A pathologist exhaustively annotated the specimens for areas containing cancer. The performance of our system was evaluated using ten-fold cross-validation, stratified according to patient. Image-based receiver operating characteristic (ROC) analysis was subsequently performed where a specimen containing cancer was considered positive and specimens without cancer negative. We obtained an area under the ROC curve of 0.96 and a 0.4 specificity at a 1.0 sensitivity.
Global risk of radioactive fallout after major nuclear reactor accidents
NASA Astrophysics Data System (ADS)
Lelieveld, J.; Kunkel, D.; Lawrence, M. G.
2012-05-01
Major reactor accidents of nuclear power plants are rare, yet the consequences are catastrophic. But what is meant by "rare"? And what can be learned from the Chernobyl and Fukushima incidents? Here we assess the cumulative, global risk of exposure to radioactivity due to atmospheric dispersion of gases and particles following severe nuclear accidents (the most severe ones on the International Nuclear Event Scale, INES 7), using particulate 137Cs and gaseous 131I as proxies for the fallout. Our results indicate that previously the occurrence of INES 7 major accidents and the risks of radioactive contamination have been underestimated. Using a global model of the atmosphere we compute that on average, in the event of a major reactor accident of any nuclear power plant worldwide, more than 90% of emitted 137Cs would be transported beyond 50 km and about 50% beyond 1000 km distance before being deposited. This corroborates that such accidents have large-scale and trans-boundary impacts. Although the emission strengths and atmospheric removal processes of 137Cs and 131I are quite different, the radioactive contamination patterns over land and the human exposure due to deposition are computed to be similar. High human exposure risks occur around reactors in densely populated regions, notably in West Europe and South Asia, where a major reactor accident can subject around 30 million people to radioactive contamination. The recent decision by Germany to phase out its nuclear reactors will reduce the national risk, though a large risk will still remain from the reactors in neighbouring countries.
Exploring the Issues: Humans and Computers.
ERIC Educational Resources Information Center
Walsh, Huber M.
This presentation addresses three basic social issues generated by the computer revolution. The first section, "Money Matters," focuses on the economic effects of computer technology. These include the replacement of workers by fully automated machines, the threat to professionals posed by expanded access to specialized information, and the…
Designing automation for human use: empirical studies and quantitative models.
Parasuraman, R
2000-07-01
An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.
Apparatus and method for automated monitoring of airborne bacterial spores
NASA Technical Reports Server (NTRS)
Ponce, Adrian (Inventor)
2009-01-01
An apparatus and method for automated monitoring of airborne bacterial spores. The apparatus is provided with an air sampler, a surface for capturing airborne spores, a thermal lysis unit to release DPA from bacterial spores, a source of lanthanide ions, and a spectrometer for excitation and detection of the characteristic fluorescence of the aromatic molecules in bacterial spores complexed with lanthanide ions. In accordance with the method: computer-programmed steps allow for automation of the apparatus for the monitoring of airborne bacterial spores.
Building Flexible User Interfaces for Solving PDEs
NASA Astrophysics Data System (ADS)
Logg, Anders; Wells, Garth N.
2010-09-01
FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.
Using Analogue Computers in Schools
ERIC Educational Resources Information Center
Hinson, D. J.
1974-01-01
Discusses the procedures of using operational amplifiers to conduct student projects and teach physical phenomena such as oscillation, radioactive decay, terminal velocity, projectile and ball bouncing. (CC)
Role of post-mapping computed tomography in virtual-assisted lung mapping.
Sato, Masaaki; Nagayama, Kazuhiro; Kuwano, Hideki; Nitadori, Jun-Ichi; Anraku, Masaki; Nakajima, Jun
2017-02-01
Background Virtual-assisted lung mapping is a novel bronchoscopic preoperative lung marking technique in which virtual bronchoscopy is used to predict the locations of multiple dye markings. Post-mapping computed tomography is performed to confirm the locations of the actual markings. This study aimed to examine the accuracy of marking locations predicted by virtual bronchoscopy and elucidate the role of post-mapping computed tomography. Methods Automated and manual virtual bronchoscopy was used to predict marking locations. After bronchoscopic dye marking under local anesthesia, computed tomography was performed to confirm the actual marking locations before surgery. Discrepancies between marking locations predicted by the different methods and the actual markings were examined on computed tomography images. Forty-three markings in 11 patients were analyzed. Results The average difference between the predicted and actual marking locations was 30 mm. There was no significant difference between the latest version of the automated virtual bronchoscopy system (30.7 ± 17.2 mm) and manual virtual bronchoscopy (29.8 ± 19.1 mm). The difference was significantly greater in the upper vs. lower lobes (37.1 ± 20.1 vs. 23.0 ± 6.8 mm, for automated virtual bronchoscopy; p < 0.01). Despite this discrepancy, all targeted lesions were successfully resected using 3-dimensional image guidance based on post-mapping computed tomography reflecting the actual marking locations. Conclusions Markings predicted by virtual bronchoscopy were dislocated from the actual markings by an average of 3 cm. However, surgery was accurately performed using post-mapping computed tomography guidance, demonstrating the indispensable role of post-mapping computed tomography in virtual-assisted lung mapping.
Williams, W E
1987-01-01
The maturing of technologies in computer capabilities, particularly direct digital signals, has provided an exciting variety of new communication and facility control opportunities. These include telecommunications, energy management systems, security systems, office automation systems, local area networks, and video conferencing. New applications are developing continuously. The so-called "intelligent" or "smart" building concept evolves from the development of this advanced technology in building environments. Automation has had a dramatic effect on facility planning. For decades, communications were limited to the telephone, the typewritten message, and copy machines. The office itself and its functions had been essentially unchanged for decades. Office automation systems began to surface during the energy crisis and, although their newer technology was timely, they were, for the most part, designed separately from other new building systems. For example, most mainframe computer systems were originally stand-alone, as were word processing installations. In the last five years, the advances in distributive systems, networking, and personal computer capabilities have provided opportunities to make such dramatic improvements in productivity that the Selectric typewriter has gone from being the most advanced piece of office equipment to nearly total obsolescence.
Communications among elements of a space construction ensemble
NASA Technical Reports Server (NTRS)
Davis, Randal L.; Grasso, Christopher A.
1989-01-01
Space construction projects will require careful coordination between managers, designers, manufacturers, operators, astronauts, and robots with large volumes of information of varying resolution, timeliness, and accuracy flowing between the distributed participants over computer communications networks. Within the CSC Operations Branch, we are researching the requirements and options for such communications. Based on our work to date, we feel that communications standards being developed by the International Standards Organization, the CCITT, and other groups can be applied to space construction. We are currently studying in depth how such standards can be used to communicate with robots and automated construction equipment used in a space project. Specifically, we are looking at how the Manufacturing Automation Protocol (MAP) and the Manufacturing Message Specification (MMS), which tie together computers and machines in automated factories, might be applied to space construction projects. Together with our CSC industrial partner Computer Technology Associates, we are developing a MAP/MMS companion standard for space construction and we will produce software to allow the MAP/MMS protocol to be used in our CSC operations testbed.
An automated method to find reaction mechanisms and solve the kinetics in organometallic catalysis.
Varela, J A; Vázquez, S A; Martínez-Núñez, E
2017-05-01
A novel computational method is proposed in this work for use in discovering reaction mechanisms and solving the kinetics of transition metal-catalyzed reactions. The method does not rely on either chemical intuition or assumed a priori mechanisms, and it works in a fully automated fashion. Its core is a procedure, recently developed by one of the authors, that combines accelerated direct dynamics with an efficient geometry-based post-processing algorithm to find transition states (Martinez-Nunez, E., J. Comput. Chem. 2015 , 36 , 222-234). In the present work, several auxiliary tools have been added to deal with the specific features of transition metal catalytic reactions. As a test case, we chose the cobalt-catalyzed hydroformylation of ethylene because of its well-established mechanism, and the fact that it has already been used in previous automated computational studies. Besides the generally accepted mechanism of Heck and Breslow, several side reactions, such as hydrogenation of the alkene, emerged from our calculations. Additionally, the calculated rate law for the hydroformylation reaction agrees reasonably well with those obtained in previous experimental and theoretical studies.
Wang, Nancy X. R.; Olson, Jared D.; Ojemann, Jeffrey G.; Rao, Rajesh P. N.; Brunton, Bingni W.
2016-01-01
Fully automated decoding of human activities and intentions from direct neural recordings is a tantalizing challenge in brain-computer interfacing. Implementing Brain Computer Interfaces (BCIs) outside carefully controlled experiments in laboratory settings requires adaptive and scalable strategies with minimal supervision. Here we describe an unsupervised approach to decoding neural states from naturalistic human brain recordings. We analyzed continuous, long-term electrocorticography (ECoG) data recorded over many days from the brain of subjects in a hospital room, with simultaneous audio and video recordings. We discovered coherent clusters in high-dimensional ECoG recordings using hierarchical clustering and automatically annotated them using speech and movement labels extracted from audio and video. To our knowledge, this represents the first time techniques from computer vision and speech processing have been used for natural ECoG decoding. Interpretable behaviors were decoded from ECoG data, including moving, speaking and resting; the results were assessed by comparison with manual annotation. Discovered clusters were projected back onto the brain revealing features consistent with known functional areas, opening the door to automated functional brain mapping in natural settings. PMID:27148018
Savant Genome Browser 2: visualization and analysis for population-scale genomics.
Fiume, Marc; Smith, Eric J M; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M; Robinson, Mark D; Wodak, Shoshana J; Brudno, Michael
2012-07-01
High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com.
Savant Genome Browser 2: visualization and analysis for population-scale genomics
Smith, Eric J. M.; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M.; Robinson, Mark D.; Wodak, Shoshana J.; Brudno, Michael
2012-01-01
High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com. PMID:22638571
An Intelligent Systems Approach to Automated Object Recognition: A Preliminary Study
Maddox, Brian G.; Swadley, Casey L.
2002-01-01
Attempts at fully automated object recognition systems have met with varying levels of success over the years. However, none of the systems have achieved high enough accuracy rates to be run unattended. One of the reasons for this may be that they are designed from the computer's point of view and rely mainly on image-processing methods. A better solution to this problem may be to make use of modern advances in computational intelligence and distributed processing to try to mimic how the human brain is thought to recognize objects. As humans combine cognitive processes with detection techniques, such a system would combine traditional image-processing techniques with computer-based intelligence to determine the identity of various objects in a scene.
A computer program for automated flutter solution and matched point determination
NASA Technical Reports Server (NTRS)
Bhatia, K. G.
1973-01-01
The use of a digital computer program (MATCH) for automated determination of the flutter velocity and the matched-point flutter density is described. The program is based on the use of the modified Laguerre iteration formula to converge to a flutter crossing or a matched-point density. A general description of the computer program is included and the purpose of all subroutines used is stated. The input required by the program and various input options are detailed, and the output description is presented. The program can solve flutter equations formulated with up to 12 vibration modes and obtain flutter solutions for up to 10 air densities. The program usage is illustrated by a sample run, and the FORTRAN program listing is included.
Automated attendance accounting system
NASA Technical Reports Server (NTRS)
Chapman, C. P. (Inventor)
1973-01-01
An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.
Péharpré, D; Cliquet, F; Sagné, E; Renders, C; Costy, F; Aubert, M
1999-07-01
The rapid fluorescent focus inhibition test (RFFIT) and the fluorescent antibody virus neutralization test (FAVNT) are both diagnostic tests for determining levels of rabies neutralizing antibodies. An automated method for determining fluorescence has been implemented to reduce the work time required for fluorescent visual microscopic observations. The automated method offers several advantages over conventional visual observation, such as the ability to rapidly test many samples. The antibody titers obtained with automated techniques were similar to those obtained with both the RFFIT (n = 165, r = 0.93, P < 0.001) and the FAVNT (n = 52, r = 0.99, P < 0.001).
NASA Technical Reports Server (NTRS)
Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby
2013-01-01
The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.
Development of an automated film-reading system for ballistic ranges
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1992-01-01
Software for an automated film-reading system that uses personal computers and digitized shadowgraphs is described. The software identifies pixels associated with fiducial-line and model images, and least-squares procedures are used to calculate the positions and orientations of the images. Automated position and orientation readings for sphere and cone models are compared to those obtained using a manual film reader. When facility calibration errors are removed from these readings, the accuracy of the automated readings is better than the pixel resolution, and it is equal to, or better than, the manual readings. The effects of film-reading and facility-calibration errors on calculated aerodynamic coefficients is discussed.
Introduction to the Use of Computers in Libraries: A Textbook for the Non-Technical Student.
ERIC Educational Resources Information Center
Ogg, Harold C.
This book outlines computing and information science from the perspective of what librarians and educators need to do with computer technology and how it can help them perform their jobs more efficiently. It provides practical explanations and library applications for non-technical users of desktop computers and other library automation tools.…
THE COMPUTER CONCEPT OF SELF-INSTRUCTIONAL DEVICES.
ERIC Educational Resources Information Center
SILBERMAN, HARRY F.
THE COMPUTER SYSTEM CONCEPT WILL BE DEVELOPED IN TWO WAYS--FIRST, A DESCRIPTION WILL BE MADE OF THE SMALL COMPUTER-BASED TEACHING MACHINE WHICH IS BEING USED AS A RESEARCH TOOL, SECOND, A DESCRIPTION WILL BE MADE OF THE LARGE COMPUTER LABORATORY FOR AUTOMATED SCHOOL SYSTEMS WHICH ARE BEING DEVELOPED. THE FIRST MACHINE CONSISTS OF THREE ELEMENTS--…
NASA Tech Briefs, July 1994. Volume 18, No. 7
NASA Technical Reports Server (NTRS)
1994-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Life Sciences; Books and Reports
ERIC Educational Resources Information Center
Goodgion, Laurel; And Others
1986-01-01
Eight articles in special supplement to "Library Journal" and "School Library Journal" cover a computer program called "Byte into Books"; microcomputers and the small library; creating databases with students; online searching with a microcomputer; quality automation software; Meckler Publishing Company's…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
..., Cer-Cad Engineering Resources, Computer Consultants of America, Computer Engrg Services, Compuware..., Automated Analysis Corp/Belcan, Bartech Group, CAE Tech, CDI Information Services, CER-CAD Engineering...
NASA Tech Briefs, November 2000. Volume 24, No. 11
NASA Technical Reports Server (NTRS)
2000-01-01
Topics covered include: Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Test and Measurement; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Data Acquisition.
NASA Tech Briefs, April 1996. Volume 20, No. 4
NASA Technical Reports Server (NTRS)
1996-01-01
Topics covered include: Advanced Composites and Plastics; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information; Books and Reports.
Teaching 3D computer animation to illustrators: the instructor as translator and technical director.
Koning, Wobbe F
2012-01-01
An art instructor discusses the difficulties he's encountered teaching computer graphics skills to undergraduate art students. To help the students, he introduced an automated-rigging script for character animation.
Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design
NASA Technical Reports Server (NTRS)
Ordaz, Irian; Li, Wu; Campbell, Richard L.
2014-01-01
The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.
Cost-Effectiveness Analysis of the Automation of a Circulation System.
ERIC Educational Resources Information Center
Mosley, Isobel
A general methodology for cost effectiveness analysis was developed and applied to the Colorado State University library loan desk. The cost effectiveness of the existing semi-automated circulation system was compared with that of a fully manual one, based on the existing manual subsystem. Faculty users' time and computer operating costs were…
Automated Semantic Indices Related to Cognitive Function and Rate of Cognitive Decline
ERIC Educational Resources Information Center
Pakhomov, Serguei V. S.; Hemmy, Laura S.; Lim, Kelvin O.
2012-01-01
The objective of our study is to introduce a fully automated, computational linguistic technique to quantify semantic relations between words generated on a standard semantic verbal fluency test and to determine its cognitive and clinical correlates. Cognitive differences between patients with Alzheimer's disease and mild cognitive impairment are…
Microsoft's Vista: Guarantees People with Special Needs Access to Computers
ERIC Educational Resources Information Center
Williams, John M.
2006-01-01
In this article, the author discusses the accessibility features of Microsoft's Windows Vista. One of the most innovative aspects of Windows Vista is a new accessibility and automated testing model called Microsoft UI Automation, which reduces development costs not only for accessible and assistive technology (AT) developers, but also for…
Mechanisation and Automation of Information Library Procedures in the USSR.
ERIC Educational Resources Information Center
Batenko, A. I.
Scientific and technical libraries represent a fundamental link in a complex information storage and retrieval system. The handling of a large volume of scientific and technical data and provision of information library services requires the utilization of computing facilities and automation equipment, and was started in the Soviet Union on a…
Information Systems; Modern Health Care and Medical Information.
ERIC Educational Resources Information Center
Brandejs, J. F., And Others
1975-01-01
To effectively handle changes in health policy and health information, new designs and applications of automation are explored. Increased use of computer-based information systems in health care could serve as a means of control over the costs of developing more comprehensive health service, with applications increasing not only the automation of…
Validation of Automated Scoring for a Formative Assessment That Employs Scientific Argumentation
ERIC Educational Resources Information Center
Mao, Liyang; Liu, Ou Lydia; Roohr, Katrina; Belur, Vinetha; Mulholland, Matthew; Lee, Hee-Sun; Pallant, Amy
2018-01-01
Scientific argumentation is one of the core practices for teachers to implement in science classrooms. We developed a computer-based formative assessment to support students' construction and revision of scientific arguments. The assessment is built upon automated scoring of students' arguments and provides feedback to students and teachers.…
Automated Formative Assessment as a Tool to Scaffold Student Documentary Writing
ERIC Educational Resources Information Center
Ferster, Bill; Hammond, Thomas C.; Alexander, R. Curby; Lyman, Hunt
2012-01-01
The hurried pace of the modern classroom does not permit formative feedback on writing assignments at the frequency or quality recommended by the research literature. One solution for increasing individual feedback to students is to incorporate some form of computer-generated assessment. This study explores the use of automated assessment of…
Optomechatronic System For Automated Intra Cytoplasmic Sperm Injection
NASA Astrophysics Data System (ADS)
Shulev, Assen; Tiankov, Tihomir; Ignatova, Detelina; Kostadinov, Kostadin; Roussev, Ilia; Trifonov, Dimitar; Penchev, Valentin
2015-12-01
This paper presents a complex optomechatronic system for In-Vitro Fertilization (IVF), offering almost complete automation of the Intra Cytoplasmic Sperm Injection (ICSI) procedure. The compound parts and sub-systems, as well as some of the computer vision algorithms, are described below. System capabilities for ICSI have been demonstrated on infertile oocyte cells.
Readerbench: Automated Evaluation of Collaboration Based on Cohesion and Dialogism
ERIC Educational Resources Information Center
Dascalu, Mihai; Trausan-Matu, Stefan; McNamara, Danielle S.; Dessus, Philippe
2015-01-01
As Computer-Supported Collaborative Learning (CSCL) gains a broader usage, the need for automated tools capable of supporting tutors in the time-consuming process of analyzing conversations becomes more pressing. Moreover, collaboration, which presumes the intertwining of ideas or points of view among participants, is a central element of dialogue…
2004-03-01
On all levels of the military command hierarchy there is a strong demand for support through the automated processing of reconnaissance reports. This...preconditions for the improvement of computer support and then illustrates the automated processing of report information using a military ambush situation in
Default Parallels Plesk Panel Page
services that small businesses want and need. Our software includes key building blocks of cloud service virtualized servers Service Provider Products Parallels® Automation Hosting, SaaS, and cloud computing , the leading hosting automation software. You see this page because there is no Web site at this
Computer-Automated Approach for Scoring Short Essays in an Introductory Statistics Course
ERIC Educational Resources Information Center
Zimmerman, Whitney Alicia; Kang, Hyun Bin; Kim, Kyung; Gao, Mengzhao; Johnson, Glenn; Clariana, Roy; Zhang, Fan
2018-01-01
Over two semesters short essay prompts were developed for use with the Graphical Interface for Knowledge Structure (GIKS), an automated essay scoring system. Participants were students in an undergraduate-level online introductory statistics course. The GIKS compares students' writing samples with an expert's to produce keyword occurrence and…
NASA Astrophysics Data System (ADS)
Hoang, Bui Huy; Oda, Masahiro; Jiang, Zhengang; Kitasaka, Takayuki; Misawa, Kazunari; Fujiwara, Michitaka; Mori, Kensaku
2011-03-01
This paper presents an automated anatomical labeling method of arteries extracted from contrasted 3D CT images based on multi-class AdaBoost. In abdominal surgery, understanding of vasculature related to a target organ such as the colon is very important. Therefore, the anatomical structure of blood vessels needs to be understood by computers in a system supporting abdominal surgery. There are several researches on automated anatomical labeling, but there is no research on automated anatomical labeling to arteries concerning with the colon. The proposed method obtains a tree structure of arteries from the artery region and calculates features values of each branch. These feature values are thickness, curvature, direction, and running vectors of branch. Then, candidate arterial names are computed by classifiers that are trained to output artery names. Finally, a global optimization process is applied to the candidate arterial names to determine final names. Target arteries of this paper are nine lower abdominal arteries (AO, LCIA, RCIA, LEIA, REIA, SMA, IMA, LIIA, RIIA). We applied the proposed method to 14 cases of 3D abdominal contrasted CT images, and evaluated the results by leave-one-out scheme. The average precision and recall rates of the proposed method were 87.9% and 93.3%, respectively. The results of this method are applicable for anatomical name display of surgical simulation and computer aided surgery.
Evaluation of oesophageal transit velocity using the improved Demons technique.
De Souza, Michele N; Xavier, Fernando E B; Secaf, Marie; Troncon, Luiz E A; de Oliveira, Ricardo B; Moraes, Eder R
2016-01-01
This paper presents a novel method to compute oesophageal transit velocity in a direct and automatized manner by the registration of scintigraphy images. A total of 36 images from nine healthy volunteers were processed. Four dynamic image series per volunteer were acquired after a minimum 8 h fast. Each acquisition was made following the ingestion of 5 ml saline labelled with about 26 MBq (700 µCi) technetium-99m phytate in a single swallow. Between the acquisitions, another two swallows of 5 ml saline were performed to clear the oesophagus. The composite acquired files were made of 240 frames of anterior and posterior views. Each frame is the accumulate count for 250 ms.At the end of acquisitions, the images were corrected for radioactive decay, the geometric mean was computed between the anterior and posterior views and the registration of a set of subsequent images was performed. Utilizing the improved Demons technique, we obtained from the deformation field the regional resultant velocity, which is directly related to the oesophagus transit velocity. The mean regional resulting velocities decreases progressively from the proximal to the distal oesophageal portions and, at the proximal portion, is virtually identical to the primary peristaltic pump typical velocity. Comparison between this parameter and 'time-activity' curves reveals consistency in velocities obtained using both methods, for the proximal portion. Application of the improved Demons technique, as an easy and automated method to evaluate velocities of oesophageal bolus transit, is feasible and seems to yield consistent data, particularly for the proximal oesophagus.
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
The H-Metaphor as a Guideline for Vehicle Automation and Interaction
NASA Technical Reports Server (NTRS)
Flemisch, Frank O.; Adams, Catherine A.; Conway, Sheila R.; Goodrich, Ken H.; Palmer, Michael T.; Schutte, Paul C.
2003-01-01
Good design is not free of form. It does not necessarily happen through a mere sampling of technologies packaged together, through pure analysis, or just by following procedures. Good design begins with inspiration and a vision, a mental image of the end product, which can sometimes be described with a design metaphor. A successful example from the 20th century is the desktop metaphor, which took a real desktop as an orientation for the manipulation of electronic documents on a computer. Initially defined by Xerox, then refined by Apple and others, it could be found on almost every computer by the turn of the 20th century. This paper sketches a specific metaphor for the emerging field of highly automated vehicles, their interactions with human users and with other vehicles. In the introduction, general questions on vehicle automation are raised and related to the physical control of conventional vehicles and to the automation of some late 20th century vehicles. After some words on design metaphors, the H-Metaphor is introduced. More details of the metaphor's source are described and their application to human-machine interaction, automation and management of intelligent vehicles sketched. Finally, risks and opportunities to apply the metaphor to technical applications are discussed.
Seymour, A. C.; Dale, J.; Hammill, M.; Halpin, P. N.; Johnston, D. W.
2017-01-01
Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95–98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management. PMID:28338047
NASA Astrophysics Data System (ADS)
Hopp, T.; Zapf, M.; Ruiter, N. V.
2014-03-01
An essential processing step for comparison of Ultrasound Computer Tomography images to other modalities, as well as for the use in further image processing, is to segment the breast from the background. In this work we present a (semi-) automated 3D segmentation method which is based on the detection of the breast boundary in coronal slice images and a subsequent surface fitting. The method was evaluated using a software phantom and in-vivo data. The fully automatically processed phantom results showed that a segmentation of approx. 10% of the slices of a dataset is sufficient to recover the overall breast shape. Application to 16 in-vivo datasets was performed successfully using semi-automated processing, i.e. using a graphical user interface for manual corrections of the automated breast boundary detection. The processing time for the segmentation of an in-vivo dataset could be significantly reduced by a factor of four compared to a fully manual segmentation. Comparison to manually segmented images identified a smoother surface for the semi-automated segmentation with an average of 11% of differing voxels and an average surface deviation of 2mm. Limitations of the edge detection may be overcome by future updates of the KIT USCT system, allowing a fully-automated usage of our segmentation approach.
NASA Astrophysics Data System (ADS)
Seymour, A. C.; Dale, J.; Hammill, M.; Halpin, P. N.; Johnston, D. W.
2017-03-01
Estimating animal populations is critical for wildlife management. Aerial surveys are used for generating population estimates, but can be hampered by cost, logistical complexity, and human risk. Additionally, human counts of organisms in aerial imagery can be tedious and subjective. Automated approaches show promise, but can be constrained by long setup times and difficulty discriminating animals in aggregations. We combine unmanned aircraft systems (UAS), thermal imagery and computer vision to improve traditional wildlife survey methods. During spring 2015, we flew fixed-wing UAS equipped with thermal sensors, imaging two grey seal (Halichoerus grypus) breeding colonies in eastern Canada. Human analysts counted and classified individual seals in imagery manually. Concurrently, an automated classification and detection algorithm discriminated seals based upon temperature, size, and shape of thermal signatures. Automated counts were within 95-98% of human estimates; at Saddle Island, the model estimated 894 seals compared to analyst counts of 913, and at Hay Island estimated 2188 seals compared to analysts’ 2311. The algorithm improves upon shortcomings of computer vision by effectively recognizing seals in aggregations while keeping model setup time minimal. Our study illustrates how UAS, thermal imagery, and automated detection can be combined to efficiently collect population data critical to wildlife management.
Microfluidic-Based sample chips for radioactive solutions
Tripp, J. L.; Law, J. D.; Smith, T. E.; ...
2015-01-01
Historical nuclear fuel cycle process sampling techniques required sample volumes ranging in the tens of milliliters. The radiation levels experienced by analytical personnel and equipment, in addition to the waste volumes generated from analysis of these samples, have been significant. These sample volumes also impacted accountability inventories of required analytes during process operations. To mitigate radiation dose and other issues associated with the historically larger sample volumes, a microcapillary sample chip was chosen for further investigation. The ability to obtain microliter volume samples coupled with a remote automated means of sample loading, tracking, and transporting to the analytical instrument wouldmore » greatly improve analytical efficiency while reducing both personnel exposure and radioactive waste volumes. Sample chip testing was completed to determine the accuracy, repeatability, and issues associated with the use of microfluidic sample chips used to supply µL sample volumes of lanthanide analytes dissolved in nitric acid for introduction to an analytical instrument for elemental analysis.« less
Stager, Ron; Chambers, Douglas; Wiatzka, Gerd; Dupre, Monica; Callough, Micah; Benson, John; Santiago, Erwin; van Veen, Walter
2017-04-01
The Port Hope Area Initiative is a project mandated and funded by the Government of Canada to remediate properties with legacy low-level radioactive waste contamination in the Town of Port Hope, Ontario. The management and use of large amounts of data from surveys of some 4800 properties is a significant task critical to the success of the project. A large amount of information is generated through the surveys, including scheduling individual field visits to the properties, capture of field data laboratory sample tracking, QA/QC, property report generation and project management reporting. Web-mapping tools were used to track and display temporal progress of various tasks and facilitated consideration of spatial associations of contamination levels. The IM system facilitated the management and integrity of the large amounts of information collected, evaluation of spatial associations, automated report reproduction and consistent application and traceable execution for this project.x. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Nuclear Resonance Fluorescence for Materials Assay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quiter, Brian; Ludewigt, Bernhard; Mozin, Vladimir
This paper discusses the use of nuclear resonance fluorescence (NRF) techniques for the isotopic and quantitative assaying of radioactive material. Potential applications include age-dating of an unknown radioactive source, pre- and post-detonation nuclear forensics, and safeguards for nuclear fuel cycles Examples of age-dating a strong radioactive source and assaying a spent fuel pin are discussed. The modeling work has ben performed with the Monte Carlo radiation transport computer code MCNPX, and the capability to simulate NRF has bee added to the code. Discussed are the limitations in MCNPX's photon transport physics for accurately describing photon scattering processes that are importantmore » contributions to the background and impact the applicability of the NRF assay technique.« less
Implementing Computer Integrated Manufacturing Technician Program.
ERIC Educational Resources Information Center
Gibbons, Roger
A computer-integrated manufacturing (CIM) technician program was developed to provide training and technical assistance to meet the needs of business and industry in the face of the demands of high technology. The Computer and Automated Systems Association (CASA) of the Society of Manufacturing Engineers provided the incentive and guidelines…
Systems Librarian and Automation Review.
ERIC Educational Resources Information Center
Schuyler, Michael
1992-01-01
Discusses software sharing on computer networks and the need for proper bandwidth; and describes the technology behind FidoNet, a computer network made up of electronic bulletin boards. Network features highlighted include front-end mailers, Zone Mail Hour, Nodelist, NetMail, EchoMail, computer conferences, tosser and scanner programs, and host…
Introduction to Minicomputers in Federal Libraries.
ERIC Educational Resources Information Center
Young, Micki Jo; And Others
This book for library administrators and Federal library staff covers the application of minicomputers in Federal libraries and offers a review of minicomputer technology. A brief overview of automation explains computer technology, hardware, and software. The role of computers in libraries is examined in terms of the history of computers and…
Effects of automation of information-processing functions on teamwork.
Wright, Melanie C; Kaber, David B
2005-01-01
We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.
ERIC Educational Resources Information Center
1984
This 63-paper collection represents a variety of interests and areas of expertise related to technology and its impact on the educational process at all levels. Topics include automated instructional management, computer literacy, software evaluation, beginning a computer program, finding software, networking, programming, and the computer and…
A computational framework for automation of point defect calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
Computer system for scanning tunneling microscope automation
NASA Astrophysics Data System (ADS)
Aguilar, M.; García, A.; Pascual, P. J.; Presa, J.; Santisteban, A.
1987-03-01
A computerized system for the automation of a scanning tunneling microscope is presented. It is based on an IBM personal computer (PC) either an XT or an AT, which performs the control, data acquisition and storage operations, displays the STM "images" in real time, and provides image processing tools for the restoration and analysis of data. It supports different data acquisition and control cards and image display cards. The software has been designed in a modular way to allow the replacement of these cards and other equipment improvements as well as the inclusion of user routines for data analysis.
A computational framework for automation of point defect calculations
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; ...
2017-01-13
We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.
NASA Technical Reports Server (NTRS)
Milner, E. J.; Krosel, S. M.
1977-01-01
Techniques are presented for determining the elements of the A, B, C, and D state variable matrices for systems simulated on an EAI Pacer 100 hybrid computer. An automated procedure systematically generates disturbance data necessary to linearize the simulation model and stores these data on a floppy disk. A separate digital program verifies this data, calculates the elements of the system matrices, and prints these matrices appropriately labeled. The partial derivatives forming the elements of the state variable matrices are approximated by finite difference calculations.
Using Automated Scores of Student Essays to Support Teacher Guidance in Classroom Inquiry
NASA Astrophysics Data System (ADS)
Gerard, Libby F.; Linn, Marcia C.
2016-02-01
Computer scoring of student written essays about an inquiry topic can be used to diagnose student progress both to alert teachers to struggling students and to generate automated guidance. We identify promising ways for teachers to add value to automated guidance to improve student learning. Three teachers from two schools and their 386 students participated. We draw on evidence from student progress, observations of how teachers interact with students, and reactions of teachers. The findings suggest that alerts for teachers prompted rich teacher-student conversations about energy in photosynthesis. In one school, the combination of the automated guidance plus teacher guidance was more effective for student science learning than two rounds of personalized, automated guidance. In the other school, both approaches resulted in equal learning gains. These findings suggest optimal combinations of automated guidance and teacher guidance to support students to revise explanations during inquiry and build integrated understanding of science.
Automation of the longwall mining system
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Aster, R. W.; Harris, J.; High, J.
1982-01-01
Cost effective, safe, and technologically sound applications of automation technology to underground coal mining were identified. The longwall analysis commenced with a general search for government and industry experience of mining automation technology. A brief industry survey was conducted to identify longwall operational, safety, and design problems. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state of the art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less
Computer program calculates gamma ray source strengths of materials exposed to neutron fluxes
NASA Technical Reports Server (NTRS)
Heiser, P. C.; Ricks, L. O.
1968-01-01
Computer program contains an input library of nuclear data for 44 elements and their isotopes to determine the induced radioactivity for gamma emitters. Minimum input requires the irradiation history of the element, a four-energy-group neutron flux, specification of an alloy composition by elements, and selection of the output.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-25
... a computing environment that has adequate computer security controls in place to prevent... NRC intends to issue a security Order to this Licensee in the near future. The Order will require compliance with specific Additional Security Measures to enhance the security for certain radioactive...
Simulating the Gradually Deteriorating Performance of an RTG
NASA Technical Reports Server (NTRS)
Wood, Eric G.; Ewell, Richard C.; Patel, Jagdish; Hanks, David R.; Lozano, Juan A.; Snyder, G. Jeffrey; Noon, Larry
2008-01-01
Degra (now in version 3) is a computer program that simulates the performance of a radioisotope thermoelectric generator (RTG) over its lifetime. Degra is provided with a graphical user interface that is used to edit input parameters that describe the initial state of the RTG and the time-varying loads and environment to which it will be exposed. Performance is computed by modeling the flows of heat from the radioactive source and through the thermocouples, also allowing for losses, to determine the temperature drop across the thermocouples. This temperature drop is used to determine the open-circuit voltage, electrical resistance, and thermal conductance of the thermocouples. Output power can then be computed by relating the open-circuit voltage and the electrical resistance of the thermocouples to a specified time-varying load voltage. Degra accounts for the gradual deterioration of performance attributable primarily to decay of the radioactive source and secondarily to gradual deterioration of the thermoelectric material. To provide guidance to an RTG designer, given a minimum of input, Degra computes the dimensions, masses, and thermal conductances of important internal structures as well as the overall external dimensions and total mass.
Influencing Trust for Human-Automation Collaborative Scheduling of Multiple Unmanned Vehicles.
Clare, Andrew S; Cummings, Mary L; Repenning, Nelson P
2015-11-01
We examined the impact of priming on operator trust and system performance when supervising a decentralized network of heterogeneous unmanned vehicles (UVs). Advances in autonomy have enabled a future vision of single-operator control of multiple heterogeneous UVs. Real-time scheduling for multiple UVs in uncertain environments requires the computational ability of optimization algorithms combined with the judgment and adaptability of human supervisors. Because of system and environmental uncertainty, appropriate operator trust will be instrumental to maintain high system performance and prevent cognitive overload. Three groups of operators experienced different levels of trust priming prior to conducting simulated missions in an existing, multiple-UV simulation environment. Participants who play computer and video games frequently were found to have a higher propensity to overtrust automation. By priming gamers to lower their initial trust to a more appropriate level, system performance was improved by 10% as compared to gamers who were primed to have higher trust in the automation. Priming was successful at adjusting the operator's initial and dynamic trust in the automated scheduling algorithm, which had a substantial impact on system performance. These results have important implications for personnel selection and training for futuristic multi-UV systems under human supervision. Although gamers may bring valuable skills, they may also be potentially prone to automation bias. Priming during training and regular priming throughout missions may be one potential method for overcoming this propensity to overtrust automation. © 2015, Human Factors and Ergonomics Society.
Ren, Guo-Ping; Yan, Jia-Qing; Yu, Zhi-Xin; Wang, Dan; Li, Xiao-Nan; Mei, Shan-Shan; Dai, Jin-Dong; Li, Xiao-Li; Li, Yun-Lin; Wang, Xiao-Fei; Yang, Xiao-Feng
2018-02-01
High frequency oscillations (HFOs) are considered as biomarker for epileptogenicity. Reliable automation of HFOs detection is necessary for rapid and objective analysis, and is determined by accurate computation of the baseline. Although most existing automated detectors measure baseline accurately in channels with rare HFOs, they lose accuracy in channels with frequent HFOs. Here, we proposed a novel algorithm using the maximum distributed peak points method to improve baseline determination accuracy in channels with wide HFOs activity ranges and calculate a dynamic baseline. Interictal ripples (80-200[Formula: see text]Hz), fast ripples (FRs, 200-500[Formula: see text]Hz) and baselines in intracerebral EEGs from seven patients with intractable epilepsy were identified by experienced reviewers and by our computer-automated program, and the results were compared. We also compared the performance of our detector to four well-known detectors integrated in RIPPLELAB. The sensitivity and specificity of our detector were, respectively, 71% and 75% for ripples and 66% and 84% for FRs. Spearman's rank correlation coefficient comparing automated and manual detection was [Formula: see text] for ripples and [Formula: see text] for FRs ([Formula: see text]). In comparison to other detectors, our detector had a relatively higher sensitivity and specificity. In conclusion, our automated detector is able to accurately calculate a dynamic iEEG baseline in different HFO activity channels using the maximum distributed peak points method, resulting in higher sensitivity and specificity than other available HFO detectors.
Using computers for planning and evaluating nursing in the health care services.
Emuziene, Vilma
2009-01-01
This paper describes that the nurses attitudes, using and motivation towards the computer usage significantly influenced by area of nursing/health care service. Today most of the nurses traditionally document patient information in a medical record using pen and paper. Most nursing administrators not currently involved with computer applications in their settings are interested in exploring whether technology could help them with the day-to-day and long - range tasks of planning and evaluating nursing services. The results of this investigation showed that respondents (nurses), as specialists and nursing informatics, make their activity well: they had "positive" attitude towards computers and "good" or "average" computer skills. The nurses overall computer attitude did influence by the age of the nurses, by sex, by professional qualification. Younger nurses acquire informatics skills while in nursing school and are more accepting of computer advancements. The knowledge about computer among nurses who don't have any training in computers' significantly differs, who have training and using the computer once a week or everyday. In the health care services often are using the computers and the automated data systems, data for the statistical information (visit information, patient information) and billing information. In nursing field often automated data systems are using for statistical information, billing information, information about the vaccination, patient assessment and patient classification.
Office Computers: Ergonomic Considerations.
ERIC Educational Resources Information Center
Ganus, Susannah
1984-01-01
Each new report of the office automation market indicates technology is overrunning the office. The impacts of this technology are described and some ways to manage and physically "soften" the change to a computer-based office environment are suggested. (Author/MLW)
ESTIMATION OF PHYSIOCHEMICAL PROPERTIES OF ORGANIC COMPOUNDS BY SPARC
The computer program SPARC (SPARC Performs Automated Reasoning in Chemistry) has been under development for several years to estimate physical properties and chemical reactivity parameters of organic compounds strictly from molecular structure. SPARC uses computational algorithms...
NASA Tech Briefs, August 2000. Volume 24, No. 8
NASA Technical Reports Server (NTRS)
2000-01-01
Topics include: Simulation/Virtual Reality; Test and Measurement; Computer-Aided Design and Engineering; Electronic Components and Circuits; Electronic Systems; Physical Sciences; Materials; Computer Programs; Mechanics; Machinery/Automation; Manufacturing/Fabrication; Mathematics and Information Sciences; Medical Design.
IMPROVED BIOMASS UTILIZATION THROUGH REMOTE FLOW SENSING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washington University- St. Louis: Muthanna Al-Dahhan
The growth of the livestock industry provides a valuable source of affordable, sustainable, and renewable bioenergy, while also requiring the safe disposal of the large quantities of animal wastes (manure) generated at dairy, swine, and poultry farms. If these biomass resources are mishandled and underutilized, major environmental problems will be created, such as surface and ground water contamination, odors, dust, ammonia leaching, and methane emission. Anaerobic digestion of animal wastes, in which microorganisms break down organic materials in the absence of oxygen, is one of the most promising waste treatment technologies. This process produces biogas typically containing {approx}65% methane andmore » {approx}35% carbon dioxide. The production of biogas through anaerobic digestion from animal wastes, landfills, and municipal waste water treatment plants represents a large source of renewable and sustainable bio-fuel. Such bio-fuel can be combusted directly, used in internal combustion engines, converted into methanol, or partially oxidized to produce synthesis gas (a mixture of hydrogen and carbon monoxide) that can be converted to clean liquid fuels and chemicals via Fischer-Tropsch synthesis. Different design and mixing configurations of anaerobic digesters for treating cow manure have been utilized commercially and/or tested on a laboratory scale. These digesters include mechanically mixed, gas recirculation mixed, and slurry recirculation mixed designs, as well as covered lagoon digesters. Mixing is an important parameter for successful performance of anaerobic digesters. It enhances substrate contact with the microbial community; improves pH, temperature and substrate/microorganism uniformity; prevents stratification and scum accumulation; facilitates the removal of biogas from the digester; reduces or eliminates the formation of inactive zones (dead zones); prevents settling of biomass and inert solids; and aids in particle size reduction. Unfortunately, information and findings in the literature on the effect of mixing on anaerobic digestion are contradictory. One reason is the lack of measurement techniques for opaque systems such as digesters. Better understanding of the mixing and hydrodynamics of digesters will result in appropriate design, configuration selection, scale-up, and performance, which will ultimately enable avoiding digester failures. Accordingly, this project sought to advance the fundamental knowledge and understanding of the design, scale up, operation, and performance of cow manure anaerobic digesters with high solids loading. The project systematically studied parameters affecting cow manure anaerobic digestion performance, in different configurations and sizes by implementing computer automated radioactive particle tracking (CARPT), computed tomography (CT), and computational fluid dynamics (CFD), and by developing novel multiple-particle CARPT (MP-CARPT) and dual source CT (DSCT) techniques. The accomplishments of the project were achieved in a collaborative effort among Washington University, the Oak Ridge National Laboratory, and the Iowa Energy Center teams. The following investigations and achievements were accomplished: Systematic studies of anaerobic digesters performance and kinetics using various configurations, modes of mixing, and scales (laboratory, pilot plant, and commercial sizes) were conducted and are discussed in Chapter 2. It was found that mixing significantly affected the performance of the pilot plant scale digester ({approx}97 liter). The detailed mixing and hydrodynamics were investigated using computer automated radioactive particle tracking (CARPT) techniques, and are discussed in Chapter 3. A novel multiple particle tracking technique (MP-CARPT) technique that can track simultaneously up to 8 particles was developed, tested, validated, and implemented. Phase distribution was investigated using gamma ray computer tomography (CT) techniques, which are discussed in Chapter 4. A novel dual source CT (DSCT) technique was developed to measure the phase distribution of dynamic three phase system such as digesters with high solids loading and other types of gas-liquid-solid fluidization systems. Evaluation and validation of the computational fluid dynamics (CFD) models and closures were conducted to model and simulate the hydrodynamics and mixing intensity of the anaerobic digesters (Chapter 5). It is strongly recommended that additional studies be conducted, both on hydrodynamics and performance, in large scale digesters. The studies should use advanced non-invasive measurement techniques, including the developed novel measurement techniques, to further understand their design, scale-up, performance, and operation to avoid any digester failure. The final goal is a system ready to be used by farmers on site for bioenergy production and for animal/farm waste treatment.« less
Generative Representations for Computer-Automated Evolutionary Design
NASA Technical Reports Server (NTRS)
Hornby, Gregory S.
2006-01-01
With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.
Evolutionary Computation for the Identification of Emergent Behavior in Autonomous Systems
NASA Technical Reports Server (NTRS)
Terrile, Richard J.; Guillaume, Alexandre
2009-01-01
Over the past several years the Center for Evolutionary Computation and Automated Design at the Jet Propulsion Laboratory has developed a technique based on Evolutionary Computational Methods (ECM) that allows for the automated optimization of complex computationally modeled systems. An important application of this technique is for the identification of emergent behaviors in autonomous systems. Mobility platforms such as rovers or airborne vehicles are now being designed with autonomous mission controllers that can find trajectories over a solution space that is larger than can reasonably be tested. It is critical to identify control behaviors that are not predicted and can have surprising results (both good and bad). These emergent behaviors need to be identified, characterized and either incorporated into or isolated from the acceptable range of control characteristics. We use cluster analysis of automatically retrieved solutions to identify isolated populations of solutions with divergent behaviors.