Sample records for automated computerized methodology

  1. Web-based automation of green building rating index and life cycle cost analysis

    NASA Astrophysics Data System (ADS)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  2. 45 CFR 310.40 - What requirements apply for accessing systems and records for monitoring Computerized Tribal IV-D...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... records for monitoring Computerized Tribal IV-D Systems and Office Automation? 310.40 Section 310.40... COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION Accountability and Monitoring Procedures for... monitoring Computerized Tribal IV-D Systems and Office Automation? In accordance with Part 95 of this title...

  3. 45 CFR 310.5 - What options are available for Computerized Tribal IV-D Systems and office automation?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to conduct automated data processing and recordkeeping activities through Office Automation... IV-D Systems and office automation? 310.5 Section 310.5 Public Welfare Regulations Relating to Public... AUTOMATION Requirements for Computerized Tribal IV-D Systems and Office Automation § 310.5 What options are...

  4. 45 CFR 310.20 - What are the conditions for funding the installation, operation, maintenance and enhancement of...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... installation, operation, maintenance and enhancement of Computerized Tribal IV-D Systems and Office Automation... HEALTH AND HUMAN SERVICES COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION Funding for Computerized Tribal IV-D Systems and Office Automation § 310.20 What are the conditions for funding the installation...

  5. Automated control of hierarchical systems using value-driven methods

    NASA Technical Reports Server (NTRS)

    Pugh, George E.; Burke, Thomas E.

    1990-01-01

    An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.

  6. 45 CFR 310.15 - What are the safeguards and processes that comprehensive Tribal IV-D agencies must have in place...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... IV-D Systems and Office Automation? 310.15 Section 310.15 Public Welfare Regulations Relating to... AND OFFICE AUTOMATION Requirements for Computerized Tribal IV-D Systems and Office Automation § 310.15... ensure the security and privacy of Computerized Tribal IV-D Systems and Office Automation? (a...

  7. 45 CFR 310.35 - Under what circumstances would emergency FFP be available for Computerized Tribal IV-D Systems?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Under what circumstances would emergency FFP be... AND OFFICE AUTOMATION Funding for Computerized Tribal IV-D Systems and Office Automation § 310.35 Under what circumstances would emergency FFP be available for Computerized Tribal IV-D Systems? (a...

  8. 45 CFR 310.30 - Under what circumstances would FFP be suspended or disallowed in the costs of Computerized Tribal...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Under what circumstances would FFP be suspended or... SYSTEMS AND OFFICE AUTOMATION Funding for Computerized Tribal IV-D Systems and Office Automation § 310.30 Under what circumstances would FFP be suspended or disallowed in the costs of Computerized Tribal IV-D...

  9. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  10. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  11. Resources for Improving Computerized Learning Environments.

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    1989-01-01

    Presents an annotated review of human factors literature that discusses computerized environments. Topics discussed include the application of office automation practices to educational environments; video display terminal (VDT) workstations; health and safety hazards; planning educational facilities; ergonomics in computerized offices; and…

  12. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    ERIC Educational Resources Information Center

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  13. 45 CFR 310.25 - What conditions apply to acquisitions of Computerized Tribal IV-D Systems?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION... Acquisition Threshold; (c) Software and ownership rights. (1) All procurement and contract instruments must... Computerized Tribal IV-D System software or enhancements thereof and all associated documentation designed...

  14. Records Management Handbook; Source Data Automation Equipment Guide.

    ERIC Educational Resources Information Center

    National Archives and Records Service (GSA), Washington, DC. Office of Records Management.

    A detailed guide to selecting appropriate source data automation equipment is presented. Source data automation equipment is used to prepare data for electronic data processing or computerized recordkeeping. The guide contains specifications, performance data cost, and pictures of the major types of machines used in source data automation.…

  15. 45 CFR 310.0 - What does this part cover?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... COMPUTERIZED TRIBAL IV-D SYSTEMS AND OFFICE AUTOMATION General Provisions § 310.0 What does this part cover... and Office Automation including: (a) The automated systems options for comprehensive Tribal IV-D... and Office Automation in § 310.15 of this part; (d) The conditions for funding the installation...

  16. Evaluation of the Salt Lake City Computerized Rider Information System

    DOT National Transportation Integrated Search

    1985-11-01

    The Utah Transit Authority (UTA) Computerized Rider Information System (CRIS) project involved the installation of an automated telephone service to quickly provide bus stop-specific schedule and service information to residents throughout the Author...

  17. 45 CFR 310.15 - What are the safeguards and processes that comprehensive Tribal IV-D agencies must have in place...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... comprehensive Tribal IV-D agencies must have in place to ensure the security and privacy of Computerized Tribal... ensure the security and privacy of Computerized Tribal IV-D Systems and Office Automation? (a..., accuracy, completeness, access to, and use of data in the Computerized Tribal IV-D System and Office...

  18. Testing primates with joystick-based automated apparatus - Lessons from the Language Research Center's Computerized Test System

    NASA Technical Reports Server (NTRS)

    Washburn, David A.; Rumbaugh, Duane M.

    1992-01-01

    Nonhuman primates provide useful models for studying a variety of medical, biological, and behavioral topics. Four years of joystick-based automated testing of monkeys using the Language Research Center's Computerized Test System (LRC-CTS) are examined to derive hints and principles for comparable testing with other species - including humans. The results of multiple parametric studies are reviewed, and reliability data are presented to reveal the surprises and pitfalls associated with video-task testing of performance.

  19. The Human Response to Library Automation.

    ERIC Educational Resources Information Center

    Kirkland, Janice, Ed.

    1989-01-01

    Eleven articles discuss the response of library users and personnel to automation. Topics covered include computerized reference services, online public access catalogs, paraprofessional staff perceptions of technology, organizational and managerial impacts of automation, workshops on new technology, gender differences in motivation to manage and…

  20. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    PubMed

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Comparison of Automated Scoring Methods for a Computerized Performance Assessment of Clinical Judgment

    ERIC Educational Resources Information Center

    Harik, Polina; Baldwin, Peter; Clauser, Brian

    2013-01-01

    Growing reliance on complex constructed response items has generated considerable interest in automated scoring solutions. Many of these solutions are described in the literature; however, relatively few studies have been published that "compare" automated scoring strategies. Here, comparisons are made among five strategies for…

  2. The Computerization of the National Library in Paris.

    ERIC Educational Resources Information Center

    Lerin, Christian; Bernard, Annick

    1986-01-01

    Describes the organization and automation plan of the Bibliotheque Nationale (Paris, France) that was begun in 1981. Highlights include the method of moving toward computerization; technical choices; the choosing procedure (pre-qualification, bench-mark test); short term and pilot operations; and preparation for the implementation of the…

  3. The Computerized Adaptive Testing System Development Project.

    ERIC Educational Resources Information Center

    McBride, James R.; Sympson, J. B.

    The Computerized Adaptive Testing (CAT) project is a joint Armed Services coordinated effort to develop and evaluate a system for automated, adaptive administration of the Armed Services Vocational Aptitude Battery (ASVAB). The CAT is a system for administering personnel tests that differs from conventional test administration in two major…

  4. Language Research Center's Computerized Test System (LRC-CTS) - Video-formatted tasks for comparative primate research

    NASA Technical Reports Server (NTRS)

    Rumbaugh, Duane M.; Washburn, David A.; Savage-Rumbaugh, E. S.; Hopkins, William D.; Richardson, W. K.

    1991-01-01

    Automation of a computerized test system for comparative primate research is shown to improve the results of learning in standard paradigms. A mediational paradigm is used to determine the degree to which criterion in the learning-set testing reflects stimulus-response associative or mediational learning. Rhesus monkeys are shown to exhibit positive transfer as the criterion levels are shifted upwards, and the effectiveness of the computerized testing system is confirmed.

  5. Microcomputer Network for Computerized Adaptive Testing (CAT). [Final Report, FY81-83].

    ERIC Educational Resources Information Center

    Quan, Baldwin; And Others

    Computerized adaptive testing (CAT) offers the opportunity to replace paper-and-pencil aptitude tests such as the Armed Services Vocational Aptitude Battery with shorter, more accurate, and more secure computer-administered tests. Its potential advantages need to be verified by experimental administration of automated tests to military recruit…

  6. Automated Computerized Analysis of Speechin Psychiatric Disorders

    PubMed Central

    Cohen, Alex S.; Elvevåg, Brita

    2014-01-01

    Purpose of Review Disturbances in communication are a hallmark of severe mental illnesses. Recent technological advances have paved the way for objectifying communication using automated computerized linguistic and acoustic analysis. We review recent studies applying various computer-based assessments to the natural language produced by adult patients with severe mental illness. Recent Findings Automated computerized methods afford tools with which it is possible to objectively evaluate patients in a reliable, valid and efficient manner that complements human ratings. Crucially, these measures correlate with important clinical measures. The clinical relevance of these novel metrics has been demonstrated by showing their relationship to functional outcome measures, their in vivo link to classic ‘language’ regions in the brain, and, in the case of linguistic analysis, their relationship to candidate genes for severe mental illness. Summary Computer based assessments of natural language afford a framework with which to measure communication disturbances in adults with SMI. Emerging evidence suggests that they can be reliable and valid, and overcome many practical limitations of more traditional assessment methods. The advancement of these technologies offers unprecedented potential for measuring and understanding some of the most crippling symptoms of some of the most debilitating illnesses known to humankind. PMID:24613984

  7. Computerized Manufacturing Automation. Employment, Education, and the Workplace. Summary.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    The application of programmable automation (PA) offers new opportunities to enhance and streamline manufacturing processes. Five PA technologies are examined in this report: computer-aided design, robots, numerically controlled machine tools, flexible manufacturing systems, and computer-integrated manufacturing. Each technology is in a relatively…

  8. The Impact of Computerization on Library Support Staff: A Study of Support Staff in Academic Libraries in Wisconsin.

    ERIC Educational Resources Information Center

    Palmini, Cathleen C.

    1994-01-01

    Describes a survey of Wisconsin academic library support staff that explored the effects of computerization of libraries on work and job satisfaction. Highlights include length of employment; time spent at computer terminals; training; computer background; computers as timesavers; influence of automation on effectiveness; and job frustrations.…

  9. History of a Building Automation System.

    ERIC Educational Resources Information Center

    Martin, Anthony A.

    1984-01-01

    Having successfully used computer control in the solar-heated and cooled Terraset School, the Fairfax County, VA, Public Schools are now computerizing all their facilities. This article discusses the configuration and use of a countywide control system, reasons for the project's success, and problems of facility automation. (MCG)

  10. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... satisfactorily, may be used in the manufacture, processing, packing, and holding of a drug product. If such... performed in connection with laboratory analysis, are eliminated by computerization or other automated... erasures, or loss shall be maintained. (c) Such automated equipment used for performance of operations...

  11. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... satisfactorily, may be used in the manufacture, processing, packing, and holding of a drug product. If such... performed in connection with laboratory analysis, are eliminated by computerization or other automated... erasures, or loss shall be maintained. (c) Such automated equipment used for performance of operations...

  12. Computerized Manufacturing Automation: Employment, Education, and the Workplace.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Office of Technology Assessment.

    This report describes the technologies of programmable automation (PA) in manufacturing, their uses, and future capabilities. Following the summary and introduction, the prospects for PA are examined from several perspectives. Chapter 3 defines PA technologies, describes their developmental trends, and evaluates the potential for the integration…

  13. Information technology and medication safety: what is the benefit?

    PubMed Central

    Kaushal, R; Bates, D

    2002-01-01

    

 Medication errors occur frequently and have significant clinical and financial consequences. Several types of information technologies can be used to decrease rates of medication errors. Computerized physician order entry with decision support significantly reduces serious inpatient medication error rates in adults. Other available information technologies that may prove effective for inpatients include computerized medication administration records, robots, automated pharmacy systems, bar coding, "smart" intravenous devices, and computerized discharge prescriptions and instructions. In outpatients, computerization of prescribing and patient oriented approaches such as personalized web pages and delivery of web based information may be important. Public and private mandates for information technology interventions are growing, but further development, application, evaluation, and dissemination are required. PMID:12486992

  14. Computers in Manufacturing.

    ERIC Educational Resources Information Center

    Hudson, C. A.

    1982-01-01

    Advances in factory computerization (computer-aided design and computer-aided manufacturing) are reviewed, including discussions of robotics, human factors engineering, and the sociological impact of automation. (JN)

  15. Impact of Computerized Student Information System.

    ERIC Educational Resources Information Center

    San Diego Community Coll. District, CA. Research Office.

    A two-part study was conducted by the San Diego Community College District to assess the post-automation impact of the Student Information System (SIS) on the cost of providing student services. The study first determined the service areas most affected by the SIS and then assessed the savings potential of automation by: (1) interviewing personnel…

  16. Automated Simultaneous Assembly of Multistage Testlets for a High-Stakes Licensing Examination

    ERIC Educational Resources Information Center

    Breithaupt, Krista; Hare, Donovan R.

    2007-01-01

    Many challenges exist for high-stakes testing programs offering continuous computerized administration. The automated assembly of test questions to exactly meet content and other requirements, provide uniformity, and control item exposure can be modeled and solved by mixed-integer programming (MIP) methods. A case study of the computerized…

  17. Automated Inspection And Precise Grinding Of Gears

    NASA Technical Reports Server (NTRS)

    Frint, Harold; Glasow, Warren

    1995-01-01

    Method of precise grinding of spiral bevel gears involves automated inspection of gear-tooth surfaces followed by adjustments of machine-tool settings to minimize differences between actual and nominal surfaces. Similar to method described in "Computerized Inspection of Gear-Tooth Surfaces" (LEW-15736). Yields gears of higher quality, with significant reduction in manufacturing and inspection time.

  18. MDCT for Computerized Volumetry of Pneumothoraces in Pediatric Patients

    PubMed Central

    Cai, Wenli; Lee, Edward Y.; Vij, Abhinav; Mahmood, Soran A.; Yoshida, Hiroyuki

    2010-01-01

    OBJECTIVE Our purpose in this study was to develop an automated computer-aided volumetry (CAV) scheme for quantifying pneumothorax in MDCT images for pediatric patients and to investigate the imaging parameters that may affect its accuracy. MATERIALS AND METHODS Fifty-eight consecutive pediatric patients (mean age 12±6 years) with pneumothorax who underwent MDCT for evaluation were collected retrospectively for this study. All cases were imaged by a 16- or 64-MDCT scanner with weight-based kilovoltage, low-dose tube current, 1.0 ~ 1.5 pitch, 0.6 ~ 5.0 mm slice thickness, and a B70f (sharp) or B31f (soft) reconstruction kernel. Sixty-three pneumothoraces ≥1 cc were visually identified in the left (n = 30) or/and right (n = 33) lungs. Each identified pneumothorax was contoured manually on an Amira workstation V4.1.1 (Mercury Computer Systems, Chelmsford, Massachusetts) by two radiologists in consensus. The computerized volumes of the pneumothoraces were determined by application of our CAV scheme. The accuracy of our automated CAV scheme was evaluated by comparison between computerized volumetry and manual volumetry, for the total volume of pneumothoraces in the left and right lungs. RESULTS The mean difference between the computerized volumetry and the manual volumetry for all 63 pneumothoraces ≥1 cc was 8.2%. For pneumothoraces ≥10 cc, ≥50 cc, and ≥200 cc, the mean differences were 7.7% (n=57), 7.3% (n=33), and 6.4% (n=13), respectively. The correlation coefficient was 0.99 between the computerized volume and the manual volume of pneumothoraces. Bland-Altman analysis showed that computerized volumetry has a mean difference of −5.1% compared to manual volumetry. For all pneumothoraces ≥10 cc, the mean differences for slice thickness ≤1.25 mm, =1.5 mm, and =5.0 mm were 6.1% (n=28), 3.5% (n=10), and 12.2% (n=19), respectively. For the two reconstruction kernels, B70f and B31f, the mean differences were 6.3% (n=42, B70f) and 11.7% (n=15, B31f), respectively. CONCLUSION Our automated CAV scheme provides an accurate measurement of pneumothorax volume in MDCT images of pediatric patients. For accurate volumetric quantification of pneumothorax in children in MDCT images by use of the automated CAV scheme, we recommended reconstruction parameters based on a slice thickness ≤1.5 mm and the reconstruction kernel B70f. PMID:21216160

  19. MDCT for computerized volumetry of pneumothoraces in pediatric patients.

    PubMed

    Cai, Wenli; Lee, Edward Y; Vij, Abhinav; Mahmood, Soran A; Yoshida, Hiroyuki

    2011-03-01

    Our purpose in this study was to develop an automated computer-aided volumetry (CAV) scheme for quantifying pneumothorax in multidetector computed tomography (MDCT) images for pediatric patients and to investigate the imaging parameters that may affect its accuracy. Fifty-eight consecutive pediatric patients (mean age 12 ± 6 years) with pneumothorax who underwent MDCT for evaluation were collected retrospectively for this study. All cases were imaged by a 16- or 64-MDCT scanner with weight-based kilovoltage, low-dose tube current, 1.0-1.5 pitch, 0.6-5.0 mm slice thickness, and a B70f (sharp) or B31f (soft) reconstruction kernel. Sixty-three pneumothoraces ≥1 mL were visually identified in the left (n = 30) and right (n = 33) lungs. Each identified pneumothorax was contoured manually on an Amira workstation V4.1.1 (Mercury Computer Systems, Chelmsford, MA) by two radiologists in consensus. The computerized volumes of the pneumothoraces were determined by application of our CAV scheme. The accuracy of our automated CAV scheme was evaluated by comparison between computerized volumetry and manual volumetry, for the total volume of pneumothoraces in the left and right lungs. The mean difference between the computerized volumetry and the manual volumetry for all 63 pneumothoraces ≥1 mL was 8.2%. For pneumothoraces ≥10 mL, ≥50 mL, and ≥200 mL, the mean differences were 7.7% (n = 57), 7.3% (n = 33), and 6.4% (n = 13), respectively. The correlation coefficient was 0.99 between the computerized volume and the manual volume of pneumothoraces. Bland-Altman analysis showed that computerized volumetry has a mean difference of -5.1% compared to manual volumetry. For all pneumothoraces ≥10 mL, the mean differences for slice thickness ≤1.25 mm, = 1.5 mm, and = 5.0 mm were 6.1% (n = 28), 3.5% (n = 10), and 12.2% (n = 19), respectively. For the two reconstruction kernels, B70f and B31f, the mean differences were 6.3% (n = 42, B70f) and 11.7% (n = 15, B31f), respectively. Our automated CAV scheme provides an accurate measurement of pneumothorax volume in MDCT images of pediatric patients. For accurate volumetric quantification of pneumothorax in children in MDCT images by use of the automated CAV scheme, we recommended reconstruction parameters based on a slice thickness ≤1.5 mm and the reconstruction kernel B70f. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  20. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    PubMed

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  1. Computers, Automation, and the Employment of Persons Who Are Blind or Visually Impaired.

    ERIC Educational Resources Information Center

    Mather, J.

    1994-01-01

    This article discusses the impact of technology on the formation of skills and the career advancement of persons who are blind or visually impaired. It concludes that dependence on technology (computerization and automation) and the mechanistic aspects of jobs may trap blind and visually impaired workers in occupations with narrow career paths…

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  3. An Evaluation of the Utility and Cost of Computerized Library Catalogs. Final Report.

    ERIC Educational Resources Information Center

    Dolby, J.L.; And Others

    This study analyzes the basic cost factors in the automation of library catalogs, with a separate examination of the influence of typography on the cost of printed catalogs and the use of efficient automatic error detection procedures in processing bibliographic records. The utility of automated catalogs is also studied, based on data from a…

  4. The Impact of Misspelled Words on Automated Computer Scoring: A Case Study of Scientific Explanations

    ERIC Educational Resources Information Center

    Ha, Minsu; Nehm, Ross H.

    2016-01-01

    Automated computerized scoring systems (ACSSs) are being increasingly used to analyze text in many educational settings. Nevertheless, the impact of misspelled words (MSW) on scoring accuracy remains to be investigated in many domains, particularly jargon-rich disciplines such as the life sciences. Empirical studies confirm that MSW are a…

  5. Automated Communication Tools and Computer-Based Medication Reconciliation to Decrease Hospital Discharge Medication Errors.

    PubMed

    Smith, Kenneth J; Handler, Steven M; Kapoor, Wishwa N; Martich, G Daniel; Reddy, Vivek K; Clark, Sunday

    2016-07-01

    This study sought to determine the effects of automated primary care physician (PCP) communication and patient safety tools, including computerized discharge medication reconciliation, on discharge medication errors and posthospitalization patient outcomes, using a pre-post quasi-experimental study design, in hospitalized medical patients with ≥2 comorbidities and ≥5 chronic medications, at a single center. The primary outcome was discharge medication errors, compared before and after rollout of these tools. Secondary outcomes were 30-day rehospitalization, emergency department visit, and PCP follow-up visit rates. This study found that discharge medication errors were lower post intervention (odds ratio = 0.57; 95% confidence interval = 0.44-0.74; P < .001). Clinically important errors, with the potential for serious or life-threatening harm, and 30-day patient outcomes were not significantly different between study periods. Thus, automated health system-based communication and patient safety tools, including computerized discharge medication reconciliation, decreased hospital discharge medication errors in medically complex patients. © The Author(s) 2015.

  6. Evaluation of a Computerized Clinical Information System (Micromedex).

    PubMed Central

    Lundsgaarde, H. P.; Moreshead, G. E.

    1991-01-01

    This paper summarizes data collected as part of a project designed to identify and assess the technical and organizational problems associated with the implementation and evaluation of a Computerized Clinical Information System (CCIS), Micromedex, in three U.S. Department of Veterans Affairs Medical Centers (VAMCs). The study began in 1987 as a national effort to implement decision support technologies in the Veterans Administration Decentralized Hospital Computer Program (DHCP). The specific objectives of this project were to (1) examine one particular decision support technology, (2) identify the technical and organizational barriers to the implementation of a CCIS in the VA host environment, (3) assess the possible benefits of this system to VA clinicians in terms of therapeutic decision making, and (4) develop new methods for identifying the clinical utility of a computer program designed to provide clinicians with a new information tool. The project was conducted intermittently over a three-year period at three VA medical centers chosen as implementation and evaluation test sites for Micromedex. Findings from the Kansas City Medical Center in Missouri are presented to illustrate some of the technical problems associated with the implementation of a commercial database program in the DHCP host environment, the organizational factors influencing clinical use of the system, and the methods used to evaluate its use. Data from 4581 provider encounters with the CCIS are summarized. Usage statistics are presented to illustrate the methodological possibilities for assessing the "benefits and burdens" of a computerized information system by using an automated collection of user demographics and program audit trails that allow evaluators to monitor user interactions with different segments of the database. PMID:1807583

  7. Evaluation of a Computerized Clinical Information System (Micromedex).

    PubMed

    Lundsgaarde, H P; Moreshead, G E

    1991-01-01

    This paper summarizes data collected as part of a project designed to identify and assess the technical and organizational problems associated with the implementation and evaluation of a Computerized Clinical Information System (CCIS), Micromedex, in three U.S. Department of Veterans Affairs Medical Centers (VAMCs). The study began in 1987 as a national effort to implement decision support technologies in the Veterans Administration Decentralized Hospital Computer Program (DHCP). The specific objectives of this project were to (1) examine one particular decision support technology, (2) identify the technical and organizational barriers to the implementation of a CCIS in the VA host environment, (3) assess the possible benefits of this system to VA clinicians in terms of therapeutic decision making, and (4) develop new methods for identifying the clinical utility of a computer program designed to provide clinicians with a new information tool. The project was conducted intermittently over a three-year period at three VA medical centers chosen as implementation and evaluation test sites for Micromedex. Findings from the Kansas City Medical Center in Missouri are presented to illustrate some of the technical problems associated with the implementation of a commercial database program in the DHCP host environment, the organizational factors influencing clinical use of the system, and the methods used to evaluate its use. Data from 4581 provider encounters with the CCIS are summarized. Usage statistics are presented to illustrate the methodological possibilities for assessing the "benefits and burdens" of a computerized information system by using an automated collection of user demographics and program audit trails that allow evaluators to monitor user interactions with different segments of the database.

  8. 75 FR 8508 - Computerized Tribal IV-D Systems and Office Automation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-25

    ...This rule enables Tribes and Tribal organizations currently operating comprehensive Tribal Child Support Enforcement programs under Title IV-D of the Social Security Act (the Act) to apply for and receive direct Federal funding for the costs of automated data processing. This rule addresses the Secretary's commitment to provide instructions and guidance to Tribes and Tribal organizations on requirements for applying for, and upon approval, securing Federal Financial Participation (FFP) in the costs of installing, operating, maintaining, and enhancing automated data processing systems.

  9. Innovative production technology in aircraft construction: CIAM Forming 'made by MBB' - A highly productive example

    NASA Astrophysics Data System (ADS)

    A novel production technology in aircraft construction was developed for manufacturing parts of shapes and dimensions that involve only small quantities for one machine. The process, called computerized integrated and automated manufacturing (CIAM), makes it possible to make ready-to-install sheet-metal parts for all types of aircraft. All of the system's job sequences, which include milling the flat sheet-metal parts in stacks, deburring, heat treatment, and forming under the high-pressure rubber-pad press, are automated. The CIAM production center, called SIAM Forming, fulfills the prerequisites for the cost-effective production of sheet-metal parts made of aluminum alloys, titanium, or steel. The SIAM procedure results in negligible material loss through computerizing both component-contour nesting of the sheet-metal parts and contour milling.

  10. [SmartCare: automatizing clinical guidelines].

    PubMed

    Mersmann, Stefan

    2009-10-01

    In critical care environments, important medical and economic challenges are presented by the enhancement of therapeutic quality and the reduction of therapeutic costs. For this purpose, several clinical studies have demonstrated a positive impact of the adoption of so-called clinical guidelines. Clinical guidelines represent well documented best practices in healthcare and are fundamental aspects of evidence-based medicine. However, at the bedside, such clinical guidelines remain difficult to use by clinical staff. The knowledge-based technology SmartCare allows incorporation of arbitrary computerized clinical guidelines into various medical target systems. SmartCare constitutes a clinical guideline engine because it executes one or more clinical guidelines on a specific medical device. SmartCare was initially applied for the automated control of a mechanical ventilator to assist the process of weaning from a medical device. The methodology allows further applications to be implemented effectively with other medical devices and/or with other appropriate guidelines. In this paper, we report on the methodology and the resulting versatility of such a system, as well as the clinical evaluation of SmartCare/PS and its perspectives.

  11. Automated segmentation of foveal avascular zone in fundus fluorescein angiography.

    PubMed

    Zheng, Yalin; Gandhi, Jagdeep Singh; Stangos, Alexandros N; Campa, Claudio; Broadbent, Deborah M; Harding, Simon P

    2010-07-01

    PURPOSE. To describe and evaluate the performance of a computerized automated segmentation technique for use in quantification of the foveal avascular zone (FAZ). METHODS. A computerized technique for automated segmentation of the FAZ using images from fundus fluorescein angiography (FFA) was applied to 26 transit-phase images obtained from patients with various grades of diabetic retinopathy. The area containing the FAZ zone was first extracted from the original image and smoothed by a Gaussian kernel (sigma = 1.5). An initializing contour was manually placed inside the FAZ of the smoothed image and iteratively moved by the segmentation program toward the FAZ boundary. Five tests with different initializing curves were run on each of 26 images to assess reproducibility. The accuracy of the program was also validated by comparing results obtained by the program with the FAZ boundaries manually delineated by medical retina specialists. Interobserver performance was then evaluated by comparing delineations from two of the experts. RESULTS. One-way analysis of variance indicated that the disparities between different tests were not statistically significant, signifying excellent reproducibility for the computer program. There was a statistically significant linear correlation between the results obtained by automation and manual delineations by experts. CONCLUSIONS. This automated segmentation program can produce highly reproducible results that are comparable to those made by clinical experts. It has the potential to assist in the detection and management of foveal ischemia and to be integrated into automated grading systems.

  12. An automated system for terrain database construction

    NASA Technical Reports Server (NTRS)

    Johnson, L. F.; Fretz, R. K.; Logan, T. L.; Bryant, N. A.

    1987-01-01

    An automated Terrain Database Preparation System (TDPS) for the construction and editing of terrain databases used in computerized wargaming simulation exercises has been developed. The TDPS system operates under the TAE executive, and it integrates VICAR/IBIS image processing and Geographic Information System software with CAD/CAM data capture and editing capabilities. The terrain database includes such features as roads, rivers, vegetation, and terrain roughness.

  13. Design and validation of an automated hydrostatic weighing system.

    PubMed

    McClenaghan, B A; Rocchio, L

    1986-08-01

    The purpose of this study was to design and evaluate the validity of an automated technique to assess body density using a computerized hydrostatic weighing system. An existing hydrostatic tank was modified and interfaced with a microcomputer equipped with an analog-to-digital converter. Software was designed to input variables, control the collection of data, calculate selected measurements, and provide a summary of the results of each session. Validity of the data obtained utilizing the automated hydrostatic weighing system was estimated by: evaluating the reliability of the transducer/computer interface to measure objects of known underwater weight; comparing the data against a criterion measure; and determining inter-session subject reliability. Values obtained from the automated system were found to be highly correlated with known underwater weights (r = 0.99, SEE = 0.0060 kg). Data concurrently obtained utilizing the automated system and a manual chart recorder were also found to be highly correlated (r = 0.99, SEE = 0.0606 kg). Inter-session subject reliability was determined utilizing data collected on subjects (N = 16) tested on two occasions approximately 24 h apart. Correlations revealed high relationships between measures of underwater weight (r = 0.99, SEE = 0.1399 kg) and body density (r = 0.98, SEE = 0.00244 g X cm-1). Results indicate that a computerized hydrostatic weighing system is a valid and reliable method for determining underwater weight.

  14. The impact of automation on organizational changes in a community hospital clinical microbiology laboratory.

    PubMed

    Camporese, Alessandro

    2004-06-01

    The diagnosis of infectious diseases and the role of the microbiology laboratory are currently undergoing a process of change. The need for overall efficiency in providing results is now given the same importance as accuracy. This means that laboratories must be able to produce quality results in less time with the capacity to interpret the results clinically. To improve the clinical impact of microbiology results, the new challenge facing the microbiologist has become one of process management instead of pure analysis. A proper project management process designed to improve workflow, reduce analytical time, and provide the same high quality results without losing valuable time treating the patient, has become essential. Our objective was to study the impact of introducing automation and computerization into the microbiology laboratory, and the reorganization of the laboratory workflow, i.e. scheduling personnel to work shifts covering both the entire day and the entire week. In our laboratory, the introduction of automation and computerization, as well as the reorganization of personnel, thus the workflow itself, has resulted in an improvement in response time and greater efficiency in diagnostic procedures.

  15. Computerized Cognitive Rehabilitation of Attention and Executive Function in Acquired Brain Injury: A Systematic Review.

    PubMed

    Bogdanova, Yelena; Yee, Megan K; Ho, Vivian T; Cicerone, Keith D

    Comprehensive review of the use of computerized treatment as a rehabilitation tool for attention and executive function in adults (aged 18 years or older) who suffered an acquired brain injury. Systematic review of empirical research. Two reviewers independently assessed articles using the methodological quality criteria of Cicerone et al. Data extracted included sample size, diagnosis, intervention information, treatment schedule, assessment methods, and outcome measures. A literature review (PubMed, EMBASE, Ovid, Cochrane, PsychINFO, CINAHL) generated a total of 4931 publications. Twenty-eight studies using computerized cognitive interventions targeting attention and executive functions were included in this review. In 23 studies, significant improvements in attention and executive function subsequent to training were reported; in the remaining 5, promising trends were observed. Preliminary evidence suggests improvements in cognitive function following computerized rehabilitation for acquired brain injury populations including traumatic brain injury and stroke. Further studies are needed to address methodological issues (eg, small sample size, inadequate control groups) and to inform development of guidelines and standardized protocols.

  16. Moving beyond the pros and cons of automating cognitive testing in pathological aging and dementia: the case for equal opportunity.

    PubMed

    Wesnes, Keith A

    2014-01-01

    The lack of progress over the last decade in developing treatments for Alzheimer's disease has called into question the quality of the cognitive assessments used while also shifting the emphasis from treatment to prophylaxis by studying the disorder at earlier stages, even prior to the development of cognitive symptoms. This has led various groups to seek cognitive tests which are more sensitive than those currently used and which can be meaningfully administered to individuals with mild or even no cognitive impairment. Although computerized tests have long been used in this field, they have made little inroads compared with non-automated tests. This review attempts to put in perspective the relative utilities of automated and non-automated tests of cognitive function in therapeutic trials of pathological aging and the dementias. Also by a review of the automation of cognitive tests over the last 150 years, it is hoped that the notion that such procedures are novel compared with pencil-and-paper testing will be dispelled. Furthermore, data will be presented to illustrate that older individuals and patients with dementia are neither stressed nor disadvantaged when tested with appropriately developed computerized methods. An important aspect of automated testing is that it can assess all aspects of task performance, including the speed of cognitive processes, and data are presented on the advantages this can confer in clinical trials. The ultimate objectives of the review are to encourage decision making in the field to move away from the automated/non-automated dichotomy and to develop criteria pertinent to each trial against which all available procedures are evaluated. If we are to make serious progress in this area, we must use the best tools available, and the evidence suggests that automated testing has earned the right to be judged against the same criteria as non-automated tests.

  17. [Automated processing of data from the 1985 population and housing census].

    PubMed

    Cholakov, S

    1987-01-01

    The author describes the method of automated data processing used in the 1985 census of Bulgaria. He notes that the computerization of the census involves decentralization and the use of regional computing centers as well as data processing at the Central Statistical Office's National Information Computer Center. Special attention is given to problems concerning the projection and programming of census data. (SUMMARY IN ENG AND RUS)

  18. Design and evaluation of a service oriented architecture for paperless ICU tarification.

    PubMed

    Steurbaut, Kristof; Colpaert, Kirsten; Van Hoecke, Sofie; Steurbaut, Sabrina; Danneels, Chris; Decruyenaere, Johan; De Turck, Filip

    2012-06-01

    The computerization of Intensive Care Units provides an overwhelming amount of electronic data for both medical and financial analysis. However, the current tarification, which is the process to tick and count patients' procedures, is still a repetitive, time-consuming process on paper. Nurses and secretaries keep track manually of the patients' medical procedures. This paper describes the design methodology and implementation of automated tarification services. In this study we investigate if the tarification can be modeled in service oriented architecture as a composition of interacting services. Services are responsible for data collection, automatic assignment of records to physicians and application of rules. Performance is evaluated in terms of execution time, cost evaluation and return on investment based on tracking of real procedures. The services provide high flexibility in terms of maintenance, integration and rules support. It is shown that services offer a more accurate, less time-consuming and cost-effective tarification.

  19. Measuring facial expression of emotion.

    PubMed

    Wolf, Karsten

    2015-12-01

    Research into emotions has increased in recent decades, especially on the subject of recognition of emotions. However, studies of the facial expressions of emotion were compromised by technical problems with visible video analysis and electromyography in experimental settings. These have only recently been overcome. There have been new developments in the field of automated computerized facial recognition; allowing real-time identification of facial expression in social environments. This review addresses three approaches to measuring facial expression of emotion and describes their specific contributions to understanding emotion in the healthy population and in persons with mental illness. Despite recent progress, studies on human emotions have been hindered by the lack of consensus on an emotion theory suited to examining the dynamic aspects of emotion and its expression. Studying expression of emotion in patients with mental health conditions for diagnostic and therapeutic purposes will profit from theoretical and methodological progress.

  20. Eliminating the use of ticket takers.

    DOT National Transportation Integrated Search

    1998-03-01

    The Oregon Department of Transportation tested an automated means of collecting data from paving trucks as an alternative to the traditional method of "ticket taking". A computerized communication system was designed and built by Quality Design Syste...

  1. Comparison of Centralized-Manual, Centralized-Computerized, and Decentralized-Computerized Order and Management Information Models for the Turkish Air Force Logistics System.

    DTIC Science & Technology

    1986-09-01

    differentiation between the systems. This study will investigate an appropriate Order Processing and Management Information System (OP&MIS) to link base-level...methodology: 1. Reviewed the current order processing and information model of the TUAF Logistics System. (centralized-manual model) 2. Described the...RDS program’s order processing and information system. (centralized-computerized model) 3. Described the order irocessing and information system of

  2. National Voice Response System (VRS) Implementation Plan Alternatives Study

    DOT National Transportation Integrated Search

    1979-07-01

    This study examines the alternatives available to implement a national Voice Response System (VRS) for automated preflight weather briefings and flight plan filing. Four major hardware configurations are discussed. A computerized analysis model was d...

  3. In Search of Effective Methodology for Organizational Learning: A Japanese Experience

    ERIC Educational Resources Information Center

    Tsuchiya, Shigehisa

    2011-01-01

    The author's personal journey regarding simulation and gaming started about 25 years ago when he happened to realize how powerful computerized simulation could be for organizational change. The metaphors created by computerized simulation enabled him to transform a stagnant university into a high-performance organization. Through extensive…

  4. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers, and Automation Technology, Number 26

    DTIC Science & Technology

    1977-01-26

    Sisteme Matematicheskogo Obespecheniya YeS EVM [ Applied Programs in the Software System for the Unified System of Computers], by A. Ye. Fateyev, A. I...computerized systems are most effective in large production complexes , in which the level of utilization of computers can be as high as 500,000...performance of these tasks could be furthered by the complex introduction of electronic computers in automated control systems. The creation of ASU

  5. Automated RTOP Management System

    NASA Technical Reports Server (NTRS)

    Hayes, P.

    1984-01-01

    The structure of NASA's Office of Aeronautics and Space Technology electronic information system network from 1983 to 1985 is illustrated. The RTOP automated system takes advantage of existing hardware, software, and expertise, and provides: (1) computerized cover sheet and resources forms; (2) electronic signature and transmission; (3) a data-based information system; (4) graphics; (5) intercenter communications; (6) management information; and (7) text editing. The system is coordinated with Headquarters efforts in codes R,E, and T.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This paper is actually a composite of two papers dealing with automation and computerized control of underground mining equipment. The paper primarily discusses drills, haulage equipment, and tunneling machines. It compares performance and cost benefits of conventional equipment to the new automated methods. The company involved are iron ore mining companies in Scandinavia. The papers also discusses the different equipment using air power, water power, hydraulic power, and computer power. The different drill rigs are compared for performance and cost.

  7. Computerized Management Information and Reporting Systems for Sponsored Projects.

    ERIC Educational Resources Information Center

    Rodman, John A.; Peters, Carl M.

    1980-01-01

    The effective management of the university depends on the research office providing usable, accurate, timely, and accessible information regarding sponsored programs. The utilization of automated systems to store, access, and manage information is seen as essential. (MLW)

  8. Reaching out to clinicians: implementation of a computerized alert system.

    PubMed

    Degnan, Dan; Merryfield, Dave; Hultgren, Steve

    2004-01-01

    Several published articles have identified that providing automated, computer-generated clinical alerts about potentially critical clinical situations should result in better quality of care. In 1999, the pharmacy department at a community hospital network implemented and refined a commercially available, computerized clinical alert system. This case report discusses the implementation process, gives examples of how the system is used, and describes results following implementation. The use of the clinical alert system in this hospital network resulted in improved patient safety as well as in greater efficiency and decreased costs.

  9. Evaluating the Validity of Computerized Content Analysis Programs for Identification of Emotional Expression in Cancer Narratives

    ERIC Educational Resources Information Center

    Bantum, Erin O'Carroll; Owen, Jason E.

    2009-01-01

    Psychological interventions provide linguistic data that are particularly useful for testing mechanisms of action and improving intervention methodologies. For this study, emotional expression in an Internet-based intervention for women with breast cancer (n = 63) was analyzed via rater coding and 2 computerized coding methods (Linguistic Inquiry…

  10. Automated image alignment and segmentation to follow progression of geographic atrophy in age-related macular degeneration.

    PubMed

    Ramsey, David J; Sunness, Janet S; Malviya, Poorva; Applegate, Carol; Hager, Gregory D; Handa, James T

    2014-07-01

    To develop a computer-based image segmentation method for standardizing the quantification of geographic atrophy (GA). The authors present an automated image segmentation method based on the fuzzy c-means clustering algorithm for the detection of GA lesions. The method is evaluated by comparing computerized segmentation against outlines of GA drawn by an expert grader for a longitudinal series of fundus autofluorescence images with paired 30° color fundus photographs for 10 patients. The automated segmentation method showed excellent agreement with an expert grader for fundus autofluorescence images, achieving a performance level of 94 ± 5% sensitivity and 98 ± 2% specificity on a per-pixel basis for the detection of GA area, but performed less well on color fundus photographs with a sensitivity of 47 ± 26% and specificity of 98 ± 2%. The segmentation algorithm identified 75 ± 16% of the GA border correctly in fundus autofluorescence images compared with just 42 ± 25% for color fundus photographs. The results of this study demonstrate a promising computerized segmentation method that may enhance the reproducibility of GA measurement and provide an objective strategy to assist an expert in the grading of images.

  11. Automation of orbit determination functions for National Aeronautics and Space Administration (NASA)-supported satellite missions

    NASA Technical Reports Server (NTRS)

    Mardirossian, H.; Beri, A. C.; Doll, C. E.

    1990-01-01

    The Flight Dynamics Facility (FDF) at Goddard Space Flight Center (GSFC) provides spacecraft trajectory determination for a wide variety of National Aeronautics and Space Administration (NASA)-supported satellite missions, using the Tracking Data Relay Satellite System (TDRSS) and Ground Spaceflight and Tracking Data Network (GSTDN). To take advantage of computerized decision making processes that can be used in spacecraft navigation, the Orbit Determination Automation System (ODAS) was designed, developed, and implemented as a prototype system to automate orbit determination (OD) and orbit quality assurance (QA) functions performed by orbit operations. Based on a machine-resident generic schedule and predetermined mission-dependent QA criteria, ODAS autonomously activates an interface with the existing trajectory determination system using a batch least-squares differential correction algorithm to perform the basic OD functions. The computational parameters determined during the OD are processed to make computerized decisions regarding QA, and a controlled recovery process is activated when the criteria are not satisfied. The complete cycle is autonomous and continuous. ODAS was extensively tested for performance under conditions resembling actual operational conditions and found to be effective and reliable for extended autonomous OD. Details of the system structure and function are discussed, and test results are presented.

  12. Automation of orbit determination functions for National Aeronautics and Space Administration (NASA)-supported satellite missions

    NASA Technical Reports Server (NTRS)

    Mardirossian, H.; Heuerman, K.; Beri, A.; Samii, M. V.; Doll, C. E.

    1989-01-01

    The Flight Dynamics Facility (FDF) at Goddard Space Flight Center (GSFC) provides spacecraft trajectory determination for a wide variety of National Aeronautics and Space Administration (NASA)-supported satellite missions, using the Tracking Data Relay Satellite System (TDRSS) and Ground Spaceflight and Tracking Data Network (GSTDN). To take advantage of computerized decision making processes that can be used in spacecraft navigation, the Orbit Determination Automation System (ODAS) was designed, developed, and implemented as a prototype system to automate orbit determination (OD) and orbit quality assurance (QA) functions performed by orbit operations. Based on a machine-resident generic schedule and predetermined mission-dependent QA criteria, ODAS autonomously activates an interface with the existing trajectory determination system using a batch least-squares differential correction algorithm to perform the basic OD functions. The computational parameters determined during the OD are processed to make computerized decisions regarding QA, and a controlled recovery process isactivated when the criteria are not satisfied. The complete cycle is autonomous and continuous. ODAS was extensively tested for performance under conditions resembling actual operational conditions and found to be effective and reliable for extended autonomous OD. Details of the system structure and function are discussed, and test results are presented.

  13. Automation of scour analysis at Louisiana bridge sites : final report.

    DOT National Transportation Integrated Search

    1988-12-01

    The computerized system for the organization, analysis, and display of field collected scour data is described. This system will enhance the current manual procedure of accomplishing these tasks. The system accepts input from the user, and based on u...

  14. A candidate automated test battery for neuropsychological screening of airmen : design and preliminary validation.

    DOT National Transportation Integrated Search

    1992-02-01

    A panel of the American Medical Association convened by the Federal Aviation Administration recommended that a computerized test of cognitive function be developed that would detect significant cognitive impairments that might otherwise go unrecogniz...

  15. Automation in an Addiction Treatment Research Clinic: Computerized Contingency Management, Ecological Momentary Assessment, and a Protocol Workflow System

    PubMed Central

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H.; Preston, Kenzie L.

    2009-01-01

    Issues A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients’ treatment needs and accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with provision of seamless methods for exporting, mining, and querying the data. Approach We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialized applications: the Automated Contingency Management (ACM) system for delivery of behavioral interventions, the Transactional Electronic Diary (TED) system for management of behavioral assessments, and the Protocol Workflow System (PWS) for computerized workflow automation and guidance of each participant’s daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorized staff. Key Findings ACM and TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80-patient capacity having an annual average of 18,000 patient-visits and 7,300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarize participant-safety data for research oversight. Implications and conclusion When developed in consultation with end users, automation in treatment-research clinics can enable more efficient operations, better communication among staff, and expansions in research methods. PMID:19320669

  16. Automated Selection Of Pictures In Sequences

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.; Shelton, Robert O.

    1995-01-01

    Method of automated selection of film or video motion-picture frames for storage or examination developed. Beneficial in situations in which quantity of visual information available exceeds amount stored or examined by humans in reasonable amount of time, and/or necessary to reduce large number of motion-picture frames to few conveying significantly different information in manner intermediate between movie and comic book or storyboard. For example, computerized vision system monitoring industrial process programmed to sound alarm when changes in scene exceed normal limits.

  17. The CADSS design automation system. [computerized design language for small digital systems

    NASA Technical Reports Server (NTRS)

    Franke, E. A.

    1973-01-01

    This research was designed to implement and extend a previously defined design automation system for the design of small digital structures. A description is included of the higher level language developed to describe systems as a sequence of register transfer operations. The system simulator which is used to determine if the original description is correct is also discussed. The design automation system produces tables describing the state transistions of the system and the operation of all registers. In addition all Boolean equations specifying system operation are minimized and converted to NAND gate structures. Suggestions for further extensions to the system are also given.

  18. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: A feasibility study

    NASA Technical Reports Server (NTRS)

    Boyle, W. G.; Barton, G. W.

    1979-01-01

    The feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center was considered. Since that laboratory's duties are not routine, the automation goals were set with that in mind. Four instruments were selected as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an X-ray fluorescence spectrometer, and an X-ray diffraction unit. Two options for computer automation were described: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. Costs and benefits for each option were considered. It was concluded that the microcomputer version best fits the goals and duties of the laboratory and that such an automted system is needed to meet the laboratory's future requirements.

  19. Economics of infection control surveillance technology: cost-effective or just cost?

    PubMed

    Furuno, Jon P; Schweizer, Marin L; McGregor, Jessina C; Perencevich, Eli N

    2008-04-01

    Previous studies have suggested that informatics tools, such as automated alert and decision support systems, may increase the efficiency and quality of infection control surveillance. However, little is known about the cost-effectiveness of these tools. We focus on 2 types of economic analyses that have utility in assessing infection control interventions (cost-effectiveness analysis and business-case analysis) and review the available literature on the economics of computerized infection control surveillance systems. Previous studies on the effectiveness of computerized infection control surveillance have been limited to assessments of whether these tools increase the sensitivity and specificity of surveillance over traditional methods. Furthermore, we identified only 2 studies that assessed the costs associated with computerized infection control surveillance. Thus, it remains unknown whether computerized infection control surveillance systems are cost-effective and whether use of these systems improves patient outcomes. The existing data are insufficient to allow for a summary conclusion on the cost-effectiveness of infection control surveillance technology. All future studies of computerized infection control surveillance systems should aim to collect outcomes and economic data to inform decision making and assist hospitals with completing business-cases analyses.

  20. USU Contracts and Grants System--Innovative Inquiry.

    ERIC Educational Resources Information Center

    Henderson, Harold C.; Eagar, Virginia L.

    1981-01-01

    In January of 1979, the Contracts and Grants Office at Utah State University implemented a computerized system to keep track of pending research proposals, as well as active grants. The automation process used is described from its conception, design, and implementation to future enhancements. (Author/MLW)

  1. SCOPE in Cataloguing.

    ERIC Educational Resources Information Center

    Tom, Ellen; Reed, Sue

    This report describes the Systematic Computerized Processing in Cataloguing system (SCOPE), an automated system for the catalog department of a university library. The system produces spine labels, pocket labels, book cards for the circulation system, catalog cards including shelf list, main entry, subject and added entry cards, statistics, an…

  2. Automation, Resource Sharing, and the Small Academic Library.

    ERIC Educational Resources Information Center

    Miller, Arthur H., Jr.

    1983-01-01

    Discussion of Illinois experiences in library cooperation and computerization (OCLC, Library Computer System, LIBRAS) describes use of library materials, benefits and drawbacks of online networking, experiences at Lake Forest College (Illinois), and six tasks recommended for small academic libraries as preparation for major changes toward…

  3. Computer-Controlled HVAC -- at Low Cost

    ERIC Educational Resources Information Center

    American School and University, 1974

    1974-01-01

    By tying into a computerized building-automation network, Schaumburg High School, Illinois, slashed its energy consumption by one-third. The remotely connected computer controls the mechanical system for the high school as well as other buildings in the community, with the cost being shared by all. (Author)

  4. Cost-effectiveness analysis of computerized ECG interpretation system in an ambulatory health care organization.

    PubMed

    Carel, R S

    1982-04-01

    The cost-effectiveness of a computerized ECG interpretation system in an ambulatory health care organization has been evaluated in comparison with a conventional (manual) system. The automated system was shown to be more cost-effective at a minimum load of 2,500 patients/month. At larger monthly loads an even greater cost-effectiveness was found, the average cost/ECG being about $2. In the manual system the cost/unit is practically independent of patient load. This is primarily due to the fact that 87% of the cost/ECG is attributable to wages and fees of highly trained personnel. In the automated system, on the other hand, the cost/ECG is heavily dependent on examinee load. This is due to the relatively large impact of equipment depreciation on fixed (and total) cost. Utilization of a computer-assisted system leads to marked reduction in cardiologists' interpretation time, substantially shorter turnaround time (of unconfirmed reports), and potential provision of simultaneous service at several remotely located "heart stations."

  5. Methodology for Prototyping Increased Levels of Automation for Spacecraft Rendezvous Functions

    NASA Technical Reports Server (NTRS)

    Hart, Jeremy J.; Valasek, John

    2007-01-01

    The Crew Exploration Vehicle necessitates higher levels of automation than previous NASA vehicles, due to program requirements for automation, including Automated Rendezvous and Docking. Studies of spacecraft development often point to the locus of decision-making authority between humans and computers (i.e. automation) as a prime driver for cost, safety, and mission success. Therefore, a critical component in the Crew Exploration Vehicle development is the determination of the correct level of automation. To identify the appropriate levels of automation and autonomy to design into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool. This paper develops a methodology for prototyping increased levels of automation for spacecraft rendezvous functions. This methodology is used to evaluate the accuracy of the Function-specific Level of Autonomy and Automation Tool specified levels of automation, via prototyping. Spacecraft rendezvous planning tasks are selected and then prototyped in Matlab using Fuzzy Logic techniques and existing Space Shuttle rendezvous trajectory algorithms.

  6. Teleoperated robotic sorting system

    DOEpatents

    Roos, Charles E.; Sommer, Jr., Edward J.; Parrish, Robert H.; Russell, James R.

    2008-06-24

    A method and apparatus are disclosed for classifying materials utilizing a computerized touch sensitive screen or other computerized pointing device for operator identification and electronic marking of spatial coordinates of materials to be extracted. An operator positioned at a computerized touch sensitive screen views electronic images of the mixture of materials to be sorted as they are conveyed past a sensor array which transmits sequences of images of the mixture either directly or through a computer to the touch sensitive display screen. The operator manually "touches" objects displayed on the screen to be extracted from the mixture thereby registering the spatial coordinates of the objects within the computer. The computer then tracks the registered objects as they are conveyed and directs automated devices including mechanical means such as air jets, robotic arms, or other mechanical diverters to extract the registered objects.

  7. [Complex automatic data processing in multi-profile hospitals].

    PubMed

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  8. Teleoperated robotic sorting system

    DOEpatents

    Roos, Charles E.; Sommer, Edward J.; Parrish, Robert H.; Russell, James R.

    2000-01-01

    A method and apparatus are disclosed for classifying materials utilizing a computerized touch sensitive screen or other computerized pointing device for operator identification and electronic marking of spatial coordinates of materials to be extracted. An operator positioned at a computerized touch sensitive screen views electronic images of the mixture of materials to be sorted as they are conveyed past a sensor array which transmits sequences of images of the mixture either directly or through a computer to the touch sensitive display screen. The operator manually "touches" objects displayed on the screen to be extracted from the mixture thereby registering the spatial coordinates of the objects within the computer. The computer then tracks the registered objects as they are conveyed and directs automated devices including mechanical means such as air jets, robotic arms, or other mechanical diverters to extract the registered objects.

  9. A computerized clinical decision support system as a means of implementing depression guidelines.

    PubMed

    Trivedi, Madhukar H; Kern, Janet K; Grannemann, Bruce D; Altshuler, Kenneth Z; Sunderajan, Prabha

    2004-08-01

    The authors describe the history and current use of computerized systems for implementing treatment guidelines in general medicine as well as the development, testing, and early use of a computerized decision support system for depression treatment among "real-world" clinical settings in Texas. In 1999 health care experts from Europe and the United States met to confront the well-documented challenges of implementing treatment guidelines and to identify strategies for improvement. They suggested the integration of guidelines into computer systems that is incorporated into clinical workflow. Several studies have demonstrated improvements in physicians' adherence to guidelines when such guidelines are provided in a computerized format. Although computerized decision support systems are being used in many areas of medicine and have demonstrated improved patient outcomes, their use in psychiatric illness is limited. The authors designed and developed a computerized decision support system for the treatment of major depressive disorder by using evidence-based guidelines, transferring the knowledge gained from the Texas Medication Algorithm Project (TMAP). This computerized decision support system (CompTMAP) provides support in diagnosis, treatment, follow-up, and preventive care and can be incorporated into the clinical setting. CompTMAP has gone through extensive testing to ensure accuracy and reliability. Physician surveys have indicated a positive response to CompTMAP, although the sample was insufficient for statistical testing. CompTMAP is part of a new era of comprehensive computerized decision support systems that take advantage of advances in automation and provide more complete clinical support to physicians in clinical practice.

  10. Methodology for vocational psychodiagnostics of senior schoolchildren using information technologies

    NASA Astrophysics Data System (ADS)

    Bogdanovskaya, I. M.; Kosheleva, A. N.; Kiselev, P. B.; Davydova, Yu. A.

    2017-01-01

    The article identifies the role and main problems of vocational psychodiagnostics in modern socio-cultural conditions. It analyzes the potentials of information technologies in vocational psychodiagnostics of senior schoolchildren. The article describes the theoretical and methodological grounds, content and diagnostic potentials of the computerized method in vocational psychodiagnostics. The computerized method includes three blocks of sub-tests to identify intellectual potential, personal qualities, professional interests and values, career orientations, as well as subtests to analyze the specific life experience of senior schoolchildren. The results of diagnostics allow developing an integrated psychodiagnostic conclusion with recommendations. The article contains options of software architecture for the given method.

  11. Implementing ICAO Language Proficiency Requirements in the Versant Aviation English Test

    ERIC Educational Resources Information Center

    Van Moere, Alistair; Suzuki, Masanori; Downey, Ryan; Cheng, Jian

    2009-01-01

    This paper discusses the development of an assessment to satisfy the International Civil Aviation Organization (ICAO) Language Proficiency Requirements. The Versant Aviation English Test utilizes speech recognition technology and a computerized testing platform, such that test administration and scoring are fully automated. Developed in…

  12. 76 FR 4703 - Proposed Information Collection Activity; Comment Request Proposed Projects:

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-26

    ... Information Collection Activity; Comment Request Proposed Projects: Title: Computerized Support Enforcement Systems. OMB No. 0980-0271. Description: The information being collected is mandated by Section 454(16) of...) approved under section 452(d) of the title, of a statewide automated data processing and information...

  13. Libraries Can Learn from Banks.

    ERIC Educational Resources Information Center

    Lawrence, Gail H.

    1983-01-01

    The experiences of banks introducing computerized services to the public are described to provide some idea of what libraries can expect when they introduce online systems. Volume of use of Automated Teller Machines, types of users, introduction of machines, and user acceptance are highlighted. Thirty-two references are cited. (EJS)

  14. Automated Simultaneous Assembly for Multistage Testing

    ERIC Educational Resources Information Center

    Breithaupt, Krista; Ariel, Adelaide; Veldkamp, Bernard P.

    2005-01-01

    This article offers some solutions used in the assembly of the computerized Uniform Certified Public Accountancy (CPA) licensing examination as practical alternatives for operational programs producing large numbers of forms. The Uniform CPA examination was offered as an adaptive multistage test (MST) beginning in April of 2004. Examples of…

  15. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peters, Valerie A.; Ogilvie, Alistair B.

    2012-01-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific data recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of operating wind turbines. This report is intended to help develop a basic understanding of the data needed for reliability analysis frommore » a Computerized Maintenance Management System (CMMS) and other data systems. The report provides a rationale for why this data should be collected, a list of the data needed to support reliability and availability analysis, and specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and analysis and reporting needs. The 'Motivation' section of this report provides a rationale for collecting and analyzing field data for reliability analysis. The benefits of this type of effort can include increased energy delivered, decreased operating costs, enhanced preventive maintenance schedules, solutions to issues with the largest payback, and identification of early failure indicators.« less

  16. LSU: The Library Space Utilization Methodology.

    ERIC Educational Resources Information Center

    Hall, Richard B.

    A computerized research technique for measuring the space utilization of public library facilities provides a behavioral activity and occupancy analysis for library planning purposes. The library space utilization (LSU) methodology demonstrates that significant information about the functional requirements of a library can be measured and…

  17. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    PubMed

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  18. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    PubMed Central

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  19. Understanding and enhancing user acceptance of computer technology

    NASA Technical Reports Server (NTRS)

    Rouse, William B.; Morris, Nancy M.

    1986-01-01

    Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.

  20. Computerization of Library and Information Services in Mainland China.

    ERIC Educational Resources Information Center

    Lin, Sharon Chien

    1994-01-01

    Describes two phases of the automation of library and information services in mainland China. From 1974-86, much effort was concentrated on developing computer systems, databases, online retrieval, and networking. From 1986 to the present, practical progress became possible largely because of CD-ROM technology; and large scale networking for…

  1. Internet-Based Cervical Cytology Screening System

    DTIC Science & Technology

    2007-04-01

    approaches to cervical cancer screening possible. In addition, advances in information technology have facilitated the Internet transmission and archival...processes in the clinical laboratory. Recent technological advances in specimen preparation and computerized primary screening make automated...AD_________________ Award Number: W81XWH-04-C-0083 TITLE: Internet -Based Cervical Cytology

  2. 21 CFR 111.20 - What design and construction requirements apply to your physical plant?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... surfaces, with microorganisms, chemicals, filth, or other extraneous material. Your physical plant must have, and you must use, separate or defined areas of adequate size or other control systems, such as computerized inventory controls or automated systems of separation, to prevent contamination and mixups of...

  3. 21 CFR 111.20 - What design and construction requirements apply to your physical plant?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... surfaces, with microorganisms, chemicals, filth, or other extraneous material. Your physical plant must have, and you must use, separate or defined areas of adequate size or other control systems, such as computerized inventory controls or automated systems of separation, to prevent contamination and mixups of...

  4. Automated Psychological Testing: Method of Administration, Need for Approval, and Measures of Anxiety.

    ERIC Educational Resources Information Center

    Davis, Caroline; Cowles, Michael

    1989-01-01

    Computerized and paper-and-pencil versions of four standard personality inventories administered to 147 undergraduates were compared for: (1) test-retest reliability; (2) scores; (3) trait anxiety; (4) interaction between method and social desirability; and (5) preferences concerning method of testing. Doubts concerning the efficacy of…

  5. Converting the H. W. Wilson Company Indexes to an Automated System: A Functional Analysis.

    ERIC Educational Resources Information Center

    Regazzi, John J.

    1984-01-01

    Description of the computerized information system that supports the editorial and manufacturing processes involved in creation of Wilson's subject indexes and catalogs includes the major subsystems--online data entry, batch input processing, validation and release, file generation and database management, online and offline retrieval, publication…

  6. 32 CFR Appendix A to Part 806b - Definitions

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... system of records to make the data accurate, relevant, timely, or complete. Computer matching: A computerized comparison of two or more automated systems of records or a system of records with non-Federal... operate and safeguard it. Local system managers operate record systems or are responsible for part of a...

  7. 32 CFR Appendix A to Part 806b - Definitions

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... system of records to make the data accurate, relevant, timely, or complete. Computer matching: A computerized comparison of two or more automated systems of records or a system of records with non-Federal... operate and safeguard it. Local system managers operate record systems or are responsible for part of a...

  8. 32 CFR Appendix A to Part 806b - Definitions

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... system of records to make the data accurate, relevant, timely, or complete. Computer matching: A computerized comparison of two or more automated systems of records or a system of records with non-Federal... operate and safeguard it. Local system managers operate record systems or are responsible for part of a...

  9. An automated library financial management system

    NASA Technical Reports Server (NTRS)

    Dueker, S.; Gustafson, L.

    1977-01-01

    A computerized library acquisition system developed for control of informational materials acquired at NASA Ames Research Center is described. The system monitors the acquisition of both library and individual researchers' orders and supplies detailed financial, statistical, and bibliographical information. Applicability for other libraries and the future availability of the program is discussed.

  10. Socio-Economic Impact Assessment of the Computerized Customer Information System (CCIS) at the Southern California Rapid Transit District (SCRTD)

    DOT National Transportation Integrated Search

    1983-06-01

    This document is a product of an ongoing program to assess the impacts of automated transit information system (ATIS) technology on the transit industry's efforts to improve the productivity and quality of telephone information/marketing services to ...

  11. Technology for the product and process data base

    NASA Technical Reports Server (NTRS)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  12. Design techniques for developing a computerized instrumentation test plan. [for wind tunnel test data acquisition system

    NASA Technical Reports Server (NTRS)

    Burnett, S. Kay; Forsyth, Theodore J.; Maynard, Everett E.

    1987-01-01

    The development of a computerized instrumentation test plan (ITP) for the NASA/Ames Research Center National Full Scale Aerodynamics Complex (NFAC) is discussed. The objective of the ITP program was to aid the instrumentation engineer in documenting the configuration and calibration of data acquisition systems for a given test at any of four low speed wind tunnel facilities (Outdoor Aerodynamic Research Facility, 7 x 10, 40 x 80, and 80 x 120) at the NFAC. It is noted that automation of the ITP has decreased errors, engineering hours, and setup time while adding a higher level of consistency and traceability.

  13. [The movement computerized analysis as instrumental support for occupational doctors in evaluation of upper limb pathologies in engineering workers].

    PubMed

    D'Orso, M I; Centemeri, R; Oggionni, P; Latocca, R; Crippa, M; Vercellino, R; Riva, M; Cesana, G

    2011-01-01

    The movement computerized analysis of upper limb is a valid support in the definition of residual functional capability and of specific work suitability in complex cases. This methodology of evaluation is able to correctly and objectively define the tridimensional ranges of motion of every patient's upper limb. This fact can be particularly useful for workers coming back to work after a work-related or a not work-related accident of for handicapped workers at the beginning of a new work activity. We report a research carried out using computerized analysis of motion of upper limbs in 20 engineering workers.

  14. Children of the Four Winds: The Migrant Student Record Transfer System.

    ERIC Educational Resources Information Center

    Dyer, Maxwell

    A discussion of the computerized Migrant Student Record Transfer System (MSRTS) is presented. The author first describes it as a functional automated system, headquartered in the Arkansas Department of Education, which serves the record transferral needs of seasonal farm migrant children as they move throughout the contiguous 48 states.…

  15. WordBytes: Exploring an Intermediate Constraint Format for Rapid Classification of Student Answers on Constructed Response Assessments

    ERIC Educational Resources Information Center

    Kim, Kerry J.; Meir, Eli; Pope, Denise S.; Wendel, Daniel

    2017-01-01

    Computerized classification of student answers offers the possibility of instant feedback and improved learning. Open response (OR) questions provide greater insight into student thinking and understanding than more constrained multiple choice (MC) questions, but development of automated classifiers is more difficult, often requiring training a…

  16. 76 FR 21383 - Proposed Collection; Comment Request; Food Reporting Comparison Study (FORCS) and Food and Eating...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-15

    ... be submitted to the Office of Management and Budget (OMB) for review and approval. Proposed... Institute (NCI) Observational Feeding Studies.'' The objective of the two studies is to compare the performance of the newly developed computerized Automated Self- Administered 24-Hour Recall (ASA24) approach...

  17. Faculty Attitudes and Habits Concerning Library Instruction: How Much Has Changed Since 1982?

    ERIC Educational Resources Information Center

    Thomas, Joy

    1994-01-01

    To determine whether either automation or changing demographics may have influenced professorial habits or attitudes in relation to library instruction, a 1982 survey of faculty with additional questions on computerized searching was replicated in 1990 on the same large campus. The survey results are summarized and discussed; the questionnaire is…

  18. A Top-Down Approach to Designing the Computerized Adaptive Multistage Test

    ERIC Educational Resources Information Center

    Luo, Xiao; Kim, Doyoung

    2018-01-01

    The top-down approach to designing a multistage test is relatively understudied in the literature and underused in research and practice. This study introduced a route-based top-down design approach that directly sets design parameters at the test level and utilizes the advanced automated test assembly algorithm seeking global optimality. The…

  19. Lewis Online Travel System: Preparer's/Traveler's Manual, Release 1.0

    NASA Technical Reports Server (NTRS)

    Seese, Michael

    1992-01-01

    The Lewis Online Travel System (LOTS) is a menu-driven interactive application that automates nearly all of the functions associated with government travel. The purpose of this manual is to provide LOTS users with concise instructions for using the computerized application. As such, it will not go into the details of travel regulations.

  20. Office Automation Pilot: A Paperless Approach at College of DuPage.

    ERIC Educational Resources Information Center

    Carlson, Bart

    The pilot project described in this report was undertaken by the College of DuPage (CD) to increase the clerical efficiency of seven administrative offices through the installation of a computerized word processing and data transmission system. The first section of the report provides background information detailing: the history of computer…

  1. Serials Automation for San Jose State University Library.

    ERIC Educational Resources Information Center

    Liu, Susana J.

    This study (1) examines the university's serials system and identifies its problems; (2) analyzes the current manual operations in the serials department, with emphasis on the serials check-in system; and (3) determines whether or not computerization of some or all of the serials subsystems would improve the department's internal effectiveness and…

  2. White Collar Displacement: Job Erosion in the Service Sector.

    ERIC Educational Resources Information Center

    Golden, Lonnie; Danann, Sharon

    The National Commission for Employment Policy estimates that 19 million workers--17 percent of the work force--are in jobs directly threatened by office automation, and the consequences of the displacement of clerical workers due to increasing office computerization are as serious as those from manufacturing job loss. Between 1983 and 1988, almost…

  3. Documentation Centre of the Association of African Universities.

    ERIC Educational Resources Information Center

    Chateh, Peter

    This report presents the results of a study of the Documentation Centre of the Association of African Universities (AAU) undertaken to work out proposals for the rational organization of the Centre, and to explore the possibility of computerizing the Centre and linking it with other centers which provide automated documentation services. The…

  4. Automation Applications in an Advanced Air Traffic Management System : Volume 3. Methodology for Man-Machine Task Allocation

    DOT National Transportation Integrated Search

    1974-08-01

    Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...

  5. Designing, Evaluating, and Deploying Automated Scoring Systems with Validity in Mind: Methodological Design Decisions

    ERIC Educational Resources Information Center

    Rupp, André A.

    2018-01-01

    This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…

  6. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  7. Online security and cyberbystander relations in mobilizing sex abuse intervention.

    PubMed

    Palasinski, Marek

    2012-10-01

    Two studies examined men's interventions in a virtual reality situation involving child grooming. In Study 1, 92 men observed an online encounter between an apparent minor and a sex offender. The results suggest that the bystander effect was stronger under computerized rather than user-assisted surveillance, and when the fellow cyberbystander was unknown rather than known. In Study 2, where 100 men observed the same encounter, the effect also emerged under computerized surveillance as long as the number unknown cyberbystanders was increased. Thus, vesting more responsibility for security in the average netizen rather than just in the automated abuse-detection technology is cautiously suggested, the relevance of which lies in increasing minors' health and safety.

  8. The laboratory of the 1990s—Planning for total automation

    PubMed Central

    Brunner, Linda A.

    1992-01-01

    The analytical laboratory of the 1990s must be able to meet and accommodate the rapid evolution of modern-day technology. One such area is laboratory automation. Total automation may be seen as the coupling of computerized sample tracking, electronic documentation and data reduction with automated sample handling, preparation and analysis, resulting in a complete analytical procedure with minimal human involvement. Requirements may vary from one laboratory or facility to another, so the automation has to be flexible enough to cover a wide range of applications, and yet fit into specific niches depending on individual needs. Total automation must be planned for, well in advance, if the endeavour is to be a success. Space, laboratory layout, proper equipment, and the availability and access to necessary utilities must be taken into account. Adequate training and experience of the personnel working with the technology must also be ensured. In addition, responsibilities of installation, programming maintenance and operation have to be addressed. Proper time management and the efficient implementation and use of total automation are also crucial to successful operations. This paper provides insights into laboratory organization and requirements, as well as discussing the management issues that must be faced when automating laboratory procedures. PMID:18924925

  9. Application of a minicomputer-based system in measuring intraocular fluid dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bronzino, J.D.; D'Amato, D.P.; O'Rourke, J.

    A complete, computerized system has been developed to automate and display radionuclide clearance studies in an ophthalmology clinical laboratory. The system is based on a PDP-8E computer with a 16-k core memory and includes a dual-drive Decassette system and an interactive display terminal. The software controls the acquisition of data from an NIM scaler, times the procedures, and analyzes and simultaneously displays logarithmically converted data on a fully annotated graph. Animal studies and clinical experiments are presented to illustrate the nature of these displays and the results obtained using this automated eye physiometer.

  10. Computer system for scanning tunneling microscope automation

    NASA Astrophysics Data System (ADS)

    Aguilar, M.; García, A.; Pascual, P. J.; Presa, J.; Santisteban, A.

    1987-03-01

    A computerized system for the automation of a scanning tunneling microscope is presented. It is based on an IBM personal computer (PC) either an XT or an AT, which performs the control, data acquisition and storage operations, displays the STM "images" in real time, and provides image processing tools for the restoration and analysis of data. It supports different data acquisition and control cards and image display cards. The software has been designed in a modular way to allow the replacement of these cards and other equipment improvements as well as the inclusion of user routines for data analysis.

  11. The State and Trends of Barcode, RFID, Biometric and Pharmacy Automation Technologies in US Hospitals.

    PubMed

    Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A

    2015-01-01

    The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.

  12. A Recommendation Algorithm for Automating Corollary Order Generation

    PubMed Central

    Klann, Jeffrey; Schadow, Gunther; McCoy, JM

    2009-01-01

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards. PMID:20351875

  13. A recommendation algorithm for automating corollary order generation.

    PubMed

    Klann, Jeffrey; Schadow, Gunther; McCoy, J M

    2009-11-14

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards.

  14. Preliminary evaluation of a micro-based repeated measures testing system

    NASA Technical Reports Server (NTRS)

    Kennedy, Robert S.; Wilkes, Robert L.; Lane, Norman E.

    1985-01-01

    A need exists for an automated performance test system to study the effects of various treatments which are of interest to the aerospace medical community, i.e., the effects of drugs and environmental stress. The ethics and pragmatics of such assessment demand that repeated measures in small groups of subjects be the customary research paradigm. Test stability, reliability-efficiency and factor structure take on extreme significance; in a program of study by the U.S. Navy, 80 percent of 150 tests failed to meet minimum metric requirements. The best is being programmed on a portable microprocessor and administered along with tests in their original formats in order to examine their metric properties in the computerized mode. Twenty subjects have been tested over four replications on a 6.0 minute computerized battery (six tests) and which compared with five paper and pencil marker tests. All tests achieved stability within the four test sessions, reliability-efficiencies were high (r greater than .707 for three minutes testing), and the computerized tests were largely comparable to the paper and pencil version from which they were derived. This computerized performance test system is portable, inexpensive and rugged.

  15. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  16. Computerized Design and Generation of Gear Drives With a Localized Bearing Contact and a Low Level of Transmission Errors

    NASA Technical Reports Server (NTRS)

    Litvin, F.; Chen, J.; Seol, I.; Kim, D.; Lu, J.; Zhao, X.; Handschuh, R.

    1996-01-01

    A general approach developed for the computerized simulation of loaded gear drives is presented. In this paper the methodology used to localize the bearing contact, provide a parabolic function of transmission errors, and simulate meshing and contact of unloaded gear drives is developed. The approach developed is applied to spur and helical gears, spiral bevel gears, face-gear drives, and worm-gear drives with cylindrical worms.

  17. 45 CFR 307.11 - Functional requirements for computerized support enforcement systems in operation by October 1...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... (2) The capability to perform the following tasks with the frequency and in the manner required under... business days after receipt of notice of income, and the income source subject to withholding from a court... orders through an automated information network in meeting paragraph (e)(2)(ii) of this section provided...

  18. Proposal for a Spatial Organization Model in Soil Science (The Example of the European Communities Soil Map).

    ERIC Educational Resources Information Center

    King, D.; And Others

    1994-01-01

    Discusses the computational problems of automating paper-based spatial information. A new relational structure for soil science information based on the main conceptual concepts used during conventional cartographic work is proposed. This model is a computerized framework for coherent description of the geographical variability of soils, combined…

  19. Computerization of the Botswana National Library Service. Restricted Technical Report.

    ERIC Educational Resources Information Center

    Underwood, Peter C.

    This report discusses the scope for and feasibility of introducing automated systems into the Botswana National Library Service (BNLS). The study was undertaken at the request of BNLS and was conducted by an outside consultant who interviewed staff, read internal documents and reports, and studied patterns of work. Topics of the report include:…

  20. 45 CFR 307.11 - Functional requirements for computerized support enforcement systems in operation by October 1...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...; (ii) Social security numbers; (iii) Dates of birth; (iv) Case identification numbers; (v) Other... collected amounts; (v) The birth date and, beginning no later than October 1, 1999, the name and social... orders through an automated information network in meeting paragraph (e)(2)(ii) of this section provided...

  1. Computerization of the Archivo General de Indias: Strategies and Results, Part I.

    ERIC Educational Resources Information Center

    Gonzalez, Pedro

    1998-01-01

    Discusses the digital reformatting project of the Archivo General de Indias (AGI) in Seville, Spain; the goal was to design, develop, and implement an automated data system capable of integrated management of common functions of a historical archive. Describes the general system architecture, and the design and attributes of the information and…

  2. Critical factors influencing physicians' intention to use computerized clinical practice guidelines: an integrative model of activity theory and the technology acceptance model.

    PubMed

    Hsiao, Ju-Ling; Chen, Rai-Fu

    2016-01-16

    With the widespread use of information communication technologies, computerized clinical practice guidelines are developed and considered as effective decision supporting tools in assisting the processes of clinical activities. However, the development of computerized clinical practice guidelines in Taiwan is still at the early stage and acceptance level among major users (physicians) of computerized clinical practice guidelines is not satisfactory. This study aims to investigate critical factors influencing physicians' intention to computerized clinical practice guideline use through an integrative model of activity theory and the technology acceptance model. The survey methodology was employed to collect data from physicians of the investigated hospitals that have implemented computerized clinical practice guidelines. A total of 505 questionnaires were sent out, with 238 completed copies returned, indicating a valid response rate of 47.1 %. The collected data was then analyzed by structural equation modeling technique. The results showed that attitudes toward using computerized clinical practice guidelines (γ = 0.451, p < 0.001), organizational support (γ = 0.285, p < 0.001), perceived usefulness of computerized clinical practice guidelines (γ = 0.219, p < 0.05), and social influence (γ = 0.213, p < 0.05) were critical factors influencing physicians' intention to use computerized clinical practice guidelines, and these factors can explain 68.6 % of the variance in intention to use computerized clinical practice guidelines. This study confirmed that some subject (human) factors, environment (organization) factors, tool (technology) factors mentioned in the activity theory should be carefully considered when introducing computerized clinical practice guidelines. Managers should pay much attention on those identified factors and provide adequate resources and incentives to help the promotion and use of computerized clinical practice guidelines. Through the appropriate use of computerized clinical practice guidelines, the clinical benefits, particularly in improving quality of care and facilitating the clinical processes, will be realized.

  3. Automated classification of brain tumor type in whole-slide digital pathology images using local representative tiles.

    PubMed

    Barker, Jocelyn; Hoogi, Assaf; Depeursinge, Adrien; Rubin, Daniel L

    2016-05-01

    Computerized analysis of digital pathology images offers the potential of improving clinical care (e.g. automated diagnosis) and catalyzing research (e.g. discovering disease subtypes). There are two key challenges thwarting computerized analysis of digital pathology images: first, whole slide pathology images are massive, making computerized analysis inefficient, and second, diverse tissue regions in whole slide images that are not directly relevant to the disease may mislead computerized diagnosis algorithms. We propose a method to overcome both of these challenges that utilizes a coarse-to-fine analysis of the localized characteristics in pathology images. An initial surveying stage analyzes the diversity of coarse regions in the whole slide image. This includes extraction of spatially localized features of shape, color and texture from tiled regions covering the slide. Dimensionality reduction of the features assesses the image diversity in the tiled regions and clustering creates representative groups. A second stage provides a detailed analysis of a single representative tile from each group. An Elastic Net classifier produces a diagnostic decision value for each representative tile. A weighted voting scheme aggregates the decision values from these tiles to obtain a diagnosis at the whole slide level. We evaluated our method by automatically classifying 302 brain cancer cases into two possible diagnoses (glioblastoma multiforme (N = 182) versus lower grade glioma (N = 120)) with an accuracy of 93.1% (p < 0.001). We also evaluated our method in the dataset provided for the 2014 MICCAI Pathology Classification Challenge, in which our method, trained and tested using 5-fold cross validation, produced a classification accuracy of 100% (p < 0.001). Our method showed high stability and robustness to parameter variation, with accuracy varying between 95.5% and 100% when evaluated for a wide range of parameters. Our approach may be useful to automatically differentiate between the two cancer subtypes. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Computerized intrapartum electronic fetal monitoring: analysis of the decision to deliver for fetal distress.

    PubMed

    Georgieva, Antoniya; Payne, Stephen J; Moulden, Mary; Redman, Christopher W G

    2011-01-01

    We applied computerized methods to assess the Electronic Fetal Monitoring (EFM) in labor. We analyzed retrospectively the last hour of EFM for 1,370 babies, delivered by emergency Cesarean sections before the onset of pushing (data collected at the John Radcliffe Hospital, Oxford, UK). There were two cohorts according to the reason for intervention: (a) fetal distress, n(1) = 524 and (b) failure to progress and/or malpresentation, n(2) = 846. The cohorts were compared in terms of classical EFM features (baseline, decelerations, variability and accelerations), computed by a dedicated Oxford system for automated analysis--OxSys. In addition, OxSys was employed to simulate current clinical guidelines for the classification of fetal monitoring, i.e. providing in real time a three-tier grading system of the EFM (normal, indeterminate, or abnormal). The computerized features and the simulated guidelines corresponded well to the clinical management and to the actual labor outcome (measured by umbilical arterial pH).

  5. Cooperative forestry inventory project for Nevada

    NASA Technical Reports Server (NTRS)

    Thornhill, R.

    1981-01-01

    A forest inventory project employing computerized classification of LANDSAT data to inventory vegetation types in western Nevada is described. The methodology and applicability of the resulting survey are summarized.

  6. Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification

    DOT National Transportation Integrated Search

    2011-04-29

    For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...

  7. CLECOS_P: clinical evolution control system on Parkinsonian patients undergoing neural transplantation.

    PubMed

    Morales, F; Molina, H; Cruz, N; Valladares, P; Muñoz, J; Ortega, I; Torres, O; Leon, M

    1995-01-01

    The CLECOS_P system was conceived for registering and automating the processing of clinical evaluations performed on patients with Parkinson's disease who undergo functional neurosurgery and/or neural transplant. CLECOS_P represents the first time a computerized system is able to offer--with high precision and considerable time-savings--an integral analysis of the evolutive behavior of the universe in integrated variables at the core assessment program for intracerebral transplantations (CAPIT). CAPIT is used internationally for the evaluation and follow-up of patients with this pathology who have undergone neural transplant. We used the so-called MEDSAC methodology for the preparation of this system. The methodology that was used for the design of an intelligent system aimed at medical decision-making was based on the quantitative analysis of the clinical evolution. At the present moment, there are 20 patients controlled by this system: 11 bilaterally transplanted, 9 unilaterally (registered in ranks of 3 months before operation up to 1, 2, 3, 6, 9, 12, 18, and 24 months after operation). The application of CLECOS_P to these patients permitted the evaluation of 400 clinical variables, where a better evolutive characterization of the patients was obtained, thus getting most favorable results with personalized therapeutic methods aimed at raising their quality of life. CLECOS_P is used in a multi-user environment on a local area network running Novell Netware version 3.11.

  8. Detection and classification of interstitial lung diseases and emphysema using a joint morphological-fuzzy approach

    NASA Astrophysics Data System (ADS)

    Chang Chien, Kuang-Che; Fetita, Catalin; Brillet, Pierre-Yves; Prêteux, Françoise; Chang, Ruey-Feng

    2009-02-01

    Multi-detector computed tomography (MDCT) has high accuracy and specificity on volumetrically capturing serial images of the lung. It increases the capability of computerized classification for lung tissue in medical research. This paper proposes a three-dimensional (3D) automated approach based on mathematical morphology and fuzzy logic for quantifying and classifying interstitial lung diseases (ILDs) and emphysema. The proposed methodology is composed of several stages: (1) an image multi-resolution decomposition scheme based on a 3D morphological filter is used to detect and analyze the different density patterns of the lung texture. Then, (2) for each pattern in the multi-resolution decomposition, six features are computed, for which fuzzy membership functions define a probability of association with a pathology class. Finally, (3) for each pathology class, the probabilities are combined up according to the weight assigned to each membership function and two threshold values are used to decide the final class of the pattern. The proposed approach was tested on 10 MDCT cases and the classification accuracy was: emphysema: 95%, fibrosis/honeycombing: 84% and ground glass: 97%.

  9. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    NASA Astrophysics Data System (ADS)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  10. Computers and Management Structure: Some Empirical Findings Re-examined

    ERIC Educational Resources Information Center

    Robey, Daniel

    1977-01-01

    Studies that relate computerization to either centralization or decentralization of organizational decision making are reviewed. Four issues are addressed that relate to conceptual or methodological problems. (Author/MLF)

  11. Computerized engineering logic for procurement and dedication processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tulay, M.P.

    1996-12-31

    This paper summarizes the work performed for designing the system and especially for calculating on-line expected performance and gives some significant results. In an attempt to better meet the needs of operations and maintenance organizations, many nuclear utility procurement engineering groups have simplified their procedures, developed on-line tools for performing the specification of replacement items, and developed relational databases containing part-level information necessary to automate the procurement process. Although these improvements have helped to reduce the engineering necessary to properly specify and accept/dedicate items for nuclear safety-related applications, a number of utilities have recognized that additional long-term savings can bemore » realized by integrating a computerized logic to assist technical procurement engineering personnel.« less

  12. Dopamine Beta Hydroxylase Genotype Identifies Individuals Less Susceptible to Bias in Computer-Assisted Decision Making

    PubMed Central

    Parasuraman, Raja; de Visser, Ewart; Lin, Ming-Kuan; Greenwood, Pamela M.

    2012-01-01

    Computerized aiding systems can assist human decision makers in complex tasks but can impair performance when they provide incorrect advice that humans erroneously follow, a phenomenon known as “automation bias.” The extent to which people exhibit automation bias varies significantly and may reflect inter-individual variation in the capacity of working memory and the efficiency of executive function, both of which are highly heritable and under dopaminergic and noradrenergic control in prefrontal cortex. The dopamine beta hydroxylase (DBH) gene is thought to regulate the differential availability of dopamine and norepinephrine in prefrontal cortex. We therefore examined decision-making performance under imperfect computer aiding in 100 participants performing a simulated command and control task. Based on two single nucleotide polymorphism (SNPs) of the DBH gene, −1041 C/T (rs1611115) and 444 G/A (rs1108580), participants were divided into groups of low and high DBH enzyme activity, where low enzyme activity is associated with greater dopamine relative to norepinephrine levels in cortex. Compared to those in the high DBH enzyme activity group, individuals in the low DBH enzyme activity group were more accurate and speedier in their decisions when incorrect advice was given and verified automation recommendations more frequently. These results indicate that a gene that regulates relative prefrontal cortex dopamine availability, DBH, can identify those individuals who are less susceptible to bias in using computerized decision-aiding systems. PMID:22761865

  13. Establishing the Reliability and Validity of a Computerized Assessment of Children's Working Memory for Use in Group Settings

    ERIC Educational Resources Information Center

    St Clair-Thompson, Helen

    2014-01-01

    The aim of the present study was to investigate the reliability and validity of a brief standardized assessment of children's working memory; "Lucid Recall." Although there are many established assessments of working memory, "Lucid Recall" is fully automated and can therefore be administered in a group setting. It is therefore…

  14. Research on the Impact of a Computerized Circulation System on the Performance of a Large College Library. Final Report.

    ERIC Educational Resources Information Center

    Frohmberg, Katherine A.; Moffett, William A.

    In order to study the effects of introducing an automated circulation system at Oberlin College, Ohio, data were collected from September 1978 until June 1982 on book availability, usage of library facilities, attitudes of library users toward the library, and the efficiency of circulation activities. Data collection methods included circulation…

  15. Specifications for a Computerized Library Circulation Management Data and On-Line Catalog System.

    ERIC Educational Resources Information Center

    Schwarz, Philip J.

    This manual is intended primarily for libraries that wish to purchase a turnkey automated circulation system and online catalog, but lack the staff, time, and expertise to develop a set of specifications, or the money to hire consultants. Specifications are provided to assist in the selection from several options: (1) development of an in-house…

  16. Design of a solar array simulator for the NASA EOS testbed

    NASA Technical Reports Server (NTRS)

    Butler, Steve J.; Sable, Dan M.; Lee, Fred C.; Cho, Bo H.

    1992-01-01

    The present spacecraft solar array simulator addresses both dc and ac characteristics as well as changes in illumination and temperature and performance degradation over the course of array service life. The computerized control system used allows simulation of a complete orbit cycle, in addition to automated diagnostics. The simulator is currently interfaced with the NASA EOS testbed.

  17. Evaluation of Automated Natural Language Processing in the Further Development of Science Information Retrieval. String Program Reports No. 10.

    ERIC Educational Resources Information Center

    Sager, Naomi

    This investigation matches the emerging techniques in computerized natural language processing against emerging needs for such techniques in the information field to evaluate and extend such techniques for future applications and to establish a basis and direction for further research toward these goals. An overview describes developments in the…

  18. Proceedings of the international conference on cybernetics and societ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    This book presents the papers given at a conference on artificial intelligence, expert systems and knowledge bases. Topics considered at the conference included automating expert system development, modeling expert systems, causal maps, data covariances, robot vision, image processing, multiprocessors, parallel processing, VLSI structures, man-machine systems, human factors engineering, cognitive decision analysis, natural language, computerized control systems, and cybernetics.

  19. Measuring and Enhancing Organizational Productivity: An Annotated Bibliography. Interim Report, April 2, 1980 through June 30, 1980.

    ERIC Educational Resources Information Center

    Tuttle, Thomas C.; And Others

    This report resulted from visits to over 50 organizations in the Air Force, Army, Navy, and in the civilian sector, automated and manual searches of journals, and computerized databases. This report is a comprehensive annotated bibliography of the literature on productivity measurement and enhancement. The report is organized into four sections:…

  20. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  1. Cognitive consequences of clumsy automation on high workload, high consequence human performance

    NASA Technical Reports Server (NTRS)

    Cook, Richard I.; Woods, David D.; Mccolligan, Elizabeth; Howie, Michael B.

    1991-01-01

    The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems.

  2. A cognitive task analysis of information management strategies in a computerized provider order entry environment.

    PubMed

    Weir, Charlene R; Nebeker, Jonathan J R; Hicken, Bret L; Campo, Rebecca; Drews, Frank; Lebar, Beth

    2007-01-01

    Computerized Provider Order Entry (CPOE) with electronic documentation, and computerized decision support dramatically changes the information environment of the practicing clinician. Prior work patterns based on paper, verbal exchange, and manual methods are replaced with automated, computerized, and potentially less flexible systems. The objective of this study is to explore the information management strategies that clinicians use in the process of adapting to a CPOE system using cognitive task analysis techniques. Observation and semi-structured interviews were conducted with 88 primary-care clinicians at 10 Veterans Administration Medical Centers. Interviews were taped, transcribed, and extensively analyzed to identify key information management goals, strategies, and tasks. Tasks were aggregated into groups, common components across tasks were clarified, and underlying goals and strategies identified. Nearly half of the identified tasks were not fully supported by the available technology. Six core components of tasks were identified. Four meta-cognitive information management goals emerged: 1) Relevance Screening; 2) Ensuring Accuracy; 3) Minimizing memory load; and 4) Negotiating Responsibility. Strategies used to support these goals are presented. Users develop a wide array of information management strategies that allow them to successfully adapt to new technology. Supporting the ability of users to develop adaptive strategies to support meta-cognitive goals is a key component of a successful system.

  3. Virginia Transit Performance Evaluation Package (VATPEP).

    DOT National Transportation Integrated Search

    1987-01-01

    The Virginia Transit Performance Evaluation Package (VATPEP), a computer software package, is documented. This is the computerized version of the methodology used by the Virginia Department of Transportation to evaluate the performance of public tran...

  4. Driving out errors through tight integration between software and automation.

    PubMed

    Reifsteck, Mark; Swanson, Thomas; Dallas, Mary

    2006-01-01

    A clear case has been made for using clinical IT to improve medication safety, particularly bar-code point-of-care medication administration and computerized practitioner order entry (CPOE) with clinical decision support. The equally important role of automation has been overlooked. When the two are tightly integrated, with pharmacy information serving as a hub, the distinctions between software and automation become blurred. A true end-to-end medication management system drives out errors from the dockside to the bedside. Presbyterian Healthcare Services in Albuquerque has been building such a system since 1999, beginning by automating pharmacy operations to support bar-coded medication administration. Encouraged by those results, it then began layering on software to further support clinician workflow and improve communication, culminating with the deployment of CPOE and clinical decision support. This combination, plus a hard-wired culture of safety, has resulted in a dramatically lower mortality and harm rate that could not have been achieved with a partial solution.

  5. The State and Trends of Barcode, RFID, Biometric and Pharmacy Automation Technologies in US Hospitals

    PubMed Central

    Uy, Raymonde Charles Y.; Kury, Fabricio P.; Fontelo, Paul A.

    2015-01-01

    The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions. PMID:26958264

  6. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  7. Sled Control and Safety System

    NASA Technical Reports Server (NTRS)

    Forrest, L. J.

    1982-01-01

    Computerized system for controlling motion of linear-track accelerator applied to other automated equipment, such as numerically-controlled machine tools and robot manipulators on assembly lines. System controls motions of sled with sine-wave signal created digitally by microprocessor. Dynamic parameters of sled motion are monitored so sled may be stopped safely if malfunction occurs. Sled is capable of sinusoidal accelerations up to 0.5 g with 125-kg load.

  8. IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bazjanac, Vladimir

    2008-07-01

    Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less

  9. An evaluation of NASA's program in human factors research: Aircrew-vehicle system interaction

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Research in human factors in the aircraft cockpit and a proposed program augmentation were reviewed. The dramatic growth of microprocessor technology makes it entirely feasible to automate increasingly more functions in the aircraft cockpit; the promise of improved vehicle performance, efficiency, and safety through automation makes highly automated flight inevitable. An organized data base and validated methodology for predicting the effects of automation on human performance and thus on safety are lacking and without such a data base and validated methodology for analyzing human performance, increased automation may introduce new risks. Efforts should be concentrated on developing methods and techniques for analyzing man machine interactions, including human workload and prediction of performance.

  10. A Multisite, Randomized Controlled Clinical Trial of Computerized Cognitive Remediation Therapy for Schizophrenia.

    PubMed

    Gomar, Jesús J; Valls, Elia; Radua, Joaquim; Mareca, Celia; Tristany, Josep; del Olmo, Francisco; Rebolleda-Gil, Carlos; Jañez-Álvarez, María; de Álvaro, Francisco J; Ovejero, María R; Llorente, Ana; Teixidó, Cristina; Donaire, Ana M; García-Laredo, Eduardo; Lazcanoiturburu, Andrea; Granell, Luis; Mozo, Cristina de Pablo; Pérez-Hernández, Mónica; Moreno-Alcázar, Ana; Pomarol-Clotet, Edith; McKenna, Peter J

    2015-11-01

    The effectiveness of cognitive remediation therapy (CRT) for the neuropsychological deficits seen in schizophrenia is supported by meta-analysis. However, a recent methodologically rigorous trial had negative findings. In this study, 130 chronic schizophrenic patients were randomly assigned to computerized CRT, an active computerized control condition (CC) or treatment as usual (TAU). Primary outcome measures were 2 ecologically valid batteries of executive function and memory, rated under blind conditions; other executive and memory tests and a measure of overall cognitive function were also employed. Carer ratings of executive and memory failures in daily life were obtained before and after treatment. Computerized CRT was found to produce improvement on the training tasks, but this did not transfer to gains on the primary outcome measures and most other neuropsychological tests in comparison to either CC or TAU conditions. Nor did the intervention result in benefits on carer ratings of daily life cognitive failures. According to this study, computerized CRT is not effective in schizophrenia. The use of both active and passive CCs suggests that nature of the control group is not an important factor influencing results. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center.

  11. A HUMAN AUTOMATION INTERACTION CONCEPT FOR A SMALL MODULAR REACTOR CONTROL ROOM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya; Spielman, Zach; Hill, Rachael

    Many advanced nuclear power plant (NPP) designs incorporate higher degrees of automation than the existing fleet of NPPs. Automation is being introduced or proposed in NPPs through a wide variety of systems and technologies, such as advanced displays, computer-based procedures, advanced alarm systems, and computerized operator support systems. Additionally, many new reactor concepts, both full scale and small modular reactors, are proposing increased automation and reduced staffing as part of their concept of operations. However, research consistently finds that there is a fundamental tradeoff between system performance with increased automation and reduced human performance. There is a need to addressmore » the question of how to achieve high performance and efficiency of high levels of automation without degrading human performance. One example of a new NPP concept that will utilize greater degrees of automation is the SMR concept from NuScale Power. The NuScale Power design requires 12 modular units to be operated in one single control room, which leads to a need for higher degrees of automation in the control room. Idaho National Laboratory (INL) researchers and NuScale Power human factors and operations staff are working on a collaborative project to address the human performance challenges of increased automation and to determine the principles that lead to optimal performance in highly automated systems. This paper will describe this concept in detail and will describe an experimental test of the concept. The benefits and challenges of the approach will be discussed.« less

  12. Human factor engineering based design and modernization of control rooms with new I and C systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larraz, J.; Rejas, L.; Ortega, F.

    2012-07-01

    Instrumentation and Control (I and C) systems of the latest nuclear power plants are based on the use of digital technology, distributed control systems and the integration of information in data networks (Distributed Control and Instrumentation Systems). This has a repercussion on Control Rooms (CRs), where the operations and monitoring interfaces correspond to these systems. These technologies are also used in modernizing I and C systems in currently operative nuclear power plants. The new interfaces provide additional capabilities for operation and supervision, as well as a high degree of flexibility, versatility and reliability. An example of this is the implementationmore » of solutions such as compact stations, high level supervision screens, overview displays, computerized procedures, new operational support systems or intelligent alarms processing systems in the modernized Man-Machine Interface (MMI). These changes in the MMI are accompanied by newly added Software (SW) controls and new solutions in automation. Tecnatom has been leading various projects in this area for several years, both in Asian countries and in the United States, using in all cases international standards from which Tecnatom own methodologies have been developed and optimized. The experience acquired in applying this methodology to the design of new control rooms is to a large extent applicable also to the modernization of current control rooms. An adequate design of the interface between the operator and the systems will facilitate safe operation, contribute to the prompt identification of problems and help in the distribution of tasks and communications between the different members of the operating shift. Based on Tecnatom experience in the field, this article presents the methodological approach used as well as the most relevant aspects of this kind of project. (authors)« less

  13. Current and emerging business models in the health care information technology industry: a view from wall street.

    PubMed

    Frank, Seth

    2003-01-01

    When we think about health care IT, we don't just think about clinical automation with the movement to computerized physician order entry (CPOE), but also the need to upgrade legacy financial and administrative systems to interact with clinical systems. Technology acceptance by physicians remains low, and computer use by physicians for data entry and analysis remains minimal. We expect this trend to change, and expect increased automation to represent gradual change. The HCIT space is dynamic, with many opportunities, but also many challenges. The unique nature of the end market buyers, existing business models, and nature of the technology makes this a challenging but dynamic area for equity investment.

  14. Biometric correspondence between reface computerized facial approximations and CT-derived ground truth skin surface models objectively examined using an automated facial recognition system.

    PubMed

    Parks, Connie L; Monson, Keith L

    2018-05-01

    This study employed an automated facial recognition system as a means of objectively evaluating biometric correspondence between a ReFace facial approximation and the computed tomography (CT) derived ground truth skin surface of the same individual. High rates of biometric correspondence were observed, irrespective of rank class (R k ) or demographic cohort examined. Overall, 48% of the test subjects' ReFace approximation probes (n=96) were matched to his or her corresponding ground truth skin surface image at R 1 , a rank indicating a high degree of biometric correspondence and a potential positive identification. Identification rates improved with each successively broader rank class (R 10 =85%, R 25 =96%, and R 50 =99%), with 100% identification by R 57 . A sharp increase (39% mean increase) in identification rates was observed between R 1 and R 10 across most rank classes and demographic cohorts. In contrast, significantly lower (p<0.01) increases in identification rates were observed between R 10 and R 25 (8% mean increase) and R 25 and R 50 (3% mean increase). No significant (p>0.05) performance differences were observed across demographic cohorts or CT scan protocols. Performance measures observed in this research suggest that ReFace approximations are biometrically similar to the actual faces of the approximated individuals and, therefore, may have potential operational utility in contexts in which computerized approximations are utilized as probes in automated facial recognition systems. Copyright © 2018. Published by Elsevier B.V.

  15. Evaluation of roadway sites for queue management.

    DOT National Transportation Integrated Search

    1991-01-01

    This study addresses the problem of queueing on highway facilities, wherein a large number of computerized methods for the analysis of different queueing situations are available. A three-tier classification system of the methodologies was used with ...

  16. User's Guide for a Computerized Track Maintenance Simulation Cost Methodology

    DOT National Transportation Integrated Search

    1982-02-01

    This User's Guide describes the simulation cost modeling technique developed for costing of maintenance operations of track and its component structures. The procedure discussed provides for separate maintenance cost entries to be associated with def...

  17. An algorithm for analytical solution of basic problems featuring elastostatic bodies with cavities and surface flaws

    NASA Astrophysics Data System (ADS)

    Penkov, V. B.; Levina, L. V.; Novikova, O. S.; Shulmin, A. S.

    2018-03-01

    Herein we propose a methodology for structuring a full parametric analytical solution to problems featuring elastostatic media based on state-of-the-art computing facilities that support computerized algebra. The methodology includes: direct and reverse application of P-Theorem; methods of accounting for physical properties of media; accounting for variable geometrical parameters of bodies, parameters of boundary states, independent parameters of volume forces, and remote stress factors. An efficient tool to address the task is the sustainable method of boundary states originally designed for the purposes of computerized algebra and based on the isomorphism of Hilbertian spaces of internal states and boundary states of bodies. We performed full parametric solutions of basic problems featuring a ball with a nonconcentric spherical cavity, a ball with a near-surface flaw, and an unlimited medium with two spherical cavities.

  18. Using an integrated automated system to optimize retention and increase frequency of blood donations.

    PubMed

    Whitney, J Garrett; Hall, Robert F

    2010-07-01

    This study examines the impact of an integrated, automated phone system to reinforce retention and increase frequency of donations among blood donors. Cultivated by incorporating data results over the past 7 years, the system uses computerized phone messaging to contact blood donors with individualized, multilevel notifications. Donors are contacted at planned intervals to acknowledge and recognize their donations, informed where their blood was sent, asked to participate in a survey, and reminded when they are eligible to donate again. The report statistically evaluates the impact of the various components of the system on donor retention and blood donations and quantifies the fiscal advantages to blood centers. By using information and support systems provided by the automated services and then incorporating the phlebotomists and recruiters to reinforce donor retention, both retention and donations will increase. © 2010 American Association of Blood Banks.

  19. Thinking Together: Modeling Clinical Decision-Support as a Sociotechnical System

    PubMed Central

    Hussain, Mustafa I.; Reynolds, Tera L.; Mousavi, Fatemeh E.; Chen, Yunan; Zheng, Kai

    2017-01-01

    Computerized clinical decision-support systems are members of larger sociotechnical systems, composed of human and automated actors, who send, receive, and manipulate artifacts. Sociotechnical consideration is rare in the literature. This makes it difficult to comparatively evaluate the success of CDS implementations, and it may also indicate that sociotechnical context receives inadequate consideration in practice. To facilitate sociotechnical consideration, we developed the Thinking Together model, a flexible diagrammatical means of representing CDS systems as sociotechnical systems. To develop this model, we examined the literature with the lens of Distributed Cognition (DCog) theory. We then present two case studies of vastly different CDSSs, one almost fully automated and the other with minimal automation, to illustrate the flexibility of the Thinking Together model. We show that this model, informed by DCog and the CDS literature, are capable of supporting both research, by enabling comparative evaluation, and practice, by facilitating explicit sociotechnical planning and communication. PMID:29854164

  20. Computerized method for detection of vertebral fractures on lateral chest radiographs based on morphometric data

    NASA Astrophysics Data System (ADS)

    Kasai, Satoshi; Li, Feng; Shiraishi, Junji; Li, Qiang; Straus, Christopher; Vokes, Tamara; MacMahon, Heber; Doi, Kunio

    2007-03-01

    Vertebral fractures are the most common osteoporosis-related fractures. It is important to detect vertebral fractures, because they are associated with increased risk of subsequent fractures, and because pharmacologic therapy can reduce the risk of subsequent fractures. Although vertebral fractures are often not clinically recognized, they can be visualized on lateral chest radiographs taken for other purposes. However, only 15-60% of vertebral fractures found on lateral chest radiographs are mentioned in radiology reports. The purpose of this study was to develop a computerized method for detection of vertebral fractures on lateral chest radiographs in order to assist radiologists' image interpretation. Our computerized method is based on the automated identification of upper and lower vertebral edges. In order to develop the scheme, radiologists provided morphometric data for each identifiable vertebra, which consisted of six points for each vertebra, for 25 normals and 20 cases with severe fractures. Anatomical information was obtained from morphometric data of normal cases in terms of vertebral heights, heights of vertebral disk spaces, and vertebral centerline. Computerized detection of vertebral fractures was based on the reduction in the heights of fractured vertebrae compared to adjacent vertebrae and normal reference data. Vertebral heights from morphometric data on normal cases were used as reference. On 138 chest radiographs (20 with fractures) the sensitivity of our method for detection of fracture cases was 95% (19/20) with 0.93 (110/118) false-positives per image. In conclusion, the computerized method would be useful for detection of potentially overlooked vertebral fractures on lateral chest radiographs.

  1. Detecting errors and anomalies in computerized materials control and accountability databases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteson, R.; Hench, K.; Yarbro, T.

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines thesemore » large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results.« less

  2. A computerized method for automated identification of erect posteroanterior and supine anteroposterior chest radiographs

    NASA Astrophysics Data System (ADS)

    Kao, E.-Fong; Lin, Wei-Chen; Hsu, Jui-Sheng; Chou, Ming-Chung; Jaw, Twei-Shiun; Liu, Gin-Chung

    2011-12-01

    A computerized scheme was developed for automated identification of erect posteroanterior (PA) and supine anteroposterior (AP) chest radiographs. The method was based on three features, the tilt angle of the scapula superior border, the tilt angle of the clavicle and the extent of radiolucence in lung fields, to identify the view of a chest radiograph. The three indices Ascapula, Aclavicle and Clung were determined from a chest image for the three features. Linear discriminant analysis was used to classify PA and AP chest images based on the three indices. The performance of the method was evaluated by receiver operating characteristic analysis. The proposed method was evaluated using a database of 600 PA and 600 AP chest radiographs. The discriminant performances Az of Ascapula, Aclavicle and Clung were 0.878 ± 0.010, 0.683 ± 0.015 and 0.962 ± 0.006, respectively. The combination of the three indices obtained an Az value of 0.979 ± 0.004. The results indicate that the combination of the three indices could yield high discriminant performance. The proposed method could provide radiologists with information about the view of chest radiographs for interpretation or could be used as a preprocessing step for analyzing chest images.

  3. Computerized screening devices and performance assessment: development of a policy towards automation. International Academy of Cytology Task Force summary. Diagnostic Cytology Towards the 21st Century: An International Expert Conference and Tutorial.

    PubMed

    Bartels, P H; Bibbo, M; Hutchinson, M L; Gahm, T; Grohs, H K; Gwi-Mak, E; Kaufman, E A; Kaufman, R H; Knight, B K; Koss, L G; Magruder, L E; Mango, L J; McCallum, S M; Melamed, M R; Peebles, A; Richart, R M; Robinowitz, M; Rosenthal, D L; Sauer, T; Schenck, U; Tanaka, N; Topalidis, T; Verhest, A P; Wertlake, P T; Wilbur, D C

    1998-01-01

    The extension of automation to the diagnostic assessment of clinical materials raises issues of professional responsibility, on the part of both the medical professional and designer of the device. The International Academy of Cytology (IAC) and other professional cytology societies should develop a policy towards automation in the diagnostic assessment of clinical cytologic materials. The following summarizes the discussion of the initial position statement at the International Expert Conference on Diagnostic Cytology Towards the 21st Century, Hawaii, June 1997. 1. The professional in charge of a clinical cytopathology laboratory continues to bear the ultimate medical responsibility for diagnostic decisions made at the facility, whether automated devices are involved or not. 2. The introduction of automated procedures into clinical cytology should under no circumstances lead to a lowering of standards of performance. A prime objective of any guidelines should be to ensure that an automated procedure, in principle, does not expose any patient to new risks, nor should it increase already-existing, inherent risks. 3. Automated devices should provide capabilities for the medical professional to conduct periodic tests of the appropriate performance of the device. 4. Supervisory personnel should continue visual quality control screening of a certain percentage of slides dismissed at primary screening as within normal limits (WNL), even when automated procedures are employed in the laboratory. 5. Specifications for the design of primary screening devices for the detection of cervical cancer issued by the IAC in 1984 were reaffirmed. 6. The setting of numeric performance criteria is the proper charge of regulatory agencies, which also have the power of enforcement. 7. Human expert verification of results represents the "gold standard" at this time. Performance characteristics of computerized cytology devices should be determined by adherence to defined and well-considered protocols. Manufacturers should not claim a new standard of care; this is the responsibility of the medical community and professional groups. 8. Cytology professionals should support the development of procedures that bring about an improvement in diagnostic decision making. Advances in technology should be adopted if they can help solve problems in clinical cytology. The introduction of automated procedures into diagnostic decision making should take place strictly under the supervision and with the active participation and critical evaluation by the professional cytology community. Guidelines should be developed for the communication of technical information about the performance of automated screening devices by the IAC to governmental agencies and national societies. Also, guidelines are necessary for the official communication of IAC concerns to industry, medicolegal entities and the media. Procedures and guidelines for the evaluation of studies pertaining to the performance of automated devices, performance metrics and definitions for evaluation criteria should be established.

  4. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  5. Comparison of warfarin therapy clinical outcomes following implementation of an automated mobile phone-based critical laboratory value text alert system.

    PubMed

    Lin, Shu-Wen; Kang, Wen-Yi; Lin, Dong-Tsamn; Lee, James; Wu, Fe-Lin; Chen, Chuen-Liang; Tseng, Yufeng J

    2014-01-01

    Computerized alert and reminder systems have been widely accepted and applied to various patient care settings, with increasing numbers of clinical laboratories communicating critical laboratory test values to professionals via either manual notification or automated alerting systems/computerized reminders. Warfarin, an oral anticoagulant, exhibits narrow therapeutic range between treatment response and adverse events. It requires close monitoring of prothrombin time (PT)/international normalized ratio (INR) to ensure patient safety. This study was aimed to evaluate clinical outcomes of patients on warfarin therapy following implementation of a Personal Handy-phone System-based (PHS) alert system capable of generating and delivering text messages to communicate critical PT/INR laboratory results to practitioners' mobile phones in a large tertiary teaching hospital. A retrospective analysis was performed comparing patient clinical outcomes and physician prescribing behavior following conversion from a manual laboratory result alert system to an automated system. Clinical outcomes and practitioner responses to both alert systems were compared. Complications to warfarin therapy, warfarin utilization, and PT/INR results were evaluated for both systems, as well as clinician time to read alert messages, time to warfarin therapy modification, and monitoring frequency. No significant differences were detected in major hemorrhage and thromboembolism, warfarin prescribing patterns, PT/INR results, warfarin therapy modification, or monitoring frequency following implementation of the PHS text alert system. In both study periods, approximately 80% of critical results led to warfarin discontinuation or dose reduction. Senior physicians' follow-up response time to critical results was significantly decreased in the PHS alert study period (46.3% responded within 1 day) compared to the manual notification study period (24.7%; P = 0.015). No difference in follow-up response time was detected for junior physicians. Implementation of an automated PHS-based text alert system did not adversely impact clinical or safety outcomes of patients on warfarin therapy. Approximately 80% immediate recognition of text alerts was achieved. The potential benefits of an automated PHS alert for senior physicians were demonstrated.

  6. A Cognitive Task Analysis of Information Management Strategies in a Computerized Provider Order Entry Environment

    PubMed Central

    Weir, Charlene R.; Nebeker, Jonathan J.R.; Hicken, Bret L.; Campo, Rebecca; Drews, Frank; LeBar, Beth

    2007-01-01

    Objective Computerized Provider Order Entry (CPOE) with electronic documentation, and computerized decision support dramatically changes the information environment of the practicing clinician. Prior work patterns based on paper, verbal exchange, and manual methods are replaced with automated, computerized, and potentially less flexible systems. The objective of this study is to explore the information management strategies that clinicians use in the process of adapting to a CPOE system using cognitive task analysis techniques. Design Observation and semi-structured interviews were conducted with 88 primary-care clinicians at 10 Veterans Administration Medical Centers. Measurements Interviews were taped, transcribed, and extensively analyzed to identify key information management goals, strategies, and tasks. Tasks were aggregated into groups, common components across tasks were clarified, and underlying goals and strategies identified. Results Nearly half of the identified tasks were not fully supported by the available technology. Six core components of tasks were identified. Four meta-cognitive information management goals emerged: 1) Relevance Screening; 2) Ensuring Accuracy; 3) Minimizing memory load; and 4) Negotiating Responsibility. Strategies used to support these goals are presented. Conclusion Users develop a wide array of information management strategies that allow them to successfully adapt to new technology. Supporting the ability of users to develop adaptive strategies to support meta-cognitive goals is a key component of a successful system. PMID:17068345

  7. Automation and robotics for the Space Station - An ATAC perspective

    NASA Technical Reports Server (NTRS)

    Nunamaker, Robert R.

    1989-01-01

    The study of automation and robotics for the Space Station by the Advanced Technology Advisory Committee is surveyed. The formation of the committee and the methodology for the Space Station automation study are discussed. The committee's recommendations for automation and robotics research and development are listed.

  8. A case study on the impacts of computerized provider order entry (CPOE) system on hospital clinical workflow.

    PubMed

    Mominah, Maher; Yunus, Faisel; Househ, Mowafa S

    2013-01-01

    Computerized provider order entry (CPOE) is a health informatics system that helps health care providers create and manage orders for medications and other health care services. Through the automation of the ordering process, CPOE has improved the overall efficiency of hospital processes and workflow. In Saudi Arabia, CPOE has been used for years, with only a few studies evaluating the impacts of CPOE on clinical workflow. In this paper, we discuss the experience of a local hospital with the use of CPOE and its impacts on clinical workflow. Results show that there are many issues related to the implementation and use of CPOE within Saudi Arabia that must be addressed, including design, training, medication errors, alert fatigue, and system dep Recommendations for improving CPOE use within Saudi Arabia are also discussed.

  9. Automated tumor analysis for molecular profiling in lung cancer

    PubMed Central

    Boyd, Clinton; James, Jacqueline A.; Loughrey, Maurice B.; Hougton, Joseph P.; Boyle, David P.; Kelly, Paul; Maxwell, Perry; McCleary, David; Diamond, James; McArt, Darragh G.; Tunstall, Jonathon; Bankhead, Peter; Salto-Tellez, Manuel

    2015-01-01

    The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p < 0.001) with benchmark tumor cell counts. This study demonstrates a robust image analysis technology that can facilitate the automated quantitative analysis of tissue samples for molecular profiling in discovery and diagnostics. PMID:26317646

  10. Automated segmentation and reconstruction of patient-specific cardiac anatomy and pathology from in vivo MRI*

    NASA Astrophysics Data System (ADS)

    Ringenberg, Jordan; Deo, Makarand; Devabhaktuni, Vijay; Filgueiras-Rama, David; Pizarro, Gonzalo; Ibañez, Borja; Berenfeld, Omer; Boyers, Pamela; Gold, Jeffrey

    2012-12-01

    This paper presents an automated method to segment left ventricle (LV) tissues from functional and delayed-enhancement (DE) cardiac magnetic resonance imaging (MRI) scans using a sequential multi-step approach. First, a region of interest (ROI) is computed to create a subvolume around the LV using morphological operations and image arithmetic. From the subvolume, the myocardial contours are automatically delineated using difference of Gaussians (DoG) filters and GSV snakes. These contours are used as a mask to identify pathological tissues, such as fibrosis or scar, within the DE-MRI. The presented automated technique is able to accurately delineate the myocardium and identify the pathological tissue in patient sets. The results were validated by two expert cardiologists, and in one set the automated results are quantitatively and qualitatively compared with expert manual delineation. Furthermore, the method is patient-specific, performed on an entire patient MRI series. Thus, in addition to providing a quick analysis of individual MRI scans, the fully automated segmentation method is used for effectively tagging regions in order to reconstruct computerized patient-specific 3D cardiac models. These models can then be used in electrophysiological studies and surgical strategy planning.

  11. Detection of drugs and explosives using neutron computerized tomography and artificial intelligence techniques.

    PubMed

    Ferreira, F J O; Crispim, V R; Silva, A X

    2010-06-01

    In this study the development of a methodology to detect illicit drugs and plastic explosives is described with the objective of being applied in the realm of public security. For this end, non-destructive assay with neutrons was used and the technique applied was the real time neutron radiography together with computerized tomography. The system is endowed with automatic responses based upon the application of an artificial intelligence technique. In previous tests using real samples, the system proved capable of identifying 97% of the inspected materials. Copyright 2010 Elsevier Ltd. All rights reserved.

  12. The role of flow injection analysis within the framework of an automated laboratory

    PubMed Central

    Stockwell, Peter B.

    1990-01-01

    Flow Injection Analysis (FIA) was invented at roughly the same time by two quite dissimilar research groups [1,2]. FIA was patented by both groups in 1974; a year also marked by the publication of the first book on automatic chemical analysis [3]. This book was a major undertaking for its authors and it is hoped that it has added to the knowledge of those analysts attempting to automate their work or to increase the level of computerization/automation and thus reduce staffing commitments. This review discusses the role of FIA in laboratory automation, the advantages and disadvantages of the FIA approach, and the part it could play in future developments. It is important to stress at the outset that the FIA approach is all too often closely paralleled with convention al continuous flow analysis (CFA). This is a mistake for many reasons, none the least of which because of the considerable success of the CFA approach in contrast to the present lack of penetration in the commercial market-place of FIA instrumentation. PMID:18925262

  13. A Case Study of Reverse Engineering Integrated in an Automated Design Process

    NASA Astrophysics Data System (ADS)

    Pescaru, R.; Kyratsis, P.; Oancea, G.

    2016-11-01

    This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.

  14. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  15. Adaptive Computerized Training System (ACTS): A Knowledge Base System for Electronic Troubleshooting

    DTIC Science & Technology

    1983-12-01

    Design Z.3.1 Field System Z.3.e Research System Z.4 Information Flow and Management 2.4.1 Student Performance Recording .4.? Student Operational...could more easily relate to. In addition, many automated management tools were created to assist instructors with courseware authoring, student...ACTS was installed and demonstrated at Ft. Gordon. The training managers , instructors, and SMt’s who participated in those demonstrations provided

  16. Automated design and optimization of flexible booster autopilots via linear programming. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Hauser, F. D.; Szollosi, G. D.; Lakin, W. S.

    1972-01-01

    COEBRA, the Computerized Optimization of Elastic Booster Autopilots, is an autopilot design program. The bulk of the design criteria is presented in the form of minimum allowed gain/phase stability margins. COEBRA has two optimization phases: (1) a phase to maximize stability margins; and (2) a phase to optimize structural bending moment load relief capability in the presence of minimum requirements on gain/phase stability margins.

  17. How to Construct an Automated Warehouse Based on Colored Timed Petri Nets

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; He, Shanjun

    The automated warehouse considered here consists of a number of rack locations with three cranes, a narrow aisle shuttle, and several buffer stations with the roller. Based on analyzing of the behaviors of the active resources in the system, a modular and computerized model is presented via a colored timed Petri net approach, in which places are multicolored to simplify model and characterize control flow of the resources, and token colors are defined as the routes of storage/retrieval operations. In addition, an approach for realization of model via visual c++ is briefly given. These facts allow us to render an emulate system to simulate a discrete control application for online monitoring, dynamic dispatching control and off-line revising scheduler policies.

  18. reCAPTCHA: human-based character recognition via Web security measures.

    PubMed

    von Ahn, Luis; Maurer, Benjamin; McMillen, Colin; Abraham, David; Blum, Manuel

    2008-09-12

    CAPTCHAs (Completely Automated Public Turing test to tell Computers and Humans Apart) are widespread security measures on the World Wide Web that prevent automated programs from abusing online services. They do so by asking humans to perform a task that computers cannot yet perform, such as deciphering distorted characters. Our research explored whether such human effort can be channeled into a useful purpose: helping to digitize old printed material by asking users to decipher scanned words from books that computerized optical character recognition failed to recognize. We showed that this method can transcribe text with a word accuracy exceeding 99%, matching the guarantee of professional human transcribers. Our apparatus is deployed in more than 40,000 Web sites and has transcribed over 440 million words.

  19. An automated calibration laboratory - Requirements and design approach

    NASA Technical Reports Server (NTRS)

    O'Neil-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  20. E-waste Management and Refurbishment Prediction (EMARP) Model for Refurbishment Industries.

    PubMed

    Resmi, N G; Fasila, K A

    2017-10-01

    This paper proposes a novel algorithm for establishing a standard methodology to manage and refurbish e-waste called E-waste Management And Refurbishment Prediction (EMARP), which can be adapted by refurbishing industries in order to improve their performance. Waste management, particularly, e-waste management is a serious issue nowadays. Computerization has been into waste management in different ways. Much of the computerization has happened in planning the waste collection, recycling and disposal process and also managing documents and reports related to waste management. This paper proposes a computerized model to make predictions for e-waste refurbishment. All possibilities for reusing the common components among the collected e-waste samples are predicted, thus minimizing the wastage. Simulation of the model has been done to analyse the accuracy in the predictions made by the system. The model can be scaled to accommodate the real-world scenario. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Computerized quantitative evaluation of mammographic accreditation phantom images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Yongbum; Tsai, Du-Yih; Shinohara, Norimitsu

    2010-12-15

    Purpose: The objective was to develop and investigate an automated scoring scheme of the American College of Radiology (ACR) mammographic accreditation phantom (RMI 156, Middleton, WI) images. Methods: The developed method consisted of background subtraction, determination of region of interest, classification of fiber and mass objects by Mahalanobis distance, detection of specks by template matching, and rule-based scoring. Fifty-one phantom images were collected from 51 facilities for this study (one facility provided one image). A medical physicist and two radiologic technologists also scored the images. The human and computerized scores were compared. Results: In terms of meeting the ACR's criteria,more » the accuracies of the developed method for computerized evaluation of fiber, mass, and speck were 90%, 80%, and 98%, respectively. Contingency table analysis revealed significant association between observer and computer scores for microcalcifications (p<5%) but not for masses and fibers. Conclusions: The developed method may achieve a stable assessment of visibility for test objects in mammographic accreditation phantom image in whether the phantom image meets the ACR's criteria in the evaluation test, although there is room left for improvement in the approach for fiber and mass objects.« less

  2. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  3. HOW GOOD ARE MY DATA? INFORMATION QUALITY ASSESSMENT METHODOLOGY

    EPA Science Inventory


    Quality assurance techniques used in software development and hardware maintenance/reliability help to ensure that data in a computerized information management system are maintained well. However, information workers may not know the quality of the data resident in their inf...

  4. Computerized Tests of Team Performance and Crew Coordination Suitable for Military/Aviation Settings.

    PubMed

    Lawson, Ben D; Britt, Thomas W; Kelley, Amanda M; Athy, Jeremy R; Legan, Shauna M

    2017-08-01

    The coordination of team effort on shared tasks is an area of inquiry. A number of tests of team performance in challenging environments have been developed without comparison or standardization. This article provides a systematic review of the most accessible and usable low-to-medium fidelity computerized tests of team performance and determines which are most applicable to military- and aviation-relevant research, such as studies of group command, control, communication, and crew coordination. A search was conducted to identify computerized measures of team performance. In addition to extensive literature searches (DTIC, Psychinfo, PubMed), the authors reached out to team performance researchers at conferences and through electronic communication. Identified were 57 potential tests according to 6 specific selection criteria (e.g., the requirement for automated collection of team performance and coordination processes, the use of military-relevant scenarios). The following seven tests (listed alphabetically) were considered most suitable for military needs: Agent Enabled Decision Group Environment (AEDGE), C3Conflict, the C3 (Command, Control, & Communications) Interactive Task for Identifying Emerging Situations (NeoCITIES), Distributed Dynamic Decision Making (DDD), Duo Wondrous Original Method Basic Awareness/Airmanship Test (DuoWOMBAT), the Leader Development Simulator (LDS), and the Planning Task for Teams (PLATT). Strengths and weaknesses of these tests are described and recommendations offered to help researchers identify the test most suitable for their particular needs. Adoption of a few standard computerized test batteries to study team performance would facilitate the evaluation of interventions intended to enhance group performance in multiple challenging military and aerospace operational environments.Lawson BD, Britt TW, Kelley AM, Athy JR, Legan SM. Computerized tests of team performance and crew coordination suitable for military/aviation settings. Aerosp Med Hum Perform. 2017; 88(8):722-729.

  5. Automated fault-management in a simulated spaceflight micro-world

    NASA Technical Reports Server (NTRS)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  6. Rapid classification of landsat TM imagery for phase 1 stratification using the automated NDVI threshold supervised classification (ANTSC) methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2002-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....

  7. Computerized detection of leukocytes in microscopic leukorrhea images.

    PubMed

    Zhang, Jing; Zhong, Ya; Wang, Xiangzhou; Ni, Guangming; Du, Xiaohui; Liu, Juanxiu; Liu, Lin; Liu, Yong

    2017-09-01

    Detection of leukocytes is critical for the routine leukorrhea exam, which is widely used in gynecological examinations. An elevated vaginal leukocyte count in women with bacterial vaginosis is a strong predictor of vaginal or cervical infections. In the routine leukorrhea exam, the counting of leukocytes is primarily performed by manual techniques. However, the viewing and counting of leukocytes from multiple high-power viewing fields on a glass slide under a microscope leads to subjectivity, low efficiency, and low accuracy. To date, many biological cells in stool, blood, and breast cancer have been studied to realize computerized detection; however, the detection of leukocytes in microscopic leukorrhea images has not been studied. Thus, there is an increasing need for computerized detection of leukocytes. There are two key processes in the computerized detection of leukocytes in digital image processing. One is segmentation; the other is intelligent classification. In this paper, we propose a combined ensemble to detect leukocytes in the microscopic leukorrhea image. After image segmentation and selecting likely leukocyte subimages, we obtain the leukocyte candidates. Then, for intelligent classification, we adopt two methods: feature extraction and classification by a support vector machine (SVM); applying a modified convolutional neural network (CNN) to the larger subimages. If different methods classify a candidate in the same category, the process is finished. If not, the outputs of the methods are provided to a classifier to further classify the candidate. After acquiring leukocyte candidates, we attempted three methods to perform classification. The first approach using features and SVM achieved 88% sensitivity, 97% specificity, and 92.5% accuracy. The second method using CNN achieved 95% sensitivity, 84% specificity, and 89.5% accuracy. Then, in the combination approach, we achieved 92% sensitivity, 95% specificity, and 93.5% accuracy. Finally, the images with marked and counted leukocytes were obtained. A novel computerized detection system was developed for automated detection of leukocytes in microscopic images. Different methods resulted in comparable overall qualities by enabling computerized detection of leukocytes. The proposed approach further improved the performance. This preliminary study proves the feasibility of computerized detection of leukocytes in clinical use. © 2017 American Association of Physicists in Medicine.

  8. Library Automation in Sub Saharan Africa: Case Study of the University of Botswana

    ERIC Educational Resources Information Center

    Mutula, Stephen Mudogo

    2012-01-01

    Purpose: This article aims to present experiences and the lessons learned from the University of Botswana (UB) library automation project. The implications of the project for similar libraries planning automation in sub Saharan Africa and beyond are adduced. Design/methodology/approach: The article is a case study of library automation at the…

  9. Automation of Acquisition Records and Routine in the University Library, Newcastle upon Tyne

    ERIC Educational Resources Information Center

    Line, Maurice B.

    2006-01-01

    Purpose: Reports on the trial of an automated order routine for the University Library in Newcastle which began in April 1966. Design/methodology/approach: Presents the author's experiences of the manual order processing system, and the impetus for trialling an automated system. The stages of the automated system are described in detail. Findings:…

  10. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    PubMed

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. The development of the Medical Literature Analysis and Retrieval System (MEDLARS)*

    PubMed Central

    Dee, Cheryl Rae

    2007-01-01

    Objective: The research provides a chronology of the US National Library of Medicine's (NLM's) contribution to access to the world's biomedical literature through its computerization of biomedical indexes, particularly the Medical Literature Analysis and Retrieval System (MEDLARS). Method: Using material gathered from NLM's archives and from personal interviews with people associated with developing MEDLARS and its associated systems, the author discusses key events in the history of MEDLARS. Discussion: From the development of the early mechanized bibliographic retrieval systems of the 1940s and to the beginnings of online, interactive computerized bibliographic search systems of the early 1970s chronicled here, NLM's contributions to automation and bibliographic retrieval have been extensive. Conclusion: As NLM's technological experience and expertise grew, innovative bibliographic storage and retrieval systems emerged. NLM's accomplishments regarding MEDLARS were cutting edge, placing the library at the forefront of incorporating mechanization and technologies into medical information systems. PMID:17971889

  12. Effect of gender on computerized electrocardiogram measurements in college athletes.

    PubMed

    Mandic, Sandra; Fonda, Holly; Dewey, Frederick; Le, Vy-van; Stein, Ricardo; Wheeler, Matt; Ashley, Euan A; Myers, Jonathan; Froelicher, Victor F

    2010-06-01

    Broad criteria for classifying an electrocardiogram (ECG) as abnormal and requiring additional testing prior to participating in competitive athletics have been recommended for the preparticipation examination (PPE) of athletes. Because these criteria have not considered gender differences, we examined the effect of gender on the computerized ECG measurements obtained on Stanford student athletes. Currently available computer programs require a basis for "normal" in athletes of both genders to provide reliable interpretation. During the 2007 PPE, computerized ECGs were recorded and analyzed on 658 athletes (54% male; mean age, 19 +/- 1 years) representing 22 sports. Electrocardiogram measurements included intervals and durations in all 12 leads to calculate 12-lead voltage sums, QRS amplitude and QRS area, spatial vector length (SVL), and the sum of the R wave in V5 and S wave in V2 (RSsum). By computer analysis, male athletes had significantly greater QRS duration, PR interval, Q-wave duration, J-point amplitude, and T-wave amplitude, and shorter QTc interval compared with female athletes (all P < 0.05). All ECG indicators of left ventricular electrical activity were significantly greater in males. Although gender was consistently associated with indices of atrial and ventricular electrical activity in multivariable analysis, ECG measurements correlated poorly with body dimensions. Significant gender differences exist in ECG measurements of college athletes that are not explained by differences in body size. Our tables of "normal" computerized gender-specific measurements can facilitate the development of automated ECG interpretation for screening young athletes.

  13. Computerized cytometry and wavelet analysis of follicular lesions for detecting malignancy: A pilot study in thyroid cytology.

    PubMed

    Gilshtein, Hayim; Mekel, Michal; Malkin, Leonid; Ben-Izhak, Ofer; Sabo, Edmond

    2017-01-01

    The cytologic diagnosis of indeterminate lesions of the thyroid involves much uncertainty, and the final diagnosis often requires operative resection. Computerized cytomorphometry and wavelets analysis were examined to evaluate their ability to better discriminate between benign and malignant lesions based on cytology slides. Cytologic reports from patients who underwent thyroid operation in a single, tertiary referral center were retrieved. Patients with Bethesda III and IV lesions were divided according to their final histopathology. Cytomorphometry and wavelet analysis were performed on the digitized images of the cytology slides. Cytology slides of 40 patients were analyzed. Seven patients had a histologic diagnosis of follicular malignancy, 13 had follicular adenomas, and 20 had a benign goiter. Computerized cytomorphometry with a combination of descriptors of nuclear size, shape, and texture was able to predict quantitatively adenoma versus malignancy within the indeterminate group with 95% accuracy. An automated wavelets analysis with a neural network algorithm reached an accuracy of 96% in identifying correctly malignant vs. benign lesions based on cytology. Computerized analysis of cytology slides seems to be more accurate in defining indeterminate thyroid lesions compared with conventional cytologic analysis, which is based on visual characteristics on cytology as well as the expertise of the cytologist. This pilot study needs to be validated with a greater number of samples. Providing a successful validation, we believe that such methods carry promise for better patient treatment. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Automated method for determining Instron Residual Seal Force of glass vial/rubber closure systems. Part II. 13 mm vials.

    PubMed

    Ludwig, J D; Davis, C W

    1995-01-01

    Instron Residual Seal Force (IRSF) of 13 mm glass vial/rubber closure systems was determined using an Instron 4501 Materials Testing System and computerized data analysis. A series of three cap anvils varying in shape and dimensions were machined to optimize cap anvil performance. Cap anvils with spherical top surfaces and narrow internal dimensions produced uniform stress-deformation curves from which precise IRSF values were derived.

  15. Chemistry/Hematology Reporting Via the File Manager

    PubMed Central

    Tatarczuk, J. R.; Ginsburg, R. E.; Wu, A.; Schauble, M.

    1981-01-01

    A computerized reporting system was implemented to replace a simple manual cumulative laboratory chemistry report. Modification and expansion of the system was carried out with user participation, and the system now forms the nucleus for a complete automated laboratory system. It is linked to a master patient file which when fully developed will provide a suitable basis for a complete patient clinical information system. ANSI standard MUMPS was utilized and modules were developed and implemented in a serial fashion.

  16. Looking ahead in systems engineering

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Donald S.

    1966-01-01

    Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.

  17. Two-phase computerized planning of cryosurgery using bubble-packing and force-field analogy.

    PubMed

    Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed

    2006-02-01

    Cryosurgery is the destruction of undesired tissues by freezing, as in prostate cryosurgery, for example. Minimally invasive cryosurgery is currently performed by means of an array of cryoprobes, each in the shape of a long hypodermic needle. The optimal arrangement of the cryoprobes, which is known to have a dramatic effect on the quality of the cryoprocedure, remains an art held by the cryosurgeon, based on the cryosurgeon's experience and "rules of thumb." An automated computerized technique for cryosurgery planning is the subject matter of the current paper, in an effort to improve the quality of cryosurgery. A two-phase optimization method is proposed for this purpose, based on two previous and independent developments by this research team. Phase I is based on a bubble-packing method, previously used as an efficient method for finite element meshing. Phase II is based on a force-field analogy method, which has proven to be robust at the expense of a typically long runtime. As a proof-of-concept, results are demonstrated on a two-dimensional case of a prostate cross section. The major contribution of this study is to affirm that in many instances cryosurgery planning can be performed without extremely expensive simulations of bioheat transfer, achieved in Phase I. This new method of planning has proven to reduce planning runtime from hours to minutes, making automated planning practical in a clinical time frame.

  18. Image editing with Adobe Photoshop 6.0.

    PubMed

    Caruso, Ronald D; Postel, Gregory C

    2002-01-01

    The authors introduce Photoshop 6.0 for radiologists and demonstrate basic techniques of editing gray-scale cross-sectional images intended for publication and for incorporation into computerized presentations. For basic editing of gray-scale cross-sectional images, the Tools palette and the History/Actions palette pair should be displayed. The History palette may be used to undo a step or series of steps. The Actions palette is a menu of user-defined macros that save time by automating an action or series of actions. Converting an image to 8-bit gray scale is the first editing function. Cropping is the next action. Both decrease file size. Use of the smallest file size necessary for the purpose at hand is recommended. Final file size for gray-scale cross-sectional neuroradiologic images (8-bit, single-layer TIFF [tagged image file format] at 300 pixels per inch) intended for publication varies from about 700 Kbytes to 3 Mbytes. Final file size for incorporation into computerized presentations is about 10-100 Kbytes (8-bit, single-layer, gray-scale, high-quality JPEG [Joint Photographic Experts Group]), depending on source and intended use. Editing and annotating images before they are inserted into presentation software is highly recommended, both for convenience and flexibility. Radiologists should find that image editing can be carried out very rapidly once the basic steps are learned and automated. Copyright RSNA, 2002

  19. Semi-automated segmentation of a glioblastoma multiforme on brain MR images for radiotherapy planning.

    PubMed

    Hori, Daisuke; Katsuragawa, Shigehiko; Murakami, Ryuuji; Hirai, Toshinori

    2010-04-20

    We propose a computerized method for semi-automated segmentation of the gross tumor volume (GTV) of a glioblastoma multiforme (GBM) on brain MR images for radiotherapy planning (RTP). Three-dimensional (3D) MR images of 28 cases with a GBM were used in this study. First, a sphere volume of interest (VOI) including the GBM was selected by clicking a part of the GBM region in the 3D image. Then, the sphere VOI was transformed to a two-dimensional (2D) image by use of a spiral-scanning technique. We employed active contour models (ACM) to delineate an optimal outline of the GBM in the transformed 2D image. After inverse transform of the optimal outline to the 3D space, a morphological filter was applied to smooth the shape of the 3D segmented region. For evaluation of our computerized method, we compared the computer output with manually segmented regions, which were obtained by a therapeutic radiologist using a manual tracking method. In evaluating our segmentation method, we employed the Jaccard similarity coefficient (JSC) and the true segmentation coefficient (TSC) in volumes between the computer output and the manually segmented region. The mean and standard deviation of JSC and TSC were 74.2+/-9.8% and 84.1+/-7.1%, respectively. Our segmentation method provided a relatively accurate outline for GBM and would be useful for radiotherapy planning.

  20. Addressing the insider threat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hochberg, J.G.; Jackson, K.A.; McClary, J.F.

    1993-05-01

    Computers have come to play a major role in the processing of information vital to our national security. As we grow more dependent on computers, we also become more vulnerable to their misuse. Misuse may be accidental, or may occur deliberately for purposes of personal gain, espionage, terrorism, or revenge. While it is difficult to obtain exact statistics on computer misuse, clearly it is growing. It is also clear that insiders -- authorized system users -- are responsible for most of this increase. Unfortunately, their insider status gives them a greater potential for harm This paper takes an asset-based approachmore » to the insider threat. We begin by characterizing the insider and the threat posed by variously motivated insiders. Next, we characterize the asset of concern: computerized information of strategic or economic value. We discuss four general ways in which computerized information is vulnerable to adversary action by the insider: disclosure, violation of integrity, denial of service, and unauthorized use of resources. We then look at three general remedies for these vulnerabilities. The first is formality of operations, such as training, personnel screening, and configuration management. The second is the institution of automated safeguards, such as single-use passwords, encryption, and biometric devices. The third is the development of automated systems that collect and analyze system and user data to look for signs of misuse.« less

  1. Addressing the insider threat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hochberg, J.G.; Jackson, K.A.; McClary, J.F.

    1993-01-01

    Computers have come to play a major role in the processing of information vital to our national security. As we grow more dependent on computers, we also become more vulnerable to their misuse. Misuse may be accidental, or may occur deliberately for purposes of personal gain, espionage, terrorism, or revenge. While it is difficult to obtain exact statistics on computer misuse, clearly it is growing. It is also clear that insiders -- authorized system users -- are responsible for most of this increase. Unfortunately, their insider status gives them a greater potential for harm This paper takes an asset-based approachmore » to the insider threat. We begin by characterizing the insider and the threat posed by variously motivated insiders. Next, we characterize the asset of concern: computerized information of strategic or economic value. We discuss four general ways in which computerized information is vulnerable to adversary action by the insider: disclosure, violation of integrity, denial of service, and unauthorized use of resources. We then look at three general remedies for these vulnerabilities. The first is formality of operations, such as training, personnel screening, and configuration management. The second is the institution of automated safeguards, such as single-use passwords, encryption, and biometric devices. The third is the development of automated systems that collect and analyze system and user data to look for signs of misuse.« less

  2. Rapid Classification of Landsat TM Imagery for Phase 1 Stratification Using the Automated NDVI Threshold Supervised Classification (ANTSC) Methodology

    Treesearch

    William H. Cooke; Dennis M. Jacobs

    2005-01-01

    FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....

  3. Redefining the Practice of Peer Review Through Intelligent Automation-Part 3: Automated Report Analysis and Data Reconciliation.

    PubMed

    Reiner, Bruce I

    2018-02-01

    One method for addressing existing peer review limitations is the assignment of peer review cases on a completely blinded basis, in which the peer reviewer would create an independent report which can then be cross-referenced with the primary reader report of record. By leveraging existing computerized data mining techniques, one could in theory automate and objectify the process of report data extraction, classification, and analysis, while reducing time and resource requirements intrinsic to manual peer review report analysis. Once inter-report analysis has been performed, resulting inter-report discrepancies can be presented to the radiologist of record for review, along with the option to directly communicate with the peer reviewer through an electronic data reconciliation tool aimed at collaboratively resolving inter-report discrepancies and improving report accuracy. All associated report and reconciled data could in turn be recorded in a referenceable peer review database, which provides opportunity for context and user-specific education and decision support.

  4. Best-Quality Vessel Identification Using Vessel Quality Measure in Multiple-Phase Coronary CT Angiography.

    PubMed

    Hadjiiski, Lubomir; Liu, Jordan; Chan, Heang-Ping; Zhou, Chuan; Wei, Jun; Chughtai, Aamer; Kuriakose, Jean; Agarwal, Prachi; Kazerooni, Ella

    2016-01-01

    The detection of stenotic plaques strongly depends on the quality of the coronary arterial tree imaged with coronary CT angiography (cCTA). However, it is time consuming for the radiologist to select the best-quality vessels from the multiple-phase cCTA for interpretation in clinical practice. We are developing an automated method for selection of the best-quality vessels from coronary arterial trees in multiple-phase cCTA to facilitate radiologist's reading or computerized analysis. Our automated method consists of vessel segmentation, vessel registration, corresponding vessel branch matching, vessel quality measure (VQM) estimation, and automatic selection of best branches based on VQM. For every branch, the VQM was calculated as the average radial gradient. An observer preference study was conducted to visually compare the quality of the selected vessels. 167 corresponding branch pairs were evaluated by two radiologists. The agreement between the first radiologist and the automated selection was 76% with kappa of 0.49. The agreement between the second radiologist and the automated selection was also 76% with kappa of 0.45. The agreement between the two radiologists was 81% with kappa of 0.57. The observer preference study demonstrated the feasibility of the proposed automated method for the selection of the best-quality vessels from multiple cCTA phases.

  5. Integrated Psychosocial and Opioid-Antagonist Treatment for Alcohol Dependence: A Systematic Review of Controlled Evaluations

    ERIC Educational Resources Information Center

    Vaughn, Michael G.; Howard, Matthew O.

    2004-01-01

    Methodological characteristics and outcomes of 14 controlled clinical investigations of integrated psychosocial and opioid-antagonist alcohol dependence treatment were evaluated. The 14 studies were identified through computerized bibliographic and manual literature searches. Clients receiving integrated psychosocial and opioid-antagonist…

  6. Crewmember Performance before, during, and after Spaceflight

    ERIC Educational Resources Information Center

    Kelly, Thomas H.; Hienz, Robert D.; Zarcone, Troy J.; Wurster, Richard M.; Brady, Joseph V.

    2005-01-01

    The development of technologies for monitoring the welfare of crewmembers is a critical requirement for extended spaceflight. Behavior analytic methodologies provide a framework for studying the performance of individuals and groups, and brief computerized tests have been used successfully to examine the impairing effects of sleep, drug, and…

  7. Pilot factors guidelines for the operational inspection of navigation systems

    NASA Technical Reports Server (NTRS)

    Sadler, J. F.; Boucek, G. P.

    1988-01-01

    A computerized human engineered inspection technique is developed for use by FAA inspectors in evaluating the pilot factors aspects of aircraft navigation systems. The short title for this project is Nav Handbook. A menu-driven checklist, computer program and data base (Human Factors Design Criteria) were developed and merged to form a self-contained, portable, human factors inspection checklist tool for use in a laboratory or field setting. The automated checklist is tailored for general aviation navigation systems and can be expanded for use with other aircraft systems, transports or military aircraft. The Nav Handbook inspection concept was demonstrated using a lap-top computer and an Omega/VLF CDU. The program generates standardized inspection reports. Automated checklists for LORAN/C and R NAV were also developed. A Nav Handbook User's Guide is included.

  8. Use of Flowchart for Automation of Clinical Protocols in mHealth.

    PubMed

    Dias, Karine Nóra; Welfer, Daniel; Cordeiro d'Ornellas, Marcos; Pereira Haygert, Carlos Jesus; Dotto, Gustavo Nogara

    2017-01-01

    For healthcare professionals to use mobile applications we need someone who knows software development, provide them. In healthcare institutions, health professionals use clinical protocols to govern care, and sometimes these documents are computerized through mobile applications to assist them. This work aims to present a proposal of an application of flow as a way of describing clinical protocols for automatic generation of mobile applications to assist health professionals. The purpose of this research is to enable health professionals to develop applications from the description of their own clinical protocols. As a result, we developed a web system that automates clinical protocols for an Android platform, and we validated with two clinical protocols used in a Brazilian hospital. Preliminary results of the developed architecture demonstrate the feasibility of this study.

  9. An automated calibration laboratory for flight research instrumentation: Requirements and a proposed design approach

    NASA Technical Reports Server (NTRS)

    Oneill-Rood, Nora; Glover, Richard D.

    1990-01-01

    NASA's Dryden Flight Research Facility (Ames-Dryden), operates a diverse fleet of research aircraft which are heavily instrumented to provide both real time data for in-flight monitoring and recorded data for postflight analysis. Ames-Dryden's existing automated calibration (AUTOCAL) laboratory is a computerized facility which tests aircraft sensors to certify accuracy for anticipated harsh flight environments. Recently, a major AUTOCAL lab upgrade was initiated; the goal of this modernization is to enhance productivity and improve configuration management for both software and test data. The new system will have multiple testing stations employing distributed processing linked by a local area network to a centralized database. The baseline requirements for the new AUTOCAL lab and the design approach being taken for its mechanization are described.

  10. An automated methodology development. [software design for combat simulation

    NASA Technical Reports Server (NTRS)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  11. Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System.

    PubMed

    Punjabi, Naresh M; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N

    2015-10-01

    Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Clinical sleep laboratories. A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90-0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91-0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. © 2015 Associated Professional Sleep Societies, LLC.

  12. Human Resource Development, Social Capital, Emotional Intelligence: Any Link to Productivity?

    ERIC Educational Resources Information Center

    Brooks, Kit; Nafukho, Fredrick Muyia

    2006-01-01

    Purpose: This article aims to offer a theoretical framework that attempts to show the integration among human resource development (HRD), social capital (SC), emotional intelligence (EI) and organizational productivity. Design/methodology/approach: The literature search included the following: a computerized search of accessible and available…

  13. An Empirical Analysis of Negotiation Teaching Methodologies Using a Negotiation Support System

    ERIC Educational Resources Information Center

    Jones, Beth H.; Jones, Gary H.; Banerjee, Debasish

    2005-01-01

    This article describes an experiment that compared different methods of teaching undergraduates the fundamentals of negotiation analysis. Using student subjects, we compared three conditions: reading, lecture-only, and lecture accompanied by student use of a computerized negotiation support system (NSS). The authors examined two facets of…

  14. Simplified bridge load rating methodology using the national bridge inventory file : user manual

    DOT National Transportation Integrated Search

    1988-08-01

    The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...

  15. Simplified bridge load rating methodology using the national bridge inventory file : program listing

    DOT National Transportation Integrated Search

    1987-08-01

    The purpose of this research was to develop a computerized system to determine the adequacy of a bridge or group of bridges to carry specified overload vehicles. The system utilizes two levels of analysis. The Level 1 analysis is the basic rating sys...

  16. Policy Information System Computer Program.

    ERIC Educational Resources Information Center

    Hamlin, Roger E.; And Others

    The concepts and methodologies outlined in "A Policy Information System for Vocational Education" are presented in a simple computer format in this booklet. It also contains a sample output representing 5-year projections of various planning needs for vocational education. Computerized figures in the eight areas corresponding to those in the…

  17. A prototype computerized synthesis methodology for generic space access vehicle (SAV) conceptual design

    NASA Astrophysics Data System (ADS)

    Huang, Xiao

    2006-04-01

    Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.

  18. Implementation of a pharmacy automation system (robotics) to ensure medication safety at Norwalk hospital.

    PubMed

    Bepko, Robert J; Moore, John R; Coleman, John R

    2009-01-01

    This article reports an intervention to improve the quality and safety of hospital patient care by introducing the use of pharmacy robotics into the medication distribution process. Medication safety is vitally important. The integration of pharmacy robotics with computerized practitioner order entry and bedside medication bar coding produces a significant reduction in medication errors. The creation of a safe medication-from initial ordering to bedside administration-provides enormous benefits to patients, to health care providers, and to the organization as well.

  19. Naval War College Review. Volume 67, Number 3, Summer 2014

    DTIC Science & Technology

    2014-01-01

    Naval Career, has recently been published by the Press as Newport Paper 41� We wish Barney fair winds and following seas� our next two offerings in...now done�” In goschen’s opinion, “the best way of taking the wind out of the sails of the Big Navy Party in germany is to state frankly that if...and computerization of command and control; expanded use of shipborne helicopters; automation of gunnery and sensor systems; and even the advent of

  20. Rhesus monkey (Macaca mulatta) complex learning skills reassessed

    NASA Technical Reports Server (NTRS)

    Washburn, David A.; Rumbaugh, Duane M.

    1991-01-01

    An automated computerized testing facility is employed to study basic learning and transfer in rhesus monkeys including discrimination learning set and mediational learning. The data show higher performance levels than those predicted from other tests that involved compromised learning with analogous conditions. Advanced transfer-index ratios and positive transfer of learning are identified, and indications of mediational learning strategies are noted. It is suggested that these data are evidence of the effectiveness of the present experimental apparatus for enhancing learning in nonhuman primates.

  1. The aerospace energy systems laboratory: Hardware and software implementation

    NASA Technical Reports Server (NTRS)

    Glover, Richard D.; Oneil-Rood, Nora

    1989-01-01

    For many years NASA Ames Research Center, Dryden Flight Research Facility has employed automation in the servicing of flight critical aircraft batteries. Recently a major upgrade to Dryden's computerized Battery Systems Laboratory was initiated to incorporate distributed processing and a centralized database. The new facility, called the Aerospace Energy Systems Laboratory (AESL), is being mechanized with iAPX86 and iAPX286 hardware running iRMX86. The hardware configuration and software structure for the AESL are described.

  2. Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest

    NASA Technical Reports Server (NTRS)

    Rohloff, Kurt

    2010-01-01

    The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.

  3. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Computerized tomography with total variation and with shearlets

    NASA Astrophysics Data System (ADS)

    Garduño, Edgar; Herman, Gabor T.

    2017-04-01

    To reduce the x-ray dose in computerized tomography (CT), many constrained optimization approaches have been proposed aiming at minimizing a regularizing function that measures a lack of consistency with some prior knowledge about the object that is being imaged, subject to a (predetermined) level of consistency with the detected attenuation of x-rays. One commonly investigated regularizing function is total variation (TV), while other publications advocate the use of some type of multiscale geometric transform in the definition of the regularizing function, a particular recent choice for this is the shearlet transform. Proponents of the shearlet transform in the regularizing function claim that the reconstructions so obtained are better than those produced using TV for texture preservation (but may be worse for noise reduction). In this paper we report results related to this claim. In our reported experiments using simulated CT data collection of the head, reconstructions whose shearlet transform has a small ℓ 1-norm are not more efficacious than reconstructions that have a small TV value. Our experiments for making such comparisons use the recently-developed superiorization methodology for both regularizing functions. Superiorization is an automated procedure for turning an iterative algorithm for producing images that satisfy a primary criterion (such as consistency with the observed measurements) into its superiorized version that will produce results that, according to the primary criterion are as good as those produced by the original algorithm, but in addition are superior to them according to a secondary (regularizing) criterion. The method presented for superiorization involving the ℓ 1-norm of the shearlet transform is novel and is quite general: It can be used for any regularizing function that is defined as the ℓ 1-norm of a transform specified by the application of a matrix. Because in the previous literature the split Bregman algorithm is used for similar purposes, a section is included comparing the results of the superiorization algorithm with the split Bregman algorithm.

  5. An automated machine vision system for the histological grading of cervical intraepithelial neoplasia (CIN).

    PubMed

    Keenan, S J; Diamond, J; McCluggage, W G; Bharucha, H; Thompson, D; Bartels, P H; Hamilton, P W

    2000-11-01

    The histological grading of cervical intraepithelial neoplasia (CIN) remains subjective, resulting in inter- and intra-observer variation and poor reproducibility in the grading of cervical lesions. This study has attempted to develop an objective grading system using automated machine vision. The architectural features of cervical squamous epithelium are quantitatively analysed using a combination of computerized digital image processing and Delaunay triangulation analysis; 230 images digitally captured from cases previously classified by a gynaecological pathologist included normal cervical squamous epithelium (n=30), koilocytosis (n=46), CIN 1 (n=52), CIN 2 (n=56), and CIN 3 (n=46). Intra- and inter-observer variation had kappa values of 0.502 and 0.415, respectively. A machine vision system was developed in KS400 macro programming language to segment and mark the centres of all nuclei within the epithelium. By object-oriented analysis of image components, the positional information of nuclei was used to construct a Delaunay triangulation mesh. Each mesh was analysed to compute triangle dimensions including the mean triangle area, the mean triangle edge length, and the number of triangles per unit area, giving an individual quantitative profile of measurements for each case. Discriminant analysis of the geometric data revealed the significant discriminatory variables from which a classification score was derived. The scoring system distinguished between normal and CIN 3 in 98.7% of cases and between koilocytosis and CIN 1 in 76.5% of cases, but only 62.3% of the CIN cases were classified into the correct group, with the CIN 2 group showing the highest rate of misclassification. Graphical plots of triangulation data demonstrated the continuum of morphological change from normal squamous epithelium to the highest grade of CIN, with overlapping of the groups originally defined by the pathologists. This study shows that automated location of nuclei in cervical biopsies using computerized image analysis is possible. Analysis of positional information enables quantitative evaluation of architectural features in CIN using Delaunay triangulation meshes, which is effective in the objective classification of CIN. This demonstrates the future potential of automated machine vision systems in diagnostic histopathology. Copyright 2000 John Wiley & Sons, Ltd.

  6. Computerized planning of prostate cryosurgery using variable cryoprobe insertion depth.

    PubMed

    Rossi, Michael R; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed

    2010-02-01

    The current study presents a computerized planning scheme for prostate cryosurgery using a variable insertion depth strategy. This study is a part of an ongoing effort to develop computerized tools for cryosurgery. Based on typical clinical practices, previous automated planning schemes have required that all cryoprobes be aligned at a single insertion depth. The current study investigates the benefit of removing this constraint, in comparison with results based on uniform insertion depth planning as well as the so-called "pullback procedure". Planning is based on the so-called "bubble-packing method", and its quality is evaluated with bioheat transfer simulations. This study is based on five 3D prostate models, reconstructed from ultrasound imaging, and cryoprobe active length in the range of 15-35 mm. The variable insertion depth technique is found to consistently provide superior results when compared to the other placement methods. Furthermore, it is shown that both the optimal active length and the optimal number of cryoprobes vary among prostate models, based on the size and shape of the target region. Due to its low computational cost, the new scheme can be used to determine the optimal cryoprobe layout for a given prostate model in real time. Copyright 2008 Elsevier Inc. All rights reserved.

  7. Performance modeling of automated manufacturing systems

    NASA Astrophysics Data System (ADS)

    Viswanadham, N.; Narahari, Y.

    A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.

  8. Space Age Spuds

    NASA Technical Reports Server (NTRS)

    2000-01-01

    American Ag-Tech International, Ltd. developed a system called Quantum Tubers through the Wisconsin Center for Space Automation and Robotics (a NASA-sponsored Commercial Space Center). Using computerization and technologies originally intended for growing plants in space, the company developed a growth chamber that accelerates plant growth and is free of plant pathogens. The chamber is used to grow minitubers, which serve as nuclear seed stock for potatoes. Using lighting technology, temperature and humidity controls, and automation technology, the minituber can be generated in one closed facility with out much labor handling. This means they can be grown year round in extreme environments. The system eliminates the need for multiple generations of seed and eliminates exposure to pathogens, disease and pests. The Quantum Tubers system can produce 10-20 million tubers throughout the year, about equal to the world's supply of this generation seed stock.

  9. Automated Electrostatics Environmental Chamber

    NASA Technical Reports Server (NTRS)

    Calle, Carlos; Lewis, Dean C.; Buchanan, Randy K.; Buchanan, Aubri

    2005-01-01

    The Mars Electrostatics Chamber (MEC) is an environmental chamber designed primarily to create atmospheric conditions like those at the surface of Mars to support experiments on electrostatic effects in the Martian environment. The chamber is equipped with a vacuum system, a cryogenic cooling system, an atmospheric-gas replenishing and analysis system, and a computerized control system that can be programmed by the user and that provides both automation and options for manual control. The control system can be set to maintain steady Mars-like conditions or to impose temperature and pressure variations of a Mars diurnal cycle at any given season and latitude. In addition, the MEC can be used in other areas of research because it can create steady or varying atmospheric conditions anywhere within the wide temperature, pressure, and composition ranges between the extremes of Mars-like and Earth-like conditions.

  10. Computerized controlled-substance surveillance: application involving automated storage and distribution cabinets.

    PubMed

    Wellman, G S; Hammond, R L; Talmage, R

    2001-10-01

    A secondary data-reporting system used to scan the archives of a hospital's automated storage and distribution cabinets (ASDCs) for indications of controlled-substance diversion is described. ASDCs, which allow access to multiple doses of the same medication at one time, use drug count verification to ensure complete audits and disposition tracking. Because an ASDC may interpret inappropriate removal of a medication as a normal transaction, users of ASDCs should have a comprehensive plan for detecting and investigating controlled-substance diversion. Monitoring for and detecting diversion can be difficult and time-consuming, given the limited report-generating features of many ASDCs. Managers at an 800-bed hospital used report-writing software to address these problems. This application interfaces with the hospital's computer system and generates customized reports. The monthly activity recapitulation report lists each user of the ASDCs and gives a summary of all the controlled-substance transactions for those users for the time period specified. The monthly summary report provides the backbone of the surveillance system and identifies situations that require further audit and review. This report provides a summary of each user's activity for a specific medication for the time period specified. The detailed summary report allows for efficient review of specific transactions before there is a decision to conduct a chart review. This report identifies all ASDC controlled-substance transactions associated with a user. A computerized report-generating system identifies instances of inappropriate removal of controlled substances from a hospital's ASDCs.

  11. Two-phase Computerized Planning of Cryosurgery Using Bubble-packing and Force-field Analogy

    PubMed Central

    Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed

    2007-01-01

    Background: Cryosurgery is the destruction of undesired tissues by freezing, as in prostate cryosurgery, for example. Minimally-invasive cryosurgery is currently performed by means of an array of cryoprobes, each in the shape of a long hypodermic needle. The optimal arrangement of the cryoprobes, which is known to have a dramatic effect on the quality of the cryoprocedure, remains an art held by the cryosurgeon, based on the cryosurgeon's experience and “rules of thumb.” An automated computerized technique for cryosurgery planning is the subject matter of the current report, in an effort to improve the quality of cryosurgery. Method of Approach: A two-phase optimization method is proposed for this purpose, based on two previous and independent developments by this research team. Phase I is based on a bubble-packing method, previously used as an efficient method for finite elements meshing. Phase II is based on a force-field analogy method, which has proven to be robust at the expense of a typically long runtime. Results: As a proof-of-concept, results are demonstrated on a 2D case of a prostate cross-section. The major contribution of this study is to affirm that in many instances cryosurgery planning can be performed without extremely expensive simulations of bioheat transfer, achieved in Phase I. Conclusions: This new method of planning has proven to reduce planning runtime from hours to minutes, making automated planning practical in a clinical time frame. PMID:16532617

  12. Development and validation of an automated ventilator-associated event electronic surveillance system: A report of a successful implementation.

    PubMed

    Hebert, Courtney; Flaherty, Jennifer; Smyer, Justin; Ding, Jing; Mangino, Julie E

    2018-03-01

    Surveillance is an important tool for infection control; however, this task can often be time-consuming and take away from infection prevention activities. With the increasing availability of comprehensive electronic health records, there is an opportunity to automate these surveillance activities. The objective of this article is to describe the implementation of an electronic algorithm for ventilator-associated events (VAEs) at a large academic medical center METHODS: This article reports on a 6-month manual validation of a dashboard for VAEs. We developed a computerized algorithm for automatically detecting VAEs and compared the output of this algorithm to the traditional, manual method of VAE surveillance. Manual surveillance by the infection preventionists identified 13 possible and 11 probable ventilator-associated pneumonias (VAPs), and the VAE dashboard identified 16 possible and 13 probable VAPs. The dashboard had 100% sensitivity and 100% accuracy when compared with manual surveillance for possible and probable VAP. We report on the successfully implemented VAE dashboard. Workflow of the infection preventionists was simplified after implementation of the dashboard with subjective time-savings reported. Implementing a computerized dashboard for VAE surveillance at a medical center with a comprehensive electronic health record is feasible; however, this required significant initial and ongoing work on the part of data analysts and infection preventionists. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  13. Computerized Lung Sound Analysis as diagnostic aid for the detection of abnormal lung sounds: a systematic review and meta-analysis

    PubMed Central

    Gurung, Arati; Scrafford, Carolyn G; Tielsch, James M; Levine, Orin S; Checkley, William

    2011-01-01

    Rationale The standardized use of a stethoscope for chest auscultation in clinical research is limited by its inherent inter-listener variability. Electronic auscultation and automated classification of recorded lung sounds may help prevent some these shortcomings. Objective We sought to perform a systematic review and meta-analysis of studies implementing computerized lung sounds analysis (CLSA) to aid in the detection of abnormal lung sounds for specific respiratory disorders. Methods We searched for articles on CLSA in MEDLINE, EMBASE, Cochrane Library and ISI Web of Knowledge through July 31, 2010. Following qualitative review, we conducted a meta-analysis to estimate the sensitivity and specificity of CLSA for the detection of abnormal lung sounds. Measurements and Main Results Of 208 articles identified, we selected eight studies for review. Most studies employed either electret microphones or piezoelectric sensors for auscultation, and Fourier Transform and Neural Network algorithms for analysis and automated classification of lung sounds. Overall sensitivity for the detection of wheezes or crackles using CLSA was 80% (95% CI 72–86%) and specificity was 85% (95% CI 78–91%). Conclusions While quality data on CLSA are relatively limited, analysis of existing information suggests that CLSA can provide a relatively high specificity for detecting abnormal lung sounds such as crackles and wheezes. Further research and product development could promote the value of CLSA in research studies or its diagnostic utility in clinical setting. PMID:21676606

  14. Computerized lung sound analysis as diagnostic aid for the detection of abnormal lung sounds: a systematic review and meta-analysis.

    PubMed

    Gurung, Arati; Scrafford, Carolyn G; Tielsch, James M; Levine, Orin S; Checkley, William

    2011-09-01

    The standardized use of a stethoscope for chest auscultation in clinical research is limited by its inherent inter-listener variability. Electronic auscultation and automated classification of recorded lung sounds may help prevent some of these shortcomings. We sought to perform a systematic review and meta-analysis of studies implementing computerized lung sound analysis (CLSA) to aid in the detection of abnormal lung sounds for specific respiratory disorders. We searched for articles on CLSA in MEDLINE, EMBASE, Cochrane Library and ISI Web of Knowledge through July 31, 2010. Following qualitative review, we conducted a meta-analysis to estimate the sensitivity and specificity of CLSA for the detection of abnormal lung sounds. Of 208 articles identified, we selected eight studies for review. Most studies employed either electret microphones or piezoelectric sensors for auscultation, and Fourier Transform and Neural Network algorithms for analysis and automated classification of lung sounds. Overall sensitivity for the detection of wheezes or crackles using CLSA was 80% (95% CI 72-86%) and specificity was 85% (95% CI 78-91%). While quality data on CLSA are relatively limited, analysis of existing information suggests that CLSA can provide a relatively high specificity for detecting abnormal lung sounds such as crackles and wheezes. Further research and product development could promote the value of CLSA in research studies or its diagnostic utility in clinical settings. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  16. Nociones de la programacion de lenguas extranjeras: ensayo metodologico (Notions on the Programming of Foreign Languages: Methodological Experiment)

    ERIC Educational Resources Information Center

    Feldman, David

    1975-01-01

    Presents a computerized program for foreign language learning giving drills for all the major language skills. The drills are followed by an extensive bibliography of documents in some way dealing with computer based instruction, particularly foreign language instruction. (Text is in Spanish.) (TL)

  17. A Monte Carlo Approach for Adaptive Testing with Content Constraints

    ERIC Educational Resources Information Center

    Belov, Dmitry I.; Armstrong, Ronald D.; Weissman, Alexander

    2008-01-01

    This article presents a new algorithm for computerized adaptive testing (CAT) when content constraints are present. The algorithm is based on shadow CAT methodology to meet content constraints but applies Monte Carlo methods and provides the following advantages over shadow CAT: (a) lower maximum item exposure rates, (b) higher utilization of the…

  18. STUDENT-TEACHER POPULATION GROWTH MODEL--DYNAMOD II.

    ERIC Educational Resources Information Center

    ZABROWSKI, EDWARD K.; AND OTHERS

    DYNAMOD II IS A COMPUTERIZED MARKOVIAN-TYPE FLOW MODEL DEVELOPED TO PROVIDE ESTIMATES OF THE EDUCATIONAL POPULATION OF STUDENTS AND TEACHERS OVER SELECTED INTERVALS OF TIME. THE POPULATION IS CROSS-CLASSIFIED INTO 108 GROUPS BY SEX, RACE, AGE, AND EDUCATIONAL CATEGORY. THIS NOTE DESCRIBES THE METHODOLOGY USED IN DYNAMOD II, COMPARES DYNAMOD II…

  19. Integrative Education: Teaching Psychology with the Use of Literature and Informational Technology

    ERIC Educational Resources Information Center

    Toom, Anna

    2013-01-01

    In this work, a new method of teaching psychology based on the union of scientific, artistic, and information-technological knowledge is presented. The author teaches Cognitive Development in Early Childhood analyzing Anton Chekhov's short story "Grisha" and uses both traditional and computerized instructional methodology. In the authors' two…

  20. An Overview of Integrated Logistic Support in Medical Material Programs.

    DTIC Science & Technology

    1980-12-01

    OF MEDICAL INTEGRATED LOGISTIC SUPPORT ----------------- 7 B. PROBLEM DEFINITION AND OBJECTIVE ------------ 9 C. GENERAL APPROACH AND METHODOLOGY...SYSTEM ---------------------- 61 C. GENERAL CONCLUSIONS ------------------------- 63 D. RECOMMENDATIONS ----------------------------- 73 E. CONCLUSION...21 Technological advancement has caused major changes in medicine and dentistry in the last several decades. Inten- sive care units, computerized axial

  1. Effects of Computer System and Vowel Loading on Measures of Nasalance

    ERIC Educational Resources Information Center

    Awan, Shaheen N.; Omlor, Kristin; Watts, Christopher R.

    2011-01-01

    Purpose: The purpose of this study was to determine similarities and differences in nasalance scores observed with different computerized nasalance systems in the context of vowel-loaded sentences. Methodology: Subjects were 46 Caucasian adults with no perceived hyper-or hyponasality. Nasalance scores were obtained using the Nasometer 6200 (Kay…

  2. Methodological, technical, and ethical issues of a computerized data system.

    PubMed

    Rice, C A; Godkin, M A; Catlin, R J

    1980-06-01

    This report examines some methodological, technical, and ethical issues which need to be addressed in designing and implementing a valid and reliable computerized clinical data base. The report focuses on the data collection system used by four residency based family health centers, affiliated with the University of Massachusetts Medical Center. It is suggested that data reliability and validity can be maximized by: (1) standardizing encounter forms at affiliated health centers to eliminate recording biases and ensure data comparability; (2) using forms with a diagnosis checklist to reduce coding errors and increase the number of diagnoses recorded per encounter; (3) developing uniform diagnostic criteria; (4) identifying sources of error, including discrepancies of clinical data as recorded in medical records, encounter forms, and the computer; and (5) improving provider cooperation in recording data by distributing data summaries which reinforce the data's applicability to service provision. Potential applications of the data for research purposes are restricted by personnel and computer costs, confidentiality considerations, programming related issues, and, most importantly, health center priorities, largely focused on patient care, not research.

  3. A Validity-Based Approach to Quality Control and Assurance of Automated Scoring

    ERIC Educational Resources Information Center

    Bejar, Isaac I.

    2011-01-01

    Automated scoring of constructed responses is already operational in several testing programmes. However, as the methodology matures and the demand for the utilisation of constructed responses increases, the volume of automated scoring is likely to increase at a fast pace. Quality assurance and control of the scoring process will likely be more…

  4. Failure mode and effect analysis oriented to risk-reduction interventions in intraoperative electron radiation therapy: the specific impact of patient transportation, automation, and treatment planning availability.

    PubMed

    López-Tarjuelo, Juan; Bouché-Babiloni, Ana; Santos-Serra, Agustín; Morillo-Macías, Virginia; Calvo, Felipe A; Kubyshin, Yuri; Ferrer-Albiach, Carlos

    2014-11-01

    Industrial companies use failure mode and effect analysis (FMEA) to improve quality. Our objective was to describe an FMEA and subsequent interventions for an automated intraoperative electron radiotherapy (IOERT) procedure with computed tomography simulation, pre-planning, and a fixed conventional linear accelerator. A process map, an FMEA, and a fault tree analysis are reported. The equipment considered was the radiance treatment planning system (TPS), the Elekta Precise linac, and TN-502RDM-H metal-oxide-semiconductor-field-effect transistor in vivo dosimeters. Computerized order-entry and treatment-automation were also analyzed. Fifty-seven potential modes and effects were identified and classified into 'treatment cancellation' and 'delivering an unintended dose'. They were graded from 'inconvenience' or 'suboptimal treatment' to 'total cancellation' or 'potentially wrong' or 'very wrong administered dose', although these latter effects were never experienced. Risk priority numbers (RPNs) ranged from 3 to 324 and totaled 4804. After interventions such as double checking, interlocking, automation, and structural changes the final total RPN was reduced to 1320. FMEA is crucial for prioritizing risk-reduction interventions. In a semi-surgical procedure like IOERT double checking has the potential to reduce risk and improve quality. Interlocks and automation should also be implemented to increase the safety of the procedure. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System

    PubMed Central

    Punjabi, Naresh M.; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N.

    2015-01-01

    Study Objectives: Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. Design: The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Setting: Clinical sleep laboratories. Measurements and Results: A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90–0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91–0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Conclusion: Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. Citation: Punjabi NM, Shifa N, Dorffner G, Patil S, Pien G, Aurora RN. Computer-assisted automated scoring of polysomnograms using the Somnolyzer system. SLEEP 2015;38(10):1555–1566. PMID:25902809

  6. Biomimicry in Product Design through Materials Selection and Computer Aided Engineering

    NASA Astrophysics Data System (ADS)

    Alexandridis, G.; Tzetzis, D.; Kyratsis, P.

    2016-11-01

    The aim of this study is to demonstrate a 7-step methodology that describes the way nature can act as a source of inspiration for the design and the development of a product. Furthermore, it suggests special computerized tools and methods for the product optimization regarding its environmental impact i.e. material selection, production methods. For validation purposes, a garden chaise lounge that imitates the form of a scorpion was developed as a result for the case study and the presentation of the current methodology.

  7. Extending the Instructional Systems Development Methodology.

    ERIC Educational Resources Information Center

    O'Neill, Colin E.

    1993-01-01

    Describes ways that components of Information Engineering (IE) methodology can be used by training system developers to extend Instructional Systems Development (ISD) methodology. Aspects of IE that are useful in ISD are described, including requirements determination, group facilitation, integrated automated tool support, and prototyping.…

  8. Human/Automation Trade Methodology for the Moon, Mars and Beyond

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David J.

    2009-01-01

    It is possible to create a consistent trade methodology that can characterize operations model alternatives for crewed exploration missions. For example, a trade-space that is organized around the objective of maximizing Crew Exploration Vehicle (CEV) independence would have the input as a classification of the category of analysis to be conducted or decision to be made, and a commitment to a detailed point in a mission profile during which the analysis or decision is to be made. For example, does the decision have to do with crew activity planning, or life support? Is the mission phase trans-Earth injection, cruise, or lunar descent? Different kinds of decision analysis of the trade-space between human and automated decisions will occurs at different points in a mission's profile. The necessary objectives at a given point in time during a mission will call for different kinds of response with respect to where and how computers and automation are expected to help provide an accurate, safe, and timely response. In this paper, a consistent methodology for assessing the trades between human and automated decisions on-board will be presented and various examples discussed.

  9. [Some approaches to the countermeasure system for a mars exploration mission].

    PubMed

    Kozlovskaia, I B; Egorov, A D; Son'kin, V D

    2010-01-01

    In article discussed physiological and methodical principles of the organization of training process and his (its) computerization during Martian flight in conditions of autonomous activity of the crew, providing interaction with onboard medical means, self-maintained by crew of the their health, performance of preventive measures, diagnostic studies and, in case of necessity, carrying out of treatment. In super long autonomous flights essentially become complicated the control of ground experts over of crew members conditions, that testifies to necessity of a computerization of control process by a state of health of crew, including carrying out of preventive actions. The situation becomes complicated impossibility of reception and transfer aboard the necessary information in real time and emergency returning of crew to the Earth. In these conditions realization of problems of physical preventive maintenance should be solved by means of the onboard automated expert system, providing management by trainings of each crew members, directed on optimization of their psychophysical condition.

  10. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  11. E-Prescribing: History, Issues, and Potentials

    PubMed Central

    Salmon, J. Warren; Jiang, Ruixuan

    2012-01-01

    Electronic-Prescribing, Computerized Prescribing, or E-RX has increased dramatically of late in the American health care system, a long overdue alternative to the written form for the almost five billion drug treatments annually. This paper examines the history and selected issues in the rise of E-RX by a review of salient literature, interviews, and field observations in Pharmacy. Pharmacies were early adopters of computerization for a variety of factors. The profession in its new corporate forms of chain drug stores and pharmacy benefits firms has sought efficiencies, profit enhancements, and clinical improvements through managed care strategies that rely upon data automation. E-RX seems to be a leading factor in overall physician acceptance of Electronic Medical Records (EMRs), although the Centers for Medicare and Medicaid (CMS) incentives seem to be the propelling force in acceptance. We conclude that greater research should be conducted by public health professionals to focus on resolutions to pharmaceutical use, safety, and cost escalation, which persist and remain dire following health reform. PMID:23569654

  12. j5 DNA assembly design automation.

    PubMed

    Hillson, Nathan J

    2014-01-01

    Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.

  13. Facial recognition techniques applied to the automated registration of patients in the emergency treatment of head injuries.

    PubMed

    Gooroochurn, M; Kerr, D; Bouazza-Marouf, K; Ovinis, M

    2011-02-01

    This paper describes the development of a registration framework for image-guided solutions to the automation of certain routine neurosurgical procedures. The registration process aligns the pose of the patient in the preoperative space to that of the intraoperative space. Computerized tomography images are used in the preoperative (planning) stage, whilst white light (TV camera) images are used to capture the intraoperative pose. Craniofacial landmarks, rather than artificial markers, are used as the registration basis for the alignment. To create further synergy between the user and the image-guided system, automated methods for extraction of these landmarks have been developed. The results obtained from the application of a polynomial neural network classifier based on Gabor features for the detection and localization of the selected craniofacial landmarks, namely the ear tragus and eye corners in the white light modality are presented. The robustness of the classifier to variations in intensity and noise is analysed. The results show that such a classifier gives good performance for the extraction of craniofacial landmarks.

  14. Spray automated balancing of rotors: Methods and materials

    NASA Technical Reports Server (NTRS)

    Smalley, Anthony J.; Baldwin, Richard M.; Schick, Wilbur R.

    1988-01-01

    The work described consists of two parts. In the first part, a survey is performed to assess the state of the art in rotor balancing technology as it applies to Army gas turbine engines and associated power transmission hardware. The second part evaluates thermal spray processes for balancing weight addition in an automated balancing procedure. The industry survey reveals that: (1) computerized balancing equipment is valuable to reduce errors, improve balance quality, and provide documentation; (2) slow-speed balancing is used exclusively, with no forseeable need for production high-speed balancing; (3) automated procedures are desired; and (4) thermal spray balancing is viewed with cautious optimism whereas laser balancing is viewed with concern for flight propulsion hardware. The FARE method (Fuel/Air Repetitive Explosion) was selected for experimental evaluation of bond strength and fatigue strength. Material combinations tested were tungsten carbide on stainless steel (17-4), Inconel 718 on Inconel 718, and Triballoy 800 on Inconel 718. Bond strengths were entirely adequate for use in balancing. Material combinations have been identified for use in hot and cold sections of an engine, with fatigue strengths equivalent to those for hand-ground materials.

  15. Recent advances in automated protein design and its future challenges.

    PubMed

    Setiawan, Dani; Brender, Jeffrey; Zhang, Yang

    2018-04-25

    Protein function is determined by protein structure which is in turn determined by the corresponding protein sequence. If the rules that cause a protein to adopt a particular structure are understood, it should be possible to refine or even redefine the function of a protein by working backwards from the desired structure to the sequence. Automated protein design attempts to calculate the effects of mutations computationally with the goal of more radical or complex transformations than are accessible by experimental techniques. Areas covered: The authors give a brief overview of the recent methodological advances in computer-aided protein design, showing how methodological choices affect final design and how automated protein design can be used to address problems considered beyond traditional protein engineering, including the creation of novel protein scaffolds for drug development. Also, the authors address specifically the future challenges in the development of automated protein design. Expert opinion: Automated protein design holds potential as a protein engineering technique, particularly in cases where screening by combinatorial mutagenesis is problematic. Considering solubility and immunogenicity issues, automated protein design is initially more likely to make an impact as a research tool for exploring basic biology in drug discovery than in the design of protein biologics.

  16. The Computer Bulletin Board.

    ERIC Educational Resources Information Center

    Collins, Michael J.; Vitz, Ed

    1988-01-01

    Examines two computer interfaced lab experiments: 1) discusses the automation of a Perkin Elmer 337 infrared spectrophotometer noting the mechanical and electronic changes needed; 2) uses the Gouy method and Lotus Measure software to automate magnetic susceptibility determinations. Methodology is described. (MVL)

  17. Ontology-Driven Information Integration

    NASA Technical Reports Server (NTRS)

    Tissot, Florence; Menzel, Chris

    2005-01-01

    Ontology-driven information integration (ODII) is a method of computerized, automated sharing of information among specialists who have expertise in different domains and who are members of subdivisions of a large, complex enterprise (e.g., an engineering project, a government agency, or a business). In ODII, one uses rigorous mathematical techniques to develop computational models of engineering and/or business information and processes. These models are then used to develop software tools that support the reliable processing and exchange of information among the subdivisions of this enterprise or between this enterprise and other enterprises.

  18. Automated Recognition of Vegetation and Water Bodies on the Territory of Megacities in Satellite Images of Visible and IR Bands

    NASA Astrophysics Data System (ADS)

    Mozgovoy, Dmitry k.; Hnatushenko, Volodymyr V.; Vasyliev, Volodymyr V.

    2018-04-01

    Vegetation and water bodies are a fundamental element of urban ecosystems, and water mapping is critical for urban and landscape planning and management. A methodology of automated recognition of vegetation and water bodies on the territory of megacities in satellite images of sub-meter spatial resolution of the visible and IR bands is proposed. By processing multispectral images from the satellite SuperView-1A, vector layers of recognized plant and water objects were obtained. Analysis of the results of image processing showed a sufficiently high accuracy of the delineation of the boundaries of recognized objects and a good separation of classes. The developed methodology provides a significant increase of the efficiency and reliability of updating maps of large cities while reducing financial costs. Due to the high degree of automation, the proposed methodology can be implemented in the form of a geo-information web service functioning in the interests of a wide range of public services and commercial institutions.

  19. Space Station man-machine automation trade-off analysis

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Bard, J.; Feinberg, A.

    1985-01-01

    The man machine automation tradeoff methodology presented is of four research tasks comprising the autonomous spacecraft system technology (ASST) project. ASST was established to identify and study system level design problems for autonomous spacecraft. Using the Space Station as an example spacecraft system requiring a certain level of autonomous control, a system level, man machine automation tradeoff methodology is presented that: (1) optimizes man machine mixes for different ground and on orbit crew functions subject to cost, safety, weight, power, and reliability constraints, and (2) plots the best incorporation plan for new, emerging technologies by weighing cost, relative availability, reliability, safety, importance to out year missions, and ease of retrofit. A fairly straightforward approach is taken by the methodology to valuing human productivity, it is still sensitive to the important subtleties associated with designing a well integrated, man machine system. These subtleties include considerations such as crew preference to retain certain spacecraft control functions; or valuing human integration/decision capabilities over equivalent hardware/software where appropriate.

  20. Programming methodology for a general purpose automation controller

    NASA Technical Reports Server (NTRS)

    Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.

    1987-01-01

    The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.

  1. Intelligent Automation Approach for Improving Pilot Situational Awareness

    NASA Technical Reports Server (NTRS)

    Spirkovska, Lilly

    2004-01-01

    Automation in the aviation domain has been increasing for the past two decades. Pilot reaction to automation varies from highly favorable to highly critical depending on both the pilot's background and how effectively the automation is implemented. We describe a user-centered approach for automation that considers the pilot's tasks and his needs related to accomplishing those tasks. Further, we augment rather than replace how the pilot currently fulfills his goals, relying on redundant displays that offer the pilot an opportunity to build trust in the automation. Our prototype system automates the interpretation of hydraulic system faults of the UH-60 helicopter. We describe the problem with the current system and our methodology for resolving it.

  2. The Development of a Computer Model for Projecting Statewide College Enrollments: A Preliminary Study.

    ERIC Educational Resources Information Center

    Rensselaer Research Corp., Troy, NY.

    The purpose of this study was to develop the schema and methodology for the construction of a computerized mathematical model designed to project college and university enrollments in New York State and to meet the future increased demands of higher education planners. This preliminary report describes the main structure of the proposed computer…

  3. How to Use the DX SYSTEM of Diagnostic Testing. Methodology Project.

    ERIC Educational Resources Information Center

    McArthur, David; Cabello, Beverly

    The DX SYSTEM of Diagnostic Testing is an easy-to-use computerized system for developing and administering diagnostic tests. A diagnostic test measures a student's mastery of a specific domain (skill or content area). It examines the necessary subskills hierarchically from the most to the least complex. The DX SYSTEM features tailored testing with…

  4. Processes and Factors Underlying Adolescent Males' Attitudes and Decision-Making in Relation to an Unplanned Pregnancy

    ERIC Educational Resources Information Center

    Condon, John T.; Corkindale, Carolyn J.; Russell, Alan; Quinlivan, Julie A.

    2006-01-01

    This research examined adolescent males' decision-making when confronted with a hypothetical unplanned pregnancy in a sexual partner. An innovative methodology, involving a computerized simulation game was utilized with 386 Australian males (mean age of 15 years). Data were gathered from responses made during the simulation, and questionnaires…

  5. Development and validation of a survey instrument for assessing prescribers' perception of computerized drug-drug interaction alerts.

    PubMed

    Zheng, Kai; Fear, Kathleen; Chaffee, Bruce W; Zimmerman, Christopher R; Karls, Edward M; Gatwood, Justin D; Stevenson, James G; Pearlman, Mark D

    2011-12-01

    To develop a theoretically informed and empirically validated survey instrument for assessing prescribers' perception of computerized drug-drug interaction (DDI) alerts. The survey is grounded in the unified theory of acceptance and use of technology and an adapted accident causation model. Development of the instrument was also informed by a review of the extant literature on prescribers' attitude toward computerized medication safety alerts and common prescriber-provided reasons for overriding. To refine and validate the survey, we conducted a two-stage empirical validation study consisting of a pretest with a panel of domain experts followed by a field test among all eligible prescribers at our institution. The resulting survey instrument contains 28 questionnaire items assessing six theoretical dimensions: performance expectancy, effort expectancy, social influence, facilitating conditions, perceived fatigue, and perceived use behavior. Satisfactory results were obtained from the field validation; however, a few potential issues were also identified. We analyzed these issues accordingly and the results led to the final survey instrument as well as usage recommendations. High override rates of computerized medication safety alerts have been a prevalent problem. They are usually caused by, or manifested in, issues of poor end user acceptance. However, standardized research tools for assessing and understanding end users' perception are currently lacking, which inhibits knowledge accumulation and consequently forgoes improvement opportunities. The survey instrument presented in this paper may help fill this methodological gap. We developed and empirically validated a survey instrument that may be useful for future research on DDI alerts and other types of computerized medication safety alerts more generally.

  6. COMPUTERIZED EXPERT SYSTEM FOR EVALUATION OF AUTOMATED VISUAL FIELDS FROM THE ISCHEMIC OPTIC NEUROPATHY DECOMPRESSION TRIAL: METHODS, BASELINE FIELDS, AND SIX-MONTH LONGITUDINAL FOLLOW-UP

    PubMed Central

    Feldon, Steven E

    2004-01-01

    ABSTRACT Purpose To validate a computerized expert system evaluating visual fields in a prospective clinical trial, the Ischemic Optic Neuropathy Decompression Trial (IONDT). To identify the pattern and within-pattern severity of field defects for study eyes at baseline and 6-month follow-up. Design Humphrey visual field (HVF) change was used as the outcome measure for a prospective, randomized, multi-center trial to test the null hypothesis that optic nerve sheath decompression was ineffective in treating nonarteritic anterior ischemic optic neuropathy and to ascertain the natural history of the disease. Methods An expert panel established criteria for the type and severity of visual field defects. Using these criteria, a rule-based computerized expert system interpreted HVF from baseline and 6-month visits for patients randomized to surgery or careful follow-up and for patients who were not randomized. Results A computerized expert system was devised and validated. The system was then used to analyze HVFs. The pattern of defects found at baseline for patients randomized to surgery did not differ from that of patients randomized to careful follow-up. The most common pattern of defect was a superior and inferior arcuate with central scotoma for randomized eyes (19.2%) and a superior and inferior arcuate for nonrandomized eyes (30.6%). Field patterns at 6 months and baseline were not different. For randomized study eyes, the superior altitudinal defects improved (P = .03), as did the inferior altitudinal defects (P = .01). For nonrandomized study eyes, only the inferior altitudinal defects improved (P = .02). No treatment effect was noted. Conclusions A novel rule-based expert system successfully interpreted visual field defects at baseline of eyes enrolled in the IONDT. PMID:15747764

  7. Automated Blazar Light Curves Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Spencer James

    2017-07-27

    This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.

  8. Lesion Border Detection in Dermoscopy Images

    PubMed Central

    Celebi, M. Emre; Schaefer, Gerald; Iyatomi, Hitoshi; Stoecker, William V.

    2009-01-01

    Background Dermoscopy is one of the major imaging modalities used in the diagnosis of melanoma and other pigmented skin lesions. Due to the difficulty and subjectivity of human interpretation, computerized analysis of dermoscopy images has become an important research area. One of the most important steps in dermoscopy image analysis is the automated detection of lesion borders. Methods In this article, we present a systematic overview of the recent border detection methods in the literature paying particular attention to computational issues and evaluation aspects. Conclusion Common problems with the existing approaches include the acquisition, size, and diagnostic distribution of the test image set, the evaluation of the results, and the inadequate description of the employed methods. Border determination by dermatologists appears to depend upon higher-level knowledge, therefore it is likely that the incorporation of domain knowledge in automated methods will enable them to perform better, especially in sets of images with a variety of diagnoses. PMID:19121917

  9. Computer-Interpreted Electrocardiograms: Benefits and Limitations.

    PubMed

    Schläpfer, Jürg; Wellens, Hein J

    2017-08-29

    Computerized interpretation of the electrocardiogram (CIE) was introduced to improve the correct interpretation of the electrocardiogram (ECG), facilitating health care decision making and reducing costs. Worldwide, millions of ECGs are recorded annually, with the majority automatically analyzed, followed by an immediate interpretation. Limitations in the diagnostic accuracy of CIE were soon recognized and still persist, despite ongoing improvement in ECG algorithms. Unfortunately, inexperienced physicians ordering the ECG may fail to recognize interpretation mistakes and accept the automated diagnosis without criticism. Clinical mismanagement may result, with the risk of exposing patients to useless investigations or potentially dangerous treatment. Consequently, CIE over-reading and confirmation by an experienced ECG reader are essential and are repeatedly recommended in published reports. Implementation of new ECG knowledge is also important. The current status of automated ECG interpretation is reviewed, with suggestions for improvement. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  10. Automated method and system for the alignment and correlation of images from two different modalities

    DOEpatents

    Giger, Maryellen L.; Chen, Chin-Tu; Armato, Samuel; Doi, Kunio

    1999-10-26

    A method and system for the computerized registration of radionuclide images with radiographic images, including generating image data from radiographic and radionuclide images of the thorax. Techniques include contouring the lung regions in each type of chest image, scaling and registration of the contours based on location of lung apices, and superimposition after appropriate shifting of the images. Specific applications are given for the automated registration of radionuclide lungs scans with chest radiographs. The method in the example given yields a system that spatially registers and correlates digitized chest radiographs with V/Q scans in order to correlate V/Q functional information with the greater structural detail of chest radiographs. Final output could be the computer-determined contours from each type of image superimposed on any of the original images, or superimposition of the radionuclide image data, which contains high activity, onto the radiographic chest image.

  11. Information in medical decision making: how consistent is our management?

    PubMed

    Lorence, Daniel P; Spink, Amanda; Jameson, Robert

    2002-01-01

    The use of outcomes data in clinical environments requires a correspondingly greater variety of information used in decision making, the measurement of quality, and clinical performance. As information becomes integral in the decision-making process, trustworthy decision support data are required. Using data from a national census of certified health information managers, variation in automated data quality management practices was examined. Relatively low overall adoption of automated data management exists in health care organizations, with significant geographic and practice setting variation. Nonuniform regional adoption of computerized data management exists, despite national mandates that promote and in some cases require uniform adoption. Overall, a significant number of respondents (42.7%) indicated that they had not adopted policies and procedures to direct the timeliness of data capture, with 57.3% having adopted such practices. The inconsistency of patient data policy suggests that provider organizations do not use uniform information management methods, despite growing federal mandates to do so.

  12. A computerized procedure for teaching the relationship between graphic symbols and their referents.

    PubMed

    Isaacson, Mick; Lloyd, Lyle L

    2013-01-01

    Many individuals with little or no functional speech communicate through graphic symbols. Communication is enhanced when the relationship between symbols and their referents are learned to such a degree that retrieval is effortless, resulting in fluent communication. Developing fluency is a time consuming endeavor for special educators and speech-language pathologists (SLPs). It would be beneficial for these professionals to have an automated procedure based on the most efficacious method for teaching the relationship between symbols and referent. Hence, this study investigated whether a procedure based on the generation effect would promote learning the association between symbols and their referents. Results show that referent generation produces the best long-term retention of this relationship. These findings provide evidence that software based on referent generation would provide special educators and SLPs with an efficacious automated procedure, requiring minimal direct supervision, to facilitate symbol/referent learning and the development of communicative fluency.

  13. An interactive modular design for computerized photometry in spectrochemical analysis

    NASA Technical Reports Server (NTRS)

    Bair, V. L.

    1980-01-01

    A general functional description of totally automatic photometry of emission spectra is not available for an operating environment in which the sample compositions and analysis procedures are low-volume and non-routine. The advantages of using an interactive approach to computer control in such an operating environment are demonstrated. This approach includes modular subroutines selected at multiple-option, menu-style decision points. This style of programming is used to trace elemental determinations, including the automated reading of spectrographic plates produced by a 3.4 m Ebert mount spectrograph using a dc-arc in an argon atmosphere. The simplified control logic and modular subroutine approach facilitates innovative research and program development, yet is easily adapted to routine tasks. Operator confidence and control are increased by the built-in options including degree of automation, amount of intermediate data printed out, amount of user prompting, and multidirectional decision points.

  14. Coupling computer-interpretable guidelines with a drug-database through a web-based system – The PRESGUID project

    PubMed Central

    Dufour, Jean-Charles; Fieschi, Dominique; Fieschi, Marius

    2004-01-01

    Background Clinical Practice Guidelines (CPGs) available today are not extensively used due to lack of proper integration into clinical settings, knowledge-related information resources, and lack of decision support at the point of care in a particular clinical context. Objective The PRESGUID project (PREScription and GUIDelines) aims to improve the assistance provided by guidelines. The project proposes an online service enabling physicians to consult computerized CPGs linked to drug databases for easier integration into the healthcare process. Methods Computable CPGs are structured as decision trees and coded in XML format. Recommendations related to drug classes are tagged with ATC codes. We use a mapping module to enhance computerized guidelines coupling with a drug database, which contains detailed information about each usable specific medication. In this way, therapeutic recommendations are backed up with current and up-to-date information from the database. Results Two authoritative CPGs, originally diffused as static textual documents, have been implemented to validate the computerization process and to illustrate the usefulness of the resulting automated CPGs and their coupling with a drug database. We discuss the advantages of this approach for practitioners and the implications for both guideline developers and drug database providers. Other CPGs will be implemented and evaluated in real conditions by clinicians working in different health institutions. PMID:15053828

  15. The Computerized Laboratory Notebook concept for genetic toxicology experimentation and testing.

    PubMed

    Strauss, G H; Stanford, W L; Berkowitz, S J

    1989-03-01

    We describe a microcomputer system utilizing the Computerized Laboratory Notebook (CLN) concept developed in our laboratory for the purpose of automating the Battery of Leukocyte Tests (BLT). The BLT was designed to evaluate blood specimens for toxic, immunotoxic, and genotoxic effects after in vivo exposure to putative mutagens. A system was developed with the advantages of low cost, limited spatial requirements, ease of use for personnel inexperienced with computers, and applicability to specific testing yet flexibility for experimentation. This system eliminates cumbersome record keeping and repetitive analysis inherent in genetic toxicology bioassays. Statistical analysis of the vast quantity of data produced by the BLT would not be feasible without a central database. Our central database is maintained by an integrated package which we have adapted to develop the CLN. The clonal assay of lymphocyte mutagenesis (CALM) section of the CLN is demonstrated. PC-Slaves expand the microcomputer to multiple workstations so that our computerized notebook can be used next to a hood while other work is done in an office and instrument room simultaneously. Communication with peripheral instruments is an indispensable part of many laboratory operations, and we present a representative program, written to acquire and analyze CALM data, for communicating with both a liquid scintillation counter and an ELISA plate reader. In conclusion we discuss how our computer system could easily be adapted to the needs of other laboratories.

  16. HPLC-Assisted Automated Oligosaccharide Synthesis: Implementation of the Autosampler as a Mode of the Reagent Delivery.

    PubMed

    Pistorio, Salvatore G; Nigudkar, Swati S; Stine, Keith J; Demchenko, Alexei V

    2016-10-07

    The development of a useful methodology for simple, scalable, and transformative automation of oligosaccharide synthesis that easily interfaces with existing methods is reported. The automated synthesis can now be performed using accessible equipment where the reactants and reagents are delivered by the pump or the autosampler and the reactions can be monitored by the UV detector. The HPLC-based platform for automation is easy to setup and adapt to different systems and targets.

  17. Workload-Based Automated Interface Mode Selection

    DTIC Science & Technology

    2012-03-22

    Process . . . . . . . . . . . . . . . . . . . . . 31 3.5.10 Agent Reward Function . . . . . . . . . . . . . . . . 31 3.5.11 Accelerated Learning... Strategies . . . . . . . . . . . . 31 4. Experimental Methodology . . . . . . . . . . . . . . . . . . . . . . . . 33 4.1 System Engineering Methodology...26 5. Agent state function. . . . . . . . . . . . . . . . . . . . . . . . . . . 28 6. Agent reward function

  18. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation)

    PubMed Central

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-01-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760

  19. Clinical Chemistry Laboratory Automation in the 21st Century - Amat Victoria curam (Victory loves careful preparation).

    PubMed

    Armbruster, David A; Overcash, David R; Reyes, Jaime

    2014-08-01

    The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.

  20. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    PubMed Central

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-01-01

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171

  1. Design Methodology for Automated Construction Machines

    DTIC Science & Technology

    1987-12-11

    along with the design of a pair of machines which automate framework installation.-,, 20. DISTRIBUTION IAVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY... Development Assistant Professor of Civil Engineering and Laura A . Demsetz, David H. Levy, Bruce Schena Graduate Research Assistants December 11, 1987 U.S...are discussed along with the design of a pair of machines which automate framework installation. Preliminary analysis and testing indicate that these

  2. A Methodology for Developing Army Acquisition Strategies for an Uncertain Future

    DTIC Science & Technology

    2007-01-01

    manuscript for publication. Acronyms ABP Assumption-Based Planning ACEIT Automated Cost Estimating Integrated Tool ACR Armored Cavalry Regiment ACTD...decisions. For example, they employ the Automated Cost Estimating Integrated Tools ( ACEIT ) to simplify life cycle cost estimates; other tools are

  3. Studies to determine the effectiveness of automated flagger assistance devices and school crossing devices.

    DOT National Transportation Integrated Search

    2012-01-01

    This report describes the methodology and results of analyses performed to determine motorist understanding, as well as : the operational and safety effectiveness, of automated flagger assistance devices (AFADs) relative to the use of flaggers at lan...

  4. Relevance of the electronic computer to hospital medical records*

    PubMed Central

    Mitchell, J. H.

    1969-01-01

    During the past 30 years an “information explosion” has completely changed patterns of illness. Unit files of individual patients have become so large that they are increasingly difficult both to store physically and to assimilate mentally. We have reached a communications barriers which poses a major threat to the efficient practice of clinical medicine. At the same time a new kind of machine, the electronic digital computer, which was invented only 26 years ago, has already come to dominate large areas of military, scientific, commercial, and industrial activity. Its supremacy rests on its ability to perform any data procedure automatically and incredibly quickly. Computers are being employed in clinical medicine in hospitals for various purposes. They can act as arithmetic calculators, they can process and analyse output from recording devices, and they can make possible the automation of various machine systems. However, in the field of case records their role is much less well defined, for here the organization of data as a preliminary to computer input is the real stumbling-block. Data banks of retrospective selected clinical information have been in operation in some centres for a number of years. Attempts are now being made to design computerized “total information systems” to replace conventional paper records, and the possibility of automated diagnosis is being seriously discussed. In my view, however, the medical profession is in danger of being dazzled by optimistic claims about the usefulness of computers in case record processing. The solution to the present problems of record storage and handling is very simple, and does not involve computerization. PMID:4898564

  5. Computational assessment of mammography accreditation phantom images and correlation with human observer analysis

    NASA Astrophysics Data System (ADS)

    Barufaldi, Bruno; Lau, Kristen C.; Schiabel, Homero; Maidment, D. A.

    2015-03-01

    Routine performance of basic test procedures and dose measurements are essential for assuring high quality of mammograms. International guidelines recommend that breast care providers ascertain that mammography systems produce a constant high quality image, using as low a radiation dose as is reasonably achievable. The main purpose of this research is to develop a framework to monitor radiation dose and image quality in a mixed breast screening and diagnostic imaging environment using an automated tracking system. This study presents a module of this framework, consisting of a computerized system to measure the image quality of the American College of Radiology mammography accreditation phantom. The methods developed combine correlation approaches, matched filters, and data mining techniques. These methods have been used to analyze radiological images of the accreditation phantom. The classification of structures of interest is based upon reports produced by four trained readers. As previously reported, human observers demonstrate great variation in their analysis due to the subjectivity of human visual inspection. The software tool was trained with three sets of 60 phantom images in order to generate decision trees using the software WEKA (Waikato Environment for Knowledge Analysis). When tested with 240 images during the classification step, the tool correctly classified 88%, 99%, and 98%, of fibers, speck groups and masses, respectively. The variation between the computer classification and human reading was comparable to the variation between human readers. This computerized system not only automates the quality control procedure in mammography, but also decreases the subjectivity in the expert evaluation of the phantom images.

  6. Power processing methodology. [computerized design of spacecraft electric power systems

    NASA Technical Reports Server (NTRS)

    Fegley, K. A.; Hansen, I. G.; Hayden, J. H.

    1974-01-01

    Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.

  7. Prediction of ball and roller bearing thermal and kinematic performance by computer analysis

    NASA Technical Reports Server (NTRS)

    Pirvics, J.; Kleckner, R. J.

    1983-01-01

    Characteristics of good computerized analysis software are suggested. These general remarks and an overview of representative software precede a more detailed discussion of load support system analysis program structure. Particular attention is directed at a recent cylindrical roller bearing analysis as an example of the available design tools. Selected software modules are then examined to reveal the detail inherent in contemporary analysis. This leads to a brief section on current design computation which seeks to suggest when and why computerized analysis is warranted. An example concludes the argument offered for such design methodology. Finally, remarks are made concerning needs for model development to address effects which are now considered to be secondary but are anticipated to emerge to primary status in the near future.

  8. Analysis of Trinity Power Metrics for Automated Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalenko, Ashley Christine

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  9. Computerized Coordinated Service Center: A Comparison of Service Methodologies and Costs in the Urban and Rural Area.

    ERIC Educational Resources Information Center

    Waldman, Risa J.; And Others

    Ten parallel human service agencies (five urban and five rural) were compared to identify variations in the service delivery system and to compare the costs of service provision. The agencies responded to approximately 36 questions covering eight major areas and were compared and contrasted, urban versus rural, according to the type of agency. All…

  10. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  11. A novel algorithm to detect glaucoma risk using texton and local configuration pattern features extracted from fundus images.

    PubMed

    Acharya, U Rajendra; Bhat, Shreya; Koh, Joel E W; Bhandary, Sulatha V; Adeli, Hojjat

    2017-09-01

    Glaucoma is an optic neuropathy defined by characteristic damage to the optic nerve and accompanying visual field deficits. Early diagnosis and treatment are critical to prevent irreversible vision loss and ultimate blindness. Current techniques for computer-aided analysis of the optic nerve and retinal nerve fiber layer (RNFL) are expensive and require keen interpretation by trained specialists. Hence, an automated system is highly desirable for a cost-effective and accurate screening for the diagnosis of glaucoma. This paper presents a new methodology and a computerized diagnostic system. Adaptive histogram equalization is used to convert color images to grayscale images followed by convolution of these images with Leung-Malik (LM), Schmid (S), and maximum response (MR4 and MR8) filter banks. The basic microstructures in typical images are called textons. The convolution process produces textons. Local configuration pattern (LCP) features are extracted from these textons. The significant features are selected using a sequential floating forward search (SFFS) method and ranked using the statistical t-test. Finally, various classifiers are used for classification of images into normal and glaucomatous classes. A high classification accuracy of 95.8% is achieved using six features obtained from the LM filter bank and the k-nearest neighbor (kNN) classifier. A glaucoma integrative index (GRI) is also formulated to obtain a reliable and effective system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Cerebellar contribution to locomotor behavior: A neurodevelopmental perspective.

    PubMed

    Sathyanesan, Aaron; Gallo, Vittorio

    2018-04-30

    The developmental trajectory of the formation of cerebellar circuitry has significant implications for locomotor plasticity and adaptive learning at later stages. While there is a wealth of knowledge on the development of locomotor behavior in human infants, children, and adolescents, pre-clinical animal models have fallen behind on the study of the emergence of behavioral motifs in locomotor function across postnatal development. Since cerebellar development is protracted, it is subject to higher risk of genetic or environmental disruption, potentially leading to abnormal behavioral development. This highlights the need for more sophisticated and specific functional analyses of adaptive cerebellar behavior within the context of whole-body locomotion across the entire span of postnatal development. Here we review evidence on cerebellar contribution to adaptive locomotor behavior, highlighting methodologies employed to quantify and categorize behavior at different developmental stages, with the ultimate goal of following the course of early behavioral alterations in neurodevelopmental disorders. Since experimental paradigms used to study cerebellar behavior are lacking in both specificity and applicability to locomotor contexts, we highlight the use of the Erasmus Ladder - an advanced, computerized, fully automated system to quantify adaptive cerebellar learning in conjunction with locomotor function. Finally, we emphasize the need to develop objective, quantitative, behavioral tasks which can track changes in developmental trajectories rather than endpoint measurement at the adult stage of behavior. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Implementation Proposal of Computer-Based Office Automation for Republic of Korea Army Intelligence Corps (ROKAIC).

    DTIC Science & Technology

    1987-03-01

    contends his soft systems methodology is such an approach. [Ref. 2: pp. 105-107] Overview of this Methodology is meant flor addressing fuzzy., ill...could form the basis of office systems development: Checkland’s (1981) soft systems methodology , Pava’s (1983) sociotechnical design, and Mumlbrd and

  14. Soft robot design methodology for `push-button' manufacturing

    NASA Astrophysics Data System (ADS)

    Paik, Jamie

    2018-06-01

    `Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.

  15. AAC Best Practice Using Automated Language Activity Monitoring.

    ERIC Educational Resources Information Center

    Hill, Katya; Romich, Barry

    This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…

  16. A methodology for automatic intensity-modulated radiation treatment planning for lung cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Li, Xiaoqiang; Quan, Enzhuo M.; Pan, Xiaoning; Li, Yupeng

    2011-07-01

    In intensity-modulated radiotherapy (IMRT), the quality of the treatment plan, which is highly dependent upon the treatment planner's level of experience, greatly affects the potential benefits of the radiotherapy (RT). Furthermore, the planning process is complicated and requires a great deal of iteration, and is often the most time-consuming aspect of the RT process. In this paper, we describe a methodology to automate the IMRT planning process in lung cancer cases, the goal being to improve the quality and consistency of treatment planning. This methodology (1) automatically sets beam angles based on a beam angle automation algorithm, (2) judiciously designs the planning structures, which were shown to be effective for all the lung cancer cases we studied, and (3) automatically adjusts the objectives of the objective function based on a parameter automation algorithm. We compared treatment plans created in this system (mdaccAutoPlan) based on the overall methodology with plans from a clinical trial of IMRT for lung cancer run at our institution. The 'autoplans' were consistently better, or no worse, than the plans produced by experienced medical dosimetrists in terms of tumor coverage and normal tissue sparing. We conclude that the mdaccAutoPlan system can potentially improve the quality and consistency of treatment planning for lung cancer.

  17. Computerized Clinical Decision Support: Contributions from 2015

    PubMed Central

    Bouaud, J.

    2016-01-01

    Summary Objective To summarize recent research and select the best papers published in 2015 in the field of computerized clinical decision support for the Decision Support section of the IMIA yearbook. Method A literature review was performed by searching two bibliographic databases for papers related to clinical decision support systems (CDSSs) and computerized provider order entry (CPOE) systems. The aim was to identify a list of candidate best papers from the retrieved papers that were then peer-reviewed by external reviewers. A consensus meeting between the two section editors and the IMIA editorial team was finally conducted to conclude in the best paper selection. Results Among the 974 retrieved papers, the entire review process resulted in the selection of four best papers. One paper reports on a CDSS routinely applied in pediatrics for more than 10 years, relying on adaptations of the Arden Syntax. Another paper assessed the acceptability and feasibility of an important CPOE evaluation tool in hospitals outside the US where it was developed. The third paper is a systematic, qualitative review, concerning usability flaws of medication-related alerting functions, providing an important evidence-based, methodological contribution in the domain of CDSS design and development in general. Lastly, the fourth paper describes a study quantifying the effect of a complex, continuous-care, guideline-based CDSS on the correctness and completeness of clinicians’ decisions. Conclusions While there are notable examples of routinely used decision support systems, this 2015 review on CDSSs and CPOE systems still shows that, despite methodological contributions, theoretical frameworks, and prototype developments, these technologies are not yet widely spread (at least with their full functionalities) in routine clinical practice. Further research, testing, evaluation, and training are still needed for these tools to be adopted in clinical practice and, ultimately, illustrate the benefits that they promise. PMID:27830247

  18. [Automated analyser of organ cultured corneal endothelial mosaic].

    PubMed

    Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L

    2002-05-01

    Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to standardize cell counts among cornea banks in our country.

  19. Automating radiologist workflow, part 3: education and training.

    PubMed

    Reiner, Bruce

    2008-12-01

    The current model for radiologist education consists largely of mentorship during residency, followed by peer-to-peer training thereafter. The traditional focus of this radiologist education has historically been restricted to anatomy, pathology, and imaging modality. This "human" mentoring model becomes a limiting factor in the current practice environment because of rapid and dramatic changes in imaging and information technologies, along with the increased time demands placed on practicing radiologists. One novel way to address these burgeoning education and training challenges is to leverage technology, with the creation of user-specific and context-specific automated workflow templates. These automated templates would provide a low-stress, time-efficient, and easy-to-use equivalent of "computerized" mentoring. A radiologist could identify the workflow template of interest on the basis of the specific computer application, pathology, anatomy, or modality of interest. While the corresponding workflow template is activated, the radiologist "student" could effectively start and stop at areas of interest and use the functionality of an electronic wizard to identify additional educational resource of interest. An additional training feature of the technology is the ability to review "proven" cases for the purposes of establishing competence and credentialing.

  20. Nonlinear optical microscopy: use of second harmonic generation and two-photon microscopy for automated quantitative liver fibrosis studies.

    PubMed

    Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry

    2008-01-01

    Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.

  1. An inventory of state natural resources information systems. [including Puerto Rico and the U.S. Virgin Islands

    NASA Technical Reports Server (NTRS)

    Martinko, E. A. (Principal Investigator); Caron, L. M.; Stewart, D. S.

    1984-01-01

    Data bases and information systems developed and maintained by state agencies to support planning and management of environmental and nutural resources were inventoried for all 50 states, Puerto Rico, and U.S. Virgin Islands. The information obtained is assembled into a computerized data base catalog which is throughly cross-referecence. Retrieval is possible by code, state, data base name, data base acronym, agency, computer, GIS capability, language, specialized software, data category name, geograhic reference, data sources, and level of reliability. The 324 automated data bases identified are described.

  2. Toward an integrated computerized patient record.

    PubMed

    Dole, T R; Luberti, A A

    2000-04-01

    Developing a comprehensive electronic medical record system to serve ambulatory care providers in a large health care enterprise requires significant time and resources. One approach to achieving this system is to devise a series of short-term, workable solutions until a complete system is designed and implemented. The initial solution introduced a basic (mini) medical record system that provided an automated problem/summary sheet and decentralization of ambulatory-based medical records. The next step was to partner with an information system vendor committed to continued development of the long-term system capable of supporting the health care organization well into the future.

  3. Review of Current Data Exchange Practices: Providing Descriptive Data to Assist with Building Operations Decisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livingood, W.; Stein, J.; Considine, T.

    Retailers who participate in the U.S. Department of Energy Commercial Building Energy Alliances (CBEA) identified the need to enhance communication standards. The means are available to collect massive numbers of buildings operational data, but CBEA members have difficulty transforming the data into usable information and energy-saving actions. Implementing algorithms for automated fault detection and diagnostics and linking building operational data to computerized maintenance management systems are important steps in the right direction, but have limited scalability for large building portfolios because the algorithms must be configured for each building.

  4. Computerized in vitro test for chemical toxicity based on tetrahymena swimming patterns

    NASA Technical Reports Server (NTRS)

    Noever, David A.; Matsos, Helen C.; Cronise, Raymond J.; Looger, Loren L.; Relwani, Rachna A.; Johnson, Jacqueline U.

    1994-01-01

    An apparatus and method for rapidly determining chemical toxicity was evaluated. The toxicity monitor includes an automated scoring of how motile biological cells (Tetrahymena pyriformis) slow down or otherwise change their swimming patterns in a hostile chemical environment. The device, called the Motility Assay Apparatus (MAA) is tested for 30 second determination of chemical toxicity in 20 aqueous samples containing trace organics and salts. With equal or better detection limits, results compare favorably to in vivo animal tests of eye irritancy, in addition to agreeing for all chemicals with previous manual evaluations of single cell motility.

  5. Computerized In Vitro Test for Chemical Toxicity Based on Tetrahymena Swimming Patterns

    NASA Technical Reports Server (NTRS)

    Noever, David A.; Matsos, Helen C.; Cronise, Raymond J.; Looger, Loren L.; Relwani, Rachna A.; Johnson, Jacqueline U.

    1994-01-01

    An apparatus and a method for rapidly determining chemical toxicity have been evaluated as an alternative to the rabbit eye initancy test (Draize). The toxicity monitor includes an automated scoring of how motile biological cells (Tetrahymena pyriformis) slow down or otherwise change their swimming patterns in a hostile chemical environment. The method, called the motility assay (MA), is tested for 30 s to determine the chemical toxicity in 20 aqueous samples containing trace organics and salts. With equal or better detection limits, results compare favorably to in vivo animal tests of eye irritancy.

  6. Microcomputer keeps watch at Emerald Mine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-04-01

    This paper reviews the computerized mine monitoring system set up at the Emerald Mine, SW Pennsylvania, USA. This coal mine has pioneered the automation of many production and safety features and this article covers their work in fire detection and conveyor belt monitoring. A central computer control room can safely watch over the whole underground mining operation using one 25 inch colour monitor. These new data-acquisition systems will lead the way, in the future, to safer move efficient coal mining. Multi-point monitoring of carbon monoxide, heat anomalies, toxic gases and the procedures in conveyor belt operation from start-up to closedown.

  7. Improving designer productivity

    NASA Technical Reports Server (NTRS)

    Hill, Gary C.

    1992-01-01

    Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting those challenges.

  8. Hardware synthesis from DDL description. [simulating a digital system for computerized design of large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.; Shah, A. M.

    1980-01-01

    The details of digital systems can be conveniently input into the design automation system by means of hardware description language (HDL). The computer aided design and test (CADAT) system at NASA MSFC is used for the LSI design. The digital design language (DDL) was selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. Problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system are addressed.

  9. Physician/Computer Interaction

    PubMed Central

    Dlugacz, Yosef D.; Siegel, Carole; Fischer, Susan

    1981-01-01

    Despite the fact that the physician's involvement with computer operations has dramatically increased with automation in the health care industry, few studies have focused on the physician's experiences with and reactions to computers. This paper reports on these dimensions for physicians and their medical supervisors who have begun to use a computerized drug review system. Their attitudes and opinions are assessed towards this system and more generally towards the use of computers in medicine. Clinicians' attitudes towards computers are related to their clinical role and feelings about the working milieu. This report presents preliminary data of the study in terms of the frequency distribution of responses.

  10. Automated Scoring in Context: Rapid Assessment for Placed Students

    ERIC Educational Resources Information Center

    Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal

    2013-01-01

    This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…

  11. Automating Formative and Summative Feedback for Individualised Assignments

    ERIC Educational Resources Information Center

    Hamilton, Ian Robert

    2009-01-01

    Purpose: The purpose of this paper is to report on the rationale behind the use of a unique paper-based individualised accounting assignment, which automated the provision to students of immediate formative and timely summative feedback. Design/methodology/approach: As students worked towards completing their assignment, the package provided…

  12. Model of Emotional Expressions in Movements

    ERIC Educational Resources Information Center

    Rozaliev, Vladimir L.; Orlova, Yulia A.

    2013-01-01

    This paper presents a new approach to automated identification of human emotions based on analysis of body movements, a recognition of gestures and poses. Methodology, models and automated system for emotion identification are considered. To characterize the person emotions in the model, body movements are described with linguistic variables and a…

  13. Automated assessment of bilateral breast volume asymmetry as a breast cancer biomarker during mammographic screening

    NASA Astrophysics Data System (ADS)

    Williams, Alex C.; Hitt, Austin; Voisin, Sophie; Tourassi, Georgia

    2013-03-01

    The biological concept of bilateral symmetry as a marker of developmental stability and good health is well established. Although most individuals deviate slightly from perfect symmetry, humans are essentially considered bilaterally symmetrical. Consequently, increased fluctuating asymmetry of paired structures could be an indicator of disease. There are several published studies linking bilateral breast size asymmetry with increased breast cancer risk. These studies were based on radiologists' manual measurements of breast size from mammographic images. We aim to develop a computerized technique to assess fluctuating breast volume asymmetry in screening mammograms and investigate whether it correlates with the presence of breast cancer. Using a large database of screening mammograms with known ground truth we applied automated breast region segmentation and automated breast size measurements in CC and MLO views using three well established methods. All three methods confirmed that indeed patients with breast cancer have statistically significantly higher fluctuating asymmetry of their breast volumes. However, statistically significant difference between patients with cancer and benign lesions was observed only for the MLO views. The study suggests that automated assessment of global bilateral asymmetry could serve as a breast cancer risk biomarker for women undergoing mammographic screening. Such biomarker could be used to alert radiologists or computer-assisted detection (CAD) systems to exercise increased vigilance if higher than normal cancer risk is suspected.

  14. [The laboratory of tomorrow. Particular reference to hematology].

    PubMed

    Cazal, P

    1985-01-01

    A serious prediction can only be an extrapolation of recent developments. To be exact, the development has to continue in the same direction, which is only a probability. Probable development of hematological technology: Progress in methods. Development of new labelling methods: radio-elements, antibodies. Monoclonal antibodies. Progress in equipment: Cell counters and their adaptation to routine hemograms is a certainty. From analyzers: a promise that will perhaps become reality. Coagulometers: progress still to be made. Hemagglutination detectors and their application to grouping: good achievements, but the market is too limited. Computerization and automation: What form will the computerizing take? What will the computer do? Who will the computer control? What should the automatic analyzers be? Two current levels. Relationships between the automatic analysers and the computer. rapidity, fidelity and above all, reliability. Memory: large capacity and easy access. Disadvantages: conservatism and technical dependency. How can they be avoided? Development of the environment: Laboratory input: outside supplies, electricity, reagents, consumables. Samples and their identification. Output: distribution of results and communication problems. Centralization or decentralization? What will tomorrow's laboratory be? 3 hypotheses: optimistic, pessimistic, and balanced.

  15. Automated detection of pulmonary nodules in CT images with support vector machines

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Liu, Wanyu; Sun, Xiaoming

    2008-10-01

    Many methods have been proposed to avoid radiologists fail to diagnose small pulmonary nodules. Recently, support vector machines (SVMs) had received an increasing attention for pattern recognition. In this paper, we present a computerized system aimed at pulmonary nodules detection; it identifies the lung field, extracts a set of candidate regions with a high sensitivity ratio and then classifies candidates by the use of SVMs. The Computer Aided Diagnosis (CAD) system presented in this paper supports the diagnosis of pulmonary nodules from Computed Tomography (CT) images as inflammation, tuberculoma, granuloma..sclerosing hemangioma, and malignant tumor. Five texture feature sets were extracted for each lesion, while a genetic algorithm based feature selection method was applied to identify the most robust features. The selected feature set was fed into an ensemble of SVMs classifiers. The achieved classification performance was 100%, 92.75% and 90.23% in the training, validation and testing set, respectively. It is concluded that computerized analysis of medical images in combination with artificial intelligence can be used in clinical practice and may contribute to more efficient diagnosis.

  16. Cost-effective and business-beneficial computer validation for bioanalytical laboratories.

    PubMed

    McDowall, Rd

    2011-07-01

    Computerized system validation is often viewed as a burden and a waste of time to meet regulatory requirements. This article presents a different approach by looking at validation in a bioanalytical laboratory from the business benefits that computer validation can bring. Ask yourself the question, have you ever bought a computerized system that did not meet your initial expectations? This article will look at understanding the process to be automated, the paper to be eliminated and the records to be signed to meet the requirements of the GLP or GCP and Part 11 regulations. This paper will only consider commercial nonconfigurable and configurable software such as plate readers and LC-MS/MS data systems rather than LIMS or custom applications. Two streamlined life cycle models are presented. The first one consists of a single document for validation of nonconfigurable software. The second is for configurable software and is a five-stage model that avoids the need to write functional and design specifications. Both models are aimed at managing the risk each type of software poses whist reducing the amount of documented evidence required for validation.

  17. A Method for Evaluating the Safety Impacts of Air Traffic Automation

    NASA Technical Reports Server (NTRS)

    Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles

    1998-01-01

    This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.

  18. Automated Algorithm for J-Tpeak and Tpeak-Tend Assessment of Drug-Induced Proarrhythmia Risk

    DOE PAGES

    Johannesen, Lars; Vicente, Jose; Hosseini, Meisam; ...

    2016-12-30

    Prolongation of the heart rate corrected QT (QTc) interval is a sensitive marker of torsade de pointes risk; however it is not specific as QTc prolonging drugs that block inward currents are often not associated with torsade. Recent work demonstrated that separate analysis of the heart rate corrected J-T peakc (J-T peakc) and T peak-T end intervals can identify QTc prolonging drugs with inward current block and is being proposed as a part of a new cardiac safety paradigm for new drugs (the “CiPA” initiative). In this work, we describe an automated measurement methodology for assessment of the J-T peakcmore » and T peak-T end intervals using the vector magnitude lead. The automated measurement methodology was developed using data from one clinical trial and was evaluated using independent data from a second clinical trial. Comparison between the automated and the prior semi-automated measurements shows that the automated algorithm reproduces the semi-automated measurements with a mean difference of single-deltas <1 ms and no difference in intra-time point variability (p for all > 0.39). In addition, the time-profile of the baseline and placebo-adjusted changes are within 1 ms for 63% of the time-points (86% within 2 ms). Importantly, the automated results lead to the same conclusions about the electrophysiological mechanisms of the studied drugs. We have developed an automated algorithm for assessment of J-T peakc and T peak-T end intervals that can be applied in clinical drug trials. Under the CiPA initiative this ECG assessment would determine if there are unexpected ion channel effects in humans compared to preclinical studies. In conclusion, the algorithm is being released as open-source software.« less

  19. Automated Algorithm for J-Tpeak and Tpeak-Tend Assessment of Drug-Induced Proarrhythmia Risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johannesen, Lars; Vicente, Jose; Hosseini, Meisam

    Prolongation of the heart rate corrected QT (QTc) interval is a sensitive marker of torsade de pointes risk; however it is not specific as QTc prolonging drugs that block inward currents are often not associated with torsade. Recent work demonstrated that separate analysis of the heart rate corrected J-T peakc (J-T peakc) and T peak-T end intervals can identify QTc prolonging drugs with inward current block and is being proposed as a part of a new cardiac safety paradigm for new drugs (the “CiPA” initiative). In this work, we describe an automated measurement methodology for assessment of the J-T peakcmore » and T peak-T end intervals using the vector magnitude lead. The automated measurement methodology was developed using data from one clinical trial and was evaluated using independent data from a second clinical trial. Comparison between the automated and the prior semi-automated measurements shows that the automated algorithm reproduces the semi-automated measurements with a mean difference of single-deltas <1 ms and no difference in intra-time point variability (p for all > 0.39). In addition, the time-profile of the baseline and placebo-adjusted changes are within 1 ms for 63% of the time-points (86% within 2 ms). Importantly, the automated results lead to the same conclusions about the electrophysiological mechanisms of the studied drugs. We have developed an automated algorithm for assessment of J-T peakc and T peak-T end intervals that can be applied in clinical drug trials. Under the CiPA initiative this ECG assessment would determine if there are unexpected ion channel effects in humans compared to preclinical studies. In conclusion, the algorithm is being released as open-source software.« less

  20. Combining deep learning and level set for the automated segmentation of the left ventricle of the heart from cardiac cine magnetic resonance.

    PubMed

    Ngo, Tuan Anh; Lu, Zhi; Carneiro, Gustavo

    2017-01-01

    We introduce a new methodology that combines deep learning and level set for the automated segmentation of the left ventricle of the heart from cardiac cine magnetic resonance (MR) data. This combination is relevant for segmentation problems, where the visual object of interest presents large shape and appearance variations, but the annotated training set is small, which is the case for various medical image analysis applications, including the one considered in this paper. In particular, level set methods are based on shape and appearance terms that use small training sets, but present limitations for modelling the visual object variations. Deep learning methods can model such variations using relatively small amounts of annotated training, but they often need to be regularised to produce good generalisation. Therefore, the combination of these methods brings together the advantages of both approaches, producing a methodology that needs small training sets and produces accurate segmentation results. We test our methodology on the MICCAI 2009 left ventricle segmentation challenge database (containing 15 sequences for training, 15 for validation and 15 for testing), where our approach achieves the most accurate results in the semi-automated problem and state-of-the-art results for the fully automated challenge. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  1. [Methodological problems in the use of information technologies in physical education].

    PubMed

    Martirosov, E G; Zaĭtseva, G A

    2000-01-01

    The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.

  2. Roadway safety analysis methodology for Utah : final report.

    DOT National Transportation Integrated Search

    2016-12-01

    This research focuses on the creation of a three-part Roadway Safety Analysis methodology that applies and automates the cumulative work of recently-completed roadway safety research. The first part is to prepare the roadway and crash data for analys...

  3. A Combined Hazard Index Fire Test Methodology for Aircraft Cabin Materials. Volume II.

    DTIC Science & Technology

    1982-04-01

    Technical Center. The report was divided into two parts: Part I described the improved technology investigated to upgrade existin methods for testing...proper implementation of the computerized data acquisition and reduction programs will improve materials hazards measurement precision. Thus, other...the hold chamber before and after injection of a sample, will improve precision and repeatability of measurement. The listed data acquisition and

  4. Computerized system for assessing heart rate variability.

    PubMed

    Frigy, A; Incze, A; Brânzaniuc, E; Cotoi, S

    1996-01-01

    The principal theoretical, methodological and clinical aspects of heart rate variability (HRV) analysis are reviewed. This method has been developed over the last 10 years as a useful noninvasive method of measuring the activity of the autonomic nervous system. The main components and the functioning of the computerized rhythm-analyzer system developed by our team are presented. The system is able to perform short-term (maximum 20 minutes) time domain HRV analysis and statistical analysis of the ventricular rate in any rhythm, particularly in atrial fibrillation. The performances of our system are demonstrated by using the graphics (RR histograms, delta RR histograms, RR scattergrams) and the statistical parameters resulted from the processing of three ECG recordings. These recordings are obtained from a normal subject, from a patient with advanced heart failure, and from a patient with atrial fibrillation.

  5. Charting the language of leadership: a methodological investigation of President Bush and the crisis of 9/11.

    PubMed

    Bligh, Michelle C; Kohles, Jeffrey C; Meindl, James R

    2004-06-01

    In many ways, leadership is a phenomenon that is ideally suited for new and inventive research methods. For researchers who seek to reliably study and systematically compare linguistically based elements of the leadership relationship, computerized content analysis has the potential to supplement, extend, and qualify existing leadership theory and practice. Through an examination of President Bush's rhetoric and the media coverage before and after the crisis of 9/11. the authors explore how elements of the President's speeches changed in response to the post-crisis environment. Using this example, the authors illustrate the process of computerized content analysis and many of its strengths and limitations, with the hope of facilitating future leadership research that uses this approach to explore important contextual and symbolic elements of the leadership relationship. (c) 2004 APA

  6. Advanced Airframe Structural Materials: A Primer and Cost Estimating Methodology

    DTIC Science & Technology

    1991-01-01

    laying machines for larger, mildly con- toured parts such as wing and stabilizer skins. For such parts, automated tape laying machines can operate many...heat guns (90-130°F). However, thermoplastics require as much as 650°F for forming. Automated tape laying machines for these materials use warm...cycles to properly seat the plies onto the tool. This time-consuming process can sometimes be eliminated or reduced by the use of automated tape laying procedures

  7. Total synthesis of TMG-chitotriomycin based on an automated electrochemical assembly of a disaccharide building block.

    PubMed

    Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-Ichi; Nokami, Toshiki; Itoh, Toshiyuki

    2017-01-01

    The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block.

  8. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    NASA Technical Reports Server (NTRS)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  9. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  10. Interdisciplinary development of manual and automated product usability assessments for older adults with dementia: lessons learned.

    PubMed

    Boger, Jennifer; Taati, Babak; Mihailidis, Alex

    2016-10-01

    The changes in cognitive abilities that accompany dementia can make it difficult to use everyday products that are required to complete activities of daily living. Products that are inherently more usable for people with dementia could facilitate independent activity completion, thus reducing the need for caregiver assistance. The objectives of this research were to: (1) gain an understanding of how water tap design impacted tap usability and (2) create an automated computerized tool that could assess tap usability. 27 older adults, who ranged from cognitively intact to advanced dementia, completed 1309 trials on five tap designs. Data were manually analyzed to investigate tap usability as well as used to develop an automated usability analysis tool. Researchers collaborated to modify existing techniques and to create novel ones to accomplish both goals. This paper presents lessons learned through the course of this research, which could be applicable in the development of other usability studies, automated vision-based assessments and the development of assistive technologies for cognitively impaired older adults. Collaborative interdisciplinary teamwork, which included older adult with dementia participants, was key to enabling innovative advances that achieved the projects' research goals. Implications for Rehabilitation Products that are implicitly familiar and usable by older adults could foster independent activity completion, potentially reducing reliance on a caregiver. The computer-based automated tool can significantly reduce the time and effort required to perform product usability analysis, making this type of analysis more feasible. Interdisciplinary collaboration can result in a more holistic understanding of assistive technology research challenges and enable innovative solutions.

  11. Automated identification of sleep states from EEG signals by means of ensemble empirical mode decomposition and random under sampling boosting.

    PubMed

    Hassan, Ahnaf Rashik; Bhuiyan, Mohammed Imamul Hassan

    2017-03-01

    Automatic sleep staging is essential for alleviating the burden of the physicians of analyzing a large volume of data by visual inspection. It is also a precondition for making an automated sleep monitoring system feasible. Further, computerized sleep scoring will expedite large-scale data analysis in sleep research. Nevertheless, most of the existing works on sleep staging are either multichannel or multiple physiological signal based which are uncomfortable for the user and hinder the feasibility of an in-home sleep monitoring device. So, a successful and reliable computer-assisted sleep staging scheme is yet to emerge. In this work, we propose a single channel EEG based algorithm for computerized sleep scoring. In the proposed algorithm, we decompose EEG signal segments using Ensemble Empirical Mode Decomposition (EEMD) and extract various statistical moment based features. The effectiveness of EEMD and statistical features are investigated. Statistical analysis is performed for feature selection. A newly proposed classification technique, namely - Random under sampling boosting (RUSBoost) is introduced for sleep stage classification. This is the first implementation of EEMD in conjunction with RUSBoost to the best of the authors' knowledge. The proposed feature extraction scheme's performance is investigated for various choices of classification models. The algorithmic performance of our scheme is evaluated against contemporary works in the literature. The performance of the proposed method is comparable or better than that of the state-of-the-art ones. The proposed algorithm gives 88.07%, 83.49%, 92.66%, 94.23%, and 98.15% for 6-state to 2-state classification of sleep stages on Sleep-EDF database. Our experimental outcomes reveal that RUSBoost outperforms other classification models for the feature extraction framework presented in this work. Besides, the algorithm proposed in this work demonstrates high detection accuracy for the sleep states S1 and REM. Statistical moment based features in the EEMD domain distinguish the sleep states successfully and efficaciously. The automated sleep scoring scheme propounded herein can eradicate the onus of the clinicians, contribute to the device implementation of a sleep monitoring system, and benefit sleep research. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Cost-Effectiveness Analysis of the Automation of a Circulation System.

    ERIC Educational Resources Information Center

    Mosley, Isobel

    A general methodology for cost effectiveness analysis was developed and applied to the Colorado State University library loan desk. The cost effectiveness of the existing semi-automated circulation system was compared with that of a fully manual one, based on the existing manual subsystem. Faculty users' time and computer operating costs were…

  13. Automated Serials Control at the Indian Institutes of Technology: An Overview

    ERIC Educational Resources Information Center

    Ghosh, Tapas Kumar; Panda, K. C.

    2011-01-01

    Purpose: The purpose of this paper is to highlight the functional attributes of the automated serials control systems of the libraries in seven Indian Institutes of Technology (IITs) and provide a comparative analysis. Design/methodology/approach: Features of the serials control modules of the library management systems (LMSs) in use in the…

  14. Toward Automated Inventory Modeling in Life Cycle Assessment: The Utility of Semantic Data Modeling to Predict Real-WorldChemical Production

    EPA Science Inventory

    A set of coupled semantic data models, i.e., ontologies, are presented to advance a methodology towards automated inventory modeling of chemical manufacturing in life cycle assessment. The cradle-to-gate life cycle inventory for chemical manufacturing is a detailed collection of ...

  15. A Binary Programming Approach to Automated Test Assembly for Cognitive Diagnosis Models

    ERIC Educational Resources Information Center

    Finkelman, Matthew D.; Kim, Wonsuk; Roussos, Louis; Verschoor, Angela

    2010-01-01

    Automated test assembly (ATA) has been an area of prolific psychometric research. Although ATA methodology is well developed for unidimensional models, its application alongside cognitive diagnosis models (CDMs) is a burgeoning topic. Two suggested procedures for combining ATA and CDMs are to maximize the cognitive diagnostic index and to use a…

  16. Technical Processing Librarians in the 1980's: Current Trends and Future Forecasts.

    ERIC Educational Resources Information Center

    Kennedy, Gail

    1980-01-01

    This review of recent and anticipated advances in library automation technology and methodology includes a review of the effects of OCLC, MARC formatting, AACR2, and increasing costs, as well as predictions of the impact on library technical processing of networking, expansion of automation, minicomputers, specialized reference services, and…

  17. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  18. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  19. Performance evaluation of contrast-detail in full field digital mammography systems using ideal (Hotelling) observer vs. conventional automated analysis of CDMAM images for quality control of contrast-detail characteristics.

    PubMed

    Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia

    2015-11-01

    The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. An automated behavioral measure of mind wandering during computerized reading.

    PubMed

    Faber, Myrthe; Bixler, Robert; D'Mello, Sidney K

    2018-02-01

    Mind wandering is a ubiquitous phenomenon in which attention shifts from task-related to task-unrelated thoughts. The last decade has witnessed an explosion of interest in mind wandering, but research has been stymied by a lack of objective measures, leading to a near-exclusive reliance on self-reports. We addressed this issue by developing an eye-gaze-based, machine-learned model of mind wandering during computerized reading. Data were collected in a study in which 132 participants reported self-caught mind wandering while reading excerpts from a book on a computer screen. A remote Tobii TX300 or T60 eyetracker recorded their gaze during reading. The data were used to train supervised classification models to discriminate between mind wandering and normal reading in a manner that would generalize to new participants. We found that at the point of maximal agreement between the model-based and self-reported mind-wandering means (smallest difference between the group-level means: M model = .310, M self = .319), the participant-level mind-wandering proportional distributions were similar and were significantly correlated (r = .400). The model-based estimates were internally consistent (r = .751) and predicted text comprehension more strongly than did self-reported mind wandering (r model = -.374, r self = -.208). Our results also indicate that a robust strategy of probabilistically predicting mind wandering in cases with poor or missing gaze data led to improved performance on all metrics, as compared to simply discarding these data. Our findings demonstrate that an automated objective measure might be available for laboratory studies of mind wandering during reading, providing an appealing alternative or complement to self-reports.

  1. Computerized Automated Reminder Diabetes System (CARDS): E-Mail and SMS Cell Phone Text Messaging Reminders to Support Diabetes Management

    PubMed Central

    Hanauer, David A.; Wentzell, Katherine; Laffel, Nikki

    2009-01-01

    Abstract Background: Cell phone text messaging, via the Short Messaging Service (SMS), offers the promise of a highly portable, well-accepted, and inexpensive modality for engaging youth and young adults in the management of their diabetes. This pilot and feasibility study compared two-way SMS cell phone messaging with e-mail reminders that were directed at encouraging blood glucose (BG) monitoring. Methods: Forty insulin-treated adolescents and young adults with diabetes were randomized to receive electronic reminders to check their BG levels via cell phone text messaging or e-mail reminders for a 3-month pilot study. Electronic messages were automatically generated, and participant replies with BG results were processed by the locally developed Computerized Automated Reminder Diabetes System (CARDS). Participants set their schedule for reminders on the secure CARDS website where they could also enter and review BG data. Results: Of the 40 participants, 22 were randomized to receive cell phone text message reminders and 18 to receive e-mail reminders; 18 in the cell phone group and 11 in the e-mail group used the system. Compared to the e-mail group, users in the cell phone group received more reminders (180.4 vs. 106.6 per user) and responded with BG results significantly more often (30.0 vs. 6.9 per user, P=0.04). During the first month cell phone users submitted twice as many BGs as e-mail users (27.2 vs. 13.8 per user); by month 3, usage waned. Conclusions: Cell phone text messaging to promote BG monitoring is a viable and acceptable option in adolescents and young adults with diabetes. However, maintaining interest levels for prolonged intervals remains a challenge. PMID:19848576

  2. Computerized Automated Reminder Diabetes System (CARDS): e-mail and SMS cell phone text messaging reminders to support diabetes management.

    PubMed

    Hanauer, David A; Wentzell, Katherine; Laffel, Nikki; Laffel, Lori M

    2009-02-01

    Cell phone text messaging, via the Short Messaging Service (SMS), offers the promise of a highly portable, well-accepted, and inexpensive modality for engaging youth and young adults in the management of their diabetes. This pilot and feasibility study compared two-way SMS cell phone messaging with e-mail reminders that were directed at encouraging blood glucose (BG) monitoring. Forty insulin-treated adolescents and young adults with diabetes were randomized to receive electronic reminders to check their BG levels via cell phone text messaging or e-mail reminders for a 3-month pilot study. Electronic messages were automatically generated, and participant replies with BG results were processed by the locally developed Computerized Automated Reminder Diabetes System (CARDS). Participants set their schedule for reminders on the secure CARDS website where they could also enter and review BG data. Of the 40 participants, 22 were randomized to receive cell phone text message reminders and 18 to receive e-mail reminders; 18 in the cell phone group and 11 in the e-mail group used the system. Compared to the e-mail group, users in the cell phone group received more reminders (180.4 vs. 106.6 per user) and responded with BG results significantly more often (30.0 vs. 6.9 per user, P = 0.04). During the first month cell phone users submitted twice as many BGs as e-mail users (27.2 vs. 13.8 per user); by month 3, usage waned. Cell phone text messaging to promote BG monitoring is a viable and acceptable option in adolescents and young adults with diabetes. However, maintaining interest levels for prolonged intervals remains a challenge.

  3. Trends in P Value, Confidence Interval, and Power Analysis Reporting in Health Professions Education Research Reports: A Systematic Appraisal.

    PubMed

    Abbott, Eduardo F; Serrano, Valentina P; Rethlefsen, Melissa L; Pandian, T K; Naik, Nimesh D; West, Colin P; Pankratz, V Shane; Cook, David A

    2018-02-01

    To characterize reporting of P values, confidence intervals (CIs), and statistical power in health professions education research (HPER) through manual and computerized analysis of published research reports. The authors searched PubMed, Embase, and CINAHL in May 2016, for comparative research studies. For manual analysis of abstracts and main texts, they randomly sampled 250 HPER reports published in 1985, 1995, 2005, and 2015, and 100 biomedical research reports published in 1985 and 2015. Automated computerized analysis of abstracts included all HPER reports published 1970-2015. In the 2015 HPER sample, P values were reported in 69/100 abstracts and 94 main texts. CIs were reported in 6 abstracts and 22 main texts. Most P values (≥77%) were ≤.05. Across all years, 60/164 two-group HPER studies had ≥80% power to detect a between-group difference of 0.5 standard deviations. From 1985 to 2015, the proportion of HPER abstracts reporting a CI did not change significantly (odds ratio [OR] 2.87; 95% CI 1.04, 7.88) whereas that of main texts reporting a CI increased (OR 1.96; 95% CI 1.39, 2.78). Comparison with biomedical studies revealed similar reporting of P values, but more frequent use of CIs in biomedicine. Automated analysis of 56,440 HPER abstracts found 14,867 (26.3%) reporting a P value, 3,024 (5.4%) reporting a CI, and increased reporting of P values and CIs from 1970 to 2015. P values are ubiquitous in HPER, CIs are rarely reported, and most studies are underpowered. Most reported P values would be considered statistically significant.

  4. Data engineering systems: Computerized modeling and data bank capabilities for engineering analysis

    NASA Technical Reports Server (NTRS)

    Kopp, H.; Trettau, R.; Zolotar, B.

    1984-01-01

    The Data Engineering System (DES) is a computer-based system that organizes technical data and provides automated mechanisms for storage, retrieval, and engineering analysis. The DES combines the benefits of a structured data base system with automated links to large-scale analysis codes. While the DES provides the user with many of the capabilities of a computer-aided design (CAD) system, the systems are actually quite different in several respects. A typical CAD system emphasizes interactive graphics capabilities and organizes data in a manner that optimizes these graphics. On the other hand, the DES is a computer-aided engineering system intended for the engineer who must operationally understand an existing or planned design or who desires to carry out additional technical analysis based on a particular design. The DES emphasizes data retrieval in a form that not only provides the engineer access to search and display the data but also links the data automatically with the computer analysis codes.

  5. A Quantitative Measure of Handgrip Myotonia in Non-dystrophic Myotonia

    PubMed Central

    Statland, Jeffrey M; Bundy, Brian N; Wang, Yunxia; Trivedi, Jaya R; Rayan, Dipa Raja; Herbelin, Laura; Donlan, Merideth; McLin, Rhonda; Eichinger, Katy J; Findlater, Karen; Dewar, Liz; Pandya, Shree; Martens, William B; Venance, Shannon L; Matthews, Emma; Amato, Anthony A; Hanna, Michael G; Griggs, Robert C; Barohn, Richard J

    2012-01-01

    Introduction Non-dystrophic Myotonia (NDM) is characterized by myotonia without muscle wasting. A standardized quantitative myotonia assessment (QMA) is important for clinical trials. Methods Myotonia was assessed in 91 individuals enrolled in a natural history study using a commercially available computerized handgrip myometer and automated software. Average peak force and 90% to 5% relaxation times were compared to historical normal controls studied with identical methods. Results 30 subjects had chloride channel mutations, 31 sodium channel mutations, 6 DM2, and 24 no identified mutation. Chloride channel mutations were associated with prolonged 1st handgrip relaxation times, and warm up on subsequent handgrips. Sodium channel mutations were associated with prolonged 1st handgrip relaxation times and paradoxical myotonia or warm-up, depending on underlying mutations. DM2 subjects had normal relaxation times but decreased peak force. Sample size estimates are provided for clinical trial planning. Conclusion QMA is an automated, non-invasive technique for evaluating myotonia in NDM. PMID:22987687

  6. Current concepts in clinical research: web-based, automated, arthroscopic surgery prospective database registry.

    PubMed

    Lubowitz, James H; Smith, Patrick A

    2012-03-01

    In 2011, postsurgical patient outcome data may be compiled in a research registry, allowing comparative-effectiveness research and cost-effectiveness analysis by use of Health Insurance Portability and Accountability Act-compliant, institutional review board-approved, Food and Drug Administration-approved, remote, Web-based data collection systems. Computerized automation minimizes cost and minimizes surgeon time demand. A research registry can be a powerful tool to observe and understand variations in treatment and outcomes, to examine factors that influence prognosis and quality of life, to describe care patterns, to assess effectiveness, to monitor safety, and to change provider practice through feedback of data. Registry of validated, prospective outcome data is required for arthroscopic and related researchers and the public to advocate with governments and health payers. The goal is to develop evidence-based data to determine the best methods for treating patients. Copyright © 2012 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  7. Semiautomated Method for Microbiological Vitamin Assays

    PubMed Central

    Berg, T. M.; Behagel, H. A.

    1972-01-01

    A semiautomated method for microbiological vitamin assays is described, which includes separate automated systems for the preparation of the cultures and for the measurement of turbidity. In the dilution and dosage unit based on the continuous-flow principle, vitamin samples were diluted to two different dose levels at a rate of 40 per hr, mixed with the inoculated test broth, and dispensed into culture tubes. After incubation, racks with culture tubes were placed on the sampler of an automatic turbidimeter. This unit, based on the discrete-sample system, measured the turbidity and printed the extinction values at a rate of 300 per hr. Calculations were computerized and the results, including statistical data, are presented in an easily readable form. The automated method is in routine use for the assays of thiamine, riboflavine, pyridoxine, cyanocobalamin, calcium pantothenate, nicotinic acid, pantothenol, and folic acid. Identical vitamin solutions assayed on different days gave variation coefficients for the various vitamin assays of less than 10%. Images PMID:4553802

  8. Segmentation methodology for automated classification and differentiation of soft tissues in multiband images of high-resolution ultrasonic transmission tomography.

    PubMed

    Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z

    2006-08-01

    This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.

  9. Evaluating space station applications of automation and robotics technologies from a human productivity point of view

    NASA Technical Reports Server (NTRS)

    Bard, J. F.

    1986-01-01

    The role that automation, robotics, and artificial intelligence will play in Space Station operations is now beginning to take shape. Although there is only limited data on the precise nature of the payoffs that these technologies are likely to afford there is a general consensus that, at a minimum, the following benefits will be realized: increased responsiveness to innovation, lower operating costs, and reduction of exposure to hazards. Nevertheless, the question arises as to how much automation can be justified with the technical and economic constraints of the program? The purpose of this paper is to present a methodology which can be used to evaluate and rank different approaches to automating the functions and tasks planned for the Space Station. Special attention is given to the impact of advanced automation on human productivity. The methodology employed is based on the Analytic Hierarchy Process. This permits the introduction of individual judgements to resolve the confict that normally arises when incomparable criteria underly the selection process. Because of the large number of factors involved in the model, the overall problem is decomposed into four subproblems individually focusing on human productivity, economics, design, and operations, respectively. The results from each are then combined to yield the final rankings. To demonstrate the methodology, an example is developed based on the selection of an on-orbit assembly system. Five alternatives for performing this task are identified, ranging from an astronaut working in space, to a dexterous manipulator with sensory feedback. Computational results are presented along with their implications. A final parametric analysis shows that the outcome is locally insensitive to all but complete reversals in preference.

  10. Total synthesis of TMG-chitotriomycin based on an automated electrochemical assembly of a disaccharide building block

    PubMed Central

    Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-ichi

    2017-01-01

    The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block. PMID:28684973

  11. Developing Mobile BIM/2D Barcode-Based Automated Facility Management System

    PubMed Central

    Chen, Yen-Pei

    2014-01-01

    Facility management (FM) has become an important topic in research on the operation and maintenance phase. Managing the work of FM effectively is extremely difficult owing to the variety of environments. One of the difficulties is the performance of two-dimensional (2D) graphics when depicting facilities. Building information modeling (BIM) uses precise geometry and relevant data to support the facilities depicted in three-dimensional (3D) object-oriented computer-aided design (CAD). This paper proposes a new and practical methodology with application to FM that uses an integrated 2D barcode and the BIM approach. Using 2D barcode and BIM technologies, this study proposes a mobile automated BIM-based facility management (BIMFM) system for FM staff in the operation and maintenance phase. The mobile automated BIMFM system is then applied in a selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FM practice. The combined results demonstrate that a BIMFM-like system can be an effective mobile automated FM tool. The advantage of the mobile automated BIMFM system lies not only in improving FM work efficiency for the FM staff but also in facilitating FM updates and transfers in the BIM environment. PMID:25250373

  12. Developing mobile BIM/2D barcode-based automated facility management system.

    PubMed

    Lin, Yu-Cheng; Su, Yu-Chih; Chen, Yen-Pei

    2014-01-01

    Facility management (FM) has become an important topic in research on the operation and maintenance phase. Managing the work of FM effectively is extremely difficult owing to the variety of environments. One of the difficulties is the performance of two-dimensional (2D) graphics when depicting facilities. Building information modeling (BIM) uses precise geometry and relevant data to support the facilities depicted in three-dimensional (3D) object-oriented computer-aided design (CAD). This paper proposes a new and practical methodology with application to FM that uses an integrated 2D barcode and the BIM approach. Using 2D barcode and BIM technologies, this study proposes a mobile automated BIM-based facility management (BIMFM) system for FM staff in the operation and maintenance phase. The mobile automated BIMFM system is then applied in a selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FM practice. The combined results demonstrate that a BIMFM-like system can be an effective mobile automated FM tool. The advantage of the mobile automated BIMFM system lies not only in improving FM work efficiency for the FM staff but also in facilitating FM updates and transfers in the BIM environment.

  13. International Federation of Library Associations Annual Conference. Papers of the Management and Technology Division: Information Technology Section (47th, Leipzig, East Germany, August 17-22, 1981).

    ERIC Educational Resources Information Center

    Bradler, Reinhard; And Others

    These seven papers on library management and networks focus on: (1) computerized access to archival and library materials, describing the methodological problems associated with a pilot project in the German Democratic Republic, as well as the efficiency of data bank systems; (2) present and future development of libraries and information centers…

  14. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    NASA Astrophysics Data System (ADS)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  15. A comprehensive methodology for intelligent systems life-cycle cost modelling

    NASA Technical Reports Server (NTRS)

    Korsmeyer, David J.; Lum, Henry, Jr.

    1993-01-01

    As NASA moves into the last part on the twentieth century, the desire to do 'business as usual' has been replaced with the mantra 'faster, cheaper, better'. Recently, new work has been done to show how the implementation of advanced technologies, such as intelligent systems, will impact the cost of a system design or in the operational cost for a spacecraft mission. The impact of the degree of autonomous or intelligent systems and human participation on a given program is manifested most significantly during the program operational phases, while the decision of who performs what tasks, and how much automation is incorporated into the system are all made during the design and development phases. Employing intelligent systems and automation is not an either/or question, but one of degree. The question is what level of automation and autonomy will provide the optimal trade-off between performance and cost. Conventional costing methodologies, however, are unable to show the significance of technologies like these in terms of traceable cost benefits and reductions in the various phases of the spacecraft's lifecycle. The proposed comprehensive life-cycle methodology can address intelligent system technologies as well as others that impact human-machine operational modes.

  16. The role of information technology usage in physician practice satisfaction.

    PubMed

    Menachemi, Nir; Powers, Thomas L; Brooks, Robert G

    2009-01-01

    Despite the growing use of information technology (IT) in medical practices, little is known about the relationship between IT and physician satisfaction. The objective of this study was to examine the relationship between physician IT adoption (of various applications) and overall practice satisfaction, as well as satisfaction with the level of computerization at the practice. Data from a Florida survey examining physicians' use of IT and satisfaction were analyzed. Odds ratios (ORs), adjusted for physician demographics and practice characteristics, were computed utilizing logistic regressions to study the independent relationship of electronic health record (EHR) usage, PDA usage, use of e-mail with patients, and the use of disease management software with satisfaction. In addition, we examined the relationship between satisfaction with IT and overall satisfaction with the current medical practice. In multivariate analysis, EHR users were 5 times more likely to be satisfied with the level of computerization in their practice (OR = 4.93, 95% CI = 3.68-6.61) and 1.8 times more likely to be satisfied with their overall medical practice (OR = 1.77, 95% CI = 1.35-2.32). PDA use was also associated with an increase in satisfaction with the level of computerization (OR = 1.23, 95% CI = 1.02-1.47) and with the overall medical practice (OR = 1.30, 95% CI = 1.07-1.57). E-mail use with patients was negatively related to satisfaction with the level of computerization in the practice (OR = 0.69, 95% CI = 0.54-0.90). Last, physicians who were satisfied with IT were 4 times more likely to be satisfied with the current state of their medical practice (OR = 3.97, 95% CI = 3.29-4.81). Physician users of IT applications, especially EHRs, are generally satisfied with these technologies. Potential adopters and/or policy makers interested in influencing IT adoption should consider the positive impact that computer automation can have on medical practice.

  17. Assessment of readiness for clinical decision support to aid laboratory monitoring of immunosuppressive care at U.S. liver transplant centers.

    PubMed

    Jacobs, J; Weir, C; Evans, R S; Staes, C

    2014-01-01

    Following liver transplantation, patients require lifelong immunosuppressive care and monitoring. Computerized clinical decision support (CDS) has been shown to improve post-transplant immunosuppressive care processes and outcomes. The readiness of transplant information systems to implement computerized CDS to support post-transplant care is unknown. a) Describe the current clinical information system functionality and manual and automated processes for laboratory monitoring of immunosuppressive care, b) describe the use of guidelines that may be used to produce computable logic and the use of computerized alerts to support guideline adherence, and c) explore barriers to implementation of CDS in U.S. liver transplant centers. We developed a web-based survey using cognitive interviewing techniques. We surveyed 119 U.S. transplant programs that performed at least five liver transplantations per year during 2010-2012. Responses were summarized using descriptive analyses; barriers were identified using qualitative methods. Respondents from 80 programs (67% response rate) completed the survey. While 98% of programs reported having an electronic health record (EHR), all programs used paper-based manual processes to receive or track immunosuppressive laboratory results. Most programs (85%) reported that 30% or more of their patients used external laboratories for routine testing. Few programs (19%) received most external laboratory results as discrete data via electronic interfaces while most (80%) manually entered laboratory results into the EHR; less than half (42%) could integrate internal and external laboratory results. Nearly all programs had guidelines regarding pre-specified target ranges (92%) or testing schedules (97%) for managing immunosuppressive care. Few programs used computerized alerting to notify transplant coordinators of out-of-range (27%) or overdue laboratory results (20%). Use of EHRs is common, yet all liver transplant programs were largely dependent on manual paper-based processes to monitor immunosuppression for post-liver transplant patients. Similar immunosuppression guidelines provide opportunities for sharing CDS once integrated laboratory data are available.

  18. A methodology for Manufacturing Execution Systems (MES) implementation

    NASA Astrophysics Data System (ADS)

    Govindaraju, Rajesri; Putra, Krisna

    2016-02-01

    Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.

  19. Computerized detection of vertebral compression fractures on lateral chest radiographs: Preliminary results with a tool for early detection of osteoporosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasai, Satoshi; Li Feng; Shiraishi, Junji

    Vertebral fracture (or vertebral deformity) is a very common outcome of osteoporosis, which is one of the major public health concerns in the world. Early detection of vertebral fractures is important because timely pharmacologic intervention can reduce the risk of subsequent additional fractures. Chest radiographs are used routinely for detection of lung and heart diseases, and vertebral fractures can be visible on lateral chest radiographs. However, investigators noted that about 50% of vertebral fractures visible on lateral chest radiographs were underdiagnosed or under-reported, even when the fractures were severe. Therefore, our goal was to develop a computerized method for detectionmore » of vertebral fractures on lateral chest radiographs in order to assist radiologists' image interpretation and thus allow the early diagnosis of osteoporosis. The cases used in this study were 20 patients with severe vertebral fractures and 118 patients without fractures, as confirmed by the consensus of two radiologists. Radiologists identified the locations of fractured vertebrae, and they provided morphometric data on the vertebral shape for evaluation of the accuracy of detecting vertebral end plates by computer. In our computerized method, a curved search area, which included a number of vertebral end plates, was first extracted automatically, and was straightened so that vertebral end plates became oriented horizontally. Edge candidates were enhanced by use of a horizontal line-enhancement filter in the straightened image, and a multiple thresholding technique, followed by feature analysis, was used for identification of the vertebral end plates. The height of each vertebra was determined from locations of identified vertebral end plates, and fractured vertebrae were detected by comparison of the measured vertebral height with the expected height. The sensitivity of our computerized method for detection of fracture cases was 95% (19/20), with 1.03 (139/135) false-positive fractures per image. The accuracy of identifying vertebral end plates, marked by radiologists in a morphometric study, was 76.6% (400/522) and 70.9% (420/592) for cases used for training and those for testing, respectively. We prepared 32 additional fracture cases for a validation test, and we examined the detection accuracy of our computerized method. The sensitivity for these cases was 75% (24/32) at 1.03 (33/32) false-positive fractures per image. Our preliminary results show that the automated computerized scheme for detecting vertebral fractures on lateral chest radiographs has the potential to assist radiologists in detecting vertebral fractures.« less

  20. Universal Verification Methodology Based Register Test Automation Flow.

    PubMed

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  1. A comparison of automated crater detection methods

    NASA Astrophysics Data System (ADS)

    Bandeira, L.; Barreira, C.; Pina, P.; Saraiva, J.

    2008-09-01

    Abstract This work presents early results of a comparison between some common methodologies for automated crater detection. The three procedures considered were applied to images of the surface of Mars, thus illustrating some pros and cons of their use. We aim to establish the clear advantages in using this type of methods in the study of planetary surfaces.

  2. Computerized cognitive testing in patients with type I Gaucher disease: effects of enzyme replacement and substrate reduction.

    PubMed

    Elstein, Deborah; Guedalia, Judith; Doniger, Glen M; Simon, Ely S; Antebi, Vered; Arnon, Yael; Zimran, Ari

    2005-02-01

    Because of concern for drug-induced cognitive dysfunction during clinical trials using substrate reduction therapy (miglustat) in type 1 Gaucher disease and because it has been suggested that some patients with type 1 Gaucher disease may develop neurocognitive impairment as part of the natural history, two different batteries of neuropsychological tests were devised to examine these issues. Using these tests, cognitive function was assessed in patients treated with miglustat, in patients receiving enzyme replacement (standard care for symptomatic patients), and in untreated (milder) patients. For this study, 55/60 patients exposed to miglustat in Israel participated in psychologist-administered testing; 36/55 participated in computerized testing. Of these, 31 enzyme-treated patients and 22 untreated patients participated in the psychologist-administered testing, and 15 enzyme-treated patients and 18 untreated patients participated in computerized testing. The psychologist-administered battery consisted of 18 standard neuropsychological subtests specific to executive and visuospatial functioning. The computerized battery (Mindstreams, NeuroTrax Corp., New York, NY) consisted of 10 subtests tapping multiple cognitive domains. Between-group analyses for each modality compared cognitive performance. In the psychologist-administered testing, patients exposed to miglustat performed significantly less well than the other groups in 5/18 subtests. On the computerized tests, all patients performed comparably to normal controls. Scores in patients exposed to miglustat were higher than in untreated patients, particularly in visuospatial function, whereas enzyme-treated patients performed less well. However, with the exception of visuospatial function, these results were not statistically significant. It is unclear why different testing methods yielded discordant results. Any dysfunction suggested by the current study is apparently subtle and of doubtful clinical relevance given that cognitive status did not interfere with patients' daily intellectual function. The computerized battery has methodological advantages (e.g., language options, objectivity, brevity, and ease of use) that make it well-suited for longitudinal studies, for long-term surveillance of substrate reduction therapy as well as for comparisons with other lysosomal storage disorders and other chronic diseases. These preliminary findings should allay fears of cognitive dysfunction due to short-term miglustat therapy.

  3. Automated Historical and Real-Time Cyclone Discovery With Multimodal Remote Satellite Measurements

    NASA Astrophysics Data System (ADS)

    Ho, S.; Talukder, A.; Liu, T.; Tang, W.; Bingham, A.

    2008-12-01

    Existing cyclone detection and tracking solutions involve extensive manual analysis of modeled-data and field campaign data by teams of experts. We have developed a novel automated global cyclone detection and tracking system by assimilating and sharing information from multiple remote satellites. This unprecedented solution of combining multiple remote satellite measurements in an autonomous manner allows leveraging off the strengths of each individual satellite. Use of multiple satellite data sources also results in significantly improved temporal tracking accuracy for cyclones. Our solution involves an automated feature extraction and machine learning technique based on an ensemble classifier and Kalman filter for cyclone detection and tracking from multiple heterogeneous satellite data sources. Our feature-based methodology that focuses on automated cyclone discovery is fundamentally different from, and actually complements, the well-known Dvorak technique for cyclone intensity estimation (that often relies on manual detection of cyclonic regions) from field and remote data. Our solution currently employs the QuikSCAT wind measurement and the merged level 3 TRMM precipitation data for automated cyclone discovery. Assimilation of other types of remote measurements is ongoing and planned in the near future. Experimental results of our automated solution on historical cyclone datasets demonstrate the superior performance of our automated approach compared to previous work. Performance of our detection solution compares favorably against the list of cyclones occurring in North Atlantic Ocean for the 2005 calendar year reported by the National Hurricane Center (NHC) in our initial analysis. We have also demonstrated the robustness of our cyclone tracking methodology in other regions over the world by using multiple heterogeneous satellite data for detection and tracking of three arbitrary historical cyclones in other regions. Our cyclone detection and tracking methodology can be applied to (i) historical data to support Earth scientists in climate modeling, cyclonic-climate interactions, and obtain a better understanding of the cause and effects of cyclone (e.g. cyclo-genesis), and (ii) automatic cyclone discovery in near real-time using streaming satellite to support and improve the planning of global cyclone field campaigns. Additional satellite data from GOES and other orbiting satellites can be easily assimilated and integrated into our automated cyclone detection and tracking module to improve the temporal tracking accuracy of cyclones down to ½ hr and reduce the incidence of false alarms.

  4. National environmental specimen bank survey. [Location of 657 collections of environmental specimens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Hook, R.I.; Huber, E.E.

    1976-01-01

    This report presents the data base developed in the National Environmental Specimen Bank (NESB) Survey. The methodology utilized in developing the mailing lists and in developing and maintaining the data base records also is included. The NESB Survey Data Base is computerized in the Oak Ridge Computerized Hierarchical Information System, Oak Ridge National Laboratory, Oak Ridge, Tennessee 37830. The NESB Survey mailing list consisted of 4500 names and addresses. The 657 environmental specimen collections that were located and documented in the NESB Survey Data Base include the following categories: animal, atmospheric, geological, microbiological, plant, and water. However, the majority ofmore » the collections identified are biological in nature. Three indices of the NESB Survey Data Base are included in this report: respondents names and addresses categorized by organizational affiliation; (2) alphabetical listing of respondents; and geographical sampling location for materials in collections.« less

  5. A mapping of information security in health Information Systems in Latin America and Brazil.

    PubMed

    Pereira, Samáris Ramiro; Fernandes, João Carlos Lopes; Labrada, Luis; Bandiera-Paiva, Paulo

    2013-01-01

    In health, Information Systems are patient records, hospital administration or other, have advantages such as cost, availability and integration. However, for these benefits to be fully met, it is necessary to guarantee the security of information maintained and provided by the systems. The lack of security can lead to serious consequences such as lawsuits and induction to medical errors. The management of information security is complex and is used in various fields of knowledge. Often, it is left in the background for not being the ultimate goal of a computer system, causing huge financial losses to corporations. This paper by systematic review methodologies, presented a mapping in the literature, in order to identify the most relevant aspects that are addressed by security researchers of health information, as to the development of computerized systems. They conclude through the results, some important aspects, for which the managers of computerized health systems should remain alert.

  6. Microbial Load Monitor

    NASA Technical Reports Server (NTRS)

    Gibson, S. F.; Royer, E. R.

    1979-01-01

    The Microbial Load Monitor (MLM) is an automated and computerized system for detection and identification of microorganisms. Additionally, the system is designed to enumerate and provide antimicrobic susceptibility profiles for medically significant bacteria. The system is designed to accomplish these tasks in a time of 13 hours or less versus the traditional time of 24 hours for negatives and 72 hours or more for positives usually required for standard microbiological analysis. The MLM concept differs from other methods of microbial detection in that the system is designed to accept raw untreated clinical samples and to selectively identify each group or species that may be present in a polymicrobic sample.

  7. Payload crew training scheduler (PACTS) user's manual

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1980-01-01

    The operation of the payload specialist training scheduler (PACTS) is discussed in this user's manual which is used to schedule payload specialists for mission training on the Spacelab experiments. The PACTS program is a fully automated interactive, computerized scheduling program equipped with tutorial displays. The tutorial displays are sufficiently detailed for use by a program analyst having no computer experience. The PACTS program is designed to operate on the UNIVAC 1108 computer system, and has the capability to load output into a PDP 11/45 Interactive Graphics Display System for printing schedules. The program has the capacity to handle up to three overlapping Spacelab missions.

  8. An automated atmospheric sampling system operating on 747 airliners

    NASA Technical Reports Server (NTRS)

    Perkins, P. J.; Gustafsson, U. R. C.

    1976-01-01

    An air sampling system that automatically measures the temporal and spatial distribution of particulate and gaseous constituents of the atmosphere is collecting data on commercial air routes covering the world. Measurements are made in the upper troposphere and lower stratosphere (6 to 12 km) of constituents related to aircraft engine emissions and other pollutants. Aircraft operated by different airlines sample air at latitudes from the Arctic to Australia. This unique system includes specialized instrumentation, a special air inlet probe for sampling outside air, a computerized automatic control, and a data acquisition system. Air constituent and related flight data are tape recorded in flight for later computer processing on the ground.

  9. Improving designer productivity. [artificial intelligence

    NASA Technical Reports Server (NTRS)

    Hill, Gary C.

    1992-01-01

    Designer and design team productivity improves with skill, experience, and the tools available. The design process involves numerous trials and errors, analyses, refinements, and addition of details. Computerized tools have greatly speeded the analysis, and now new theories and methods, emerging under the label Artificial Intelligence (AI), are being used to automate skill and experience. These tools improve designer productivity by capturing experience, emulating recognized skillful designers, and making the essence of complex programs easier to grasp. This paper outlines the aircraft design process in today's technology and business climate, presenting some of the challenges ahead and some of the promising AI methods for meeting these challenges.

  10. Automated analysis of free speech predicts psychosis onset in high-risk youths

    PubMed Central

    Bedi, Gillinder; Carrillo, Facundo; Cecchi, Guillermo A; Slezak, Diego Fernández; Sigman, Mariano; Mota, Natália B; Ribeiro, Sidarta; Javitt, Daniel C; Copelli, Mauro; Corcoran, Cheryl M

    2015-01-01

    Background/Objectives: Psychiatry lacks the objective clinical tests routinely used in other specializations. Novel computerized methods to characterize complex behaviors such as speech could be used to identify and predict psychiatric illness in individuals. AIMS: In this proof-of-principle study, our aim was to test automated speech analyses combined with Machine Learning to predict later psychosis onset in youths at clinical high-risk (CHR) for psychosis. Methods: Thirty-four CHR youths (11 females) had baseline interviews and were assessed quarterly for up to 2.5 years; five transitioned to psychosis. Using automated analysis, transcripts of interviews were evaluated for semantic and syntactic features predicting later psychosis onset. Speech features were fed into a convex hull classification algorithm with leave-one-subject-out cross-validation to assess their predictive value for psychosis outcome. The canonical correlation between the speech features and prodromal symptom ratings was computed. Results: Derived speech features included a Latent Semantic Analysis measure of semantic coherence and two syntactic markers of speech complexity: maximum phrase length and use of determiners (e.g., which). These speech features predicted later psychosis development with 100% accuracy, outperforming classification from clinical interviews. Speech features were significantly correlated with prodromal symptoms. Conclusions: Findings support the utility of automated speech analysis to measure subtle, clinically relevant mental state changes in emergent psychosis. Recent developments in computer science, including natural language processing, could provide the foundation for future development of objective clinical tests for psychiatry. PMID:27336038

  11. Developing and evaluating an automated appendicitis risk stratification algorithm for pediatric patients in the emergency department.

    PubMed

    Deleger, Louise; Brodzinski, Holly; Zhai, Haijun; Li, Qi; Lingren, Todd; Kirkendall, Eric S; Alessandrini, Evaline; Solti, Imre

    2013-12-01

    To evaluate a proposed natural language processing (NLP) and machine-learning based automated method to risk stratify abdominal pain patients by analyzing the content of the electronic health record (EHR). We analyzed the EHRs of a random sample of 2100 pediatric emergency department (ED) patients with abdominal pain, including all with a final diagnosis of appendicitis. We developed an automated system to extract relevant elements from ED physician notes and lab values and to automatically assign a risk category for acute appendicitis (high, equivocal, or low), based on the Pediatric Appendicitis Score. We evaluated the performance of the system against a manually created gold standard (chart reviews by ED physicians) for recall, specificity, and precision. The system achieved an average F-measure of 0.867 (0.869 recall and 0.863 precision) for risk classification, which was comparable to physician experts. Recall/precision were 0.897/0.952 in the low-risk category, 0.855/0.886 in the high-risk category, and 0.854/0.766 in the equivocal-risk category. The information that the system required as input to achieve high F-measure was available within the first 4 h of the ED visit. Automated appendicitis risk categorization based on EHR content, including information from clinical notes, shows comparable performance to physician chart reviewers as measured by their inter-annotator agreement and represents a promising new approach for computerized decision support to promote application of evidence-based medicine at the point of care.

  12. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  13. Use of automated monitoring to assess behavioral toxicology in fish: Linking behavior and physiology

    USGS Publications Warehouse

    Brewer, S.K.; DeLonay, A.J.; Beauvais, S.L.; Little, E.E.; Jones, S.B.

    1999-01-01

    We measured locomotory behaviors (distance traveled, speed, tortuosity of path, and rate of change in direction) with computer-assisted analysis in 30 day posthatch rainbow trout (Oncorhynchus mykiss) exposed to pesticides. We also examined cholinesterase inhibition as a potential endpoint linking physiology and behavior. Sublethal exposure to chemicals often causes changes in swimming behavior, reflecting alterations in sensory and motor systems. Swimming behavior also integrates functions of the nervous system. Rarely are the connections between physiology and behavior made. Although behavior is often suggested as a sensitive, early indicator of toxicity, behavioral toxicology has not been used to its full potential because conventional methods of behavioral assessment have relied on manual techniques, which are often time-consuming and difficult to quantify. This has severely limited the application and utility of behavioral procedures. Swimming behavior is particularly amenable to computerized assessment and automated monitoring. Locomotory responses are sensitive to toxicants and can be easily measured. We briefly discuss the use of behavior in toxicology and automated techniques used in behavioral toxicology. We also describe the system we used to determine locomotory behaviors of fish, and present data demonstrating the system's effectiveness in measuring alterations in response to chemical challenges. Lastly, we correlate behavioral and physiological endpoints.

  14. Trends in biomedical informatics: automated topic analysis of JAMIA articles

    PubMed Central

    Wang, Shuang; Jiang, Chao; Jiang, Xiaoqian; Kim, Hyeon-Eui; Sun, Jimeng; Ohno-Machado, Lucila

    2015-01-01

    Biomedical Informatics is a growing interdisciplinary field in which research topics and citation trends have been evolving rapidly in recent years. To analyze these data in a fast, reproducible manner, automation of certain processes is needed. JAMIA is a “generalist” journal for biomedical informatics. Its articles reflect the wide range of topics in informatics. In this study, we retrieved Medical Subject Headings (MeSH) terms and citations of JAMIA articles published between 2009 and 2014. We use tensors (i.e., multidimensional arrays) to represent the interaction among topics, time and citations, and applied tensor decomposition to automate the analysis. The trends represented by tensors were then carefully interpreted and the results were compared with previous findings based on manual topic analysis. A list of most cited JAMIA articles, their topics, and publication trends over recent years is presented. The analyses confirmed previous studies and showed that, from 2012 to 2014, the number of articles related to MeSH terms Methods, Organization & Administration, and Algorithms increased significantly both in number of publications and citations. Citation trends varied widely by topic, with Natural Language Processing having a large number of citations in particular years, and Medical Record Systems, Computerized remaining a very popular topic in all years. PMID:26555018

  15. Automated identification of abnormal metaphase chromosome cells for the detection of chronic myeloid leukemia using microscopic images

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Zheng, Bin; Li, Shibo; Mulvihill, John J.; Chen, Xiaodong; Liu, Hong

    2010-07-01

    Karyotyping is an important process to classify chromosomes into standard classes and the results are routinely used by the clinicians to diagnose cancers and genetic diseases. However, visual karyotyping using microscopic images is time-consuming and tedious, which reduces the diagnostic efficiency and accuracy. Although many efforts have been made to develop computerized schemes for automated karyotyping, no schemes can get be performed without substantial human intervention. Instead of developing a method to classify all chromosome classes, we develop an automatic scheme to detect abnormal metaphase cells by identifying a specific class of chromosomes (class 22) and prescreen for suspicious chronic myeloid leukemia (CML). The scheme includes three steps: (1) iteratively segment randomly distributed individual chromosomes, (2) process segmented chromosomes and compute image features to identify the candidates, and (3) apply an adaptive matching template to identify chromosomes of class 22. An image data set of 451 metaphase cells extracted from bone marrow specimens of 30 positive and 30 negative cases for CML is selected to test the scheme's performance. The overall case-based classification accuracy is 93.3% (100% sensitivity and 86.7% specificity). The results demonstrate the feasibility of applying an automated scheme to detect or prescreen the suspicious cancer cases.

  16. Automated Microbiological Detection/Identification System

    PubMed Central

    Aldridge, C.; Jones, P. W.; Gibson, S.; Lanham, J.; Meyer, M.; Vannest, R.; Charles, R.

    1977-01-01

    An automated, computerized system, the AutoMicrobic System, has been developed for the detection, enumeration, and identification of bacteria and yeasts in clinical specimens. The biological basis for the system resides in lyophilized, highly selective and specific media enclosed in wells of a disposable plastic cuvette; introduction of a suitable specimen rehydrates and inoculates the media in the wells. An automated optical system monitors, and the computer interprets, changes in the media, with enumeration and identification results automatically obtained in 13 h. Sixteen different selective media were developed and tested with a variety of seeded (simulated) and clinical specimens. The AutoMicrobic System has been extensively tested with urine specimens, using a urine test kit (Identi-Pak) that contains selective media for Escherichia coli, Proteus species, Pseudomonas aeruginosa, Klebsiella-Enterobacter species, Serratia species, Citrobacter freundii, group D enterococci, Staphylococcus aureus, and yeasts (Candida species and Torulopsis glabrata). The system has been tested with 3,370 seeded urine specimens and 1,486 clinical urines. Agreement with simultaneous conventional (manual) cultures, at levels of 70,000 colony-forming units per ml (or more), was 92% or better for seeded specimens; clinical specimens yielded results of 93% or better for all organisms except P. aeruginosa, where agreement was 86%. System expansion in progress includes antibiotic susceptibility testing and compatibility with most types of clinical specimens. Images PMID:334798

  17. Comparing the performance of expert user heuristics and an integer linear program in aircraft carrier deck operations.

    PubMed

    Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas

    2014-06-01

    Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative.

  18. Ergonomic Redesign of an Industrial Control Panel.

    PubMed

    Raeisi, S; Osqueizadeh, R; Maghsoudipour, M; Jafarpisheh, A S

    2016-07-01

    Operator's role in industrial control centers takes place in time, which is one of the most important determinants of whether an expected action is going to be successful or not. In certain situations, due to the complex nature of the work, the existing interfaces and already prepared procedures do not meet the dynamic requirements of operator's cognitive demands, making the control tasks unnecessarily difficult. This study was conducted to identify ergonomic issues with a specific industrial control panel, and redesign its layout and elements to enhance its usability. Task and link analysis methodologies were implemented. All essential functions and supporting operations were identified at the required trivial levels. Next, the weight of any possible link between the elements of the panel was computed as a composite index of frequency and importance. Finally, all components were rearranged within a new layout, and a computerized mockup was generated. A total of 8 primary tasks was identified, including 4 system failure handling tasks, switching between manual and automated modes, and 3 types of routine vigilance and control tasks. These tasks were broken down into 28 functions and 145 supporting operations, accordingly. Higher link values were observed between hand rest position and 2 elements. Also, 6 other components showed robust linkages. In conclusion, computer modeling can reduce the likelihood of accidents and near misses in industrial control rooms by considering the operators' misperception or mental burden and correcting poor design of the panels and inappropriate task allocation.

  19. Automated microaneurysm detection method based on double ring filter in retinal fundus images

    NASA Astrophysics Data System (ADS)

    Mizutani, Atsushi; Muramatsu, Chisako; Hatanaka, Yuji; Suemori, Shinsuke; Hara, Takeshi; Fujita, Hiroshi

    2009-02-01

    The presence of microaneurysms in the eye is one of the early signs of diabetic retinopathy, which is one of the leading causes of vision loss. We have been investigating a computerized method for the detection of microaneurysms on retinal fundus images, which were obtained from the Retinopathy Online Challenge (ROC) database. The ROC provides 50 training cases, in which "gold standard" locations of microaneurysms are provided, and 50 test cases without the gold standard locations. In this study, the computerized scheme was developed by using the training cases. Although the results for the test cases are also included, this paper mainly discusses the results for the training cases because the "gold standard" for the test cases is not known. After image preprocessing, candidate regions for microaneurysms were detected using a double-ring filter. Any potential false positives located in the regions corresponding to blood vessels were removed by automatic extraction of blood vessels from the images. Twelve image features were determined, and the candidate lesions were classified into microaneurysms or false positives using the rule-based method and an artificial neural network. The true positive fraction of the proposed method was 0.45 at 27 false positives per image. Forty-two percent of microaneurysms in the 50 training cases were considered invisible by the consensus of two co-investigators. When the method was evaluated for visible microaneurysms, the sensitivity for detecting microaneurysms was 65% at 27 false positives per image. Our computerized detection scheme could be improved for helping ophthalmologists in the early diagnosis of diabetic retinopathy.

  20. An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence

    NASA Technical Reports Server (NTRS)

    Lindley, Craig A.

    1993-01-01

    This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.

  1. GT-CATS: Tracking Operator Activities in Complex Systems

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Mitchell, Christine M.; Palmer, Everett A.

    1999-01-01

    Human operators of complex dynamic systems can experience difficulties supervising advanced control automation. One remedy is to develop intelligent aiding systems that can provide operators with context-sensitive advice and reminders. The research reported herein proposes, implements, and evaluates a methodology for activity tracking, a form of intent inferencing that can supply the knowledge required for an intelligent aid by constructing and maintaining a representation of operator activities in real time. The methodology was implemented in the Georgia Tech Crew Activity Tracking System (GT-CATS), which predicts and interprets the actions performed by Boeing 757/767 pilots navigating using autopilot flight modes. This report first describes research on intent inferencing and complex modes of automation. It then provides a detailed description of the GT-CATS methodology, knowledge structures, and processing scheme. The results of an experimental evaluation using airline pilots are given. The results show that GT-CATS was effective in predicting and interpreting pilot actions in real time.

  2. Central Heating Plant site characterization report, Marine Corps Combat Development Command, Quantico, Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-08-01

    This report presents the methodology and results of a characterization of the operation and maintenance (O M) environment at the US Marine Corps (USMC) Quantico, Virginia, Central Heating Plant (CHP). This characterization is part of a program intended to provide the O M staff with a computerized artificial intelligence (AI) decision support system that will assist the plant staff in more efficient operation of their plant. 3 refs., 12 figs.

  3. Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection

    DTIC Science & Technology

    2017-03-20

    The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers

  4. Fully automated chest wall line segmentation in breast MRI by using context information

    NASA Astrophysics Data System (ADS)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina

    2012-03-01

    Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).

  5. Designing and evaluating an automated system for real-time medication administration error detection in a neonatal intensive care unit.

    PubMed

    Ni, Yizhao; Lingren, Todd; Hall, Eric S; Leonard, Matthew; Melton, Kristin; Kirkendall, Eric S

    2018-05-01

    Timely identification of medication administration errors (MAEs) promises great benefits for mitigating medication errors and associated harm. Despite previous efforts utilizing computerized methods to monitor medication errors, sustaining effective and accurate detection of MAEs remains challenging. In this study, we developed a real-time MAE detection system and evaluated its performance prior to system integration into institutional workflows. Our prospective observational study included automated MAE detection of 10 high-risk medications and fluids for patients admitted to the neonatal intensive care unit at Cincinnati Children's Hospital Medical Center during a 4-month period. The automated system extracted real-time medication use information from the institutional electronic health records and identified MAEs using logic-based rules and natural language processing techniques. The MAE summary was delivered via a real-time messaging platform to promote reduction of patient exposure to potential harm. System performance was validated using a physician-generated gold standard of MAE events, and results were compared with those of current practice (incident reporting and trigger tools). Physicians identified 116 MAEs from 10 104 medication administrations during the study period. Compared to current practice, the sensitivity with automated MAE detection was improved significantly from 4.3% to 85.3% (P = .009), with a positive predictive value of 78.0%. Furthermore, the system showed potential to reduce patient exposure to harm, from 256 min to 35 min (P < .001). The automated system demonstrated improved capacity for identifying MAEs while guarding against alert fatigue. It also showed promise for reducing patient exposure to potential harm following MAE events.

  6. Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors

    NASA Technical Reports Server (NTRS)

    Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele

    2010-01-01

    This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.

  7. Reading Guided by Automated Graphical Representations: How Model-Based Text Visualizations Facilitate Learning in Reading Comprehension Tasks

    ERIC Educational Resources Information Center

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk

    2011-01-01

    Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which has similarities to a concept map. The "text to graph" feature of the software is based on…

  8. Methodology Investigation of AI(Artificial Intelligence) Test Officer Support Tool. Volume 1

    DTIC Science & Technology

    1989-03-01

    American Association for Artificial inteligence A! ............. Artificial inteliigence AMC ............ Unt:ed States Army Maeriel Comand ASL...block number) FIELD GROUP SUB-GROUP Artificial Intelligence, Expert Systems Automated Aids to Testing 9. ABSTRACT (Continue on reverse if necessary and...identify by block number) This report covers the application of Artificial Intelligence-Techniques to the problem of creating automated tools to

  9. The Automated System of the Rhythm Analysis of the Educational Process in a Higher Educational Institution on the Basis of Aprioristic Data

    ERIC Educational Resources Information Center

    Pelin, Nicolae; Mironov, Vladimir

    2008-01-01

    In this article the problems of functioning algorithms development for system of the automated analysis of educational process rhythm in a higher educational institution are considered. Using the device of experiment planning for conducting the scientific researches, adapted methodologies, received by authors in the dissertational works at the…

  10. Automated Corrosion Detection Program

    DTIC Science & Technology

    2001-10-01

    More detailed explanations of the methodology development can be found in Hidden Corrosion Detection Technology Assessment, a paper presented at...Detection Program, a paper presented at the Fourth Joint DoD/FAA/NASA Conference on Aging Aircraft, 2000. AS&M PULSE. The PULSE system, developed...selection can be found in The Evaluation of Hidden Corrosion Detection Technologies on the Automated Corrosion Detection Program, a paper presented

  11. Computerized engineering logic for nuclear procurement and dedication processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tulay, M.P.

    1996-12-31

    In an attempt to better meet the needs of operations and maintenance organizations, many nuclear utility procurement engineering groups have simplified their procedures, developed on-line tools for performing the specification of replacement items, and developed relational databases containing part-level information necessary to automate the procurement process. Although these improvements have helped to reduce the engineering necessary to properly specify and accept/dedicate items for nuclear safety-related applications, a number of utilities have recognized that additional long-term savings can be realized by integrating a computerized logic to assist technical procurement engineering personnel. The most commonly used logic follows the generic processes containedmore » in Electric Power Research Institute (EPRI) published guidelines. The processes are typically customized to some extent to accommodate each utility`s organizational structure, operating procedures, and strategic goals. This paper will discuss a typical logic that integrates the technical evaluation, acceptance, and receipt inspection and testing processes. The logic this paper will describe has been successfully integrated at a growing number of nuclear utilities and has produced numerous positive results. The application of the logic ensures that utility-wide standards or procedures, common among multi-site utilities, are followed.« less

  12. Application of Human-Autonomy Teaming (HAT) Patterns to Reduce Crew Operations (RCO)

    NASA Technical Reports Server (NTRS)

    Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri

    2011-01-01

    Unmanned aerial systems, advanced cockpits, and air traffic management are all seeing dramatic increases in automation. However, while automation may take on some tasks previously performed by humans, humans will still be required to remain in the system for the foreseeable future. The collaboration between humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. This paper applies a methodology for identifying HAT patterns to an advanced cockpit project.

  13. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  14. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  15. Architecture Views Illustrating the Service Automation Aspect of SOA

    NASA Astrophysics Data System (ADS)

    Gu, Qing; Cuadrado, Félix; Lago, Patricia; Duenãs, Juan C.

    Earlier in this book, Chapter 8 provided a detailed analysis of service engineering, including a review of service engineering techniques and methodologies. This chapter is closely related to Chapter 8 as shows how such approaches can be used to develop a service, with particular emphasis on the identification of three views (the automation decision view, degree of service automation view and service automation related data view) that structure and ease elicitation and documentation of stakeholders' concerns. This is carried out through two large case studies to learn the industrial needs in illustrating services deployment and configuration automation. This set of views adds to the more traditional notations like UML, the visual power of attracting the attention of their users to the addressed concerns, and assist them in their work. This is especially crucial in service oriented architecting where service automation is highly demanded.

  16. Evaluating Management Strategies for Automated Test Systems/Equipment (ATS/E): An F-15 Case Study

    DTIC Science & Technology

    2005-03-01

    ethnography , grounded theory , case study , phenomenological research , and narrative research (also known as bibliography from...Creswell, 2003:183). Example inquiry strategies identified by Creswell are: narrative , phenomenology , ethnography , case study , and grounded theory ...other managed systems. Methodology The researcher chose a qualitative research methodology and

  17. NMRbot: Python scripts enable high-throughput data collection on current Bruker BioSpin NMR spectrometers.

    PubMed

    Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L

    2013-06-01

    To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.

  18. Automated Control of the Organic and Inorganic Composition of Aloe vera Extracts Using (1)H NMR Spectroscopy.

    PubMed

    Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K

    2016-09-01

    Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P < 0.0001, R(2) = 0.99). NMR was found to be suitable for the automated simultaneous quantitative determination of 13 parameters in A. vera.

  19. Self-Contained Automated Methodology for Optimal Flow Control

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, Roy A.; Erlebacherl, Gordon; Hussaini, M. Yousuff

    1997-01-01

    This paper describes a self-contained, automated methodology for active flow control which couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields and controls (e.g., actuators), may be determined. The problem of boundary layer instability suppression through wave cancellation is used as the initial validation case to test the methodology. Here, the objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc. The present methodology has been extended to three dimensions and may potentially be applied to separation control, re-laminarization, and turbulence control applications using one to many sensors and actuators.

  20. Integrated microreactor for enzymatic reaction automation: An easy step toward the quality control of monoclonal antibodies.

    PubMed

    Ladner, Yoann; Mas, Silvia; Coussot, Gaelle; Bartley, Killian; Montels, Jérôme; Morel, Jacques; Perrin, Catherine

    2017-12-15

    The main purpose of the present work is to provide a fully integrated miniaturized electrophoretic methodology in order to facilitate the quality control of monoclonal antibodies (mAbs). This methodology called D-PES, which stands for Diffusion-mediated Proteolysis combined with an Electrophoretic Separation, permits to perform subsequently mAb tryptic digestion and electrophoresis separation of proteolysis products in an automated manner. Tryptic digestion conditions were optimized regarding the influence of enzyme concentration and incubation time in order to achieve similar enzymatic digestion efficiency to that obtained with the classical methodology (off-line). Then, the optimization of electrophoretic separation conditions concerning the nature of background electrolyte (BGE), ionic strength and pH was realized. Successful and repeatable electrophoretic profiles of three mAbs digests (Trastuzumab, Infliximab and Tocilizumab), comparable to the off-line digestion profiles, were obtained demonstrating the feasibility and robustness of the proposed methodology. In summary, the use of the proposed and optimized in-line approach opens a new, fast and easy way for the quality control of mAbs. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A unified approach to VLSI layout automation and algorithm mapping on processor arrays

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Pattabiraman, S.; Srinivasan, Vinoo N.

    1993-01-01

    Development of software tools for designing supercomputing systems is highly complex and cost ineffective. To tackle this a special purpose PAcube silicon compiler which integrates different design levels from cell to processor arrays has been proposed. As a part of this, we present in this paper a novel methodology which unifies the problems of Layout Automation and Algorithm Mapping.

  2. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  3. Artificial intelligence issues related to automated computing operations

    NASA Technical Reports Server (NTRS)

    Hornfeck, William A.

    1989-01-01

    Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.

  4. Nonlinear Computerized Methodology. A. Angle of Arrival Estimation. B. Data Modeling and Identification

    DTIC Science & Technology

    1991-06-10

    essentially In the Wianer- Ville distribution ( WVD ). A preliminary analysis indicates that the simple operation of autoconvolution can enhance spectral...many troublesome cases as a supplement to MUSIC (and its adaptations) and as a simple alternative (or representation of) the Wigner - Ville ... WVD is a time-frequency distribution which provides an unbiased spectrum estimate by W(t,W) = f H,(u) X (t - u/2) X (t + u/2) e -iwu du , where the

  5. A study of commuter airplane design optimization

    NASA Technical Reports Server (NTRS)

    Roskam, J.; Wyatt, R. D.; Griswold, D. A.; Hammer, J. L.

    1977-01-01

    Problems of commuter airplane configuration design were studied to affect a minimization of direct operating costs. Factors considered were the minimization of fuselage drag, methods of wing design, and the estimated drag of an airplane submerged in a propellor slipstream; all design criteria were studied under a set of fixed performance, mission, and stability constraints. Configuration design data were assembled for application by a computerized design methodology program similar to the NASA-Ames General Aviation Synthesis Program.

  6. Empirical Demonstration of Isoperformance Methodology Preparatory of an Interactive Expert Computerized Decision Aid

    DTIC Science & Technology

    1988-11-01

    after modest amounts of 7 practi.ce. Moreover, the advant.ages of display aiding (e.g., Smith & Kennedy, 1976) or artificial intelligence may be largely...team performance in tanks is largely a function of the intelligence of the tank commmander (Wallace, 1932). In summary, individual differences such as...versus small CRT screen), and training on a videogame task which simulated a remotely piloted vehicle. This study was successful as an isoperformance

  7. Commercial Activities Baseline Study

    DTIC Science & Technology

    1991-03-01

    object oriented programming technology) that automated processing . This report documents that methodology, reviews the candidates and criteria for source data, and provides examples of the output reports.

  8. Developing and applying modern methods of leakage monitoring and state estimation of fuel at the Novovoronezh nuclear power plant

    NASA Astrophysics Data System (ADS)

    Povarov, V. P.; Tereshchenko, A. B.; Kravchenko, Yu. N.; Pozychanyuk, I. V.; Gorobtsov, L. I.; Golubev, E. I.; Bykov, V. I.; Likhanskii, V. V.; Evdokimov, I. A.; Zborovskii, V. G.; Sorokin, A. A.; Kanyukova, V. D.; Aliev, T. N.

    2014-02-01

    The results of developing and implementing the modernized fuel leakage monitoring methods at the shut-down and running reactor of the Novovoronezh nuclear power plant (NPP) are presented. An automated computerized expert system integrated with an in-core monitoring system (ICMS) and installed at the Novovoronezh NPP unit no. 5 is described. If leaky fuel elements appear in the core, the system allows one to perform on-line assessment of the parameters of leaky fuel assemblies (FAs). The computer expert system units designed for optimizing the operating regimes and enhancing the fuel usage efficiency at the Novovoronezh NPP unit no. 5 are now being developed.

  9. The Nurse Watch: Design and Evaluation of a Smart Watch Application with Vital Sign Monitoring and Checklist Reminders

    PubMed Central

    Bang, Magnus; Solnevik, Katarina; Eriksson, Henrik

    2015-01-01

    Computerized wearable devices such as smart watches will become valuable nursing tools. This paper describes a smart-watch system developed in close collaboration with a team of nurses working in a Swedish ICU. The smart-watch system provides real-time vital-sign monitoring, threshold alarms, and to-do reminders. Additionally, a Kanban board, visualized on a multitouch screen provides an overview of completed and upcoming tasks. We describe an approach to implement automated checklist systems with smart watches and discuss aspects of importance when implementing such memory and attention support. The paper is finalized with an in-development formative evaluation of the system. PMID:26958162

  10. Intelligent Computerized Training System

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Baffes, Paul; Loftin, R. Bowen; Hua, Grace C.

    1991-01-01

    Intelligent computer-aided training system gives trainees same experience gained from best on-the-job training. Automated system designed to emulate behavior of experienced teacher devoting full time and attention to training novice. Proposes challenging training scenarios, monitors and evaluates trainee's actions, makes meaningful comments in response to errors, reponds to requests for information, gives hints when appropriate, and remembers strengths and weaknesses so it designs suitable exercises. Used to train flight-dynamics officers in deploying satellites from Space Shuttle. Adapted to training for variety of tasks and situations, simply by modifying one or at most two of its five modules. Helps to ensure continuous supply of trained specialists despite scarcity of experienced and skilled human trainers.

  11. The Nurse Watch: Design and Evaluation of a Smart Watch Application with Vital Sign Monitoring and Checklist Reminders.

    PubMed

    Bang, Magnus; Solnevik, Katarina; Eriksson, Henrik

    Computerized wearable devices such as smart watches will become valuable nursing tools. This paper describes a smart-watch system developed in close collaboration with a team of nurses working in a Swedish ICU. The smart-watch system provides real-time vital-sign monitoring, threshold alarms, and to-do reminders. Additionally, a Kanban board, visualized on a multitouch screen provides an overview of completed and upcoming tasks. We describe an approach to implement automated checklist systems with smart watches and discuss aspects of importance when implementing such memory and attention support. The paper is finalized with an in-development formative evaluation of the system.

  12. CPOE: a clear purpose plus top-notch technical support equals high physician adoption.

    PubMed

    Birk, Susan

    2010-01-01

    As with any fundamental change, the transition to computerized physician order entry [CPOE] is not a risk-free endeavor, major questions hover around this facet of the arduous and controversial paper-to-electronic conversion currently preoccupying the healthcare industry: Could physician over-reliance on electronic prompts actually lead to an increase in some types of medical errors? Could automated workstations ultimately hinder safety and the delivery of quality care by diminishing face-to-face communication and nuanced discussions? In an ironic twist, could electronic solutions insidiously leach creativity, intuition and judgment from good medicine by keeping physicians tied to tools that consume their time but do not offer effective clinical decision support?

  13. Application of Human-Autonomy Teaming (HAT) Patterns to Reduce Crew Operations (RCO)

    NASA Technical Reports Server (NTRS)

    Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri

    2016-01-01

    Unmanned aerial systems, robotics, advanced cockpits, and air traffic management are all examples of domains that are seeing dramatic increases in automation. While automation may take on some tasks previously performed by humans, humans will still be required, for the foreseeable future, to remain in the system. The collaboration with humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. This paper applies a methodology for identifying HAT patterns to an advanced cockpit project.

  14. How to combine probabilistic and fuzzy uncertainties in fuzzy control

    NASA Technical Reports Server (NTRS)

    Nguyen, Hung T.; Kreinovich, Vladik YA.; Lea, Robert

    1991-01-01

    Fuzzy control is a methodology that translates natural-language rules, formulated by expert controllers, into the actual control strategy that can be implemented in an automated controller. In many cases, in addition to the experts' rules, additional statistical information about the system is known. It is explained how to use this additional information in fuzzy control methodology.

  15. An automated computerized auscultation and diagnostic system for pulmonary diseases.

    PubMed

    Abbas, Ali; Fahim, Atef

    2010-12-01

    Respiratory sounds are of significance as they provide valuable information on the health of the respiratory system. Sounds emanating from the respiratory system are uneven, and vary significantly from one individual to another and for the same individual over time. In and of themselves they are not a direct proof of an ailment, but rather an inference that one exists. Auscultation diagnosis is an art/skill that is acquired and honed by practice; hence it is common to seek confirmation using invasive and potentially harmful imaging diagnosis techniques like X-rays. This research focuses on developing an automated auscultation diagnostic system that overcomes the limitations inherent in traditional auscultation techniques. The system uses a front end sound signal filtering module that uses adaptive Neural Networks (NN) noise cancellation to eliminate spurious sound signals like those from the heart, intestine, and ambient noise. To date, the core diagnosis module is capable of identifying lung sounds from non-lung sounds, normal lung sounds from abnormal ones, and identifying wheezes from crackles as indicators of different ailments.

  16. [Evaluation of an automated pH-monitor and its logic of calculation].

    PubMed

    Ducrotté, P; Hubin, M; Xin, H; Roussignol, C; Denis, P

    1990-01-01

    The aim of this study was to compare the results of 3-hour postprandial esophageal pH recordings obtained simultaneously from a standard Beckmann pH recorder and a commercially available fully automated pH recording device, "pH 60" in 30 subjects. Both apparatuses were connected to the same pH probe and to a unique chart recorder to obtain simultaneous pH graphic tracings. The percentage of time between each pH level below pH 5, the percentage of time with pH less than 4 and Kaye's score were determined hourly and for the overall recording time. The pH graphic traces in both apparatuses were strictly identical demonstrating the accuracy of the analog-to-digital converter and the memory module to record pH changes. Moreover, we found a significant correlation (p less than 0.01) and a good overall agreement for all compared parameters between manual and computerized analysis. This study documents that the commercially available ambulatory esophageal pH instrument studied produces accurate data for the diagnosis of gastroesophageal reflux.

  17. The role of computerized diagnostic proposals in the interpretation of the 12-lead electrocardiogram by cardiology and non-cardiology fellows.

    PubMed

    Novotny, Tomas; Bond, Raymond; Andrsova, Irena; Koc, Lumir; Sisakova, Martina; Finlay, Dewar; Guldenring, Daniel; Spinar, Jindrich; Malik, Marek

    2017-05-01

    Most contemporary 12-lead electrocardiogram (ECG) devices offer computerized diagnostic proposals. The reliability of these automated diagnoses is limited. It has been suggested that incorrect computer advice can influence physician decision-making. This study analyzed the role of diagnostic proposals in the decision process by a group of fellows of cardiology and other internal medicine subspecialties. A set of 100 clinical 12-lead ECG tracings was selected covering both normal cases and common abnormalities. A team of 15 junior Cardiology Fellows and 15 Non-Cardiology Fellows interpreted the ECGs in 3 phases: without any diagnostic proposal, with a single diagnostic proposal (half of them intentionally incorrect), and with four diagnostic proposals (only one of them being correct) for each ECG. Self-rated confidence of each interpretation was collected. Availability of diagnostic proposals significantly increased the diagnostic accuracy (p<0.001). Nevertheless, in case of a single proposal (either correct or incorrect) the increase of accuracy was present in interpretations with correct diagnostic proposals, while the accuracy was substantially reduced with incorrect proposals. Confidence levels poorly correlated with interpretation scores (rho≈2, p<0.001). Logistic regression showed that an interpreter is most likely to be correct when the ECG offers a correct diagnostic proposal (OR=10.87) or multiple proposals (OR=4.43). Diagnostic proposals affect the diagnostic accuracy of ECG interpretations. The accuracy is significantly influenced especially when a single diagnostic proposal (either correct or incorrect) is provided. The study suggests that the presentation of multiple computerized diagnoses is likely to improve the diagnostic accuracy of interpreters. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. FFDM image quality assessment using computerized image texture analysis

    NASA Astrophysics Data System (ADS)

    Berger, Rachelle; Carton, Ann-Katherine; Maidment, Andrew D. A.; Kontos, Despina

    2010-04-01

    Quantitative measures of image quality (IQ) are routinely obtained during the evaluation of imaging systems. These measures, however, do not necessarily correlate with the IQ of the actual clinical images, which can also be affected by factors such as patient positioning. No quantitative method currently exists to evaluate clinical IQ. Therefore, we investigated the potential of using computerized image texture analysis to quantitatively assess IQ. Our hypothesis is that image texture features can be used to assess IQ as a measure of the image signal-to-noise ratio (SNR). To test feasibility, the "Rachel" anthropomorphic breast phantom (Model 169, Gammex RMI) was imaged with a Senographe 2000D FFDM system (GE Healthcare) using 220 unique exposure settings (target/filter, kVs, and mAs combinations). The mAs were varied from 10%-300% of that required for an average glandular dose (AGD) of 1.8 mGy. A 2.5cm2 retroareolar region of interest (ROI) was segmented from each image. The SNR was computed from the ROIs segmented from images linear with dose (i.e., raw images) after flat-field and off-set correction. Image texture features of skewness, coarseness, contrast, energy, homogeneity, and fractal dimension were computed from the Premium ViewTM postprocessed image ROIs. Multiple linear regression demonstrated a strong association between the computed image texture features and SNR (R2=0.92, p<=0.001). When including kV, target and filter as additional predictor variables, a stronger association with SNR was observed (R2=0.95, p<=0.001). The strong associations indicate that computerized image texture analysis can be used to measure image SNR and potentially aid in automating IQ assessment as a component of the clinical workflow. Further work is underway to validate our findings in larger clinical datasets.

  19. Design of Distributed Cyber-Physical Systems for Connected and Automated Vehicles with Implementing Methodologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, Yixiong; Hu, Bingtao; Hao, He

    With the development of communication and control technology, intelligent transportation systems have received increasing attention from both industry and academia. Intelligent transportation systems are supported by the Internet of Things, Cyber-Physical System, Artificial Intelligence, Cloud Computing and many other technologies, which supply fundamental information for connected and automated vehicles. Although plenty of studies have provided different formulations for intelligent transportation systems, many of them depend on Master Control Center. However, a centralized control mode requires a huge amount of data transmission and high level of hardware configuration and may cause communication delay and privacy leak. Some distributed architectures have beenmore » proposed to overcome the above problems but systematized technologies to collect and exchange information, process large amounts of data, model the dynamics of vehicles, and safely control the connected and automated vehicles are not explored in detail. In this paper, we proposed a novel distributed cyber-physical system for connected and automated vehicles in which every vehicle is modeled as a double-integrator using edge computing to analyze information collected from its nearest neighbors. The vehicles are supposed to travel along a desired trajectory and to maintain a rigid formation geometry. Related methodologies for the proposed system are illustrated and experiments are conducted showing that the performance of the connected and automated vehicles matches very well with analytic predictions. Some design guidelines and open questions are provided for the future study.« less

  20. Design of Distributed Cyber-Physical Systems for Connected and Automated Vehicles with Implementing Methodologies

    DOE PAGES

    Feng, Yixiong; Hu, Bingtao; Hao, He; ...

    2018-02-14

    With the development of communication and control technology, intelligent transportation systems have received increasing attention from both industry and academia. Intelligent transportation systems are supported by the Internet of Things, Cyber-Physical System, Artificial Intelligence, Cloud Computing and many other technologies, which supply fundamental information for connected and automated vehicles. Although plenty of studies have provided different formulations for intelligent transportation systems, many of them depend on Master Control Center. However, a centralized control mode requires a huge amount of data transmission and high level of hardware configuration and may cause communication delay and privacy leak. Some distributed architectures have beenmore » proposed to overcome the above problems but systematized technologies to collect and exchange information, process large amounts of data, model the dynamics of vehicles, and safely control the connected and automated vehicles are not explored in detail. In this paper, we proposed a novel distributed cyber-physical system for connected and automated vehicles in which every vehicle is modeled as a double-integrator using edge computing to analyze information collected from its nearest neighbors. The vehicles are supposed to travel along a desired trajectory and to maintain a rigid formation geometry. Related methodologies for the proposed system are illustrated and experiments are conducted showing that the performance of the connected and automated vehicles matches very well with analytic predictions. Some design guidelines and open questions are provided for the future study.« less

  1. Relative Panoramic Camera Position Estimation for Image-Based Virtual Reality Networks in Indoor Environments

    NASA Astrophysics Data System (ADS)

    Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.

    2017-09-01

    Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.

  2. A self-contained, automated methodology for optimal flow control validated for transition delay

    NASA Technical Reports Server (NTRS)

    Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, R. A.; Erlebacher, Gordon; Hussaini, M. Yousuff

    1995-01-01

    This paper describes a self-contained, automated methodology for flow control along with a validation of the methodology for the problem of boundary layer instability suppression. The objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow, e.g., Blasius boundary layer. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The present approach couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields, and control, e.g., actuators, may be determined. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc.

  3. Trends in biomedical informatics: automated topic analysis of JAMIA articles.

    PubMed

    Han, Dong; Wang, Shuang; Jiang, Chao; Jiang, Xiaoqian; Kim, Hyeon-Eui; Sun, Jimeng; Ohno-Machado, Lucila

    2015-11-01

    Biomedical Informatics is a growing interdisciplinary field in which research topics and citation trends have been evolving rapidly in recent years. To analyze these data in a fast, reproducible manner, automation of certain processes is needed. JAMIA is a "generalist" journal for biomedical informatics. Its articles reflect the wide range of topics in informatics. In this study, we retrieved Medical Subject Headings (MeSH) terms and citations of JAMIA articles published between 2009 and 2014. We use tensors (i.e., multidimensional arrays) to represent the interaction among topics, time and citations, and applied tensor decomposition to automate the analysis. The trends represented by tensors were then carefully interpreted and the results were compared with previous findings based on manual topic analysis. A list of most cited JAMIA articles, their topics, and publication trends over recent years is presented. The analyses confirmed previous studies and showed that, from 2012 to 2014, the number of articles related to MeSH terms Methods, Organization & Administration, and Algorithms increased significantly both in number of publications and citations. Citation trends varied widely by topic, with Natural Language Processing having a large number of citations in particular years, and Medical Record Systems, Computerized remaining a very popular topic in all years. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Computerized Liver Volumetry on MRI by Using 3D Geodesic Active Contour Segmentation

    PubMed Central

    Huynh, Hieu Trung; Karademir, Ibrahim; Oto, Aytekin; Suzuki, Kenji

    2014-01-01

    OBJECTIVE Our purpose was to develop an accurate automated 3D liver segmentation scheme for measuring liver volumes on MRI. SUBJECTS AND METHODS Our scheme for MRI liver volumetry consisted of three main stages. First, the preprocessing stage was applied to T1-weighted MRI of the liver in the portal venous phase to reduce noise and produce the boundary-enhanced image. This boundary-enhanced image was used as a speed function for a 3D fast-marching algorithm to generate an initial surface that roughly approximated the shape of the liver. A 3D geodesic-active-contour segmentation algorithm refined the initial surface to precisely determine the liver boundaries. The liver volumes determined by our scheme were compared with those manually traced by a radiologist, used as the reference standard. RESULTS The two volumetric methods reached excellent agreement (intraclass correlation coefficient, 0.98) without statistical significance (p = 0.42). The average (± SD) accuracy was 99.4% ± 0.14%, and the average Dice overlap coefficient was 93.6% ± 1.7%. The mean processing time for our automated scheme was 1.03 ± 0.13 minutes, whereas that for manual volumetry was 24.0 ± 4.4 minutes (p < 0.001). CONCLUSION The MRI liver volumetry based on our automated scheme agreed excellently with reference-standard volumetry, and it required substantially less completion time. PMID:24370139

  5. Computerized liver volumetry on MRI by using 3D geodesic active contour segmentation.

    PubMed

    Huynh, Hieu Trung; Karademir, Ibrahim; Oto, Aytekin; Suzuki, Kenji

    2014-01-01

    Our purpose was to develop an accurate automated 3D liver segmentation scheme for measuring liver volumes on MRI. Our scheme for MRI liver volumetry consisted of three main stages. First, the preprocessing stage was applied to T1-weighted MRI of the liver in the portal venous phase to reduce noise and produce the boundary-enhanced image. This boundary-enhanced image was used as a speed function for a 3D fast-marching algorithm to generate an initial surface that roughly approximated the shape of the liver. A 3D geodesic-active-contour segmentation algorithm refined the initial surface to precisely determine the liver boundaries. The liver volumes determined by our scheme were compared with those manually traced by a radiologist, used as the reference standard. The two volumetric methods reached excellent agreement (intraclass correlation coefficient, 0.98) without statistical significance (p = 0.42). The average (± SD) accuracy was 99.4% ± 0.14%, and the average Dice overlap coefficient was 93.6% ± 1.7%. The mean processing time for our automated scheme was 1.03 ± 0.13 minutes, whereas that for manual volumetry was 24.0 ± 4.4 minutes (p < 0.001). The MRI liver volumetry based on our automated scheme agreed excellently with reference-standard volumetry, and it required substantially less completion time.

  6. Follow-up Methodology: A Comprehensive Study and Evaluation of Academic, Technical and Vocational Del Mar College Graduates from September 1, 1973, Through August 31, 1975, Including Ways, Means, Instruments, Relationships, and Methods of Follow-up. TEX-SIS FOLLOW-UP SC4.

    ERIC Educational Resources Information Center

    Fite, Ronald S.

    This report details the research activities conducted by Del Mar College, as a subcontractor of Project FOLLOW-UP, in the design, development, and implementation of a graduate follow-up system. The activities included questionnaire design, development of manual and computerized record-keeping systems, student-graduate identification, and…

  7. Embedded control system for computerized franking machine

    NASA Astrophysics Data System (ADS)

    Shi, W. M.; Zhang, L. B.; Xu, F.; Zhan, H. W.

    2007-12-01

    This paper presents a novel control system for franking machine. A methodology for operating a franking machine using the functional controls consisting of connection, configuration and franking electromechanical drive is studied. A set of enabling technologies to synthesize postage management software architectures driven microprocessor-based embedded systems is proposed. The cryptographic algorithm that calculates mail items is analyzed to enhance the postal indicia accountability and security. The study indicated that the franking machine is reliability, performance and flexibility in printing mail items.

  8. Knowledge Representation and Communication: Imparting Current State Information Flow to CPR Stakeholders

    PubMed Central

    de la Cruz, Norberto B.; Spiece, Leslie J.

    2000-01-01

    Understanding and communicating the who, what, where, when, why, and how of the clinics and services for which the computerized patient record (CPR) will be built is an integral part of the implementation process. Formal methodologies have been developed to diagram information flow -- flow charts, state-transition diagram (STDs), data flow diagrams (DFDs). For documentation of the processes at our ambulatory CPR pilot site, flowcharting was selected as the preferred method based upon its versatility and understandability.

  9. Can utilizing a computerized provider order entry (CPOE) system prevent hospital medical errors and adverse drug events?

    PubMed

    Charles, Krista; Cannon, Margaret; Hall, Robert; Coustasse, Alberto

    2014-01-01

    Computerized provider order entry (CPOE) systems allow physicians to prescribe patient services electronically. In hospitals, CPOE essentially eliminates the need for handwritten paper orders and achieves cost savings through increased efficiency. The purpose of this research study was to examine the benefits of and barriers to CPOE adoption in hospitals to determine the effects on medical errors and adverse drug events (ADEs) and examine cost and savings associated with the implementation of this newly mandated technology. This study followed a methodology using the basic principles of a systematic review and referenced 50 sources. CPOE systems in hospitals were found to be capable of reducing medical errors and ADEs, especially when CPOE systems are bundled with clinical decision support systems designed to alert physicians and other healthcare providers of pending lab or medical errors. However, CPOE systems face major barriers associated with adoption in a hospital system, mainly high implementation costs and physicians' resistance to change.

  10. Crewmember Performance Before, During, And After Spaceflight

    PubMed Central

    Kelly, Thomas H; Hienz, Robert D; Zarcone, Troy J; Wurster, Richard M; Brady, Joseph V

    2005-01-01

    The development of technologies for monitoring the welfare of crewmembers is a critical requirement for extended spaceflight. Behavior analytic methodologies provide a framework for studying the performance of individuals and groups, and brief computerized tests have been used successfully to examine the impairing effects of sleep, drug, and nutrition manipulations on human behavior. The purpose of the present study was to evaluate the feasibility and sensitivity of repeated performance testing during spaceflight. Four National Aeronautics and Space Administration crewmembers were trained to complete computerized questionnaires and performance tasks at repeated regular intervals before and after a 10-day shuttle mission and at times that interfered minimally with other mission activities during spaceflight. Two types of performance, Digit-Symbol Substitution trial completion rates and response times during the most complex Number Recognition trials, were altered slightly during spaceflight. All other dimensions of the performance tasks remained essentially unchanged over the course of the study. Verbal ratings of Fatigue increased slightly during spaceflight and decreased during the postflight test sessions. Arousal ratings increased during spaceflight and decreased postflight. No other consistent changes in rating-scale measures were observed over the course of the study. Crewmembers completed all mission requirements in an efficient manner with no indication of clinically significant behavioral impairment during the 10-day spaceflight. These results support the feasibility and utility of computerized task performances and questionnaire rating scales for repeated measurement of behavior during spaceflight. PMID:16262187

  11. Agreement between Computerized and Human Assessment of Performance on the Ruff Figural Fluency Test

    PubMed Central

    Elderson, Martin F.; Pham, Sander; van Eersel, Marlise E. A.; Wolffenbuttel, Bruce H. R.; Kok, Johan; Gansevoort, Ron T.; Tucha, Oliver; van der Klauw, Melanie M.; Slaets, Joris P. J.

    2016-01-01

    The Ruff Figural Fluency Test (RFFT) is a sensitive test for nonverbal fluency suitable for all age groups. However, assessment of performance on the RFFT is time-consuming and may be affected by interrater differences. Therefore, we developed computer software specifically designed to analyze performance on the RFFT by automated pattern recognition. The aim of this study was to compare assessment by the new software with conventional assessment by human raters. The software was developed using data from the Lifelines Cohort Study and validated in an independent cohort of the Prevention of Renal and Vascular End Stage Disease (PREVEND) study. The total study population included 1,761 persons: 54% men; mean age (SD), 58 (10) years. All RFFT protocols were assessed by the new software and two independent human raters (criterion standard). The mean number of unique designs (SD) was 81 (29) and the median number of perseverative errors (interquartile range) was 9 (4 to 16). The intraclass correlation coefficient (ICC) between the computerized and human assessment was 0.994 (95%CI, 0.988 to 0.996; p<0.001) and 0.991 (95%CI, 0.990 to 0.991; p<0.001) for the number of unique designs and perseverative errors, respectively. The mean difference (SD) between the computerized and human assessment was -1.42 (2.78) and +0.02 (1.94) points for the number of unique designs and perseverative errors, respectively. This was comparable to the agreement between two independent human assessments: ICC, 0.995 (0.994 to 0.995; p<0.001) and 0.985 (0.982 to 0.988; p<0.001), and mean difference (SD), -0.44 (2.98) and +0.56 (2.36) points for the number of unique designs and perseverative errors, respectively. We conclude that the agreement between the computerized and human assessment was very high and comparable to the agreement between two independent human assessments. Therefore, the software is an accurate tool for the assessment of performance on the RFFT. PMID:27661083

  12. Lithography-based automation in the design of program defect masks

    NASA Astrophysics Data System (ADS)

    Vakanas, George P.; Munir, Saghir; Tejnil, Edita; Bald, Daniel J.; Nagpal, Rajesh

    2004-05-01

    In this work, we are reporting on a lithography-based methodology and automation in the design of Program Defect masks (PDM"s). Leading edge technology masks have ever-shrinking primary features and more pronounced model-based secondary features such as optical proximity corrections (OPC), sub-resolution assist features (SRAF"s) and phase-shifted mask (PSM) structures. In order to define defect disposition specifications for critical layers of a technology node, experience alone in deciding worst-case scenarios for the placement of program defects is necessary but may not be sufficient. MEEF calculations initiated from layout pattern data and their integration in a PDM layout flow provide a natural approach for improvements, relevance and accuracy in the placement of programmed defects. This methodology provides closed-loop feedback between layout and hard defect disposition specifications, thereby minimizing engineering test restarts, improving quality and reducing cost of high-end masks. Apart from SEMI and industry standards, best-known methods (BKM"s) in integrated lithographically-based layout methodologies and automation specific to PDM"s are scarce. The contribution of this paper lies in the implementation of Design-For-Test (DFT) principles to a synergistic interaction of CAD Layout and Aerial Image Simulator to drive layout improvements, highlight layout-to-fracture interactions and output accurate program defect placement coordinates to be used by tools in the mask shop.

  13. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  14. Studies of planning behavior of aircraft pilots in normal, abnormal and emergency situations

    NASA Technical Reports Server (NTRS)

    Johannsen, G.; Rouse, W. B.; Hillmann, K.

    1981-01-01

    A methodology for the study of planning is presented and the results of applying the methodology within two experimental investigations of planning behavior of aircraft pilots in normal, abnormal, and emergency situations are discussed. Beyond showing that the methodology yields consistent results, these experiments also lead to concepts in terms of a dichotomy between event driven and time driven planning, subtle effects of automation on planning, and the relationship of planning to workload and flight performance.

  15. Symposium on Automation, Robotics and Advanced Computing for the National Space Program (2nd) Held in Arlington, Virginia on 9-11 March 1987

    DTIC Science & Technology

    1988-02-28

    enormous investment in software. This is an area extremely important objective. We need additional where better methodologies , tools and theories...microscopy (SEM) and optical mi- [131 Hanson, A., et a. "A Methodology for the Develop- croscopy. Current activities include the study of SEM im- ment...through a phased knowledge engineering methodology Center (ARC) and NASA Johnson Space Center consisting of: prototype knowledge base develop- iJSC

  16. Agility through Automated Negotiation for C2 Services

    DTIC Science & Technology

    2014-06-01

    using this e-contract negotiation methodology in a C2 context in Brazil. We have modeled the operations of the Rio de Janeiro Command Center that will be...methodology in a C2 context in Brazil. We have modeled the operations of the Rio de Janeiro Command Center that will be in place for the World Cup (2014...through e-contracts. The scenario chosen to demonstrate this methodology is a security incident in Rio de Janeiro , host city of the next World Cup (2014

  17. The effect of JPEG compression on automated detection of microaneurysms in retinal images

    NASA Astrophysics Data System (ADS)

    Cree, M. J.; Jelinek, H. F.

    2008-02-01

    As JPEG compression at source is ubiquitous in retinal imaging, and the block artefacts introduced are known to be of similar size to microaneurysms (an important indicator of diabetic retinopathy) it is prudent to evaluate the effect of JPEG compression on automated detection of retinal pathology. Retinal images were acquired at high quality and then compressed to various lower qualities. An automated microaneurysm detector was run on the retinal images of various qualities of JPEG compression and the ability to predict the presence of diabetic retinopathy based on the detected presence of microaneurysms was evaluated with receiver operating characteristic (ROC) methodology. The negative effect of JPEG compression on automated detection was observed even at levels of compression sometimes used in retinal eye-screening programmes and these may have important clinical implications for deciding on acceptable levels of compression for a fully automated eye-screening programme.

  18. Study of the impact of automation on productivity in bus-maintenance facilities. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sumanth, D.J.; Weiss, H.J.; Adya, B.

    1988-12-01

    Whether or not the various types of automation and new technologies introduced in a bus-transit system really have an impact on productivity is the question addressed in the study. The report describes a new procedure of productivity measurement and evaluation for a county-transit system and provides an objective perspective on the impact of automation on productivity in bus maintenance facilities. The research objectives were: to study the impact of automation on total productivity in transit maintenance facilities; to develop and apply a methodology for measuring the total productivity of a Floridian transit maintenance facility (Bradenton-Manatee County bus maintenance facility whichmore » has been introducing automation since 1983); and to develop a practical step-by-step implementation scheme for the total productivity-based productivity measurement system that any bus manager can use. All 3 objectives were successfully accomplished.« less

  19. The problem of resonance in technology usage

    NASA Technical Reports Server (NTRS)

    Sayani, H. H.; Svoboda, C. P.

    1981-01-01

    Various information system tools and techniques are analyzed. A case study is presented which draws together the issues raised in three distinct cases. This case study shows a typical progression from the selection of an analysis methodology, to the adoption of an automated tool for specification and documentation, and the difficulty of fitting these into an existing life cycle development methodology.

  20. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.

    PubMed

    Eom, Hwisoo; Lee, Sang Hun

    2015-06-12

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.

  1. Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles

    PubMed Central

    Eom, Hwisoo; Lee, Sang Hun

    2015-01-01

    A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406

  2. Capturing patient information at nursing shift changes: methodological evaluation of speech recognition and information extraction

    PubMed Central

    Suominen, Hanna; Johnson, Maree; Zhou, Liyuan; Sanchez, Paula; Sirel, Raul; Basilakis, Jim; Hanlen, Leif; Estival, Dominique; Dawson, Linda; Kelly, Barbara

    2015-01-01

    Objective We study the use of speech recognition and information extraction to generate drafts of Australian nursing-handover documents. Methods Speech recognition correctness and clinicians’ preferences were evaluated using 15 recorder–microphone combinations, six documents, three speakers, Dragon Medical 11, and five survey/interview participants. Information extraction correctness evaluation used 260 documents, six-class classification for each word, two annotators, and the CRF++ conditional random field toolkit. Results A noise-cancelling lapel-microphone with a digital voice recorder gave the best correctness (79%). This microphone was also the most preferred option by all but one participant. Although the participants liked the small size of this recorder, their preference was for tablets that can also be used for document proofing and sign-off, among other tasks. Accented speech was harder to recognize than native language and a male speaker was detected better than a female speaker. Information extraction was excellent in filtering out irrelevant text (85% F1) and identifying text relevant to two classes (87% and 70% F1). Similarly to the annotators’ disagreements, there was confusion between the remaining three classes, which explains the modest 62% macro-averaged F1. Discussion We present evidence for the feasibility of speech recognition and information extraction to support clinicians’ in entering text and unlock its content for computerized decision-making and surveillance in healthcare. Conclusions The benefits of this automation include storing all information; making the drafts available and accessible almost instantly to everyone with authorized access; and avoiding information loss, delays, and misinterpretations inherent to using a ward clerk or transcription services. PMID:25336589

  3. Computerized methodology for micro-CT and histological data inflation using an IVUS based translation map.

    PubMed

    Athanasiou, Lambros S; Rigas, George A; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Naka, Katerina K; Panetta, Daniele; Pelosi, Gualtiero; Vozzi, Federico; Michalis, Lampros K; Parodi, Oberdan; Fotiadis, Dimitrios I

    2015-10-01

    A framework for the inflation of micro-CT and histology data using intravascular ultrasound (IVUS) images, is presented. The proposed methodology consists of three steps. In the first step the micro-CT/histological images are manually co-registered with IVUS by experts using fiducial points as landmarks. In the second step the lumen of both the micro-CT/histological images and IVUS images are automatically segmented. Finally, in the third step the micro-CT/histological images are inflated by applying a transformation method on each image. The transformation method is based on the IVUS and micro-CT/histological contour difference. In order to validate the proposed image inflation methodology, plaque areas in the inflated micro-CT and histological images are compared with the ones in the IVUS images. The proposed methodology for inflating micro-CT/histological images increases the sensitivity of plaque area matching between the inflated and the IVUS images (7% and 22% in histological and micro-CT images, respectively). Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A Systematic Approach to Optimize Organizations Operating in Uncertain Environments: Design Methodology and Applications

    DTIC Science & Technology

    2002-09-01

    sub-goal can lead to achieving different goals (e.g., automation of on-line order processing may lead to both reducing the storage cost and reducing...equipment Introduce new technology Find cheaper supplier Sign a contract Introduce cheaper materials Set up and automate on-line order processing Integrate... order processing with inventory and shipping Set up company’s website Freight consolidation Just-in-time versus pre-planned balance

  5. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An

  6. Improving Grid Resilience through Informed Decision-making (IGRID)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnham, Laurie; Stamber, Kevin L.; Jeffers, Robert Fredric

    The transformation of the distribution grid from a centralized to decentralized architecture, with bi-directional power and data flows, is made possible by a surge in network intelligence and grid automation. While changes are largely beneficial, the interface between grid operator and automated technologies is not well understood, nor are the benefits and risks of automation. Quantifying and understanding the latter is an important facet of grid resilience that needs to be fully investigated. The work described in this document represents the first empirical study aimed at identifying and mitigating the vulnerabilities posed by automation for a grid that for themore » foreseeable future will remain a human-in-the-loop critical infrastructure. Our scenario-based methodology enabled us to conduct a series of experimental studies to identify causal relationships between grid-operator performance and automated technologies and to collect measurements of human performance as a function of automation. Our findings, though preliminary, suggest there are predictive patterns in the interplay between human operators and automation, patterns that can inform the rollout of distribution automation and the hiring and training of operators, and contribute in multiple and significant ways to the field of grid resilience.« less

  7. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  8. The development of a Flight Test Engineer's Workstation for the Automated Flight Test Management System

    NASA Technical Reports Server (NTRS)

    Tartt, David M.; Hewett, Marle D.; Duke, Eugene L.; Cooper, James A.; Brumbaugh, Randal W.

    1989-01-01

    The Automated Flight Test Management System (ATMS) is being developed as part of the NASA Aircraft Automation Program. This program focuses on the application of interdisciplinary state-of-the-art technology in artificial intelligence, control theory, and systems methodology to problems of operating and flight testing high-performance aircraft. The development of a Flight Test Engineer's Workstation (FTEWS) is presented, with a detailed description of the system, technical details, and future planned developments. The goal of the FTEWS is to provide flight test engineers and project officers with an automated computer environment for planning, scheduling, and performing flight test programs. The FTEWS system is an outgrowth of the development of ATMS and is an implementation of a component of ATMS on SUN workstations.

  9. NASA Out-of-Autoclave Process Technology Development

    NASA Technical Reports Server (NTRS)

    Johnston, Norman, J.; Clinton, R. G., Jr.; McMahon, William M.

    2000-01-01

    Polymer matrix composites (PMCS) will play a significant role in the construction of large reusable launch vehicles (RLVs), mankind's future major access to low earth orbit and the international space station. PMCs are lightweight and offer attractive economies of scale and automated fabrication methodology. Fabrication of large RLV structures will require non-autoclave methods which have yet to be matured including (1) thermoplastic forming: heated head robotic tape placement, sheet extrusion, pultrusion, molding and forming; (2) electron beam curing: bulk and ply-by-ply automated placement; (3) RTM and VARTM. Research sponsored by NASA in industrial and NASA laboratories on automated placement techniques involving the first 2 categories will be presented.

  10. [An educational software development proposal for nursing in neonatal cardiopulmonary resuscitation].

    PubMed

    Rodrigues, Rita de Cassia Vieira; Peres, Heloisa Helena Ciqueto

    2013-02-01

    The objective of this study was to develop an educational software program for nursing continuing education. This program was intended to incorporate applied methodological research that used the learning management system methodology created by Galvis Panqueva in association with contextualized instructional design for software design. As a result of this study, we created a computerized educational product (CEP) called ENFNET. This study describes all the necessary steps taken during its development. The creation of a CEP demands a great deal of study, dedication and investment as well as the necessity of specialized technical personnel to construct it. At the end of the study, the software was positively evaluated and shown to be a useful strategy to help users in their education, skills development and professional training.

  11. Evaluation of ionic liquids as alternative solvents for aldolase activity: Use of a new automated SIA methodology.

    PubMed

    Cunha, Edite; Pinto, Paula C A G; Saraiva, M Lúcia M F S

    2015-08-15

    An automated methodology is proposed for the evaluation of a set of ionic liquids (ILs) as alternative reaction media for aldolase based synthetic processes. For that, the effect of traditionally used organic solvents and ILs on the activity of aldolase was studied by means of a novel automated methodology. The implemented methodology is based on the concept of sequential injection analysis (SIA) and relies on the aldolase based cleavage of d-fructose-1,6 diphosphate (DFDP), to produce dihydroxyacetone phosphate (DHAP) and d-glyceraldehyde-3-phosphate (G3P). In the presence of FeCl3, 3-methyl-2-benzothiazoline hydrazine (MBTH) forms a blue cation that can be measured at 670nm, by combination with G3P. The influence of several parameters such as substrate and enzyme concentration, temperature, delay time and MBTH and FeCl3 concentration were studied and the optimum reaction conditions were subsequently selected. The developed methodology showed good precision and a relative standard deviation (rsd) that does not exceed 7% also leading to low reagents consumption as well as effluent production. Resorting to this strategy, the activity of the enzyme was studied in strictly aqueous media and in the presence of dimethylformamide, methanol, bmpyr [Cl], hmim [Cl], bmim [BF4], emim [BF4], emim [Ac], bmim [Cl], emim [TfMs], emim [Ms] and Chol [Ac] up to 50%. The results show that the utilization of ILs as reaction media for aldolase based organic synthesis might present potential advantages over the tested conventional organic solvents. The least toxic IL found in this study was cho [Ac] that causes a reduction of enzyme activity of only 2.7% when used in a concentration of 50%. Generally, it can be concluded that ILs based on choline or short alkyl imidazolium moieties associated with biocompatible anions are the most promising ILs regarding the future inclusion of these solvents in synthetic protocols catalyzed by aldolase. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Computerized symptom and quality-of-life assessment for patients with cancer part I: development and pilot testing.

    PubMed

    Berry, Donna L; Trigg, Lisa J; Lober, William B; Karras, Bryant T; Galligan, Mary L; Austin-Seymour, Mary; Martin, Stephanie

    2004-09-01

    To develop and test an innovative computerized symptom and quality-of-life (QOL) assessment for patients with cancer who are evaluated for and treated with radiation therapy. Descriptive, longitudinal prototype development and cross-sectional clinical data. Department of radiation oncology in an urban, academic medical center. 101 outpatients who were evaluated for radiation therapy, able to communicate in English (or through one of many interpreters available at the University of Washington), and competent to understand the study information and give informed consent. Six clinicians caring for the patients in the sample were enrolled. Iterative prototype development was conducted using a standing focus group of clinicians. The software was developed based on survey markup language and implemented in a wireless, Web-based format. Patient participants completed the computerized assessment prior to consultation with the radiation physician. Graphical output pages with flagged areas of symptom distress or troublesome QOL issues were made available to consulting physicians and nurses. Pain intensity, symptoms, QOL, and demographics. Computerized versions of a 0 to 10 Pain Intensity Numerical Scale (PINS), Symptom Distress Scale, and Short Form-8. Focus group recommendations included clinician priorities of brevity, flexibility, and simplicity for both input interface and output and that the assessment output contain color graphic display. Patient participants included 45 women and 56 men with a mean age of 52.7 years (SD = 13.8). Fewer than half of the participants (40%) reported using a computer on a regular basis (weekly or daily). Completion time averaged 7.8 minutes (SD = 3.7). Moderate to high levels of distress were reported more often for fatigue, pain, and emotional issues than for other symptoms or concerns. Computerized assessment of cancer symptoms and QOL is technically possible and feasible in an ambulatory cancer clinic. A wireless, Web-based system facilitates access to results and data entry and retrieval. The symptom and QOL profiles of these patients new to radiation therapy were comparable to other samples of outpatients with cancer. The ability to capture an easily interpreted illustration of a patients symptom and QOL experience in less than 10 minutes is a potentially useful adjunct to traditional face-to-face interviewing. Ultimately, electronic patient-generated data could produce automated red flags directed to the most appropriate clinicians (e.g., nurse, pain specialist, social worker, nutritionist) for further evaluation. Such system enhancement could greatly facilitate oncology nurses coordination role in caring for complex patients with cancer.

  13. Managing mapping data using commercial data base management software.

    USGS Publications Warehouse

    Elassal, A.A.

    1985-01-01

    Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author

  14. Smart manufacturing of complex shaped pipe components

    NASA Astrophysics Data System (ADS)

    Salchak, Y. A.; Kotelnikov, A. A.; Sednev, D. A.; Borikov, V. N.

    2018-03-01

    Manufacturing industry is constantly improving. Nowadays the most relevant trend is widespread automation and optimization of the production process. This paper represents a novel approach for smart manufacturing of steel pipe valves. The system includes two main parts: mechanical treatment and quality assurance units. Mechanical treatment is performed by application of the milling machine with implementation of computerized numerical control, whilst the quality assurance unit contains three testing modules for different tasks, such as X-ray testing, optical scanning and ultrasound testing modules. The advances of each of them provide reliable results that contain information about any failures of the technological process, any deviations of geometrical parameters of the valves. The system also allows detecting defects on the surface or in the inner structure of the component.

  15. Computerizing an integrated clinical and financial record system in a CMHC: a pilot project.

    PubMed

    Newkham, J; Bawcom, L

    1981-01-01

    The authors describe the three-year experience of a mid-sized community mental health center in designing and installing an automated Staff/Management Information System (S/MIS). The purpose of the project, piloted at the heart od Texas Region Mental Health Mental Retardation Center (HOTRMHMR) in Waco, Texas, was to examine the feasibility of a comprehensive data system operating at a local level which would create an effective audit trail for services and reimbursement and serve as a viable mechanism for the transmission of center data to a state system via computer tapes. Included in the discussion are agency philosophy, costs, management attitudes, the design and implementation process, and special features which evolved from the fully integrated system.

  16. Thermoelectric property measurements with computer controlled systems

    NASA Technical Reports Server (NTRS)

    Chmielewski, A. B.; Wood, C.

    1984-01-01

    A joint JPL-NASA program to develop an automated system to measure the thermoelectric properties of newly developed materials is described. Consideration is given to the difficulties created by signal drift in measurements of Hall voltage and the Large Delta T Seebeck coefficient. The benefits of a computerized system were examined with respect to error reduction and time savings for human operators. It is shown that the time required to measure Hall voltage can be reduced by a factor of 10 when a computer is used to fit a curve to the ratio of the measured signal and its standard deviation. The accuracy of measurements of the Large Delta T Seebeck coefficient and thermal diffusivity was also enhanced by the use of computers.

  17. DORMAN computer program (study 2.5). Volume 1: Executive summary. [development of data bank for computerized information storage of NASA programs

    NASA Technical Reports Server (NTRS)

    Stricker, L. T.

    1973-01-01

    The DORCA Applications study has been directed at development of a data bank management computer program identified as DORMAN. Because of the size of the DORCA data files and the manipulations required on that data to support analyses with the DORCA program, automated data techniques to replace time-consuming manual input generation are required. The Dynamic Operations Requirements and Cost Analysis (DORCA) program was developed for use by NASA in planning future space programs. Both programs are designed for implementation on the UNIVAC 1108 computing system. The purpose of this Executive Summary Report is to define for the NASA management the basic functions of the DORMAN program and its capabilities.

  18. Pigment network-based skin cancer detection.

    PubMed

    Alfed, Naser; Khelifi, Fouad; Bouridane, Ahmed; Seker, Huseyin

    2015-08-01

    Diagnosing skin cancer in its early stages is a challenging task for dermatologists given the fact that the chance for a patient's survival is higher and hence the process of analyzing skin images and making decisions should be time efficient. Therefore, diagnosing the disease using automated and computerized systems has nowadays become essential. This paper proposes an efficient system for skin cancer detection on dermoscopic images. It has been shown that the statistical characteristics of the pigment network, extracted from the dermoscopic image, could be used as efficient discriminating features for cancer detection. The proposed system has been assessed on a dataset of 200 dermoscopic images of the `Hospital Pedro Hispano' [1] and the results of cross-validation have shown high detection accuracy.

  19. Costing for the Future: Exploring Cost Estimation With Unmanned Autonomous Systems

    DTIC Science & Technology

    2016-04-30

    account for how cost estimating for autonomy is different than current methodologies and to suggest ways it can be addressed through the integration and...The Development stage involves refining the system requirements, creating a solution description , and building a system. 3. The Operational Test...parameter describes the extent to which efficient fabrication methodologies and processes are used, and the automation of labor-intensive operations

  20. Applications of pathology-assisted image analysis of immunohistochemistry-based biomarkers in oncology.

    PubMed

    Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D

    2014-01-01

    Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.

  1. [The application of new technologies to hospital pharmacy in Spain].

    PubMed

    Bermejo Vicedo, T; Pérez Menéndez Conde, C; Alvarez, Ana; Codina, Carlos; Delgado, Olga; Herranz, Ana; Hidalgo Correas, Francisco; Martín, Isabel; Martínez, Julio; Luis Poveda, José; Queralt Gorgas, María; Sanjurjo Sáez, María

    2007-01-01

    To describe the degree of introduction of new technologies in the medication use process in pharmacy services in Spain. A descriptive study via a survey into the degree of introduction of computer systems for: management, computerized physician order entry (CPOE), automated unit dose drug dispensing, preparation of parenteral nutrition solutions, recording drug administration, pharmaceutical care and foreseen improvements. The survey was sent by electronic mail to the heads of the pharmacy services of 207 hospitals throughout Spain. Response index: 82 hospitals (38.6%). 29 hospitals (36.7%) have a modular management system, 24 (30.4%) an integrated one and 34 (44.9%) a modular-integrated one. CPOE is utilised in 17 (22.4%). According to the size of the hospital, between 17.9 and 26.7% of unit dose dispensing is done online with a management software; between 5.1 and 33.3% of unit dose dispensing is automated. Automation of unit dose dispensing centred in the pharmacy service varies between 10 and 33.3%. Between 13.2 and 35.7% of automated in-ward dispensing systems are utilised. Administration records are kept manually on a computerised sheet at 23 (31.5%) of the hospitals; at 4 (5.4%) on CPOE and 7 (9.5%) online on the integral management programme and 4 (5.4%) on specific nursing softwares. Sixty-three per cent foresee the implementation of improvements in the short to medium term. The introduction of new technologies is being developed in Spain aiming to improve the safety and management of drugs, and there is a trend towards increasing their deployment in the near future. It is hoped that their fomentation could help to bring about process reengineering within pharmacy services in order to increase the time available for devotion to pharmaceutical care.

  2. Automated sizing of large structures by mixed optimization methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  3. A Performance Measurement and Implementation Methodology in a Department of Defense CIM (Computer Integrated Manufacturing) Environment

    DTIC Science & Technology

    1988-01-24

    vanes.-The new facility is currently being called the Engine Blade/ Vape Facility (EB/VF). There are three primary goals in automating this proc..e...earlier, the search led primarily into the areas of CIM Justification, Automation Strategies , Performance Measurement, and Integration issues. Of...of living, has been steadily eroding. One dangerous trend that has developed in keenly competitive world markets , says Rohan [33], has been for U.S

  4. 19 CFR 111.23 - Retention of records.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...

  5. 19 CFR 111.23 - Retention of records.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...

  6. 19 CFR 111.23 - Retention of records.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...

  7. Information security system quality assessment through the intelligent tools

    NASA Astrophysics Data System (ADS)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  8. Cancer Detection Using Neural Computing Methodology

    NASA Technical Reports Server (NTRS)

    Toomarian, Nikzad; Kohen, Hamid S.; Bearman, Gregory H.; Seligson, David B.

    2001-01-01

    This paper describes a novel learning methodology used to analyze bio-materials. The premise of this research is to help pathologists quickly identify anomalous cells in a cost efficient method. Skilled pathologists must methodically, efficiently and carefully analyze manually histopathologic materials for the presence, amount and degree of malignancy and/or other disease states. The prolonged attention required to accomplish this task induces fatigue that may result in a higher rate of diagnostic errors. In addition, automated image analysis systems to date lack a sufficiently intelligent means of identifying even the most general regions of interest in tissue based studies and this shortfall greatly limits their utility. An intelligent data understanding system that could quickly and accurately identify diseased tissues and/or could choose regions of interest would be expected to increase the accuracy of diagnosis and usher in truly automated tissue based image analysis.

  9. Application of Human-Autonomy Teaming (HAT) Patterns to Reduced Crew Operations (RCO)

    NASA Technical Reports Server (NTRS)

    Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri

    2016-01-01

    As part of the Air Force - NASA Bi-Annual Research Council Meeting, slides will be presented on recent Reduced Crew Operations (RCO) work. Unmanned aerial systems, robotics, advanced cockpits, and air traffic management are all examples of domains that are seeing dramatic increases in automation. While automation may take on some tasks previously performed by humans, humans will still be required, for the foreseeable future, to remain in the system. The collaboration with humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. A methodology for identifying HAT patterns to an advanced cockpit project is discussed.

  10. Analytical research and development for the Whitney Programs. Automation and instrumentation. Computer automation of the Cary Model 17I spectrophotometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haugen, G.R.; Bystroff, R.I.; Downey, R.M.

    1975-09-01

    In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less

  11. Analysis of technical university information system

    NASA Astrophysics Data System (ADS)

    Savelyev, N. A.; Boyarkin, M. A.

    2018-05-01

    The paper covers a set and interaction of the existing higher education institution automated control systems in φ state budgetary educational institution of higher professional education "Industrial University of Tyumen ". A structural interaction of the existing systems and their functions has been analyzed which has become a basis for identification of a number of system-related and local (related to separate modules) drawbacks of the university activities automation. The authors suggested a new structure of the automated control system, consisting of three major subsystems: management support; training and methodology support; distance and supplementary education support. Functionality for each subsystem has been defined in accordance with the educational institution automation requirements. The suggested structure of the ACS will solve the challenges facing the university during reorganization and optimization of the processes of management of the institution activities as a whole.

  12. Detecting the optic disc boundary in digital fundus images using morphological, edge detection, and feature extraction techniques.

    PubMed

    Aquino, Arturo; Gegundez-Arias, Manuel Emilio; Marin, Diego

    2010-11-01

    Optic disc (OD) detection is an important step in developing systems for automated diagnosis of various serious ophthalmic pathologies. This paper presents a new template-based methodology for segmenting the OD from digital retinal images. This methodology uses morphological and edge detection techniques followed by the Circular Hough Transform to obtain a circular OD boundary approximation. It requires a pixel located within the OD as initial information. For this purpose, a location methodology based on a voting-type algorithm is also proposed. The algorithms were evaluated on the 1200 images of the publicly available MESSIDOR database. The location procedure succeeded in 99% of cases, taking an average computational time of 1.67 s. with a standard deviation of 0.14 s. On the other hand, the segmentation algorithm rendered an average common area overlapping between automated segmentations and true OD regions of 86%. The average computational time was 5.69 s with a standard deviation of 0.54 s. Moreover, a discussion on advantages and disadvantages of the models more generally used for OD segmentation is also presented in this paper.

  13. Automated measurement of office, home and ambulatory blood pressure in atrial fibrillation.

    PubMed

    Kollias, Anastasios; Stergiou, George S

    2014-01-01

    1. Hypertension and atrial fibrillation (AF) often coexist and are strong risk factors for stroke. Current guidelines for blood pressure (BP) measurement in AF recommend repeated measurements using the auscultatory method, whereas the accuracy of the automated devices is regarded as questionable. This review presents the current evidence on the feasibility and accuracy of automated BP measurement in the presence of AF and the potential for automated detection of undiagnosed AF during such measurements. 2. Studies evaluating the use of automated BP monitors in AF are limited and have significant heterogeneity in methodology and protocols. Overall, the oscillometric method is feasible for static (office or home) and ambulatory use and appears to be more accurate for systolic than diastolic BP measurement. 3. Given that systolic hypertension is particularly common and important in the elderly, the automated BP measurement method may be acceptable for self-home and ambulatory monitoring, but not for professional office or clinic measurement. 4. An embedded algorithm for the detection of asymptomatic AF during routine automated BP measurement with high diagnostic accuracy has been developed and appears to be a useful screening tool for elderly hypertensives. © 2013 Wiley Publishing Asia Pty Ltd.

  14. Automation in high-content flow cytometry screening.

    PubMed

    Naumann, U; Wand, M P

    2009-09-01

    High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.

  15. Automated personnel-assets-consumables-drug tracking in ambulance services for more effective and efficient medical emergency interventions.

    PubMed

    Utku, Semih; Özcanhan, Mehmet Hilal; Unluturk, Mehmet Suleyman

    2016-04-01

    Patient delivery time is no longer considered as the only critical factor, in ambulatory services. Presently, five clinical performance indicators are used to decide patient satisfaction. Unfortunately, the emergency ambulance services in rapidly growing metropolitan areas do not meet current satisfaction expectations; because of human errors in the management of the objects onboard the ambulances. But, human involvement in the information management of emergency interventions can be reduced by electronic tracking of personnel, assets, consumables and drugs (PACD) carried in the ambulances. Electronic tracking needs the support of automation software, which should be integrated to the overall hospital information system. Our work presents a complete solution based on a centralized database supported by radio frequency identification (RFID) and bluetooth low energy (BLE) identification and tracking technologies. Each object in an ambulance is identified and tracked by the best suited technology. The automated identification and tracking reduces manual paper documentation and frees the personnel to better focus on medical activities. The presence and amounts of the PACD are automatically monitored, warning about their depletion, non-presence or maintenance dates. The computerized two way hospital-ambulance communication link provides information sharing and instantaneous feedback for better and faster diagnosis decisions. A fully implemented system is presented, with detailed hardware and software descriptions. The benefits and the clinical outcomes of the proposed system are discussed, which lead to improved personnel efficiency and more effective interventions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Automation Improves Schedule Quality and Increases Scheduling Efficiency for Residents.

    PubMed

    Perelstein, Elizabeth; Rose, Ariella; Hong, Young-Chae; Cohn, Amy; Long, Micah T

    2016-02-01

    Medical resident scheduling is difficult due to multiple rules, competing educational goals, and ever-evolving graduate medical education requirements. Despite this, schedules are typically created manually, consuming hours of work, producing schedules of varying quality, and yielding negative consequences for resident morale and learning. To determine whether computerized decision support can improve the construction of residency schedules, saving time and improving schedule quality. The Optimized Residency Scheduling Assistant was designed by a team from the University of Michigan Department of Industrial and Operations Engineering. It was implemented in the C.S. Mott Children's Hospital Pediatric Emergency Department in the 2012-2013 academic year. The 4 metrics of schedule quality that were compared between the 2010-2011 and 2012-2013 academic years were the incidence of challenging shift transitions, the incidence of shifts following continuity clinics, the total shift inequity, and the night shift inequity. All scheduling rules were successfully incorporated. Average schedule creation time fell from 22 to 28 hours to 4 to 6 hours per month, and 3 of 4 metrics of schedule quality significantly improved. For the implementation year, the incidence of challenging shift transitions decreased from 83 to 14 (P < .01); the incidence of postclinic shifts decreased from 72 to 32 (P < .01); and the SD of night shifts dropped by 55.6% (P < .01). This automated shift scheduling system improves the current manual scheduling process, reducing time spent and improving schedule quality. Embracing such automated tools can benefit residency programs with shift-based scheduling needs.

  17. Modeling Increased Complexity and the Reliance on Automation: FLightdeck Automation Problems (FLAP) Model

    NASA Technical Reports Server (NTRS)

    Ancel, Ersin; Shih, Ann T.

    2014-01-01

    This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.

  18. Influence of Cultural, Organizational, and Automation Capability on Human Automation Trust: A Case Study of Auto-GCAS Experimental Test Pilots

    NASA Technical Reports Server (NTRS)

    Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

  19. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  20. An Extended Case Study Methoology for Investigating Influence of Cultural, Organizational, and Automation Factors on Human-Automation Trust

    NASA Technical Reports Server (NTRS)

    Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio

    2014-01-01

    This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.

Top