Sample records for facilitate automated processing

  1. 49 CFR 1104.2 - Document specifications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...

  2. 49 CFR 1104.2 - Document specifications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...

  3. 49 CFR 1104.2 - Document specifications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...

  4. 49 CFR 1104.2 - Document specifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...

  5. 49 CFR 1104.2 - Document specifications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to facilitate automated processing in document sheet feeders, original documents of more than one... textual submissions. Use of color in filings is limited to images such as graphs, maps and photographs. To facilitate automated processing of color pages, color pages may not be inserted among pages containing text...

  6. Ionospheric Modeling: Development, Verification and Validation

    DTIC Science & Technology

    2005-09-01

    facilitate the automated processing of a large network of GPS receiver data. 4.; CALIBRATION AND VALIDATION OF IONOSPHERIC SENSORS We have been...NOFS Workshop, Estes Park, CO, January 2005. W. Rideout, A. Coster, P. Doherty, MIT Haystack Automated Processing of GPS Data to Produce Worldwide TEC

  7. The Change to Administrative Computing in Schools.

    ERIC Educational Resources Information Center

    Brown, Daniel J.

    1984-01-01

    Describes a study of the process of school office automation which focuses on personnel reactions to administrative computing, what users view as advantages and disadvantages of the automation, perceived barriers and facilitators of the change to automation, school personnel view of long term effects, and implications for school computer policy.…

  8. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  9. Automated Circulation Systems in Libraries Serving the Blind and Physically Handicapped: A Reference Guide for Planning.

    ERIC Educational Resources Information Center

    Wanger, Judith; And Others

    Designed to facilitate communications in future automation projects between library and data processing personnel, especially those projects involving the use of automated systems in the service of disabled patrons, this guide identifies and describes a master set of major circulation system requirements and design considerations, and illustrates…

  10. Space station automation study: Automation requriements derived from space manufacturing concepts,volume 2

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Automation reuirements were developed for two manufacturing concepts: (1) Gallium Arsenide Electroepitaxial Crystal Production and Wafer Manufacturing Facility, and (2) Gallium Arsenide VLSI Microelectronics Chip Processing Facility. A functional overview of the ultimate design concept incoporating the two manufacturing facilities on the space station are provided. The concepts were selected to facilitate an in-depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, sensors, and artificial intelligence. While the cost-effectiveness of these facilities was not analyzed, both appear entirely feasible for the year 2000 timeframe.

  11. Automated imaging system for single molecules

    DOEpatents

    Schwartz, David Charles; Runnheim, Rodney; Forrest, Daniel

    2012-09-18

    There is provided a high throughput automated single molecule image collection and processing system that requires minimal initial user input. The unique features embodied in the present disclosure allow automated collection and initial processing of optical images of single molecules and their assemblies. Correct focus may be automatically maintained while images are collected. Uneven illumination in fluorescence microscopy is accounted for, and an overall robust imaging operation is provided yielding individual images prepared for further processing in external systems. Embodiments described herein are useful in studies of any macromolecules such as DNA, RNA, peptides and proteins. The automated image collection and processing system and method of same may be implemented and deployed over a computer network, and may be ergonomically optimized to facilitate user interaction.

  12. Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn; Davis, Tom.

    2013-01-01

    NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.

  13. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  14. Combined process automation for large-scale EEG analysis.

    PubMed

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  16. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  17. An Automated Method for High-Definition Transcranial Direct Current Stimulation Modeling*

    PubMed Central

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C.

    2014-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy. PMID:23367144

  18. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  19. Space station automation study. Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The two manufacturing concepts developed represent innovative, technologically advanced manufacturing schemes. The concepts were selected to facilitate an in depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, and artificial intelligence. While the cost effectiveness of these facilities has not been analyzed as part of this study, both appear entirely feasible for the year 2000 timeframe. The growing demand for high quality gallium arsenide microelectronics may warrant the ventures.

  20. Automated negotiation in environmental resource management: Review and assessment.

    PubMed

    Eshragh, Faezeh; Pooyandeh, Majeed; Marceau, Danielle J

    2015-10-01

    Negotiation is an integral part of our daily life and plays an important role in resolving conflicts and facilitating human interactions. Automated negotiation, which aims at capturing the human negotiation process using artificial intelligence and machine learning techniques, is well-established in e-commerce, but its application in environmental resource management remains limited. This is due to the inherent uncertainties and complexity of environmental issues, along with the diversity of stakeholders' perspectives when dealing with these issues. The objective of this paper is to describe the main components of automated negotiation, review and compare machine learning techniques in automated negotiation, and provide a guideline for the selection of suitable methods in the particular context of stakeholders' negotiation over environmental resource issues. We advocate that automated negotiation can facilitate the involvement of stakeholders in the exploration of a plurality of solutions in order to reach a mutually satisfying agreement and contribute to informed decisions in environmental management along with the need for further studies to consolidate the potential of this modeling approach. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  2. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  3. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    PubMed

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  4. Using PATIMDB to Create Bacterial Transposon Insertion Mutant Libraries

    PubMed Central

    Urbach, Jonathan M.; Wei, Tao; Liberati, Nicole; Grenfell-Lee, Daniel; Villanueva, Jacinto; Wu, Gang; Ausubel, Frederick M.

    2015-01-01

    PATIMDB is a software package for facilitating the generation of transposon mutant insertion libraries. The software has two main functions: process tracking and automated sequence analysis. The process tracking function specifically includes recording the status and fates of multiwell plates and samples in various stages of library construction. Automated sequence analysis refers specifically to the pipeline of sequence analysis starting with ABI files from a sequencing facility and ending with insertion location identifications. The protocols in this unit describe installation and use of PATIMDB software. PMID:19343706

  5. Automated response matching for organic scintillation detector arrays

    NASA Astrophysics Data System (ADS)

    Aspinall, M. D.; Joyce, M. J.; Cave, F. D.; Plenteda, R.; Tomanin, A.

    2017-07-01

    This paper identifies a digitizer technology with unique features that facilitates feedback control for the realization of a software-based technique for automatically calibrating detector responses. Three such auto-calibration techniques have been developed and are described along with an explanation of the main configuration settings and potential pitfalls. Automating this process increases repeatability, simplifies user operation, enables remote and periodic system calibration where consistency across detectors' responses are critical.

  6. ClinicalTrials.gov as a data source for semi-automated point-of-care trial eligibility screening.

    PubMed

    Pfiffner, Pascal B; Oh, JiWon; Miller, Timothy A; Mandl, Kenneth D

    2014-01-01

    Implementing semi-automated processes to efficiently match patients to clinical trials at the point of care requires both detailed patient data and authoritative information about open studies. To evaluate the utility of the ClinicalTrials.gov registry as a data source for semi-automated trial eligibility screening. Eligibility criteria and metadata for 437 trials open for recruitment in four different clinical domains were identified in ClinicalTrials.gov. Trials were evaluated for up to date recruitment status and eligibility criteria were evaluated for obstacles to automated interpretation. Finally, phone or email outreach to coordinators at a subset of the trials was made to assess the accuracy of contact details and recruitment status. 24% (104 of 437) of trials declaring on open recruitment status list a study completion date in the past, indicating out of date records. Substantial barriers to automated eligibility interpretation in free form text are present in 81% to up to 94% of all trials. We were unable to contact coordinators at 31% (45 of 146) of the trials in the subset, either by phone or by email. Only 53% (74 of 146) would confirm that they were still recruiting patients. Because ClinicalTrials.gov has entries on most US and many international trials, the registry could be repurposed as a comprehensive trial matching data source. Semi-automated point of care recruitment would be facilitated by matching the registry's eligibility criteria against clinical data from electronic health records. But the current entries fall short. Ultimately, improved techniques in natural language processing will facilitate semi-automated complex matching. As immediate next steps, we recommend augmenting ClinicalTrials.gov data entry forms to capture key eligibility criteria in a simple, structured format.

  7. Effective application of multiple locus variable number of tandem repeats analysis to tracing Staphylococcus aureus in food-processing environment.

    PubMed

    Rešková, Z; Koreňová, J; Kuchta, T

    2014-04-01

    A total of 256 isolates of Staphylococcus aureus were isolated from 98 samples (34 swabs and 64 food samples) obtained from small or medium meat- and cheese-processing plants in Slovakia. The strains were genotypically characterized by multiple locus variable number of tandem repeats analysis (MLVA), involving multiplex polymerase chain reaction (PCR) with subsequent separation of the amplified DNA fragments by an automated flow-through gel electrophoresis. With the panel of isolates, MLVA produced 31 profile types, which was a sufficient discrimination to facilitate the description of spatial and temporal aspects of contamination. Further data on MLVA discrimination were obtained by typing a subpanel of strains by multiple locus sequence typing (MLST). MLVA coupled to automated electrophoresis proved to be an effective, comparatively fast and inexpensive method for tracing S. aureus contamination of food-processing factories. Subspecies genotyping of microbial contaminants in food-processing factories may facilitate identification of spatial and temporal aspects of the contamination. This may help to properly manage the process hygiene. With S. aureus, multiple locus variable number of tandem repeats analysis (MLVA) proved to be an effective method for the purpose, being sufficiently discriminative, yet comparatively fast and inexpensive. The application of automated flow-through gel electrophoresis to separation of DNA fragments produced by multiplex PCR helped to improve the accuracy and speed of the method. © 2013 The Society for Applied Microbiology.

  8. Automation in the Teaching of Descriptive Geometry and CAD. High-Level CAD Templates Using Script Languages

    NASA Astrophysics Data System (ADS)

    Moreno, R.; Bazán, A. M.

    2017-10-01

    The main purpose of this work is to study improvements to the learning method of technical drawing and descriptive geometry through exercises with traditional techniques that are usually solved manually by applying automated processes assisted by high-level CAD templates (HLCts). Given that an exercise with traditional procedures can be solved, detailed step by step in technical drawing and descriptive geometry manuals, CAD applications allow us to do the same and generalize it later, incorporating references. Traditional teachings have become obsolete and current curricula have been relegated. However, they can be applied in certain automation processes. The use of geometric references (using variables in script languages) and their incorporation into HLCts allows the automation of drawing processes. Instead of repeatedly creating similar exercises or modifying data in the same exercises, users should be able to use HLCts to generate future modifications of these exercises. This paper introduces the automation process when generating exercises based on CAD script files, aided by parametric geometry calculation tools. The proposed method allows us to design new exercises without user intervention. The integration of CAD, mathematics, and descriptive geometry facilitates their joint learning. Automation in the generation of exercises not only saves time but also increases the quality of the statements and reduces the possibility of human error.

  9. Successfully Automating Library Consortia: Procedures To Facilitate Governance, Management and Cooperation. DataResearch Automation Guide Series, Number Three.

    ERIC Educational Resources Information Center

    Data Research Associates, Inc., St. Louis, MO.

    Sharing a local automated library system will generally reduce the costs of automation for each participating library and will facilitate the sharing of resources. To set up a consortium, libraries must first identify and agree on governance issues and methods for dealing with these issues. Issues range from ownership, management, and location of…

  10. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    PubMed

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Automated astatination of biomolecules – a stepping stone towards multicenter clinical trials

    PubMed Central

    Aneheim, Emma; Albertsson, Per; Bäck, Tom; Jensen, Holger; Palm, Stig; Lindegren, Sture

    2015-01-01

    To facilitate multicentre clinical studies on targeted alpha therapy, it is necessary to develop an automated, on-site procedure for conjugating rare, short-lived, alpha-emitting radionuclides to biomolecules. Astatine-211 is one of the few alpha-emitting nuclides with appropriate chemical and physical properties for use in targeted therapies for cancer. Due to the very short range of the emitted α-particles, this therapy is particularly suited to treating occult, disseminated cancers. Astatine is not intrinsically tumour-specific; therefore, it requires an appropriate tumour-specific targeting vector, which can guide the radiation to the cancer cells. Consequently, an appropriate method is required for coupling the nuclide to the vector. To increase the availability of astatine-211 radiopharmaceuticals for targeted alpha therapy, their production should be automated. Here, we present a method that combines dry distillation of astatine-211 and a synthesis module for producing radiopharmaceuticals into a process platform. This platform will standardize production of astatinated radiopharmaceuticals, and hence, it will facilitate large clinical studies focused on this promising, but chemically challenging, alpha-emitting radionuclide. In this work, we describe the process platform, and we demonstrate the production of both astaine-211, for preclinical use, and astatine-211 labelled antibodies. PMID:26169786

  12. Stunning systems for poultry

    USDA-ARS?s Scientific Manuscript database

    Poultry are stunned immediately prior to slaughter to facilitate automated processing, to minimize the subsequent death struggle and thereby minimize carcass damage and down grades, and to render the bird unconscious and incapable to perceive pain. A stunning method for slaughter should be consider...

  13. An overview of the Progenika ID CORE XT: an automated genotyping platform based on a fluidic microarray system.

    PubMed

    Goldman, Mindy; Núria, Núria; Castilho, Lilian M

    2015-01-01

    Automated testing platforms facilitate the introduction of red cell genotyping of patients and blood donors. Fluidic microarray systems, such as Luminex XMAP (Austin, TX), are used in many clinical applications, including HLA and HPA typing. The Progenika ID CORE XT (Progenika Biopharma-Grifols, Bizkaia, Spain) uses this platform to analyze 29 polymorphisms determining 37 antigens in 10 blood group systems. Once DNA has been extracted, processing time is approximately 4 hours. The system is highly automated and includes integrated analysis software that produces a file and a report with genotype and predicted phenotype results.

  14. Optimizing the balance between task automation and human manual control in simulated submarine track management.

    PubMed

    Chen, Stephanie I; Visser, Troy A W; Huf, Samuel; Loft, Shayne

    2017-09-01

    Automation can improve operator performance and reduce workload, but can also degrade operator situation awareness (SA) and the ability to regain manual control. In 3 experiments, we examined the extent to which automation could be designed to benefit performance while ensuring that individuals maintained SA and could regain manual control. Participants completed a simulated submarine track management task under varying task load. The automation was designed to facilitate information acquisition and analysis, but did not make task decisions. Relative to a condition with no automation, the continuous use of automation improved performance and reduced subjective workload, but degraded SA. Automation that was engaged and disengaged by participants as required (adaptable automation) moderately improved performance and reduced workload relative to no automation, but degraded SA. Automation engaged and disengaged based on task load (adaptive automation) provided no benefit to performance or workload, and degraded SA relative to no automation. Automation never led to significant return-to-manual deficits. However, all types of automation led to degraded performance on a nonautomated task that shared information processing requirements with automated tasks. Given these outcomes, further research is urgently required to establish how to design automation to maximize performance while keeping operators cognitively engaged. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Application of remotely sensed multispectral data to automated analysis of marshland vegetation. Inference to the location of breeding habitats of the salt marsh mosquito (Aedes Sollicitans)

    NASA Technical Reports Server (NTRS)

    Cibula, W. G.

    1976-01-01

    The techniques used for the automated classification of marshland vegetation and for the color-coded display of remotely acquired data to facilitate the control of mosquito breeding are presented. A multispectral scanner system and its mode of operation are described, and the computer processing techniques are discussed. The procedures for the selection of calibration sites are explained. Three methods for displaying color-coded classification data are presented.

  16. Automation and hypermedia technology applications

    NASA Technical Reports Server (NTRS)

    Jupin, Joseph H.; Ng, Edward W.; James, Mark L.

    1993-01-01

    This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.

  17. Novel SPECT Technologies and Approaches in Cardiac Imaging

    PubMed Central

    Slomka, Piotr; Hung, Guang-Uei; Germano, Guido; Berman, Daniel S.

    2017-01-01

    Recent novel approaches in myocardial perfusion single photon emission CT (SPECT) have been facilitated by new dedicated high-efficiency hardware with solid-state detectors and optimized collimators. New protocols include very low-dose (1 mSv) stress-only, two-position imaging to mitigate attenuation artifacts, and simultaneous dual-isotope imaging. Attenuation correction can be performed by specialized low-dose systems or by previously obtained CT coronary calcium scans. Hybrid protocols using CT angiography have been proposed. Image quality improvements have been demonstrated by novel reconstructions and motion correction. Fast SPECT acquisition facilitates dynamic flow and early function measurements. Image processing algorithms have become automated with virtually unsupervised extraction of quantitative imaging variables. This automation facilitates integration with clinical variables derived by machine learning to predict patient outcome or diagnosis. In this review, we describe new imaging protocols made possible by the new hardware developments. We also discuss several novel software approaches for the quantification and interpretation of myocardial perfusion SPECT scans. PMID:29034066

  18. Impact of pharmacy automation on patient waiting time: an application of computer simulation.

    PubMed

    Tan, Woan Shin; Chua, Siang Li; Yong, Keng Woh; Wu, Tuck Seng

    2009-06-01

    This paper aims to illustrate the use of computer simulation in evaluating the impact of a prototype automated dispensing system on waiting time in an outpatient pharmacy and its potential as a routine tool in pharmacy management. A discrete event simulation model was developed to investigate the impact of a prototype automated dispensing system on operational efficiency and service standards in an outpatient pharmacy. The simulation results suggest that automating the prescription-filing function using a prototype that picks and packs at 20 seconds per item will not assist the pharmacy in achieving the waiting time target of 30 minutes for all patients. Regardless of the state of automation, to meet the waiting time target, 2 additional pharmacists are needed to overcome the process bottleneck at the point of medication dispense. However, if the automated dispensing is the preferred option, the speed of the system needs to be twice as fast as the current configuration to facilitate the reduction of the 95th percentile patient waiting time to below 30 minutes. The faster processing speed will concomitantly allow the pharmacy to reduce the number of pharmacy technicians from 11 to 8. Simulation was found to be a useful and low cost method that allows an otherwise expensive and resource intensive evaluation of new work processes and technology to be completed within a short time.

  19. KAMO: towards automated data processing for microcrystals.

    PubMed

    Yamashita, Keitaro; Hirata, Kunio; Yamamoto, Masaki

    2018-05-01

    In protein microcrystallography, radiation damage often hampers complete and high-resolution data collection from a single crystal, even under cryogenic conditions. One promising solution is to collect small wedges of data (5-10°) separately from multiple crystals. The data from these crystals can then be merged into a complete reflection-intensity set. However, data processing of multiple small-wedge data sets is challenging. Here, a new open-source data-processing pipeline, KAMO, which utilizes existing programs, including the XDS and CCP4 packages, has been developed to automate whole data-processing tasks in the case of multiple small-wedge data sets. Firstly, KAMO processes individual data sets and collates those indexed with equivalent unit-cell parameters. The space group is then chosen and any indexing ambiguity is resolved. Finally, clustering is performed, followed by merging with outlier rejections, and a report is subsequently created. Using synthetic and several real-world data sets collected from hundreds of crystals, it was demonstrated that merged structure-factor amplitudes can be obtained in a largely automated manner using KAMO, which greatly facilitated the structure analyses of challenging targets that only produced microcrystals. open access.

  20. Inselect: Automating the Digitization of Natural History Collections

    PubMed Central

    Hudson, Lawrence N.; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W.; van der Walt, Stéfan; Smith, Vincent S.

    2015-01-01

    The world’s natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect—a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization. PMID:26599208

  1. Inselect: Automating the Digitization of Natural History Collections.

    PubMed

    Hudson, Lawrence N; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W; van der Walt, Stéfan; Smith, Vincent S

    2015-01-01

    The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.

  2. Internet-Based Cervical Cytology Screening System

    DTIC Science & Technology

    2007-04-01

    approaches to cervical cancer screening possible. In addition, advances in information technology have facilitated the Internet transmission and archival...processes in the clinical laboratory. Recent technological advances in specimen preparation and computerized primary screening make automated...AD_________________ Award Number: W81XWH-04-C-0083 TITLE: Internet -Based Cervical Cytology

  3. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  4. Towards an integrated optofluidic system for highly sensitive detection of antibiotics in seawater incorporating bimodal waveguide photonic biosensors and complex, active microfluidics

    NASA Astrophysics Data System (ADS)

    Szydzik, C.; Gavela, A. F.; Roccisano, J.; Herranz de Andrés, S.; Mitchell, A.; Lechuga, L. M.

    2016-12-01

    We present recent results on the realisation and demonstration of an integrated optofluidic lab-on-a-chip measurement system. The system consists of an integrated on-chip automated microfluidic fluid handling subsystem, coupled with bimodal nano-interferometer waveguide technology, and is applied in the context of detection of antibiotics in seawater. The bimodal waveguide (BMWG) is a highly sensitive label-free biosensor. Integration of complex microfluidic systems with bimodal waveguide technology enables on-chip sample handling and fluid processing capabilities and allows for significant automation of experimental processes. The on-chip fluid-handling subsystem is realised through the integration of pneumatically actuated elastomer pumps and valves, enabling high temporal resolution sample and reagent delivery and facilitating multiplexed detection processes.

  5. The enhanced Software Life Cyle Support Environment (ProSLCSE): Automation for enterprise and process modeling

    NASA Technical Reports Server (NTRS)

    Milligan, James R.; Dutton, James E.

    1993-01-01

    In this paper, we have introduced a comprehensive method for enterprise modeling that addresses the three important aspects of how an organization goes about its business. FirstEP includes infrastructure modeling, information modeling, and process modeling notations that are intended to be easy to learn and use. The notations stress the use of straightforward visual languages that are intuitive, syntactically simple, and semantically rich. ProSLCSE will be developed with automated tools and services to facilitate enterprise modeling and process enactment. In the spirit of FirstEP, ProSLCSE tools will also be seductively easy to use. Achieving fully managed, optimized software development and support processes will be long and arduous for most software organizations, and many serious problems will have to be solved along the way. ProSLCSE will provide the ability to document, communicate, and modify existing processes, which is the necessary first step.

  6. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  7. Why poultry should be stunned at slaughter and the welfare advantages/challenges of electrical and gas stunning

    USDA-ARS?s Scientific Manuscript database

    Poultry are stunned immediately prior to slaughter to facilitate automated processing, to suppress the subsequent death struggle and thereby minimize carcass damage and down grades, and to render the bird unconscious and incapable to perceive pain. A stunning method should be considered ethical if ...

  8. Combining Archetypes, Ontologies and Formalization Enables Automated Computation of Quality Indicators.

    PubMed

    Legaz-García, María Del Carmen; Dentler, Kathrin; Fernández-Breis, Jesualdo Tomás; Cornet, Ronald

    2017-01-01

    ArchMS is a framework that represents clinical information and knowledge using ontologies in OWL, which facilitates semantic interoperability and thereby the exploitation and secondary use of clinical data. However, it does not yet support the automated assessment of quality of care. CLIF is a stepwise method to formalize quality indicators. The method has been implemented in the CLIF tool which supports its users in generating computable queries based on a patient data model which can be based on archetypes. To enable the automated computation of quality indicators using ontologies and archetypes, we tested whether ArchMS and the CLIF tool can be integrated. We successfully automated the process of generating SPARQL queries from quality indicators that have been formalized with CLIF and integrated them into ArchMS. Hence, ontologies and archetypes can be combined for the execution of formalized quality indicators.

  9. The value of the Semantic Web in the laboratory.

    PubMed

    Frey, Jeremy G

    2009-06-01

    The Semantic Web is beginning to impact on the wider chemical and physical sciences, beyond the earlier adopted bio-informatics. While useful in large-scale data driven science with automated processing, these technologies can also help integrate the work of smaller scale laboratories producing diverse data. The semantics aid the discovery, reliable re-use of data, provide improved provenance and facilitate automated processing by increased resilience to changes in presentation and reduced ambiguity. The Semantic Web, its tools and collections are not yet competitive with well-established solutions to current problems. It is in the reduced cost of instituting solutions to new problems that the versatility of Semantic Web-enabled data and resources will make their mark once the more general-purpose tools are more available.

  10. Automated smear counting and data processing using a notebook computer in a biomedical research facility.

    PubMed

    Ogata, Y; Nishizawa, K

    1995-10-01

    An automated smear counting and data processing system for a life science laboratory was developed to facilitate routine surveys and eliminate human errors by using a notebook computer. This system was composed of a personal computer, a liquid scintillation counter and a well-type NaI(Tl) scintillation counter. The radioactivity of smear samples was automatically measured by these counters. The personal computer received raw signals from the counters through an interface of RS-232C. The software for the computer evaluated the surface density of each radioisotope and printed out that value along with other items as a report. The software was programmed in Pascal language. This system was successfully applied to routine surveys for contamination in our facility.

  11. Automated Dissolution for Enteric-Coated Aspirin Tablets: A Case Study for Method Transfer to a RoboDis II.

    PubMed

    Ibrahim, Sarah A; Martini, Luigi

    2014-08-01

    Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.

  12. Improving clinical laboratory efficiency: a time-motion evaluation of the Abbott m2000 RealTime and Roche COBAS AmpliPrep/COBAS TaqMan PCR systems for the simultaneous quantitation of HIV-1 RNA and HCV RNA.

    PubMed

    Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria

    2011-08-01

    Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.

  13. Drone Mission Definition and Implementation for Automated Infrastructure Inspection Using Airborne Sensors

    PubMed Central

    Besada, Juan A.; Bergesio, Luca; Campaña, Iván; Vaquero-Melchor, Diego; Bernardos, Ana M.; Casar, José R.

    2018-01-01

    This paper describes a Mission Definition System and the automated flight process it enables to implement measurement plans for discrete infrastructure inspections using aerial platforms, and specifically multi-rotor drones. The mission definition aims at improving planning efficiency with respect to state-of-the-art waypoint-based techniques, using high-level mission definition primitives and linking them with realistic flight models to simulate the inspection in advance. It also provides flight scripts and measurement plans which can be executed by commercial drones. Its user interfaces facilitate mission definition, pre-flight 3D synthetic mission visualisation and flight evaluation. Results are delivered for a set of representative infrastructure inspection flights, showing the accuracy of the flight prediction tools in actual operations using automated flight control. PMID:29641506

  14. Drone Mission Definition and Implementation for Automated Infrastructure Inspection Using Airborne Sensors.

    PubMed

    Besada, Juan A; Bergesio, Luca; Campaña, Iván; Vaquero-Melchor, Diego; López-Araquistain, Jaime; Bernardos, Ana M; Casar, José R

    2018-04-11

    This paper describes a Mission Definition System and the automated flight process it enables to implement measurement plans for discrete infrastructure inspections using aerial platforms, and specifically multi-rotor drones. The mission definition aims at improving planning efficiency with respect to state-of-the-art waypoint-based techniques, using high-level mission definition primitives and linking them with realistic flight models to simulate the inspection in advance. It also provides flight scripts and measurement plans which can be executed by commercial drones. Its user interfaces facilitate mission definition, pre-flight 3D synthetic mission visualisation and flight evaluation. Results are delivered for a set of representative infrastructure inspection flights, showing the accuracy of the flight prediction tools in actual operations using automated flight control.

  15. The Gemini Recipe System: a dynamic workflow for automated data reduction

    NASA Astrophysics Data System (ADS)

    Labrie, Kathleen; Allen, Craig; Hirst, Paul; Holt, Jennifer; Allen, River; Dement, Kaniela

    2010-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. The data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps, called Primitives, which are written in Python and can be launched from the PyRAF user interface by users wishing to use them interactively for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines.

  16. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    PubMed

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  17. Automated Ontology Generation Using Spatial Reasoning

    NASA Astrophysics Data System (ADS)

    Coalter, Alton; Leopold, Jennifer L.

    Recently there has been much interest in using ontologies to facilitate knowledge representation, integration, and reasoning. Correspondingly, the extent of the information embodied by an ontology is increasing beyond the conventional is_a and part_of relationships. To address these requirements, a vast amount of digitally available information may need to be considered when building ontologies, prompting a desire for software tools to automate at least part of the process. The main efforts in this direction have involved textual information retrieval and extraction methods. For some domains extension of the basic relationships could be enhanced further by the analysis of 2D and/or 3D images. For this type of media, image processing algorithms are more appropriate than textual analysis methods. Herein we present an algorithm that, given a collection of 3D image files, utilizes Qualitative Spatial Reasoning (QSR) to automate the creation of an ontology for the objects represented by the images, relating the objects in terms of is_a and part_of relationships and also through unambiguous Relational Connection Calculus (RCC) relations.

  18. Phase II Report: Design Study for Automated Document Location and Control System.

    ERIC Educational Resources Information Center

    Booz, Allen Applied Research, Inc., Bethesda, MD.

    The scope of Phase II is the design of a system for document control within the National Agricultural Library (NAL) that will facilitate the processing of the documents selected, ordered, or received; that will avoid backlogs; and that will provide rapid document location reports. The results are set forth as follows: Chapter I, Introduction,…

  19. Stunning poultry prior to slaughter and the welfare advantages/challenges of electrical and controlled atmosphere stunning pp. 90-98.

    USDA-ARS?s Scientific Manuscript database

    Poultry are stunned immediately prior to slaughter to render them unconscious and incapable of perceiving pain, to facilitate automated processing (up to 180 birds/min), and to minimize the occurrence of the death struggle and thereby minimize carcass damage and down grades. A stunning method for s...

  20. Experience in Education Environment Virtualization within the Automated Information System "Platonus" (Kazakhstan)

    ERIC Educational Resources Information Center

    Abeldina, Zhaidary; Moldumarova, Zhibek; Abeldina, Rauza; Makysh, Gulmira; Moldumarova, Zhuldyz Ilibaevna

    2016-01-01

    This work reports on the use of virtual tools as means of learning process activation. A good result can be achieved by combining the classical learning with modern computer technology. By creating a virtual learning environment and using multimedia learning tools one can obtain a significant result while facilitating the development of students'…

  1. An automated testing tool for traffic signal controller functionalities.

    DOT National Transportation Integrated Search

    2010-03-01

    The purpose of this project was to develop an automated tool that facilitates testing of traffic controller functionality using controller interface device (CID) technology. Benefits of such automated testers to traffic engineers include reduced test...

  2. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection

    PubMed Central

    Herasevich, Vitaly

    2017-01-01

    Background The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. Objective The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. Methods First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. Results The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Conclusions Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians’ needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as “apps.” A user-centered design process and usability evaluation should be considered during creation of these tools. PMID:28526675

  3. Iterative User Interface Design for Automated Sequential Organ Failure Assessment Score Calculator in Sepsis Detection.

    PubMed

    Aakre, Christopher Ansel; Kitson, Jaben E; Li, Man; Herasevich, Vitaly

    2017-05-18

    The new sepsis definition has increased the need for frequent sequential organ failure assessment (SOFA) score recalculation and the clerical burden of information retrieval makes this score ideal for automated calculation. The aim of this study was to (1) estimate the clerical workload of manual SOFA score calculation through a time-motion analysis and (2) describe a user-centered design process for an electronic medical record (EMR) integrated, automated SOFA score calculator with subsequent usability evaluation study. First, we performed a time-motion analysis by recording time-to-task-completion for the manual calculation of 35 baseline and 35 current SOFA scores by 14 internal medicine residents over a 2-month period. Next, we used an agile development process to create a user interface for a previously developed automated SOFA score calculator. The final user interface usability was evaluated by clinician end users with the Computer Systems Usability Questionnaire. The overall mean (standard deviation, SD) time-to-complete manual SOFA score calculation time was 61.6 s (33). Among the 24% (12/50) usability survey respondents, our user-centered user interface design process resulted in >75% favorability of survey items in the domains of system usability, information quality, and interface quality. Early stakeholder engagement in our agile design process resulted in a user interface for an automated SOFA score calculator that reduced clinician workload and met clinicians' needs at the point of care. Emerging interoperable platforms may facilitate dissemination of similarly useful clinical score calculators and decision support algorithms as "apps." A user-centered design process and usability evaluation should be considered during creation of these tools. ©Christopher Ansel Aakre, Jaben E Kitson, Man Li, Vitaly Herasevich. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 18.05.2017.

  4. Surge of Bering Glacier and Bagley Ice Field: Parameterization of surge characteristics based on automated analysis of crevasse image data and laser altimeter data

    NASA Astrophysics Data System (ADS)

    Stachura, M.; Herzfeld, U. C.; McDonald, B.; Weltman, A.; Hale, G.; Trantow, T.

    2012-12-01

    The dynamical processes that occur during the surge of a large, complex glacier system are far from being understood. The aim of this paper is to derive a parameterization of surge characteristics that captures the principle processes and can serve as the basis for a dynamic surge model. Innovative mathematical methods are introduced that facilitate derivation of such a parameterization from remote-sensing observations. Methods include automated geostatistical characterization and connectionist-geostatistical classification of dynamic provinces and deformation states, using the vehicle of crevasse patterns. These methods are applied to analyze satellite and airborne image and laser altimeter data collected during the current surge of Bering Glacier and Bagley Ice Field, Alaska.

  5. An automated dose tracking system for adaptive radiation therapy.

    PubMed

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Design, Development, and Commissioning of a Substation Automation Laboratory to Enhance Learning

    ERIC Educational Resources Information Center

    Thomas, M. S.; Kothari, D. P.; Prakash, A.

    2011-01-01

    Automation of power systems is gaining momentum across the world, and there is a need to expose graduate and undergraduate students to the latest developments in hardware, software, and related protocols for power automation. This paper presents the design, development, and commissioning of an automation lab to facilitate the understanding of…

  7. Advanced imaging techniques for the study of plant growth and development.

    PubMed

    Sozzani, Rosangela; Busch, Wolfgang; Spalding, Edgar P; Benfey, Philip N

    2014-05-01

    A variety of imaging methodologies are being used to collect data for quantitative studies of plant growth and development from living plants. Multi-level data, from macroscopic to molecular, and from weeks to seconds, can be acquired. Furthermore, advances in parallelized and automated image acquisition enable the throughput to capture images from large populations of plants under specific growth conditions. Image-processing capabilities allow for 3D or 4D reconstruction of image data and automated quantification of biological features. These advances facilitate the integration of imaging data with genome-wide molecular data to enable systems-level modeling. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Electronic Design Automation: Integrating the Design and Manufacturing Functions

    NASA Technical Reports Server (NTRS)

    Bachnak, Rafic; Salkowski, Charles

    1997-01-01

    As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.

  9. Automation of a N-S S and C Database Generation for the Harrier in Ground Effect

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Chaderjian, Neal M.; Pandya, Shishir; Kwak, Dochan (Technical Monitor)

    2001-01-01

    A method of automating the generation of a time-dependent, Navier-Stokes static stability and control database for the Harrier aircraft in ground effect is outlined. Reusable, lightweight components arc described which allow different facets of the computational fluid dynamic simulation process to utilize a consistent interface to a remote database. These components also allow changes and customizations to easily be facilitated into the solution process to enhance performance, without relying upon third-party support. An analysis of the multi-level parallel solver OVERFLOW-MLP is presented, and the results indicate that it is feasible to utilize large numbers of processors (= 100) even with a grid system with relatively small number of cells (= 10(exp 6)). A more detailed discussion of the simulation process, as well as refined data for the scaling of the OVERFLOW-MLP flow solver will be included in the full paper.

  10. Integrated microfluidic systems for cell lysis, mixing/pumping and DNA amplification

    NASA Astrophysics Data System (ADS)

    Lee, Chia-Yen; Lee, Gwo-Bin; Lin, Jr-Lung; Huang, Fu-Chun; Liao, Chia-Sheng

    2005-06-01

    The present paper reports a fully automated microfluidic system for the DNA amplification process by integrating an electroosmotic pump, an active micromixer and an on-chip temperature control system. In this DNA amplification process, the cell lysis is initially performed in a micro cell lysis reactor. Extracted DNA samples, primers and reagents are then driven electroosmotically into a mixing region where they are mixed by the active micromixer. The homogeneous mixture is then thermally cycled in a micro-PCR (polymerase chain reaction) chamber to perform DNA amplification. Experimental results show that the proposed device can successfully automate the sample pretreatment operation for DNA amplification, thereby delivering significant time and effort savings. The new microfluidic system, which facilitates cell lysis, sample driving/mixing and DNA amplification, could provide a significant contribution to ongoing efforts to miniaturize bio-analysis systems by utilizing a simple fabrication process and cheap materials.

  11. SAMI Automated Plug Plate Configuration

    NASA Astrophysics Data System (ADS)

    Lorente, N. P. F.; Farrell, T.; Goodwin, M.

    2013-10-01

    The Sydney-AAO Multi-object Integral field spectrograph (SAMI) is a prototype wide-field system at the Anglo-Australian Telescope (AAT) which uses a plug-plate to mount its 13×61-core imaging fibre bundles (hexabundles) in the optical path at the telescope's prime focus. In this paper we describe the process of determining the positions of the plug-plate holes, where plates contain three or more stacked observation configurations. The process, which up until now has involved several separate processes and has required significant manual configuration and checking, is now being automated to increase efficiency and reduce error. This is carried out by means of a thin Java controller layer which drives the configuration cycle. This layer controls the user interface and the C++ algorithm layer where the plate configuration and optimisation is carried out. Additionally, through the Aladin display package, it provides visualisation and facilitates user verification of the resulting plates.

  12. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR).

    PubMed

    O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S

    2018-01-09

    The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

  13. Utilization of Intelligent Software Agent Features for Improving E-Learning Efforts: A Comprehensive Investigation

    ERIC Educational Resources Information Center

    Farzaneh, Mandana; Vanani, Iman Raeesi; Sohrabi, Babak

    2012-01-01

    E-learning is one of the most important learning approaches within which intelligent software agents can be efficiently used so as to automate and facilitate the process of learning. The aim of this paper is to illustrate a comprehensive categorization of intelligent software agent features, which is valuable for being deployed in the virtual…

  14. A time-series method for automated measurement of changes in mitotic and interphase duration from time-lapse movies.

    PubMed

    Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W

    2011-01-01

    Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.

  15. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing

    USGS Publications Warehouse

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  16. Automated Data Abstraction of Cardiopulmonary Resuscitation Process Measures for Complete Episodes of Cardiac Arrest Resuscitation.

    PubMed

    Lin, Steve; Turgulov, Anuar; Taher, Ahmed; Buick, Jason E; Byers, Adam; Drennan, Ian R; Hu, Samantha; J Morrison, Laurie

    2016-10-01

    Cardiopulmonary resuscitation (CPR) process measures research and quality assurance has traditionally been limited to the first 5 minutes of resuscitation due to significant costs in time, resources, and personnel from manual data abstraction. CPR performance may change over time during prolonged resuscitations, which represents a significant knowledge gap. Moreover, currently available commercial software output of CPR process measures are difficult to analyze. The objective was to develop and validate a software program to help automate the abstraction and transfer of CPR process measures data from electronic defibrillators for complete episodes of cardiac arrest resuscitation. We developed a software program to facilitate and help automate CPR data abstraction and transfer from electronic defibrillators for entire resuscitation episodes. Using an intermediary Extensible Markup Language export file, the automated software transfers CPR process measures data (electrocardiogram [ECG] number, CPR start time, number of ventilations, number of chest compressions, compression rate per minute, compression depth per minute, compression fraction, and end-tidal CO 2 per minute). We performed an internal validation of the software program on 50 randomly selected cardiac arrest cases with resuscitation durations between 15 and 60 minutes. CPR process measures were manually abstracted and transferred independently by two trained data abstractors and by the automated software program, followed by manual interpretation of raw ECG tracings, treatment interventions, and patient events. Error rates and the time needed for data abstraction, transfer, and interpretation were measured for both manual and automated methods, compared to an additional independent reviewer. A total of 9,826 data points were each abstracted by the two abstractors and by the software program. Manual data abstraction resulted in a total of six errors (0.06%) compared to zero errors by the software program. The mean ± SD time measured per case for manual data abstraction was 20.3 ± 2.7 minutes compared to 5.3 ± 1.4 minutes using the software program (p = 0.003). We developed and validated an automated software program that efficiently abstracts and transfers CPR process measures data from electronic defibrillators for complete cardiac arrest episodes. This software will enable future cardiac arrest studies and quality assurance programs to evaluate the impact of CPR process measures during prolonged resuscitations. © 2016 by the Society for Academic Emergency Medicine.

  17. The Gemini Recipe System: A Dynamic Workflow for Automated Data Reduction

    NASA Astrophysics Data System (ADS)

    Labrie, K.; Hirst, P.; Allen, C.

    2011-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. Building on concepts that originated in ORAC-DR, a data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps called Primitives. The Primitives are written in Python and can be launched from the PyRAF user interface by users wishing for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines. All observatory or instrument specific definitions are isolated from the core of the AstroData system and distributed in external configuration packages that define a lexicon including classifications, uniform metadata elements, and transformations.

  18. Leveraging annotation-based modeling with Jump.

    PubMed

    Bergmayr, Alexander; Grossniklaus, Michael; Wimmer, Manuel; Kappel, Gerti

    2018-01-01

    The capability of UML profiles to serve as annotation mechanism has been recognized in both research and industry. Today's modeling tools offer profiles specific to platforms, such as Java, as they facilitate model-based engineering approaches. However, considering the large number of possible annotations in Java, manually developing the corresponding profiles would only be achievable by huge development and maintenance efforts. Thus, leveraging annotation-based modeling requires an automated approach capable of generating platform-specific profiles from Java libraries. To address this challenge, we present the fully automated transformation chain realized by Jump, thereby continuing existing mapping efforts between Java and UML by emphasizing on annotations and profiles. The evaluation of Jump shows that it scales for large Java libraries and generates profiles of equal or even improved quality compared to profiles currently used in practice. Furthermore, we demonstrate the practical value of Jump by contributing profiles that facilitate reverse engineering and forward engineering processes for the Java platform by applying it to a modernization scenario.

  19. CDISC SHARE, a Global, Cloud-based Resource of Machine-Readable CDISC Standards for Clinical and Translational Research

    PubMed Central

    Hume, Samuel; Chow, Anthony; Evans, Julie; Malfait, Frederik; Chason, Julie; Wold, J. Darcy; Kubick, Wayne; Becnel, Lauren B.

    2018-01-01

    The Clinical Data Interchange Standards Consortium (CDISC) is a global non-profit standards development organization that creates consensus-based standards for clinical and translational research. Several of these standards are now required by regulators for electronic submissions of regulated clinical trials’ data and by government funding agencies. These standards are free and open, available for download on the CDISC Website as PDFs. While these documents are human readable, they are not amenable to ready use by electronic systems. CDISC launched the CDISC Shared Health And Research Electronic library (SHARE) to provide the standards metadata in machine-readable formats to facilitate the automated management and implementation of the standards. This paper describes how CDISC SHARE’S standards can facilitate collecting, aggregating and analyzing standardized data from early design to end analysis; and its role as a central resource providing information systems with metadata that drives process automation including study setup and data pipelining. PMID:29888049

  20. Software for rapid prototyping in the pharmaceutical and biotechnology industries.

    PubMed

    Kappler, Michael A

    2008-05-01

    The automation of drug discovery methods continues to develop, especially techniques that process information, represent workflow and facilitate decision-making. The magnitude of data and the plethora of questions in pharmaceutical and biotechnology research give rise to the need for rapid prototyping software. This review describes the advantages and disadvantages of three solutions: Competitive Workflow, Taverna and Pipeline Pilot. Each of these systems processes large amounts of data, integrates diverse systems and assists novice programmers and human experts in critical decision-making steps.

  1. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  2. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    PubMed

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.

  3. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data

    PubMed Central

    Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597

  4. Daily Planet Redesign: eZ Publish Web Content Management Implementation

    NASA Technical Reports Server (NTRS)

    Dutra, Jayne E.

    2006-01-01

    This viewgraph presentation reviews the process of the redesign of the Daily . Planet news letter as a content management implementation project. This is a site that is an internal news site that acts as a communication vehicle for a large volume of content. The Objectives for the site redesign was: (1) Clean visual design, (2) Facilitation of publication processes, (3) More efficient maintenance mode, (4) Automated publishing to internal portal, (5) Better navigation through improved site IA, (6) Archiving and retrieval functionality, (7) Back to basics on fundamental business goals. The CM is a process not a software package

  5. Programs for generating data tables for the annual water-resources data report of the U.S. Geological Survey

    USGS Publications Warehouse

    Mason, R.R.; Hill, C.L.

    1988-01-01

    The U.S. Geological Survey has developed software that interfaces with the Automated Data Processing System to facilitate and expedite preparation of the annual water-resources data report. This software incorporates a feature that prepares daily values tables and appends them to previously edited files containing station manuscripts. Other features collate the merged files with miscellaneous sections of the report. The report is then printed as page-size, camera-ready copy. All system components reside on a minicomputer; this provides easy access and use by remote field offices. Automation of the annual report preparation process results in significant savings of labor and cost. Use of the system for producing the 1986 annual report in the North Carolina District realized a labor savings of over two man-months. A fully implemented system would produce a greater savings and speed release of the report to users.

  6. Terminology model discovery using natural language processing and visualization techniques.

    PubMed

    Zhou, Li; Tao, Ying; Cimino, James J; Chen, Elizabeth S; Liu, Hongfang; Lussier, Yves A; Hripcsak, George; Friedman, Carol

    2006-12-01

    Medical terminologies are important for unambiguous encoding and exchange of clinical information. The traditional manual method of developing terminology models is time-consuming and limited in the number of phrases that a human developer can examine. In this paper, we present an automated method for developing medical terminology models based on natural language processing (NLP) and information visualization techniques. Surgical pathology reports were selected as the testing corpus for developing a pathology procedure terminology model. The use of a general NLP processor for the medical domain, MedLEE, provides an automated method for acquiring semantic structures from a free text corpus and sheds light on a new high-throughput method of medical terminology model development. The use of an information visualization technique supports the summarization and visualization of the large quantity of semantic structures generated from medical documents. We believe that a general method based on NLP and information visualization will facilitate the modeling of medical terminologies.

  7. ALCHEMIST (Anesthesia Log, Charge Entry, Medical Information, and Statistics)

    PubMed Central

    Covey, M. Carl

    1979-01-01

    This paper presents an automated system for the handling of charges and information processing within the Anesthesiology department of the University of Arkansas for the Medical Sciences (UAMS). The purpose of the system is to take the place of cumbersome, manual billing procedures and in the process of automated charge generation, to compile a data base of patient data for later use. ALCHEMIST has demonstrated its value by increasing both the speed and the accuracy of generation of patient charges as well as facilitating the compilation of valuable, informative reports containing statistical summaries of all aspects of the UAMS operating wing case load. ALCHEMIST allows for the entry of fifty different sets of information (multiple items in some sets) for a total of 107 separate data elements from the original anesthetic record. All this data is entered as part of the charge entry procedure.

  8. Flight deck benefits of integrated data link communication

    NASA Technical Reports Server (NTRS)

    Waller, Marvin C.

    1992-01-01

    A fixed-base, piloted simulation study was conducted to determine the operational benefits that result when air traffic control (ATC) instructions are transmitted to the deck of a transport aircraft over a digital data link. The ATC instructions include altitude, airspeed, heading, radio frequency, and route assignment data. The interface between the flight deck and the data link was integrated with other subsystems of the airplane to facilitate data management. Data from the ATC instructions were distributed to the flight guidance and control system, the navigation system, and an automatically tuned communication radio. The co-pilot initiated the automation-assisted data distribution process. Digital communications and automated data distribution were compared with conventional voice radio communication and manual input of data into other subsystems of the simulated aircraft. Less time was required in the combined communication and data management process when data link ATC communication was integrated with the other subsystems. The test subjects, commercial airline pilots, provided favorable evaluations of both the digital communication and data management processes.

  9. A new automated spectral feature extraction method and its application in spectral classification and defective spectra recovery

    NASA Astrophysics Data System (ADS)

    Wang, Ke; Guo, Ping; Luo, A.-Li

    2017-03-01

    Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.

  10. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.

  11. Microscale bioprocess optimisation.

    PubMed

    Micheletti, Martina; Lye, Gary J

    2006-12-01

    Microscale processing techniques offer the potential to speed up the delivery of new drugs to the market, reducing development costs and increasing patient benefit. These techniques have application across both the chemical and biopharmaceutical sectors. The approach involves the study of individual bioprocess operations at the microlitre scale using either microwell or microfluidic formats. In both cases the aim is to generate quantitative bioprocess information early on, so as to inform bioprocess design and speed translation to the manufacturing scale. Automation can enhance experimental throughput and will facilitate the parallel evaluation of competing biocatalyst and process options.

  12. Periodic, On-Demand, and User-Specified Information Reconciliation

    NASA Technical Reports Server (NTRS)

    Kolano, Paul

    2007-01-01

    Automated sequence generation (autogen) signifies both a process and software used to automatically generate sequences of commands to operate various spacecraft. Autogen requires fewer workers than are needed for older manual sequence-generation processes and reduces sequence-generation times from weeks to minutes. The autogen software comprises the autogen script plus the Activity Plan Generator (APGEN) program. APGEN can be used for planning missions and command sequences. APGEN includes a graphical user interface that facilitates scheduling of activities on a time line and affords a capability to automatically expand, decompose, and schedule activities.

  13. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  14. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  15. Implementation of Systematic Review Tools in IRIS | Science ...

    EPA Pesticide Factsheets

    Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view

  16. 77 FR 11175 - Self-Regulatory Organizations; The Depository Trust Company; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-24

    ... support the Request. In order to facilitate this automation, DTC will create a function that will provide... automation, DTC will be able to reduce the notification time frame on full call MMIs so that effective April... automation input mechanism. Additionally, at the request of the Options Clearing Corporation (``OCC''), DTC...

  17. Image-Based Single Cell Profiling: High-Throughput Processing of Mother Machine Experiments

    PubMed Central

    Sachs, Christian Carsten; Grünberger, Alexander; Helfrich, Stefan; Probst, Christopher; Wiechert, Wolfgang; Kohlheyer, Dietrich; Nöh, Katharina

    2016-01-01

    Background Microfluidic lab-on-chip technology combined with live-cell imaging has enabled the observation of single cells in their spatio-temporal context. The mother machine (MM) cultivation system is particularly attractive for the long-term investigation of rod-shaped bacteria since it facilitates continuous cultivation and observation of individual cells over many generations in a highly parallelized manner. To date, the lack of fully automated image analysis software limits the practical applicability of the MM as a phenotypic screening tool. Results We present an image analysis pipeline for the automated processing of MM time lapse image stacks. The pipeline supports all analysis steps, i.e., image registration, orientation correction, channel/cell detection, cell tracking, and result visualization. Tailored algorithms account for the specialized MM layout to enable a robust automated analysis. Image data generated in a two-day growth study (≈ 90 GB) is analyzed in ≈ 30 min with negligible differences in growth rate between automated and manual evaluation quality. The proposed methods are implemented in the software molyso (MOther machine AnaLYsis SOftware) that provides a new profiling tool to analyze unbiasedly hitherto inaccessible large-scale MM image stacks. Conclusion Presented is the software molyso, a ready-to-use open source software (BSD-licensed) for the unsupervised analysis of MM time-lapse image stacks. molyso source code and user manual are available at https://github.com/modsim/molyso. PMID:27661996

  18. Novel Automated Blood Separations Validate Whole Cell Biomarkers

    PubMed Central

    Burger, Douglas E.; Wang, Limei; Ban, Liqin; Okubo, Yoshiaki; Kühtreiber, Willem M.; Leichliter, Ashley K.; Faustman, Denise L.

    2011-01-01

    Background Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs). Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs) of fresh blood samples. Methods and Findings To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes. Conclusions Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials. PMID:21799852

  19. A data mining system for providing analytical information on brain tumors to public health decision makers.

    PubMed

    Santos, R S; Malheiros, S M F; Cavalheiro, S; de Oliveira, J M Parente

    2013-03-01

    Cancer is the leading cause of death in economically developed countries and the second leading cause of death in developing countries. Malignant brain neoplasms are among the most devastating and incurable forms of cancer, and their treatment may be excessively complex and costly. Public health decision makers require significant amounts of analytical information to manage public treatment programs for these patients. Data mining, a technology that is used to produce analytically useful information, has been employed successfully with medical data. However, the large-scale adoption of this technique has been limited thus far because it is difficult to use, especially for non-expert users. One way to facilitate data mining by non-expert users is to automate the process. Our aim is to present an automated data mining system that allows public health decision makers to access analytical information regarding brain tumors. The emphasis in this study is the use of ontology in an automated data mining process. The non-experts who tried the system obtained useful information about the treatment of brain tumors. These results suggest that future work should be conducted in this area. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  20. Towards automated segmentation of cells and cell nuclei in nonlinear optical microscopy.

    PubMed

    Medyukhina, Anna; Meyer, Tobias; Schmitt, Michael; Romeike, Bernd F M; Dietzek, Benjamin; Popp, Jürgen

    2012-11-01

    Nonlinear optical (NLO) imaging techniques based e.g. on coherent anti-Stokes Raman scattering (CARS) or two photon excited fluorescence (TPEF) show great potential for biomedical imaging. In order to facilitate the diagnostic process based on NLO imaging, there is need for an automated calculation of quantitative values such as cell density, nucleus-to-cytoplasm ratio, average nuclear size. Extraction of these parameters is helpful for the histological assessment in general and specifically e.g. for the determination of tumor grades. This requires an accurate image segmentation and detection of locations and boundaries of cells and nuclei. Here we present an image processing approach for the detection of nuclei and cells in co-registered TPEF and CARS images. The algorithm developed utilizes the gray-scale information for the detection of the nuclei locations and the gradient information for the delineation of the nuclear and cellular boundaries. The approach reported is capable for an automated segmentation of cells and nuclei in multimodal TPEF-CARS images of human brain tumor samples. The results are important for the development of NLO microscopy into a clinically relevant diagnostic tool. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Semiautomated Sample Preparation for Protein Stability and Formulation Screening via Buffer Exchange.

    PubMed

    Ying, William; Levons, Jaquan K; Carney, Andrea; Gandhi, Rajesh; Vydra, Vicky; Rubin, A Erik

    2016-06-01

    A novel semiautomated buffer exchange process workflow was developed to enable efficient early protein formulation screening. An antibody fragment protein, BMSdab, was used to demonstrate the workflow. The process afforded 60% to 80% cycle time and scientist time savings and significant material efficiencies. These efficiencies ultimately facilitated execution of this stability work earlier in the drug development process, allowing this tool to inform the developability of potential candidates for development from a formulation perspective. To overcome the key technical challenges, the protein solution was buffer-exchanged by centrifuge filtration into formulations for stability screening in a 96-well plate with an ultrafiltration membrane, leveraging automated liquid handling and acoustic volume measurements to allow several cycles of exchanges. The formulations were transferred into a vacuum manifold and sterile filtered into a rack holding 96 glass vials. The vials were sealed with a capmat of individual caps and placed in stability stations. Stability of the samples prepared by this process and by the standard process was demonstrated to be comparable. This process enabled screening a number of formulations of a protein at an early pharmaceutical development stage with a short sample preparation time. © 2015 Society for Laboratory Automation and Screening.

  2. Effect of Cord Blood Processing on Transplant Outcomes after Single Myeloablative Umbilical Cord Blood Transplantation

    PubMed Central

    Ballen, Karen K.; Logan, Brent R.; Laughlin, Mary J.; He, Wensheng; Ambruso, Daniel R.; Armitage, Susan E.; Beddard, Rachel L.; Bhatla, Deepika; Hwang, William Y.K.; Kiss, Joseph E.; Koegler, Gesine; Kurtzberg, Joanne; Nagler, Arnon; Oh, David; Petz, Lawrence D.; Price, Thomas H.; Quinones, Ralph R.; Ratanatharathorn, Voravit; Rizzo, J. Douglas; Sazama, Kathleen; Scaradavou, Andromachi; Schuster, Michael W.; Sender, Leonard S.; Shpall, Elizabeth J.; Spellman, Stephen R.; Sutton, Millicent; Weitekamp, Lee Ann; Wingard, John R.; Eapen, Mary

    2015-01-01

    Variations in cord blood manufacturing and administration are common, and the optimal practice, not known. We compared processing and banking practices at 16 public cord blood banks (CBB) in the United States, and assessed transplant outcomes on 530 single umbilical cord blood (UCB) myeloablative transplantations for hematologic malignancies, facilitated by these banks. UCB banking practices were separated into three mutually exclusive groups based on whether processing was automated or manual; units were plasma and red blood cell reduced or buffy coat production method or plasma reduced. Compared to the automated processing system for units, the day-28 neutrophil recovery was significantly lower after transplantation of units that were manually processed and plasma reduced (red cell replete) (odds ratio [OR] 0.19 p=0.001) or plasma and red cell reduced (OR 0.54, p=0.05). Day-100 survival did not differ by CBB. However, day-100 survival was better with units that were thawed with the dextran-albumin wash method compared to the “no wash” or “dilution only” techniques (OR 1.82, p=0.04). In conclusion, CBB processing has no significant effect on early (day 100) survival despite differences in kinetics of neutrophil recovery. PMID:25543094

  3. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  4. Transforming BIM to BEM: Generation of Building Geometry for the NASA Ames Sustainability Base BIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Donnell, James T.; Maile, Tobias; Rose, Cody

    Typical processes of whole Building Energy simulation Model (BEM) generation are subjective, labor intensive, time intensive and error prone. Essentially, these typical processes reproduce already existing data, i.e. building models already created by the architect. Accordingly, Lawrence Berkeley National Laboratory (LBNL) developed a semi-automated process that enables reproducible conversions of Building Information Model (BIM) representations of building geometry into a format required by building energy modeling (BEM) tools. This is a generic process that may be applied to all building energy modeling tools but to date has only been used for EnergyPlus. This report describes and demonstrates each stage inmore » the semi-automated process for building geometry using the recently constructed NASA Ames Sustainability Base throughout. This example uses ArchiCAD (Graphisoft, 2012) as the originating CAD tool and EnergyPlus as the concluding whole building energy simulation tool. It is important to note that the process is also applicable for professionals that use other CAD tools such as Revit (“Revit Architecture,” 2012) and DProfiler (Beck Technology, 2012) and can be extended to provide geometry definitions for BEM tools other than EnergyPlus. Geometry Simplification Tool (GST) was used during the NASA Ames project and was the enabling software that facilitated semi-automated data transformations. GST has now been superseded by Space Boundary Tool (SBT-1) and will be referred to as SBT-1 throughout this report. The benefits of this semi-automated process are fourfold: 1) reduce the amount of time and cost required to develop a whole building energy simulation model, 2) enable rapid generation of design alternatives, 3) improve the accuracy of BEMs and 4) result in significantly better performing buildings with significantly lower energy consumption than those created using the traditional design process, especially if the simulation model was used as a predictive benchmark during operation. Developing BIM based criteria to support the semi-automated process should result in significant reliable improvements and time savings in the development of BEMs. In order to define successful BIMS, CAD export of IFC based BIMs for BEM must adhere to a standard Model View Definition (MVD) for simulation as provided by the concept design BIM MVD (buildingSMART, 2011). In order to ensure wide scale adoption, companies would also need to develop their own material libraries to support automated activities and undertake a pilot project to improve understanding of modeling conventions and design tool features and limitations.« less

  5. SYRIAC: The systematic review information automated collection system a data warehouse for facilitating automated biomedical text classification.

    PubMed

    Yang, Jianji J; Cohen, Aaron M; Cohen, Aaron; McDonagh, Marian S

    2008-11-06

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent.To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation data sets for SR text mining research.

  6. SYRIAC: The SYstematic Review Information Automated Collection System A Data Warehouse for Facilitating Automated Biomedical Text Classification

    PubMed Central

    Yang, Jianji J.; Cohen, Aaron M.; McDonagh, Marian S.

    2008-01-01

    Automatic document classification can be valuable in increasing the efficiency in updating systematic reviews (SR). In order for the machine learning process to work well, it is critical to create and maintain high-quality training datasets consisting of expert SR inclusion/exclusion decisions. This task can be laborious, especially when the number of topics is large and source data format is inconsistent. To approach this problem, we build an automated system to streamline the required steps, from initial notification of update in source annotation files to loading the data warehouse, along with a web interface to monitor the status of each topic. In our current collection of 26 SR topics, we were able to standardize almost all of the relevance judgments and recovered PMIDs for over 80% of all articles. Of those PMIDs, over 99% were correct in a manual random sample study. Our system performs an essential function in creating training and evaluation datasets for SR text mining research. PMID:18999194

  7. ST Spot Detector: a web-based application for automatic spot and tissue detection for spatial Transcriptomics image datasets.

    PubMed

    Wong, Kim; Navarro, José Fernández; Bergenstråhle, Ludvig; Ståhl, Patrik L; Lundeberg, Joakim

    2018-06-01

    Spatial Transcriptomics (ST) is a method which combines high resolution tissue imaging with high troughput transcriptome sequencing data. This data must be aligned with the images for correct visualization, a process that involves several manual steps. Here we present ST Spot Detector, a web tool that automates and facilitates this alignment through a user friendly interface. jose.fernandez.navarro@scilifelab.se. Supplementary data are available at Bioinformatics online.

  8. Manned Orbital Transfer Vehicle (MOTV). Volume 5: Turnaround analysis

    NASA Technical Reports Server (NTRS)

    Boyland, R. E.; Sherman, S. W.; Morfin, H. W.

    1979-01-01

    The development of a low-cost reliable turnaround process to employ the MOTV in enhancing the utilization of the geosynchronous space region is analyzed. It is indicated that a routine effective turnaround/maintenance plan must make maximum use of flight data for maintenance planning, a high degree of test automation, and MOTV maintainability features in order to minimize tests, facilitate repair, and reduce manpower requirements. An effective turnaround plan provides a payback of reduced risks.

  9. Microfluidic large-scale integration: the evolution of design rules for biological automation.

    PubMed

    Melin, Jessica; Quake, Stephen R

    2007-01-01

    Microfluidic large-scale integration (mLSI) refers to the development of microfluidic chips with thousands of integrated micromechanical valves and control components. This technology is utilized in many areas of biology and chemistry and is a candidate to replace today's conventional automation paradigm, which consists of fluid-handling robots. We review the basic development of mLSI and then discuss design principles of mLSI to assess the capabilities and limitations of the current state of the art and to facilitate the application of mLSI to areas of biology. Many design and practical issues, including economies of scale, parallelization strategies, multiplexing, and multistep biochemical processing, are discussed. Several microfluidic components used as building blocks to create effective, complex, and highly integrated microfluidic networks are also highlighted.

  10. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Automated Fall Detection With Quality Improvement “Rewind” to Reduce Falls in Hospital Rooms

    PubMed Central

    Rantz, Marilyn J.; Banerjee, Tanvi S.; Cattoor, Erin; Scott, Susan D.; Skubic, Marjorie; Popescu, Mihail

    2014-01-01

    The purpose of this study was to test the implementation of a fall detection and “rewind” privacy-protecting technique using the Microsoft® Kinect™ to not only detect but prevent falls from occurring in hospitalized patients. Kinect sensors were placed in six hospital rooms in a step-down unit and data were continuously logged. Prior to implementation with patients, three researchers performed a total of 18 falls (walking and then falling down or falling from the bed) and 17 non-fall events (crouching down, stooping down to tie shoe laces, and lying on the floor). All falls and non-falls were correctly identified using automated algorithms to process Kinect sensor data. During the first 8 months of data collection, processing methods were perfected to manage data and provide a “rewind” method to view events that led to falls for post-fall quality improvement process analyses. Preliminary data from this feasibility study show that using the Microsoft Kinect sensors provides detection of falls, fall risks, and facilitates quality improvement after falls in real hospital environments unobtrusively, while taking into account patient privacy. PMID:24296567

  12. Robotics-assisted mass spectrometry assay platform enabled by open-source electronics.

    PubMed

    Chiu, Shih-Hao; Urban, Pawel L

    2015-02-15

    Mass spectrometry (MS) is an important analytical technique with numerous applications in clinical analysis, biochemistry, environmental analysis, geology and physics. Its success builds on the ability of MS to determine molecular weights of analytes, and elucidate their structures. However, sample handling prior to MS requires a lot of attention and labor. In this work we were aiming to automate processing samples for MS so that analyses could be conducted without much supervision of experienced analysts. The goal of this study was to develop a robotics and information technology-oriented platform that could control the whole analysis process including sample delivery, reaction-based assay, data acquisition, and interaction with the analyst. The proposed platform incorporates a robotic arm for handling sample vials delivered to the laboratory, and several auxiliary devices which facilitate and secure the analysis process. They include: multi-relay board, infrared sensors, photo-interrupters, gyroscopes, force sensors, fingerprint scanner, barcode scanner, touch screen panel, and internet interface. The control of all the building blocks is achieved through implementation of open-source electronics (Arduino), and enabled by custom-written programs in C language. The advantages of the proposed system include: low cost, simplicity, small size, as well as facile automation of sample delivery and processing without the intervention of the analyst. It is envisaged that this simple robotic system may be the forerunner of automated laboratories dedicated to mass spectrometric analysis of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG

    PubMed Central

    Cowley, Benjamin U.; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap. PMID:29692705

  14. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    PubMed

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  15. Process development of human multipotent stromal cell microcarrier culture using an automated high-throughput microbioreactor.

    PubMed

    Rafiq, Qasim A; Hanga, Mariana P; Heathman, Thomas R J; Coopman, Karen; Nienow, Alvin W; Williams, David J; Hewitt, Christopher J

    2017-10-01

    Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high-throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum-based medium was applied to a serum-free process in the ambr15, resulting in >250% increase in yield compared to the serum-based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, N JS . The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06-0.54%, respectively. The combination of both serum-free and automated processing improved the reproducibility more than 10-fold compared to the serum-based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum-free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253-2266. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  16. Process development of human multipotent stromal cell microcarrier culture using an automated high‐throughput microbioreactor

    PubMed Central

    Hanga, Mariana P.; Heathman, Thomas R. J.; Coopman, Karen; Nienow, Alvin W.; Williams, David J.; Hewitt, Christopher J.

    2017-01-01

    ABSTRACT Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high‐throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum‐based medium was applied to a serum‐free process in the ambr15, resulting in >250% increase in yield compared to the serum‐based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, NJS. The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06–0.54%, respectively. The combination of both serum‐free and automated processing improved the reproducibility more than 10‐fold compared to the serum‐based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination of serum‐free medium, control, and automation improves both process yield and consistency. Biotechnol. Bioeng. 2017;114: 2253–2266. © 2017 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:28627713

  17. Development of a menu of performance tests self-administered on a portable microcomputer

    NASA Technical Reports Server (NTRS)

    Wilkes, Robert L.; Kuntz, Lois-Ann; Kennedy, Robert S.

    1987-01-01

    Eighteen cognitive, motor, and information processing performance subtests were screened for self-administration over 10 trials by 16 subjects. When altered presentation forms of the same test were collectively considered, the battery composition was reduced to 10 distinctly different measures. A fully automated microbased testing system was employed in presenting the battery of subtests. Successful self-administration of the battery provided for the field testing of the automated system and facilitated convenient data collection. Total test administration time was 47.2 minutes for each session. Results indicated that nine of the tests stabilized, but for a short battery of tests only five are recommended for use in repeated-measures research. The five recommended tests include: the Tapping series, Number Comparison, Short-term Memory, Grammatical Reasoning, and 4-Choice Reaction Time. These tests can be expected to reveal three factors: (1) cognition, (2) processing quickness, and (3) motor. All the tests stabilized in 24 minutes, or approximately two 12-minute sessions.

  18. Automatic OPC repair flow: optimized implementation of the repair recipe

    NASA Astrophysics Data System (ADS)

    Bahnas, Mohamed; Al-Imam, Mohamed; Word, James

    2007-10-01

    Virtual manufacturing that is enabled by rapid, accurate, full-chip simulation is a main pillar in achieving successful mask tape-out in the cutting-edge low-k1 lithography. It facilitates detecting printing failures before a costly and time-consuming mask tape-out and wafer print occur. The OPC verification step role is critical at the early production phases of a new process development, since various layout patterns will be suspected that they might to fail or cause performance degradation, and in turn need to be accurately flagged to be fed back to the OPC Engineer for further learning and enhancing in the OPC recipe. At the advanced phases of the process development, there is much less probability of detecting failures but still the OPC Verification step act as the last-line-of-defense for the whole RET implemented work. In recent publication the optimum approach of responding to these detected failures was addressed, and a solution was proposed to repair these defects in an automated methodology and fully integrated and compatible with the main RET/OPC flow. In this paper the authors will present further work and optimizations of this Repair flow. An automated analysis methodology for root causes of the defects and classification of them to cover all possible causes will be discussed. This automated analysis approach will include all the learning experience of the previously highlighted causes and include any new discoveries. Next, according to the automated pre-classification of the defects, application of the appropriate approach of OPC repair (i.e. OPC knob) on each classified defect location can be easily selected, instead of applying all approaches on all locations. This will help in cutting down the runtime of the OPC repair processing and reduce the needed number of iterations to reach the status of zero defects. An output report for existing causes of defects and how the tool handled them will be generated. The report will with help further learning and facilitate the enhancement of the main OPC recipe. Accordingly, the main OPC recipe can be more robust, converging faster and probably in a fewer number of iterations. This knowledge feedback loop is one of the fruitful benefits of the Automatic OPC Repair flow.

  19. Automation of ALK gene rearrangement testing with fluorescence in situ hybridization (FISH): a feasibility study.

    PubMed

    Zwaenepoel, Karen; Merkle, Dennis; Cabillic, Florian; Berg, Erica; Belaud-Rotureau, Marc-Antoine; Grazioli, Vittorio; Herelle, Olga; Hummel, Michael; Le Calve, Michele; Lenze, Dido; Mende, Stefanie; Pauwels, Patrick; Quilichini, Benoit; Repetti, Elena

    2015-02-01

    In the past several years we have observed a significant increase in our understanding of molecular mechanisms that drive lung cancer. Specifically in the non-small cell lung cancer sub-types, ALK gene rearrangements represent a sub-group of tumors that are targetable by the tyrosine kinase inhibitor Crizotinib, resulting in significant reductions in tumor burden. Phase II and III clinical trials were performed using an ALK break-apart FISH probe kit, making FISH the gold standard for identifying ALK rearrangements in patients. FISH is often considered a labor and cost intensive molecular technique, and in this study we aimed to demonstrate feasibility for automation of ALK FISH testing, to improve laboratory workflow and ease of testing. This involved automation of the pre-treatment steps of the ALK assay using various protocols on the VP 2000 instrument, and facilitating automated scanning of the fluorescent FISH specimens for simplified enumeration on various backend scanning and analysis systems. The results indicated that ALK FISH can be automated. Significantly, both the Ikoniscope and BioView system of automated FISH scanning and analysis systems provided a robust analysis algorithm to define ALK rearrangements. In addition, the BioView system facilitated consultation of difficult cases via the internet. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Automated data mining of a proprietary database system for physician quality improvement.

    PubMed

    Johnstone, Peter A S; Crenshaw, Tim; Cassels, Diane G; Fox, Timothy H

    2008-04-01

    Physician practice quality improvement is a subject of intense national debate. This report describes using a software data acquisition program to mine an existing, commonly used proprietary radiation oncology database to assess physician performance. Between 2003 and 2004, a manual analysis was performed of electronic portal image (EPI) review records. Custom software was recently developed to mine the record-and-verify database and the review process of EPI at our institution. In late 2006, a report was developed that allowed for immediate review of physician completeness and speed of EPI review for any prescribed period. The software extracted >46,000 EPIs between 2003 and 2007, providing EPI review status and time to review by each physician. Between 2003 and 2007, the department EPI review improved from 77% to 97% (range, 85.4-100%), with a decrease in the mean time to review from 4.2 days to 2.4 days. The initial intervention in 2003 to 2004 was moderately successful in changing the EPI review patterns; it was not repeated because of the time required to perform it. However, the implementation in 2006 of the automated review tool yielded a profound change in practice. Using the software, the automated chart review required approximately 1.5 h for mining and extracting the data for the 4-year period. This study quantified the EPI review process as it evolved during a 4-year period at our institution and found that automation of data retrieval and review simplified and facilitated physician quality improvement.

  1. Skin Bioprinting: Impending Reality or Fantasy?

    PubMed

    Ng, Wei Long; Wang, Shuai; Yeong, Wai Yee; Naing, May Win

    2016-09-01

    Bioprinting provides a fully automated and advanced platform that facilitates the simultaneous and highly specific deposition of multiple types of skin cells and biomaterials, a process that is lacking in conventional skin tissue-engineering approaches. Here, we provide a realistic, current overview of skin bioprinting, distinguishing facts from myths. We present an in-depth analysis of both current skin bioprinting works and the cellular and matrix components of native human skin. We also highlight current limitations and achievements, followed by design considerations and a future outlook for skin bioprinting. The potential of bioprinting with converging opportunities in biology, material, and computational design will eventually facilitate the fabrication of improved tissue-engineered (TE) skin constructs, making bioprinting skin an impending reality. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Modeling Driving Performance Using In-Vehicle Speech Data From a Naturalistic Driving Study.

    PubMed

    Kuo, Jonny; Charlton, Judith L; Koppel, Sjaan; Rudin-Brown, Christina M; Cross, Suzanne

    2016-09-01

    We aimed to (a) describe the development and application of an automated approach for processing in-vehicle speech data from a naturalistic driving study (NDS), (b) examine the influence of child passenger presence on driving performance, and (c) model this relationship using in-vehicle speech data. Parent drivers frequently engage in child-related secondary behaviors, but the impact on driving performance is unknown. Applying automated speech-processing techniques to NDS audio data would facilitate the analysis of in-vehicle driver-child interactions and their influence on driving performance. Speech activity detection and speaker diarization algorithms were applied to audio data from a Melbourne-based NDS involving 42 families. Multilevel models were developed to evaluate the effect of speech activity and the presence of child passengers on driving performance. Speech activity was significantly associated with velocity and steering angle variability. Child passenger presence alone was not associated with changes in driving performance. However, speech activity in the presence of two child passengers was associated with the most variability in driving performance. The effects of in-vehicle speech on driving performance in the presence of child passengers appear to be heterogeneous, and multiple factors may need to be considered in evaluating their impact. This goal can potentially be achieved within large-scale NDS through the automated processing of observational data, including speech. Speech-processing algorithms enable new perspectives on driving performance to be gained from existing NDS data, and variables that were once labor-intensive to process can be readily utilized in future research. © 2016, Human Factors and Ergonomics Society.

  3. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  4. DHMI: dynamic holographic microscopy interface

    NASA Astrophysics Data System (ADS)

    He, Xuefei; Zheng, Yujie; Lee, Woei Ming

    2016-12-01

    Digital holographic microscopy (DHM) is a powerful in-vitro biological imaging tool. In this paper, we report a fully automated off-axis digital holographic microscopy system completed with a graphical user interface in the Matlab environment. The interface primarily includes Fourier domain processing, phase reconstruction, aberration compensation and autofocusing. A variety of imaging operations such as region of interest selection, de-noising mode (filtering and averaging), low frame rate imaging for immediate reconstruction and high frame rate imaging routine ( 27 fps) are implemented to facilitate ease of use.

  5. Computer Aided Software Engineering (CASE) Environment Issues.

    DTIC Science & Technology

    1987-06-01

    tasks tend to be error prone and slowv when done by humans . Ti-.c,. are e’.el nt anidates for automation using a computer. (MacLennan. 10S1. p. 51 2...CASE r,’sourCcs; * human resources. Lonsisting of the people who use and facilitate utilization in !:1e case of manual resource, of the environment...engineering process in a given er,%irent rnizthe nature of rnanua! and human resources. CA.SU_ -esources should provide the softwvare enizincerin2 team

  6. An Automated Approach to Instructional Design Guidance.

    ERIC Educational Resources Information Center

    Spector, J. Michael; And Others

    This paper describes the Guided Approach to Instructional Design Advising (GAIDA), an automated instructional design tool that incorporates techniques of artificial intelligence. GAIDA was developed by the U.S. Air Force Armstrong Laboratory to facilitate the planning and production of interactive courseware and computer-based training materials.…

  7. PRISM Software: Processing and Review Interface for Strong‐Motion Data

    USGS Publications Warehouse

    Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter

    2017-01-01

    A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.

  8. A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies

    NASA Technical Reports Server (NTRS)

    Fern, Lisa Carolynn

    2016-01-01

    This document examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will deployed into complex systems. A key question for new technologies, is how work will be accomplished by the human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by the designers. The human machine interface (HMI) which is intended to facilitate human-machine interaction and cooperation, however, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture, can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the expected performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements a detect and avoid system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned from a recent research effort in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to the complete absence of different approaches to human-automation cooperation. For example, all of the prototype technologies that were evaluated in the research program assumed a human-automation architecture that relied on serial processing from the automation to the human. While this type of human-automation architecture is typical across many different technologies and in many different domains, it ignores different architectures where humans and automation work in parallel. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed.

  9. A macro-directive mechanism that facilitates automatic updating and processing of the contents of Electronic Healthcare Records: an extension to the CEN architecture.

    PubMed

    Deftereos, S; Lambrinoudakis, C; Gritzalis, S; Georgonikou, D; Andriopoulos, P; Aessopos, A

    2003-03-01

    Facilitating data entry, eliminating redundant effort and providing decision support are some of the factors upon which the successful uptake of Electronic Healthcare Record (EHCR) technology is dependent. The European Standardization Committee (CEN), on the other hand, has proposed a standard EHCR architecture, which allows patient record contents to be highly diverse, customized to individual user needs; this makes their processing a challenging task and poses a demand for specially designed mechanisms. We describe the requirements for a macro-directive mechanism, pertaining to CEN-compatible EHCR software that can automate updating and processing of patient records, thus enhancing the functionality of the software. We have implemented the above-mentioned mechanism in an EHCR application that has been customized for use in the care process of patients suffering from beta-Thalassemia. The application is being used during the last two years in the Thalassemia units of four Greek hospitals, as part of their every day practice. We report on the experience we have acquired so far.

  10. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    PubMed

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  11. Product-oriented Software Certification Process for Software Synthesis

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  12. Extension Master Gardener Intranet: Automating Administration, Motivating Volunteers, Increasing Efficiency, and Facilitating Impact Reporting

    ERIC Educational Resources Information Center

    Bradley, Lucy K.; Cook, Jonneen; Cook, Chris

    2011-01-01

    North Carolina State University has incorporated many aspects of volunteer program administration and reporting into an on-line solution that integrates impact reporting into daily program management. The Extension Master Gardener Intranet automates many of the administrative tasks associated with volunteer management, increasing efficiency, and…

  13. ADDING GLOBAL SOILS DATA TO THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL (AGWA)

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment Tool (AGWA) is a GIS-based hydrologic modeling tool that is available as an extension for ArcView 3.x from the USDA-ARS Southwest Watershed Research Center (www.tucson.ars.ag.gov/agwa). AGWA is designed to facilitate the assessment of...

  14. Turbidity-controlled sampling for suspended sediment load estimation

    Treesearch

    Jack Lewis

    2003-01-01

    Abstract - Automated data collection is essential to effectively measure suspended sediment loads in storm events, particularly in small basins. Continuous turbidity measurements can be used, along with discharge, in an automated system that makes real-time sampling decisions to facilitate sediment load estimation. The Turbidity Threshold Sampling method distributes...

  15. Project-focused activity and knowledge tracker: a unified data analysis, collaboration, and workflow tool for medicinal chemistry project teams.

    PubMed

    Brodney, Marian D; Brosius, Arthur D; Gregory, Tracy; Heck, Steven D; Klug-McLeod, Jacquelyn L; Poss, Christopher S

    2009-12-01

    Advances in the field of drug discovery have brought an explosion in the quantity of data available to medicinal chemists and other project team members. New strategies and systems are needed to help these scientists to efficiently gather, organize, analyze, annotate, and share data about potential new drug molecules of interest to their project teams. Herein we describe a suite of integrated services and end-user applications that facilitate these activities throughout the medicinal chemistry design cycle. The Automated Data Presentation (ADP) and Virtual Compound Profiler (VCP) processes automate the gathering, organization, and storage of real and virtual molecules, respectively, and associated data. The Project-Focused Activity and Knowledge Tracker (PFAKT) provides a unified data analysis and collaboration environment, enhancing decision-making, improving team communication, and increasing efficiency.

  16. Dotette: Programmable, high-precision, plug-and-play droplet pipetting.

    PubMed

    Fan, Jinzhen; Men, Yongfan; Hao Tseng, Kuo; Ding, Yi; Ding, Yunfeng; Villarreal, Fernando; Tan, Cheemeng; Li, Baoqing; Pan, Tingrui

    2018-05-01

    Manual micropipettes are the most heavily used liquid handling devices in biological and chemical laboratories; however, they suffer from low precision for volumes under 1  μ l and inevitable human errors. For a manual device, the human errors introduced pose potential risks of failed experiments, inaccurate results, and financial costs. Meanwhile, low precision under 1  μ l can cause severe quantification errors and high heterogeneity of outcomes, becoming a bottleneck of reaction miniaturization for quantitative research in biochemical labs. Here, we report Dotette, a programmable, plug-and-play microfluidic pipetting device based on nanoliter liquid printing. With automated control, protocols designed on computers can be directly downloaded into Dotette, enabling programmable operation processes. Utilizing continuous nanoliter droplet dispensing, the precision of the volume control has been successfully improved from traditional 20%-50% to less than 5% in the range of 100 nl to 1000 nl. Such a highly automated, plug-and-play add-on to existing pipetting devices not only improves precise quantification in low-volume liquid handling and reduces chemical consumptions but also facilitates and automates a variety of biochemical and biological operations.

  17. Yaxx: Yet another X-ray extractor

    NASA Astrophysics Data System (ADS)

    Aldcroft, Tom

    2013-06-01

    Yaxx is a Perl script that facilitates batch data processing using Perl open source software and commonly available software such as CIAO/Sherpa, S-lang, SAS, and FTOOLS. For Chandra and XMM analysis it includes automated spectral extraction, fitting, and report generation. Yaxx can be run without climbing an extensive learning curve; even so, yaxx is highly configurable and can be customized to support complex analysis. yaxx uses template files and takes full advantage of the unique Sherpa / S-lang environment to make much of the processing user configurable. Although originally developed with an emphasis on X-ray data analysis, yaxx evolved to be a general-purpose pipeline scripting package.

  18. Automation of the CFD Process on Distributed Computing Systems

    NASA Technical Reports Server (NTRS)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.

    2000-01-01

    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational resources required to compute and store the information. The scripts were continually modified to improve the utilization of the computational resources and reduce the likelihood of data loss due to failures. An ad-hoc file server was created to manage the large amount of data being generated as part of the design event. Files were stored and retrieved as needed to create new jobs and analyze the results. Additional information is contained in the original.

  19. SEM AutoAnalysis: enhancing photomask and NIL defect disposition and review

    NASA Astrophysics Data System (ADS)

    Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Ehrlich, Christian; Garetto, Anthony

    2017-06-01

    For defect disposition and repair verification regarding printability, AIMS™ is the state of the art measurement tool in industry. With its unique capability of capturing aerial images of photomasks it is the one method that comes closest to emulating the printing behaviour of a scanner. However for nanoimprint lithography (NIL) templates aerial images cannot be applied to evaluate the success of a repair process. Hence, for NIL defect dispositioning scanning, electron microscopy (SEM) imaging is the method of choice. In addition, it has been a standard imaging method for further root cause analysis of defects and defect review on optical photomasks which enables 2D or even 3D mask profiling at high resolutions. In recent years a trend observed in mask shops has been the automation of processes that traditionally were driven by operators. This of course has brought many advantages one of which is freeing cost intensive labour from conducting repetitive and tedious work. Furthermore, it reduces variability in processes due to different operator skill and experience levels which at the end contributes to eliminating the human factor. Taking these factors into consideration, one of the software based solutions available under the FAVOR® brand to support customer needs is the aerial image evaluation software, AIMS™ AutoAnalysis (AAA). It provides fully automated analysis of AIMS™ images and runs in parallel to measurements. This is enabled by its direct connection and communication with the AIMS™tools. As one of many positive outcomes, generating automated result reports is facilitated, standardizing the mask manufacturing workflow. Today, AAA has been successfully introduced into production at multiple customers and is supporting the workflow as described above. These trends indeed have triggered the demand for similar automation with respect to SEM measurements leading to the development of SEM AutoAnalysis (SAA). It aims towards a fully automated SEM image evaluation process utilizing a completely different algorithm due to the different nature of SEM images and aerial images. Both AAA and SAA are the building blocks towards an image evaluation suite in the mask shop industry.

  20. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    PubMed

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.

  1. Improved detection of soma location and morphology in fluorescence microscopy images of neurons.

    PubMed

    Kayasandik, Cihan Bilge; Labate, Demetrio

    2016-12-01

    Automated detection and segmentation of somas in fluorescent images of neurons is a major goal in quantitative studies of neuronal networks, including applications of high-content-screenings where it is required to quantify multiple morphological properties of neurons. Despite recent advances in image processing targeted to neurobiological applications, existing algorithms of soma detection are often unreliable, especially when processing fluorescence image stacks of neuronal cultures. In this paper, we introduce an innovative algorithm for the detection and extraction of somas in fluorescent images of networks of cultured neurons where somas and other structures exist in the same fluorescent channel. Our method relies on a new geometrical descriptor called Directional Ratio and a collection of multiscale orientable filters to quantify the level of local isotropy in an image. To optimize the application of this approach, we introduce a new construction of multiscale anisotropic filters that is implemented by separable convolution. Extensive numerical experiments using 2D and 3D confocal images show that our automated algorithm reliably detects somas, accurately segments them, and separates contiguous ones. We include a detailed comparison with state-of-the-art existing methods to demonstrate that our algorithm is extremely competitive in terms of accuracy, reliability and computational efficiency. Our algorithm will facilitate the development of automated platforms for high content neuron image processing. A Matlab code is released open-source and freely available to the scientific community. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Standardization of skin cleansing in vivo: part I. Development of an Automated Cleansing Device (ACiD).

    PubMed

    Sonsmann, F K; Strunk, M; Gediga, K; John, C; Schliemann, S; Seyfarth, F; Elsner, P; Diepgen, T L; Kutz, G; John, S M

    2014-05-01

    To date, there are no legally binding requirements concerning product testing in cosmetics. This leads to various manufacturer-specific test methods and absent transparent information on skin cleansing products. A standardized in vivo test procedure for assessment of cleansing efficacy and corresponding barrier impairment by the cleaning process is needed, especially in the occupational context where repeated hand washing procedures may be performed at short intervals. For the standardization of the cleansing procedure, an Automated Cleansing Device (ACiD) was designed and evaluated. Different smooth washing surfaces of the equipment for ACiD (incl. goat hair, felt, felt covered with nitrile caps) were evaluated regarding their skin compatibility. ACiD allows an automated, fully standardized skin washing procedure. Felt covered with nitrile as washing surface of the rotating washing units leads to a homogenous cleansing result and does not cause detectable skin irritation, neither clinically nor as assessed by skin bioengineering methods (transepidermal water loss, chromametry). Automated Cleansing Device may be useful for standardized evaluation of the cleansing effectiveness and parallel assessment of the corresponding irritancy potential of industrial skin cleansers. This will allow objectifying efficacy and safety of industrial skin cleansers, thus enabling market transparency and facilitating rational choice of products. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Espina: A Tool for the Automated Segmentation and Counting of Synapses in Large Stacks of Electron Microscopy Images

    PubMed Central

    Morales, Juan; Alonso-Nanclares, Lidia; Rodríguez, José-Rodrigo; DeFelipe, Javier; Rodríguez, Ángel; Merchán-Pérez, Ángel

    2011-01-01

    The synapses in the cerebral cortex can be classified into two main types, Gray's type I and type II, which correspond to asymmetric (mostly glutamatergic excitatory) and symmetric (inhibitory GABAergic) synapses, respectively. Hence, the quantification and identification of their different types and the proportions in which they are found, is extraordinarily important in terms of brain function. The ideal approach to calculate the number of synapses per unit volume is to analyze 3D samples reconstructed from serial sections. However, obtaining serial sections by transmission electron microscopy is an extremely time consuming and technically demanding task. Using focused ion beam/scanning electron microscope microscopy, we recently showed that virtually all synapses can be accurately identified as asymmetric or symmetric synapses when they are visualized, reconstructed, and quantified from large 3D tissue samples obtained in an automated manner. Nevertheless, the analysis, segmentation, and quantification of synapses is still a labor intensive procedure. Thus, novel solutions are currently necessary to deal with the large volume of data that is being generated by automated 3D electron microscopy. Accordingly, we have developed ESPINA, a software tool that performs the automated segmentation and counting of synapses in a reconstructed 3D volume of the cerebral cortex, and that greatly facilitates and accelerates these processes. PMID:21633491

  4. Bioreactor design for successive culture of anchorage-dependent cells operated in an automated manner.

    PubMed

    Kino-Oka, Masahiro; Ogawa, Natsuki; Umegaki, Ryota; Taya, Masahito

    2005-01-01

    A novel bioreactor system was designed to perform a series of batchwise cultures of anchorage-dependent cells by means of automated operations of medium change and passage for cell transfer. The experimental data on contamination frequency ensured the biological cleanliness in the bioreactor system, which facilitated the operations in a closed environment, as compared with that in flask culture system with manual handlings. In addition, the tools for growth prediction (based on growth kinetics) and real-time growth monitoring by measurement of medium components (based on small-volume analyzing machinery) were installed into the bioreactor system to schedule the operations of medium change and passage and to confirm that culture proceeds as scheduled, respectively. The successive culture of anchorage-dependent cells was conducted with the bioreactor running in an automated way. The automated bioreactor gave a successful culture performance with fair accordance to preset scheduling based on the information in the latest subculture, realizing 79- fold cell expansion for 169 h. In addition, the correlation factor between experimental data and scheduled values through the bioreactor performance was 0.998. It was concluded that the proposed bioreactor with the integration of the prediction and monitoring tools could offer a feasible system for the manufacturing process of cultured tissue products.

  5. Discrimination between smiling faces: Human observers vs. automated face analysis.

    PubMed

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Enhancing Ear and Hearing Health Access for Children With Technology and Connectivity.

    PubMed

    Swanepoel, De Wet

    2017-10-12

    Technology and connectivity advances are demonstrating increasing potential to improve access of service delivery to persons with hearing loss. This article demonstrates use cases from community-based hearing screening and automated diagnosis of ear disease. This brief report reviews recent evidence for school- and home-based hearing testing in underserved communities using smartphone technologies paired with calibrated headphones. Another area of potential impact facilitated by technology and connectivity is the use of feature extraction algorithms to facilitate automated diagnosis of most common ear conditions from video-otoscopic images. Smartphone hearing screening using calibrated headphones demonstrated equivalent sensitivity and specificity for school-based hearing screening. Automating test sequences with a forced-choice response paradigm allowed persons with minimal training to offer screening in underserved communities. The automated image analysis and diagnosis system for ear disease demonstrated an overall accuracy of 80.6%, which is up to par and exceeds accuracy rates previously reported for general practitioners and pediatricians. The emergence of these tools that capitalize on technology and connectivity advances enables affordable and accessible models of service delivery for community-based ear and hearing care.

  7. Automating data analysis for two-dimensional gas chromatography/time-of-flight mass spectrometry non-targeted analysis of comparative samples.

    PubMed

    Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey

    2018-03-16

    Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Thinking Together: Modeling Clinical Decision-Support as a Sociotechnical System

    PubMed Central

    Hussain, Mustafa I.; Reynolds, Tera L.; Mousavi, Fatemeh E.; Chen, Yunan; Zheng, Kai

    2017-01-01

    Computerized clinical decision-support systems are members of larger sociotechnical systems, composed of human and automated actors, who send, receive, and manipulate artifacts. Sociotechnical consideration is rare in the literature. This makes it difficult to comparatively evaluate the success of CDS implementations, and it may also indicate that sociotechnical context receives inadequate consideration in practice. To facilitate sociotechnical consideration, we developed the Thinking Together model, a flexible diagrammatical means of representing CDS systems as sociotechnical systems. To develop this model, we examined the literature with the lens of Distributed Cognition (DCog) theory. We then present two case studies of vastly different CDSSs, one almost fully automated and the other with minimal automation, to illustrate the flexibility of the Thinking Together model. We show that this model, informed by DCog and the CDS literature, are capable of supporting both research, by enabling comparative evaluation, and practice, by facilitating explicit sociotechnical planning and communication. PMID:29854164

  9. End-to-end workflow for finite element analysis of tumor treating fields in glioblastomas

    NASA Astrophysics Data System (ADS)

    Timmons, Joshua J.; Lok, Edwin; San, Pyay; Bui, Kevin; Wong, Eric T.

    2017-11-01

    Tumor Treating Fields (TTFields) therapy is an approved modality of treatment for glioblastoma. Patient anatomy-based finite element analysis (FEA) has the potential to reveal not only how these fields affect tumor control but also how to improve efficacy. While the automated tools for segmentation speed up the generation of FEA models, multi-step manual corrections are required, including removal of disconnected voxels, incorporation of unsegmented structures and the addition of 36 electrodes plus gel layers matching the TTFields transducers. Existing approaches are also not scalable for the high throughput analysis of large patient volumes. A semi-automated workflow was developed to prepare FEA models for TTFields mapping in the human brain. Magnetic resonance imaging (MRI) pre-processing, segmentation, electrode and gel placement, and post-processing were all automated. The material properties of each tissue were applied to their corresponding mask in silico using COMSOL Multiphysics (COMSOL, Burlington, MA, USA). The fidelity of the segmentations with and without post-processing was compared against the full semi-automated segmentation workflow approach using Dice coefficient analysis. The average relative differences for the electric fields generated by COMSOL were calculated in addition to observed differences in electric field-volume histograms. Furthermore, the mesh file formats in MPHTXT and NASTRAN were also compared using the differences in the electric field-volume histogram. The Dice coefficient was less for auto-segmentation without versus auto-segmentation with post-processing, indicating convergence on a manually corrected model. An existent but marginal relative difference of electric field maps from models with manual correction versus those without was identified, and a clear advantage of using the NASTRAN mesh file format was found. The software and workflow outlined in this article may be used to accelerate the investigation of TTFields in glioblastoma patients by facilitating the creation of FEA models derived from patient MRI datasets.

  10. Assessment for Operator Confidence in Automated Space Situational Awareness and Satellite Control Systems

    NASA Astrophysics Data System (ADS)

    Gorman, J.; Voshell, M.; Sliva, A.

    2016-09-01

    The United States is highly dependent on space resources to support military, government, commercial, and research activities. Satellites operate at great distances, observation capacity is limited, and operator actions and observations can be significantly delayed. Safe operations require support systems that provide situational understanding, enhance decision making, and facilitate collaboration between human operators and system automation both in-the-loop, and on-the-loop. Joint cognitive systems engineering (JCSE) provides a rich set of methods for analyzing and informing the design of complex systems that include both human decision-makers and autonomous elements as coordinating teammates. While, JCSE-based systems can enhance a system analysts' understanding of both existing and new system processes, JCSE activities typically occur outside of traditional systems engineering (SE) methods, providing sparse guidance about how systems should be implemented. In contrast, the Joint Director's Laboratory (JDL) information fusion model and extensions, such as the Dual Node Network (DNN) technical architecture, provide the means to divide and conquer such engineering and implementation complexity, but are loosely coupled to specialized organizational contexts and needs. We previously describe how Dual Node Decision Wheels (DNDW) extend the DNN to integrate JCSE analysis and design with the practicalities of system engineering and implementation using the DNN. Insights from Rasmussen's JCSE Decision Ladders align system implementation with organizational structures and processes. In the current work, we present a novel approach to assessing system performance based on patterns occurring in operational decisions that are documented by JCSE processes as traces in a decision ladder. In this way, system assessment is closely tied not just to system design, but the design of the joint cognitive system that includes human operators, decision-makers, information systems, and automated processes. Such operationally relevant and integrated testing provides a sound foundation for operator trust in system automation that is required to safely operate satellite systems.

  11. Automated identification of patients with pulmonary nodules in an integrated health system using administrative health plan data, radiology reports, and natural language processing.

    PubMed

    Danforth, Kim N; Early, Megan I; Ngan, Sharon; Kosco, Anne E; Zheng, Chengyi; Gould, Michael K

    2012-08-01

    Lung nodules are commonly encountered in clinical practice, yet little is known about their management in community settings. An automated method for identifying patients with lung nodules would greatly facilitate research in this area. Using members of a large, community-based health plan from 2006 to 2010, we developed a method to identify patients with lung nodules, by combining five diagnostic codes, four procedural codes, and a natural language processing algorithm that performed free text searches of radiology transcripts. An experienced pulmonologist reviewed a random sample of 116 radiology transcripts, providing a reference standard for the natural language processing algorithm. With the use of an automated method, we identified 7112 unique members as having one or more incident lung nodules. The mean age of the patients was 65 years (standard deviation 14 years). There were slightly more women (54%) than men, and Hispanics and non-whites comprised 45% of the lung nodule cohort. Thirty-six percent were never smokers whereas 11% were current smokers. Fourteen percent of the patients were subsequently diagnosed with lung cancer. The sensitivity and specificity of the natural language processing algorithm for identifying the presence of lung nodules were 96% and 86%, respectively, compared with clinician review. Among the true positive transcripts in the validation sample, only 35% were solitary and unaccompanied by one or more associated findings, and 56% measured 8 to 30 mm in diameter. A combination of diagnostic codes, procedural codes, and a natural language processing algorithm for free text searching of radiology reports can accurately and efficiently identify patients with incident lung nodules, many of whom are subsequently diagnosed with lung cancer.

  12. Automating Quality Measures for Heart Failure Using Natural Language Processing: A Descriptive Study in the Department of Veterans Affairs

    PubMed Central

    Kim, Youngjun; Gobbel, Glenn Temple; Matheny, Michael E; Redd, Andrew; Bray, Bruce E; Heidenreich, Paul; Bolton, Dan; Heavirland, Julia; Kelly, Natalie; Reeves, Ruth; Kalsy, Megha; Goldstein, Mary Kane; Meystre, Stephane M

    2018-01-01

    Background We developed an accurate, stakeholder-informed, automated, natural language processing (NLP) system to measure the quality of heart failure (HF) inpatient care, and explored the potential for adoption of this system within an integrated health care system. Objective To accurately automate a United States Department of Veterans Affairs (VA) quality measure for inpatients with HF. Methods We automated the HF quality measure Congestive Heart Failure Inpatient Measure 19 (CHI19) that identifies whether a given patient has left ventricular ejection fraction (LVEF) <40%, and if so, whether an angiotensin-converting enzyme inhibitor or angiotensin-receptor blocker was prescribed at discharge if there were no contraindications. We used documents from 1083 unique inpatients from eight VA medical centers to develop a reference standard (RS) to train (n=314) and test (n=769) the Congestive Heart Failure Information Extraction Framework (CHIEF). We also conducted semi-structured interviews (n=15) for stakeholder feedback on implementation of the CHIEF. Results The CHIEF classified each hospitalization in the test set with a sensitivity (SN) of 98.9% and positive predictive value of 98.7%, compared with an RS and SN of 98.5% for available External Peer Review Program assessments. Of the 1083 patients available for the NLP system, the CHIEF evaluated and classified 100% of cases. Stakeholders identified potential implementation facilitators and clinical uses of the CHIEF. Conclusions The CHIEF provided complete data for all patients in the cohort and could potentially improve the efficiency, timeliness, and utility of HF quality measurements. PMID:29335238

  13. Validation of an automated colony counting system for group A Streptococcus.

    PubMed

    Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R

    2016-02-08

    The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.

  14. [Standardization and modeling of surgical processes].

    PubMed

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  15. Cockpit Automation Technology CSERIAC-CAT

    DTIC Science & Technology

    1991-06-01

    AD-A273 124 AL-TR-1991-0078 A R COCKPIT AUTOMATION TECHNOLOGY M CSERIAC- CAT S JULY 1989 - DEC 1990: FINAL REPORT T R Trudy S. Abrams Cindy D. Martin...TITLE AND SUBTITLE 5. FUNDING NUMBERS Cockpit Automation Technology CSERIAC- CAT JUL 89 - DEC 90 PE 62202F Final Report (U) PR 7184 ,___,TA 12 6. AUTHOR(S...Boeing-developed CAT software tools, and for facilitating their use by the cockpit design community. A brief description of the overall task is given

  16. Accessibility, usability, and usefulness of a Web-based clinical decision support tool to enhance provider-patient communication around Self-management TO Prevent (STOP) Stroke.

    PubMed

    Anderson, Jane A; Godwin, Kyler M; Saleem, Jason J; Russell, Scott; Robinson, Joshua J; Kimmel, Barbara

    2014-12-01

    This article reports redesign strategies identified to create a Web-based user-interface for the Self-management TO Prevent (STOP) Stroke Tool. Members of a Stroke Quality Improvement Network (N = 12) viewed a visualization video of a proposed prototype and provided feedback on implementation barriers/facilitators. Stroke-care providers (N = 10) tested the Web-based prototype in think-aloud sessions of simulated clinic visits. Participants' dialogues were coded into themes. Access to comprehensive information and the automated features/systematized processes were the primary accessibility and usability facilitator themes. The need for training, time to complete the tool, and computer-centric care were identified as possible usability barriers. Patient accountability, reminders for best practice, goal-focused care, and communication/counseling themes indicate that the STOP Stroke Tool supports the paradigm of patient-centered care. The STOP Stroke Tool was found to prompt clinicians on secondary stroke-prevention clinical-practice guidelines, facilitate comprehensive documentation of evidence-based care, and support clinicians in providing patient-centered care through the shared decision-making process that occurred while using the action-planning/goal-setting feature of the tool. © The Author(s) 2013.

  17. Automated structural classification of lipids by machine learning.

    PubMed

    Taylor, Ryan; Miller, Ryan H; Miller, Ryan D; Porter, Michael; Dalgleish, James; Prince, John T

    2015-03-01

    Modern lipidomics is largely dependent upon structural ontologies because of the great diversity exhibited in the lipidome, but no automated lipid classification exists to facilitate this partitioning. The size of the putative lipidome far exceeds the number currently classified, despite a decade of work. Automated classification would benefit ongoing classification efforts by decreasing the time needed and increasing the accuracy of classification while providing classifications for mass spectral identification algorithms. We introduce a tool that automates classification into the LIPID MAPS ontology of known lipids with >95% accuracy and novel lipids with 63% accuracy. The classification is based upon simple chemical characteristics and modern machine learning algorithms. The decision trees produced are intelligible and can be used to clarify implicit assumptions about the current LIPID MAPS classification scheme. These characteristics and decision trees are made available to facilitate alternative implementations. We also discovered many hundreds of lipids that are currently misclassified in the LIPID MAPS database, strongly underscoring the need for automated classification. Source code and chemical characteristic lists as SMARTS search strings are available under an open-source license at https://www.github.com/princelab/lipid_classifier. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Challenges and opportunities in the manufacture and expansion of cells for therapy.

    PubMed

    Maartens, Joachim H; De-Juan-Pardo, Elena; Wunner, Felix M; Simula, Antonio; Voelcker, Nicolas H; Barry, Simon C; Hutmacher, Dietmar W

    2017-10-01

    Laboratory-based ex vivo cell culture methods are largely manual in their manufacturing processes. This makes it extremely difficult to meet regulatory requirements for process validation, quality control and reproducibility. Cell culture concepts with a translational focus need to embrace a more automated approach where cell yields are able to meet the quantitative production demands, the correct cell lineage and phenotype is readily confirmed and reagent usage has been optimized. Areas covered: This article discusses the obstacles inherent in classical laboratory-based methods, their concomitant impact on cost-of-goods and that a technology step change is required to facilitate translation from bed-to-bedside. Expert opinion: While traditional bioreactors have demonstrated limited success where adherent cells are used in combination with microcarriers, further process optimization will be required to find solutions for commercial-scale therapies. New cell culture technologies based on 3D-printed cell culture lattices with favourable surface to volume ratios have the potential to change the paradigm in industry. An integrated Quality-by-Design /System engineering approach will be essential to facilitate the scaled-up translation from proof-of-principle to clinical validation.

  19. Modeling and managing risk early in software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.

    1993-01-01

    In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.

  20. Finding lesion correspondences in different views of automated 3D breast ultrasound

    NASA Astrophysics Data System (ADS)

    Tan, Tao; Platel, Bram; Hicks, Michael; Mann, Ritse M.; Karssemeijer, Nico

    2013-02-01

    Screening with automated 3D breast ultrasound (ABUS) is gaining popularity. However, the acquisition of multiple views required to cover an entire breast makes radiologic reading time-consuming. Linking lesions across views can facilitate the reading process. In this paper, we propose a method to automatically predict the position of a lesion in the target ABUS views, given the location of the lesion in a source ABUS view. We combine features describing the lesion location with respect to the nipple, the transducer and the chestwall, with features describing lesion properties such as intensity, spiculation, blobness, contrast and lesion likelihood. By using a grid search strategy, the location of the lesion was predicted in the target view. Our method achieved an error of 15.64 mm+/-16.13 mm. The error is small enough to help locate the lesion with minor additional interaction.

  1. Automated solid-phase extraction workstations combined with quantitative bioanalytical LC/MS.

    PubMed

    Huang, N H; Kagel, J R; Rossi, D T

    1999-03-01

    An automated solid-phase extraction workstation was used to develop, characterize and validate an LC/MS/MS method for quantifying a novel lipid-regulating drug in dog plasma. Method development was facilitated by workstation functions that allowed wash solvents of varying organic composition to be mixed and tested automatically. Precision estimates for this approach were within 9.8% relative standard deviation (RSD) across the calibration range. Accuracy for replicate determinations of quality controls was between -7.2 and +6.2% relative error (RE) over 5-1,000 ng/ml(-1). Recoveries were evaluated for a wide variety of wash solvents, elution solvents and sorbents. Optimized recoveries were generally > 95%. A sample throughput benchmark for the method was approximately equal 8 min per sample. Because of parallel sample processing, 100 samples were extracted in less than 120 min. The approach has proven useful for use with LC/MS/MS, using a multiple reaction monitoring (MRM) approach.

  2. A Binary Segmentation Approach for Boxing Ribosome Particles in Cryo EM Micrographs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adiga, Umesh P.S.; Malladi, Ravi; Baxter, William

    Three-dimensional reconstruction of ribosome particles from electron micrographs requires selection of many single-particle images. Roughly 100,000 particles are required to achieve approximately 10 angstrom resolution. Manual selection of particles, by visual observation of the micrographs on a computer screen, is recognized as a bottleneck in automated single particle reconstruction. This paper describes an efficient approach for automated boxing of ribosome particles in micrographs. Use of a fast, anisotropic non-linear reaction-diffusion method to pre-process micrographs and rank-leveling to enhance the contrast between particles and the background, followed by binary and morphological segmentation constitute the core of this technique. Modifying the shapemore » of the particles to facilitate segmentation of individual particles within clusters and boxing the isolated particles is successfully attempted. Tests on a limited number of micrographs have shown that over 80 percent success is achieved in automatic particle picking.« less

  3. Automated Delineation of Lung Tumors from CT Images Using a Single Click Ensemble Segmentation Approach

    PubMed Central

    Gu, Yuhua; Kumar, Virendra; Hall, Lawrence O; Goldgof, Dmitry B; Li, Ching-Yen; Korn, René; Bendtsen, Claus; Velazquez, Emmanuel Rios; Dekker, Andre; Aerts, Hugo; Lambin, Philippe; Li, Xiuli; Tian, Jie; Gatenby, Robert A; Gillies, Robert J

    2012-01-01

    A single click ensemble segmentation (SCES) approach based on an existing “Click&Grow” algorithm is presented. The SCES approach requires only one operator selected seed point as compared with multiple operator inputs, which are typically needed. This facilitates processing large numbers of cases. Evaluation on a set of 129 CT lung tumor images using a similarity index (SI) was done. The average SI is above 93% using 20 different start seeds, showing stability. The average SI for 2 different readers was 79.53%. We then compared the SCES algorithm with the two readers, the level set algorithm and the skeleton graph cut algorithm obtaining an average SI of 78.29%, 77.72%, 63.77% and 63.76% respectively. We can conclude that the newly developed automatic lung lesion segmentation algorithm is stable, accurate and automated. PMID:23459617

  4. Using Advanced Tabu Search Approaches to Perform Enhanced Air Mobility Command Operational Airlift Analyses - Phases II and III

    DTIC Science & Technology

    2006-10-31

    Ross USN, Javier Barreiro and Jason Porter AMC: Mr. David L. Merrill, Maj David Van Veldhuizen PhD Mitre Inc. (USTRANSOM) Mr. Stuart Draper, Mr. Mark...interface (GUI), at the request of Lt Col Van Veldhuizen (AMC), to facilitate the use of McKinzie’s TPFDD automated editor/error corrector that was part of...and Van Veldhuizen (2006). This research addressed both the channel and contingency instances of air fleet loading at’ an APOE. In this process, Capt

  5. Convective polymer assembly for the deposition of nanostructures and polymer thin films on immobilized particles.

    PubMed

    Richardson, Joseph J; Björnmalm, Mattias; Gunawan, Sylvia T; Guo, Junling; Liang, Kang; Tardy, Blaise; Sekiguchi, Shota; Noi, Ka Fung; Cui, Jiwei; Ejima, Hirotaka; Caruso, Frank

    2014-11-21

    We report the preparation of polymer particles via convective polymer assembly (CPA). Convection is used to move polymer solutions and cargo through an agarose gel that contains immobilized template particles. This method both coats and washes the particles in a process that is amenable to automation, and does not depend on passive diffusion or electrical currents, thus facilitating incorporation of fragile and nanoscale objects, such as liposomes and gold nanoparticles, into the thin polymer films. Template dissolution leads to the formation of stable polymer particles and capsules.

  6. A new user-friendly visual environment for breast MRI data analysis.

    PubMed

    Antonios, Danelakis; Dimitrios, Verganelakis A; Theoharis, Theoharis

    2013-06-01

    In this paper a novel, user friendly visual environment for Breast MRI Data Analysis is presented (BreDAn). Given planar MRI images before and after IV contrast medium injection, BreDAn generates kinematic graphs, color maps of signal increase and decrease and finally detects high risk breast areas. The advantage of BreDAn, which has been validated and tested successfully, is the automation of the radiodiagnostic process in an accurate and reliable manner. It can potentially facilitate radiologists' workload. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  7. Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.

    PubMed

    O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.

  8. Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains

    PubMed Central

    Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721

  9. Mars Science Laboratory Flight Software Boot Robustness Testing Project Report

    NASA Technical Reports Server (NTRS)

    Roth, Brian

    2011-01-01

    On the surface of Mars, the Mars Science Laboratory will boot up its flight computers every morning, having charged the batteries through the night. This boot process is complicated, critical, and affected by numerous hardware states that can be difficult to test. The hardware test beds do not facilitate testing a long duration of back-to-back unmanned automated tests, and although the software simulation has provided the necessary functionality and fidelity for this boot testing, there has not been support for the full flexibility necessary for this task. Therefore to perform this testing a framework has been build around the software simulation that supports running automated tests loading a variety of starting configurations for software and hardware states. This implementation has been tested against the nominal cases to validate the methodology, and support for configuring off-nominal cases is ongoing. The implication of this testing is that the introduction of input configurations that have yet proved difficult to test may reveal boot scenarios worth higher fidelity investigation, and in other cases increase confidence in the robustness of the flight software boot process.

  10. Development of an optical microscopy system for automated bubble cloud analysis.

    PubMed

    Wesley, Daniel J; Brittle, Stuart A; Toolan, Daniel T W

    2016-08-01

    Recently, the number of uses of bubbles has begun to increase dramatically, with medicine, biofuel production, and wastewater treatment just some of the industries taking advantage of bubble properties, such as high mass transfer. As a result, more and more focus is being placed on the understanding and control of bubble formation processes and there are currently numerous techniques utilized to facilitate this understanding. Acoustic bubble sizing (ABS) and laser scattering techniques are able to provide information regarding bubble size and size distribution with minimal data processing, a major advantage over current optical-based direct imaging approaches. This paper demonstrates how direct bubble-imaging methods can be improved upon to yield high levels of automation and thus data comparable to ABS and laser scattering. We also discuss the added benefits of the direct imaging approaches and how it is possible to obtain considerable additional information above and beyond that which ABS and laser scattering can supply. This work could easily be exploited by both industrial-scale operations and small-scale laboratory studies, as this straightforward and cost-effective approach is highly transferrable and intuitive to use.

  11. Spitzer observatory operations: increasing efficiency in mission operations

    NASA Astrophysics Data System (ADS)

    Scott, Charles P.; Kahr, Bolinda E.; Sarrel, Marc A.

    2006-06-01

    This paper explores the how's and why's of the Spitzer Mission Operations System's (MOS) success, efficiency, and affordability in comparison to other observatory-class missions. MOS exploits today's flight, ground, and operations capabilities, embraces automation, and balances both risk and cost. With operational efficiency as the primary goal, MOS maintains a strong control process by translating lessons learned into efficiency improvements, thereby enabling the MOS processes, teams, and procedures to rapidly evolve from concept (through thorough validation) into in-flight implementation. Operational teaming, planning, and execution are designed to enable re-use. Mission changes, unforeseen events, and continuous improvement have often times forced us to learn to fly anew. Collaborative spacecraft operations and remote science and instrument teams have become well integrated, and worked together to improve and optimize each human, machine, and software-system element. Adaptation to tighter spacecraft margins has facilitated continuous operational improvements via automated and autonomous software coupled with improved human analysis. Based upon what we now know and what we need to improve, adapt, or fix, the projected mission lifetime continues to grow - as does the opportunity for numerous scientific discoveries.

  12. Enabling Healthcare IT Governance: Human Task Management Service for Administering Emergency Department's Resources for Efficient Patient Flow.

    PubMed

    Rodriguez, Salvador; Aziz, Ayesha; Chatwin, Chris

    2014-01-01

    The use of Health Information Technology (HIT) to improve healthcare service delivery is constantly increasing due to research advances in medical science and information systems. Having a fully automated process solution for a Healthcare Organization (HCO) requires a combination of organizational strategies along with a selection of technologies that facilitate the goal of improving clinical outcomes. HCOs, requires dynamic management of care capability to realize the full potential of HIT. Business Process Management (BPM) is being increasingly adopted to streamline the healthcare service delivery and management processes. Emergency Departments (EDs) provide a case in point, which require multidisciplinary resources and services to deliver effective clinical outcomes. Managed care involves the coordination of a range of services in an ED. Although fully automated processes in emergency care provide a cutting edge example of service delivery, there are many situations that require human interactions with the computerized systems; e.g. Medication Approvals, care transfer, acute patient care. This requires a coordination mechanism for all the resources, computer and human, to work side by side to provide the best care. To ensure evidence-based medical practice in ED, we have designed a Human Task Management service to model the process of coordination of ED resources based on the UK's NICE Clinical guideline for managing the care of acutely ill patients. This functionality is implemented using Java Business process Management (jBPM).

  13. clubber: removing the bioinformatics bottleneck in big data analyses.

    PubMed

    Miller, Maximilian; Zhu, Chengsheng; Bromberg, Yana

    2017-06-13

    With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these "big data" analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber's goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC) resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min) clearly illustrate the importance of clubber in the everyday computational biology environment.

  14. clubber: removing the bioinformatics bottleneck in big data analyses

    PubMed Central

    Miller, Maximilian; Zhu, Chengsheng; Bromberg, Yana

    2018-01-01

    With the advent of modern day high-throughput technologies, the bottleneck in biological discovery has shifted from the cost of doing experiments to that of analyzing results. clubber is our automated cluster-load balancing system developed for optimizing these “big data” analyses. Its plug-and-play framework encourages re-use of existing solutions for bioinformatics problems. clubber’s goals are to reduce computation times and to facilitate use of cluster computing. The first goal is achieved by automating the balance of parallel submissions across available high performance computing (HPC) resources. Notably, the latter can be added on demand, including cloud-based resources, and/or featuring heterogeneous environments. The second goal of making HPCs user-friendly is facilitated by an interactive web interface and a RESTful API, allowing for job monitoring and result retrieval. We used clubber to speed up our pipeline for annotating molecular functionality of metagenomes. Here, we analyzed the Deepwater Horizon oil-spill study data to quantitatively show that the beach sands have not yet entirely recovered. Further, our analysis of the CAMI-challenge data revealed that microbiome taxonomic shifts do not necessarily correlate with functional shifts. These examples (21 metagenomes processed in 172 min) clearly illustrate the importance of clubber in the everyday computational biology environment. PMID:28609295

  15. Modeling of prepregs during automated draping sequences

    NASA Astrophysics Data System (ADS)

    Krogh, Christian; Glud, Jens A.; Jakobsen, Johnny

    2017-10-01

    The behavior of wowen prepreg fabric during automated draping sequences is investigated. A drape tool under development with an arrangement of grippers facilitates the placement of a woven prepreg fabric in a mold. It is essential that the draped configuration is free from wrinkles and other defects. The present study aims at setting up a virtual draping framework capable of modeling the draping process from the initial flat fabric to the final double curved shape and aims at assisting the development of an automated drape tool. The virtual draping framework consists of a kinematic mapping algorithm used to generate target points on the mold which are used as input to a draping sequence planner. The draping sequence planner prescribes the displacement history for each gripper in the drape tool and these displacements are then applied to each gripper in a transient model of the draping sequence. The model is based on a transient finite element analysis with the material's constitutive behavior currently being approximated as linear elastic orthotropic. In-plane tensile and bias-extension tests as well as bending tests are conducted and used as input for the model. The virtual draping framework shows a good potential for obtaining a better understanding of the drape process and guide the development of the drape tool. However, results obtained from using the framework on a simple test case indicate that the generation of draping sequences is non-trivial.

  16. Technology assessment of automation trends in the modular home industry

    Treesearch

    Phil Mitchell; Robert Russell Hurst

    2009-01-01

    This report provides an assessment of technology used in manufacturing modular homes in the United States, and that used in the German prefabricated wooden home industry. It is the first step toward identifying the research needs in automation and manufacturing methods that will facilitate mass customization in the home manufacturing industry. Within the United States...

  17. Open-Source Tools for Enhancing Full-Text Searching of OPACs: Use of Koha, Greenstone and Fedora

    ERIC Educational Resources Information Center

    Anuradha, K. T.; Sivakaminathan, R.; Kumar, P. Arun

    2011-01-01

    Purpose: There are many library automation packages available as open-source software, comprising two modules: staff-client module and online public access catalogue (OPAC). Although the OPAC of these library automation packages provides advanced features of searching and retrieval of bibliographic records, none of them facilitate full-text…

  18. Linking the Congenital Heart Surgery Databases of the Society of Thoracic Surgeons and the Congenital Heart Surgeons’ Society: Part 1—Rationale and Methodology

    PubMed Central

    Jacobs, Jeffrey P.; Pasquali, Sara K.; Austin, Erle; Gaynor, J. William; Backer, Carl; Hirsch-Romano, Jennifer C.; Williams, William G.; Caldarone, Christopher A.; McCrindle, Brian W.; Graham, Karen E.; Dokholyan, Rachel S.; Shook, Gregory J.; Poteat, Jennifer; Baxi, Maulik V.; Karamlou, Tara; Blackstone, Eugene H.; Mavroudis, Constantine; Mayer, John E.; Jonas, Richard A.; Jacobs, Marshall L.

    2014-01-01

    Purpose The Society of Thoracic Surgeons Congenital Heart Surgery Database (STS-CHSD) is the largest Registry in the world of patients who have undergone congenital and pediatric cardiac surgical operations. The Congenital Heart Surgeons’ Society Database (CHSS-D) is an Academic Database designed for specialized detailed analyses of specific congenital cardiac malformations and related treatment strategies. The goal of this project was to create a link between the STS-CHSD and the CHSS-D in order to facilitate studies not possible using either individual database alone and to help identify patients who are potentially eligible for enrollment in CHSS studies. Methods Centers were classified on the basis of participation in the STS-CHSD, the CHSS-D, or both. Five matrices, based on CHSS inclusionary criteria and STS-CHSD codes, were created to facilitate the automated identification of patients in the STS-CHSD who meet eligibility criteria for the five active CHSS studies. The matrices were evaluated with a manual adjudication process and were iteratively refined. The sensitivity and specificity of the original matrices and the refined matrices were assessed. Results In January 2012, a total of 100 centers participated in the STS-CHSD and 74 centers participated in the CHSS. A total of 70 centers participate in both and 40 of these 70 agreed to participate in this linkage project. The manual adjudication process and the refinement of the matrices resulted in an increase in the sensitivity of the matrices from 93% to 100% and an increase in the specificity of the matrices from 94% to 98%. Conclusion Matrices were created to facilitate the automated identification of patients potentially eligible for the five active CHSS studies using the STS-CHSD. These matrices have a sensitivity of 100% and a specificity of 98%. In addition to facilitating identification of patients potentially eligible for enrollment in CHSS studies, these matrices will allow (1) estimation of the denominator of patients potentially eligible for CHSS studies and (2) comparison of eligible and enrolled patients to potentially eligible and not enrolled patients to assess the generalizability of CHSS studies. PMID:24668974

  19. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  20. Microfluidic-Based Robotic Sampling System for Radioactive Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack D. Law; Julia L. Tripp; Tara E. Smith

    A novel microfluidic based robotic sampling system has been developed for sampling and analysis of liquid solutions in nuclear processes. This system couples the use of a microfluidic sample chip with a robotic system designed to allow remote, automated sampling of process solutions in-cell and facilitates direct coupling of the microfluidic sample chip with analytical instrumentation. This system provides the capability for near real time analysis, reduces analytical waste, and minimizes the potential for personnel exposure associated with traditional sampling methods. A prototype sampling system was designed, built and tested. System testing demonstrated operability of the microfluidic based sample systemmore » and identified system modifications to optimize performance.« less

  1. An Automated HIV-1 Env-Pseudotyped Virus Production for Global HIV Vaccine Trials

    PubMed Central

    Fuss, Martina; Mazzotta, Angela S.; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; von Briesen, Hagen; Zimmermann, Heiko; Meyerhans, Andreas

    2012-01-01

    Background Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. Methodology/Principal Findings The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays) confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP) guidelines, including the validation parameters accuracy, precision, robustness and specificity. Conclusions An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell culture supernatant per week. Thus, this novel automation facilitates standardized large-scale productions of HIV pseudoviruses for ongoing and upcoming HIV vaccine trials. PMID:23300558

  2. Ontology-Based Exchange and Immediate Application of Business Calculation Definitions for Online Analytical Processing

    NASA Astrophysics Data System (ADS)

    Kehlenbeck, Matthias; Breitner, Michael H.

    Business users define calculated facts based on the dimensions and facts contained in a data warehouse. These business calculation definitions contain necessary knowledge regarding quantitative relations for deep analyses and for the production of meaningful reports. The business calculation definitions are implementation and widely organization independent. But no automated procedures facilitating their exchange across organization and implementation boundaries exist. Separately each organization currently has to map its own business calculations to analysis and reporting tools. This paper presents an innovative approach based on standard Semantic Web technologies. This approach facilitates the exchange of business calculation definitions and allows for their automatic linking to specific data warehouses through semantic reasoning. A novel standard proxy server which enables the immediate application of exchanged definitions is introduced. Benefits of the approach are shown in a comprehensive case study.

  3. Reading Guided by Automated Graphical Representations: How Model-Based Text Visualizations Facilitate Learning in Reading Comprehension Tasks

    ERIC Educational Resources Information Center

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk

    2011-01-01

    Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which has similarities to a concept map. The "text to graph" feature of the software is based on…

  4. Development of automated electromagnetic compatibility test facilities at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Harrison, Cecil A.

    1986-01-01

    The efforts to automate the electromagentic compatibility (EMC) test facilites at Marshall Flight Center were examined. A battery of nine standard tests is to be integrated by means of a desktop computer-controller in order to provide near real-time data assessment, store the data acquired during testing on flexible disk, and provide computer production of the certification report.

  5. Predictors of Interpersonal Trust in Virtual Distributed Teams

    DTIC Science & Technology

    2008-09-01

    understand systems that are very complex in nature . Such understanding is essential to facilitate building or maintaining operators’ mental models of the...a significant impact on overall system performance. Specifically, the level of automation that combined human generation of options with computer...and/or computer servers had a significant impact on automated system performance. Additionally, Parasuraman, Sheridan, & Wickens (2000) proposed

  6. Automated MeSH indexing of the World-Wide Web.

    PubMed Central

    Fowler, J.; Kouramajian, V.; Maram, S.; Devadhar, V.

    1995-01-01

    To facilitate networked discovery and information retrieval in the biomedical domain, we have designed a system for automatic assignment of Medical Subject Headings to documents retrieved from the World-Wide Web. Our prototype implementations show significant promise. We describe our methods and discuss the further development of a completely automated indexing tool called the "Web-MeSH Medibot." PMID:8563421

  7. Automated synthesis, insertion and detection of polyps for CT colonography

    NASA Astrophysics Data System (ADS)

    Sezille, Nicolas; Sadleir, Robert J. T.; Whelan, Paul F.

    2003-03-01

    CT Colonography (CTC) is a new non-invasive colon imaging technique which has the potential to replace conventional colonoscopy for colorectal cancer screening. A novel system which facilitates automated detection of colorectal polyps at CTC is introduced. As exhaustive testing of such a system using real patient data is not feasible, more complete testing is achieved through synthesis of artificial polyps and insertion into real datasets. The polyp insertion is semi-automatic: candidate points are manually selected using a custom GUI, suitable points are determined automatically from an analysis of the local neighborhood surrounding each of the candidate points. Local density and orientation information are used to generate polyps based on an elliptical model. Anomalies are identified from the modified dataset by analyzing the axial images. Detected anomalies are classified as potential polyps or natural features using 3D morphological techniques. The final results are flagged for review. The system was evaluated using 15 scenarios. The sensitivity of the system was found to be 65% with 34% false positive detections. Automated diagnosis at CTC is possible and thorough testing is facilitated by augmenting real patient data with computer generated polyps. Ultimately, automated diagnosis will enhance standard CTC and increase performance.

  8. Automation is key to managing a population's health.

    PubMed

    Matthews, Michael B; Hodach, Richard

    2012-04-01

    Online tools for automating population health management can help healthcare organizations meet their patients' needs both during and between encounters with the healthcare system. These tools can facilitate: The use of registries to track patients' health status and care gaps. Outbound messaging to notify patients when they need care. Care team management of more patients at different levels of risk. Automation of workflows related to case management and transitions of care. Online educational and mobile health interventions to engage patients in their care. Analytics programs to identify opportunities for improvement.

  9. Paradox effects of binge drinking on response inhibition processes depending on mental workload.

    PubMed

    Stock, Ann-Kathrin; Riegler, Lea; Chmielewski, Witold X; Beste, Christian

    2016-06-01

    Binge drinking is an increasing problem in Western societies, but we are still only beginning to unravel the effects of binge drinking on a cognitive level. While common sense suggests that all cognitive functions are compromised during high-dose ethanol intoxication, several studies suggest that the effects might instead be rather specific. Moreover, some results suggest that the degrees of automaticity and complexity of cognitive operations during response control modulate effects of binge drinking. However, this has not been tested in detail. In the current study, we therefore parametrically modulate cognitive/"mental" workload during response inhibition and examine the effects of high-dose ethanol intoxication (~1.1 ‰) in n = 18 male participants. The results suggest that detrimental effects of high-dose ethanol intoxication strongly depend on the complexity of processes involved in response inhibition. The results revealed strong effects (η (2) = .495) and are in line with findings showing that even high doses of ethanol have very specific effects on a cognitive level. Opposed to common sense, more complex cognitive operations seem to be less affected by a high-dose ethanol intoxication. Complementing this, high-dose ethanol intoxication is increasingly detrimental for action control, as stronger automated response tendencies are in charge and need to be controlled. Binge-like ethanol intoxication may take a heavier toll on cognitive control processes than on automated responses/response tendencies. Therefore, ethanol effects are more pronounced in supposedly "easier" control conditions because those facilitate the formation of automated response tendencies.

  10. A Cognitive Systems Engineering Approach to Developing Human Machine Interface Requirements for New Technologies

    NASA Astrophysics Data System (ADS)

    Fern, Lisa Carolynn

    This dissertation examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will be deployed into complex systems. A key question for new technologies with increasingly capable automation, is how work will be accomplished by human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by designers. The human machine interface (HMI), which is intended to facilitate human-machine interaction and cooperation, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the predicted performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements for a detect and avoid (DAA) system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to a lack of exploration of different approaches to human-automation cooperation. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed. This document concludes with a look at both the importance of, and the challenges facing, the inclusion of examining human-automation coordination issues as part of the safety assurance activities of new technologies.

  11. Selecting automation for the clinical chemistry laboratory.

    PubMed

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  12. DG-AMMOS: a new tool to generate 3d conformation of small molecules using distance geometry and automated molecular mechanics optimization for in silico screening.

    PubMed

    Lagorce, David; Pencheva, Tania; Villoutreix, Bruno O; Miteva, Maria A

    2009-11-13

    Discovery of new bioactive molecules that could enter drug discovery programs or that could serve as chemical probes is a very complex and costly endeavor. Structure-based and ligand-based in silico screening approaches are nowadays extensively used to complement experimental screening approaches in order to increase the effectiveness of the process and facilitating the screening of thousands or millions of small molecules against a biomolecular target. Both in silico screening methods require as input a suitable chemical compound collection and most often the 3D structure of the small molecules has to be generated since compounds are usually delivered in 1D SMILES, CANSMILES or in 2D SDF formats. Here, we describe the new open source program DG-AMMOS which allows the generation of the 3D conformation of small molecules using Distance Geometry and their energy minimization via Automated Molecular Mechanics Optimization. The program is validated on the Astex dataset, the ChemBridge Diversity database and on a number of small molecules with known crystal structures extracted from the Cambridge Structural Database. A comparison with the free program Balloon and the well-known commercial program Omega generating the 3D of small molecules is carried out. The results show that the new free program DG-AMMOS is a very efficient 3D structure generator engine. DG-AMMOS provides fast, automated and reliable access to the generation of 3D conformation of small molecules and facilitates the preparation of a compound collection prior to high-throughput virtual screening computations. The validation of DG-AMMOS on several different datasets proves that generated structures are generally of equal quality or sometimes better than structures obtained by other tested methods.

  13. Automatable algorithms to identify nonmedical opioid use using electronic data: a systematic review.

    PubMed

    Canan, Chelsea; Polinski, Jennifer M; Alexander, G Caleb; Kowal, Mary K; Brennan, Troyen A; Shrank, William H

    2017-11-01

    Improved methods to identify nonmedical opioid use can help direct health care resources to individuals who need them. Automated algorithms that use large databases of electronic health care claims or records for surveillance are a potential means to achieve this goal. In this systematic review, we reviewed the utility, attempts at validation, and application of such algorithms to detect nonmedical opioid use. We searched PubMed and Embase for articles describing automatable algorithms that used electronic health care claims or records to identify patients or prescribers with likely nonmedical opioid use. We assessed algorithm development, validation, and performance characteristics and the settings where they were applied. Study variability precluded a meta-analysis. Of 15 included algorithms, 10 targeted patients, 2 targeted providers, 2 targeted both, and 1 identified medications with high abuse potential. Most patient-focused algorithms (67%) used prescription drug claims and/or medical claims, with diagnosis codes of substance abuse and/or dependence as the reference standard. Eleven algorithms were developed via regression modeling. Four used natural language processing, data mining, audit analysis, or factor analysis. Automated algorithms can facilitate population-level surveillance. However, there is no true gold standard for determining nonmedical opioid use. Users must recognize the implications of identifying false positives and, conversely, false negatives. Few algorithms have been applied in real-world settings. Automated algorithms may facilitate identification of patients and/or providers most likely to need more intensive screening and/or intervention for nonmedical opioid use. Additional implementation research in real-world settings would clarify their utility. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. Dynamic Communication Resource Negotiations

    NASA Technical Reports Server (NTRS)

    Chow, Edward; Vatan, Farrokh; Paloulian, George; Frisbie, Steve; Srostlik, Zuzana; Kalomiris, Vasilios; Apgar, Daniel

    2012-01-01

    Today's advanced network management systems can automate many aspects of the tactical networking operations within a military domain. However, automation of joint and coalition tactical networking across multiple domains remains challenging. Due to potentially conflicting goals and priorities, human agreement is often required before implementation into the network operations. This is further complicated by incompatible network management systems and security policies, rendering it difficult to implement automatic network management, thus requiring manual human intervention to the communication protocols used at various network routers and endpoints. This process of manual human intervention is tedious, error-prone, and slow. In order to facilitate a better solution, we are pursuing a technology which makes network management automated, reliable, and fast. Automating the negotiation of the common network communication parameters between different parties is the subject of this paper. We present the technology that enables inter-force dynamic communication resource negotiations to enable ad-hoc inter-operation in the field between force domains, without pre-planning. It also will enable a dynamic response to changing conditions within the area of operations. Our solution enables the rapid blending of intra-domain policies so that the forces involved are able to inter-operate effectively without overwhelming each other's networks with in-appropriate or un-warranted traffic. It will evaluate the policy rules and configuration data for each of the domains, then generate a compatible inter-domain policy and configuration that will update the gateway systems between the two domains.

  15. Model annotation for synthetic biology: automating model to nucleotide sequence conversion

    PubMed Central

    Misirli, Goksel; Hallinan, Jennifer S.; Yu, Tommy; Lawson, James R.; Wimalaratne, Sarala M.; Cooling, Michael T.; Wipat, Anil

    2011-01-01

    Motivation: The need for the automated computational design of genetic circuits is becoming increasingly apparent with the advent of ever more complex and ambitious synthetic biology projects. Currently, most circuits are designed through the assembly of models of individual parts such as promoters, ribosome binding sites and coding sequences. These low level models are combined to produce a dynamic model of a larger device that exhibits a desired behaviour. The larger model then acts as a blueprint for physical implementation at the DNA level. However, the conversion of models of complex genetic circuits into DNA sequences is a non-trivial undertaking due to the complexity of mapping the model parts to their physical manifestation. Automating this process is further hampered by the lack of computationally tractable information in most models. Results: We describe a method for automatically generating DNA sequences from dynamic models implemented in CellML and Systems Biology Markup Language (SBML). We also identify the metadata needed to annotate models to facilitate automated conversion, and propose and demonstrate a method for the markup of these models using RDF. Our algorithm has been implemented in a software tool called MoSeC. Availability: The software is available from the authors' web site http://research.ncl.ac.uk/synthetic_biology/downloads.html. Contact: anil.wipat@ncl.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21296753

  16. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    PubMed

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Automated segmentation of oral mucosa from wide-field OCT images (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Goldan, Ryan N.; Lee, Anthony M. D.; Cahill, Lucas; Liu, Kelly; MacAulay, Calum; Poh, Catherine F.; Lane, Pierre

    2016-03-01

    Optical Coherence Tomography (OCT) can discriminate morphological tissue features important for oral cancer detection such as the presence or absence of basement membrane and epithelial thickness. We previously reported an OCT system employing a rotary-pullback catheter capable of in vivo, rapid, wide-field (up to 90 x 2.5mm2) imaging in the oral cavity. Due to the size and complexity of these OCT data sets, rapid automated image processing software that immediately displays important tissue features is required to facilitate prompt bed-side clinical decisions. We present an automated segmentation algorithm capable of detecting the epithelial surface and basement membrane in 3D OCT images of the oral cavity. The algorithm was trained using volumetric OCT data acquired in vivo from a variety of tissue types and histology-confirmed pathologies spanning normal through cancer (8 sites, 21 patients). The algorithm was validated using a second dataset of similar size and tissue diversity. We demonstrate application of the algorithm to an entire OCT volume to map epithelial thickness, and detection of the basement membrane, over the tissue surface. These maps may be clinically useful for delineating pre-surgical tumor margins, or for biopsy site guidance.

  18. SuperSegger: robust image segmentation, analysis and lineage tracking of bacterial cells.

    PubMed

    Stylianidou, Stella; Brennan, Connor; Nissen, Silas B; Kuwada, Nathan J; Wiggins, Paul A

    2016-11-01

    Many quantitative cell biology questions require fast yet reliable automated image segmentation to identify and link cells from frame-to-frame, and characterize the cell morphology and fluorescence. We present SuperSegger, an automated MATLAB-based image processing package well-suited to quantitative analysis of high-throughput live-cell fluorescence microscopy of bacterial cells. SuperSegger incorporates machine-learning algorithms to optimize cellular boundaries and automated error resolution to reliably link cells from frame-to-frame. Unlike existing packages, it can reliably segment microcolonies with many cells, facilitating the analysis of cell-cycle dynamics in bacteria as well as cell-contact mediated phenomena. This package has a range of built-in capabilities for characterizing bacterial cells, including the identification of cell division events, mother, daughter and neighbouring cells, and computing statistics on cellular fluorescence, the location and intensity of fluorescent foci. SuperSegger provides a variety of postprocessing data visualization tools for single cell and population level analysis, such as histograms, kymographs, frame mosaics, movies and consensus images. Finally, we demonstrate the power of the package by analyzing lag phase growth with single cell resolution. © 2016 John Wiley & Sons Ltd.

  19. Automated identification of Monogeneans using digital image processing and K-nearest neighbour approaches.

    PubMed

    Yousef Kalafi, Elham; Tan, Wooi Boon; Town, Christopher; Dhillon, Sarinder Kaur

    2016-12-22

    Monogeneans are flatworms (Platyhelminthes) that are primarily found on gills and skin of fishes. Monogenean parasites have attachment appendages at their haptoral regions that help them to move about the body surface and feed on skin and gill debris. Haptoral attachment organs consist of sclerotized hard parts such as hooks, anchors and marginal hooks. Monogenean species are differentiated based on their haptoral bars, anchors, marginal hooks, reproductive parts' (male and female copulatory organs) morphological characters and soft anatomical parts. The complex structure of these diagnostic organs and also their overlapping in microscopic digital images are impediments for developing fully automated identification system for monogeneans (LNCS 7666:256-263, 2012), (ISDA; 457-462, 2011), (J Zoolog Syst Evol Res 52(2): 95-99. 2013;). In this study images of hard parts of the haptoral organs such as bars and anchors are used to develop a fully automated identification technique for monogenean species identification by implementing image processing techniques and machine learning methods. Images of four monogenean species namely Sinodiplectanotrema malayanus, Trianchoratus pahangensis, Metahaliotrema mizellei and Metahaliotrema sp. (undescribed) were used to develop an automated technique for identification. K-nearest neighbour (KNN) was applied to classify the monogenean specimens based on the extracted features. 50% of the dataset was used for training and the other 50% was used as testing for system evaluation. Our approach demonstrated overall classification accuracy of 90%. In this study Leave One Out (LOO) cross validation is used for validation of our system and the accuracy is 91.25%. The methods presented in this study facilitate fast and accurate fully automated classification of monogeneans at the species level. In future studies more classes will be included in the model, the time to capture the monogenean images will be reduced and improvements in extraction and selection of features will be implemented.

  20. Automated identification of molecular effects of drugs (AIMED)

    PubMed Central

    Fathiamini, Safa; Johnson, Amber M; Zeng, Jia; Araya, Alejandro; Holla, Vijaykumar; Bailey, Ann M; Litzenburger, Beate C; Sanchez, Nora S; Khotskaya, Yekaterina; Xu, Hua; Meric-Bernstam, Funda; Bernstam, Elmer V

    2016-01-01

    Introduction Genomic profiling information is frequently available to oncologists, enabling targeted cancer therapy. Because clinically relevant information is rapidly emerging in the literature and elsewhere, there is a need for informatics technologies to support targeted therapies. To this end, we have developed a system for Automated Identification of Molecular Effects of Drugs, to help biomedical scientists curate this literature to facilitate decision support. Objectives To create an automated system to identify assertions in the literature concerning drugs targeting genes with therapeutic implications and characterize the challenges inherent in automating this process in rapidly evolving domains. Methods We used subject-predicate-object triples (semantic predications) and co-occurrence relations generated by applying the SemRep Natural Language Processing system to MEDLINE abstracts and ClinicalTrials.gov descriptions. We applied customized semantic queries to find drugs targeting genes of interest. The results were manually reviewed by a team of experts. Results Compared to a manually curated set of relationships, recall, precision, and F2 were 0.39, 0.21, and 0.33, respectively, which represents a 3- to 4-fold improvement over a publically available set of predications (SemMedDB) alone. Upon review of ostensibly false positive results, 26% were considered relevant additions to the reference set, and an additional 61% were considered to be relevant for review. Adding co-occurrence data improved results for drugs in early development, but not their better-established counterparts. Conclusions Precision medicine poses unique challenges for biomedical informatics systems that help domain experts find answers to their research questions. Further research is required to improve the performance of such systems, particularly for drugs in development. PMID:27107438

  1. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    NASA Astrophysics Data System (ADS)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  2. Using Internet-Based Automated Software to Process GPS Data at Michigan Tech University

    NASA Astrophysics Data System (ADS)

    Crook, A.; Diehl, J. F.

    2003-12-01

    The Michigan Tech University GPS monument was made operational in October of 2002. The monument, which consists of a concrete pillar extending approximately 10 feet below the surface and protrudes 5 feet above ground, is located at the Houghton County Memorial Airport (47.171803° N, 88.498361° W). The primary purpose of the monument is to measure the velocity of the North American Plate at this location. A Trimble 4000ssi geodetic receiver with a Trimble Zephyr antenna is used to collect GPS data. The data are sent to a PC where they are processed using Auto-GIPSY, an internet-based GPS processing utility, which makes it possible to process GPS data, via email, without having knowledge of how the software works. Two Perl scripts were written to facilitate automation and to simplify processing of the GPS data even further. Twelve months of GPS data were processed, using Auto-GIPSY, which produced a velocity of -24 +/- 5 mm/yr and -4 +/- 6 mm/yr for the X and Y components respectively with an azimuth of 261° with respect to the ITRF2000. This calculated result compares well with the NNR-NUVEL1A velocity of -17 mm/yr and -1 mm/yr for the X and Y components respectively with an azimuth of 267° . The results from an alternative online processing service, the Scripps Coordinate Update Tool (SCOUT) that uses GAMIT, will also be presented as a comparative method.

  3. Text Mining in Biomedical Domain with Emphasis on Document Clustering.

    PubMed

    Renganathan, Vinaitheerthan

    2017-07-01

    With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise.

  4. Passive Seismic Monitoring for Rockfall at Yucca Mountain: Concept Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, J; Twilley, K; Murvosh, H

    2003-03-03

    For the purpose of proof-testing a system intended to remotely monitor rockfall inside a potential radioactive waste repository at Yucca Mountain, a system of seismic sub-arrays will be deployed and tested on the surface of the mountain. The goal is to identify and locate rockfall events remotely using automated data collecting and processing techniques. We install seismometers on the ground surface, generate seismic energy to simulate rockfall in underground space beneath the array, and interpret the surface response to discriminate and locate the event. Data will be analyzed using matched-field processing, a generalized beam forming method for localizing discrete signals.more » Software is being developed to facilitate the processing. To date, a three-component sub-array has been installed and successfully tested.« less

  5. Automation and control of off-planet oxygen production processes

    NASA Technical Reports Server (NTRS)

    Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.

    1990-01-01

    This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.

  6. Automation of Vapor-Diffusion Growth of Protein Crystals

    NASA Technical Reports Server (NTRS)

    Hamrick, David T.; Bray, Terry L.

    2005-01-01

    Some improvements have been made in a system of laboratory equipment developed previously for studying the crystallization of proteins from solution by use of dynamically controlled flows of dry gas. The improvements involve mainly (1) automation of dispensing of liquids for starting experiments, (2) automatic control of drying of protein solutions during the experiments, and (3) provision for automated acquisition of video images for monitoring experiments in progress and for post-experiment analysis. The automation of dispensing of liquids was effected by adding an automated liquid-handling robot that can aspirate source solutions and dispense them in either a hanging-drop or a sitting-drop configuration, whichever is specified, in each of 48 experiment chambers. A video camera of approximately the size and shape of a lipstick dispenser was added to a mobile stage that is part of the robot, in order to enable automated acquisition of images in each experiment chamber. The experiment chambers were redesigned to enable the use of sitting drops, enable backlighting of each specimen, and facilitate automation.

  7. The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1984-01-01

    A good office automation system manned by a team of facilitators seeking opportunities to serve end users could go a long way toward defining a DBMS that serves management. The problems of DBMS organization, alternative approaches to solving some of the major problems, problems that may have no solution, and how office automation fits into the development of the manager's management information system are discussed.

  8. Expertise Development With Different Types of Automation: A Function of Different Cognitive Abilities.

    PubMed

    Jipp, Meike

    2016-02-01

    I explored whether different cognitive abilities (information-processing ability, working-memory capacity) are needed for expertise development when different types of automation (information vs. decision automation) are employed. It is well documented that expertise development and the employment of automation lead to improved performance. Here, it is argued that a learner's ability to reason about an activity may be hindered by the employment of information automation. Additional feedback needs to be processed, thus increasing the load on working memory and decelerating expertise development. By contrast, the employment of decision automation may stimulate reasoning, increase the initial load on information-processing ability, and accelerate expertise development. Authors of past research have not investigated the interrelations between automation assistance, individual differences, and expertise development. Sixty-one naive learners controlled simulated air traffic with two types of automation: information automation and decision automation. Their performance was captured across 16 trials. Well-established tests were used to assess information-processing ability and working-memory capacity. As expected, learners' performance benefited from expertise development and decision automation. Furthermore, individual differences moderated the effect of the type of automation on expertise development: The employment of only information automation increased the load on working memory during later expertise development. The employment of decision automation initially increased the need to process information. These findings highlight the importance of considering individual differences and expertise development when investigating human-automation interaction. The results are relevant for selecting automation configurations for expertise development. © 2015, Human Factors and Ergonomics Society.

  9. Classifying magnetic resonance image modalities with convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Remedios, Samuel; Pham, Dzung L.; Butman, John A.; Roy, Snehashis

    2018-02-01

    Magnetic Resonance (MR) imaging allows the acquisition of images with different contrast properties depending on the acquisition protocol and the magnetic properties of tissues. Many MR brain image processing techniques, such as tissue segmentation, require multiple MR contrasts as inputs, and each contrast is treated differently. Thus it is advantageous to automate the identification of image contrasts for various purposes, such as facilitating image processing pipelines, and managing and maintaining large databases via content-based image retrieval (CBIR). Most automated CBIR techniques focus on a two-step process: extracting features from data and classifying the image based on these features. We present a novel 3D deep convolutional neural network (CNN)- based method for MR image contrast classification. The proposed CNN automatically identifies the MR contrast of an input brain image volume. Specifically, we explored three classification problems: (1) identify T1-weighted (T1-w), T2-weighted (T2-w), and fluid-attenuated inversion recovery (FLAIR) contrasts, (2) identify pre vs postcontrast T1, (3) identify pre vs post-contrast FLAIR. A total of 3418 image volumes acquired from multiple sites and multiple scanners were used. To evaluate each task, the proposed model was trained on 2137 images and tested on the remaining 1281 images. Results showed that image volumes were correctly classified with 97.57% accuracy.

  10. HepSim: A repository with predictions for high-energy physics experiments

    DOE PAGES

    Chekanov, S. V.

    2015-02-03

    A file repository for calculations of cross sections and kinematic distributions using Monte Carlo generators for high-energy collisions is discussed. The repository is used to facilitate effective preservation and archiving of data from theoretical calculations and for comparisons with experimental data. The HepSim data library is publicly accessible and includes a number of Monte Carlo event samples with Standard Model predictions for current and future experiments. The HepSim project includes a software package to automate the process of downloading and viewing online Monte Carlo event samples. Data streaming over a network for end-user analysis is discussed.

  11. Automated measurement of cell motility and proliferation

    PubMed Central

    Bahnson, Alfred; Athanassiou, Charalambos; Koebler, Douglas; Qian, Lei; Shun, Tongying; Shields, Donna; Yu, Hui; Wang, Hong; Goff, Julie; Cheng, Tao; Houck, Raymond; Cowsert, Lex

    2005-01-01

    Background Time-lapse microscopic imaging provides a powerful approach for following changes in cell phenotype over time. Visible responses of whole cells can yield insight into functional changes that underlie physiological processes in health and disease. For example, features of cell motility accompany molecular changes that are central to the immune response, to carcinogenesis and metastasis, to wound healing and tissue regeneration, and to the myriad developmental processes that generate an organism. Previously reported image processing methods for motility analysis required custom viewing devices and manual interactions that may introduce bias, that slow throughput, and that constrain the scope of experiments in terms of the number of treatment variables, time period of observation, replication and statistical options. Here we describe a fully automated system in which images are acquired 24/7 from 384 well plates and are automatically processed to yield high-content motility and morphological data. Results We have applied this technology to study the effects of different extracellular matrix compounds on human osteoblast-like cell lines to explore functional changes that may underlie processes involved in bone formation and maintenance. We show dose-response and kinetic data for induction of increased motility by laminin and collagen type I without significant effects on growth rate. Differential motility response was evident within 4 hours of plating cells; long-term responses differed depending upon cell type and surface coating. Average velocities were increased approximately 0.1 um/min by ten-fold increases in laminin coating concentration in some cases. Comparison with manual tracking demonstrated the accuracy of the automated method and highlighted the comparative imprecision of human tracking for analysis of cell motility data. Quality statistics are reported that associate with stage noise, interference by non-cell objects, and uncertainty in the outlining and positioning of cells by automated image analysis. Exponential growth, as monitored by total cell area, did not linearly correlate with absolute cell number, but proved valuable for selection of reliable tracking data and for disclosing between-experiment variations in cell growth. Conclusion These results demonstrate the applicability of a system that uses fully automated image acquisition and analysis to study cell motility and growth. Cellular motility response is determined in an unbiased and comparatively high throughput manner. Abundant ancillary data provide opportunities for uniform filtering according to criteria that select for biological relevance and for providing insight into features of system performance. Data quality measures have been developed that can serve as a basis for the design and quality control of experiments that are facilitated by automation and the 384 well plate format. This system is applicable to large-scale studies such as drug screening and research into effects of complex combinations of factors and matrices on cell phenotype. PMID:15831094

  12. Additive Manufacturing of Thermoplastic Matrix Composites Using Ultrasonics

    NASA Astrophysics Data System (ADS)

    Olson, Meghan

    Advanced composite materials have great potential for facilitating energy efficient product design and their manufacture if improvements are made to current composite manufacturing processes. This thesis focuses on the development of a novel manufacturing process for thermoplastic composite structures entitled Laser-Ultrasonic Additive Manufacturing ('LUAM'), which is intended to combine the benefits of laser processing technology, developed by Automated Dynamics Inc., with ultrasonic bonding technology that is used commercially for unreinforced polymers. These technologies used together have the potential to significantly reduce the energy consumption and void content of thermoplastic composites made using Automated Fiber Placement (AFP). To develop LUAM in a methodical manner with minimal risk, a staged approach was devised whereby coupon-level mechanical testing and prototyping utilizing existing equipment was accomplished. Four key tasks have been identified for this effort: Benchmarking, Ultrasonic Compaction, Laser Assisted Ultrasonic Compaction, and Demonstration and Characterization of LUAM. This thesis specifically addresses Tasks 1 and 2, i.e. Benchmarking and Ultrasonic Compaction, respectively. Task 1, fabricating test specimens using two traditional processes (autoclave and thermal press) and testing structural performance and dimensional accuracy, provide results of a benchmarking study by which the performance of all future phases will be gauged. Task 2, fabricating test specimens using a non-traditional process (ultrasonic conpaction) and evaluating in a similar fashion, explores the the role of ultrasonic processing parameters using three different thermoplastic composite materials. Further development of LUAM, although beyond the scope of this thesis, will combine laser and ultrasonic technology and eventually demonstrate a working system.

  13. An ion channel library for drug discovery and safety screening on automated platforms.

    PubMed

    Wible, Barbara A; Kuryshev, Yuri A; Smith, Stephen S; Liu, Zhiqi; Brown, Arthur M

    2008-12-01

    Ion channels represent the third largest class of targets in drug discovery after G-protein coupled receptors and kinases. In spite of this ranking, ion channels continue to be under exploited as drug targets compared with the other two groups for several reasons. First, with 400 ion channel genes and an even greater number of functional channels due to mixing and matching of individual subunits, a systematic collection of ion channel-expressing cell lines for drug discovery and safety screening has not been available. Second, the lack of high-throughput functional assays for ion channels has limited their use as drug targets. Now that automated electrophysiology has come of age and provided the technology to assay ion channels at medium to high throughput, we have addressed the need for a library of ion channel cell lines by constructing the Ion Channel Panel (ChanTest Corp., Cleveland, OH). From 400 ion channel genes, a collection of 82 of the most relevant human ion channels for drug discovery, safety, and human disease has been assembled.Each channel has been stably overexpressed in human embryonic kidney 293 or Chinese hamster ovary cells. Cell lines have been selected and validated on automated electrophysiology systems to facilitate cost-effective screening for safe and selective compounds at earlier stages in the drug development process. The screening and validation processes as well as the relative advantages of different screening platforms are discussed.

  14. The Automation of Reserve Processing.

    ERIC Educational Resources Information Center

    Self, James

    1985-01-01

    Describes an automated reserve processing system developed locally at Clemons Library, University of Virginia. Discussion covers developments in the reserve operation at Clemons Library, automation of the processing and circulation functions of reserve collections, and changes in reserve operation performance and staffing needs due to automation.…

  15. Octopus-toolkit: a workflow to automate mining of public epigenomic and transcriptomic next-generation sequencing data

    PubMed Central

    Kim, Taemook; Seo, Hogyu David; Hennighausen, Lothar; Lee, Daeyoup

    2018-01-01

    Abstract Octopus-toolkit is a stand-alone application for retrieving and processing large sets of next-generation sequencing (NGS) data with a single step. Octopus-toolkit is an automated set-up-and-analysis pipeline utilizing the Aspera, SRA Toolkit, FastQC, Trimmomatic, HISAT2, STAR, Samtools, and HOMER applications. All the applications are installed on the user's computer when the program starts. Upon the installation, it can automatically retrieve original files of various epigenomic and transcriptomic data sets, including ChIP-seq, ATAC-seq, DNase-seq, MeDIP-seq, MNase-seq and RNA-seq, from the gene expression omnibus data repository. The downloaded files can then be sequentially processed to generate BAM and BigWig files, which are used for advanced analyses and visualization. Currently, it can process NGS data from popular model genomes such as, human (Homo sapiens), mouse (Mus musculus), dog (Canis lupus familiaris), plant (Arabidopsis thaliana), zebrafish (Danio rerio), fruit fly (Drosophila melanogaster), worm (Caenorhabditis elegans), and budding yeast (Saccharomyces cerevisiae) genomes. With the processed files from Octopus-toolkit, the meta-analysis of various data sets, motif searches for DNA-binding proteins, and the identification of differentially expressed genes and/or protein-binding sites can be easily conducted with few commands by users. Overall, Octopus-toolkit facilitates the systematic and integrative analysis of available epigenomic and transcriptomic NGS big data. PMID:29420797

  16. Processing and Analysis of Multichannel Extracellular Neuronal Signals: State-of-the-Art and Challenges

    PubMed Central

    Mahmud, Mufti; Vassanelli, Stefano

    2016-01-01

    In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data. PMID:27313507

  17. Collaborative data model and data base development for paleoenvironmental and archaeological domain using Semantic MediaWiki

    NASA Astrophysics Data System (ADS)

    Willmes, C.

    2017-12-01

    In the frame of the Collaborative Research Centre 806 (CRC 806) an interdisciplinary research project, that needs to manage data, information and knowledge from heterogeneous domains, such as archeology, cultural sciences, and the geosciences, a collaborative internal knowledge base system was developed. The system is based on the open source MediaWiki software, that is well known as the software that enables Wikipedia, for its facilitation of a web based collaborative knowledge and information management platform. This software is additionally enhanced with the Semantic MediaWiki (SMW) extension, that allows to store and manage structural data within the Wiki platform, as well as it facilitates complex query and API interfaces to the structured data stored in the SMW data base. Using an additional open source software called mobo, it is possible to improve the data model development process, as well as automated data imports, from small spreadsheets to large relational databases. Mobo is a command line tool that helps building and deploying SMW structure in an agile, Schema-Driven Development way, and allows to manage and collaboratively develop the data model formalizations, that are formalized in JSON-Schema format, using version control systems like git. The combination of a well equipped collaborative web platform facilitated by Mediawiki, the possibility to store and query structured data in this collaborative database provided by SMW, as well as the possibility for automated data import and data model development enabled by mobo, result in a powerful but flexible system to build and develop a collaborative knowledge base system. Furthermore, SMW allows the application of Semantic Web technology, the structured data can be exported into RDF, thus it is possible to set a triple-store including a SPARQL endpoint on top of the database. The JSON-Schema based data models, can be enhanced into JSON-LD, to facilitate and profit from the possibilities of Linked Data technology.

  18. Considerations and benefits of implementing an online database tool for business continuity.

    PubMed

    Mackinnon, Susanne; Pinette, Jennifer

    2016-01-01

    In today's challenging climate of ongoing fiscal restraints, limited resources and complex organisational structures there is an acute need to investigate opportunities to facilitate enhanced delivery of business continuity programmes while maintaining or increasing acceptable levels of service delivery. In 2013, Health Emergency Management British Columbia (HEMBC), responsible for emergency management and business continuity activities across British Columbia's health sector, transitioned its business continuity programme from a manual to automated process with the development of a customised online database, known as the Health Emergency Management Assessment Tool (HEMAT). Key benefits to date include a more efficient business continuity input process, immediate situational awareness for use in emergency response and/or advanced planning and streamlined analyses for generation of reports.

  19. Exploring the Use of a Test Automation Framework

    NASA Technical Reports Server (NTRS)

    Cervantes, Alex

    2009-01-01

    It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.

  20. Development of design principles for automated systems in transport control.

    PubMed

    Balfe, Nora; Wilson, John R; Sharples, Sarah; Clarke, Theresa

    2012-01-01

    This article reports the results of a qualitative study investigating attitudes towards and opinions of an advanced automation system currently used in UK rail signalling. In-depth interviews were held with 10 users, key issues associated with automation were identified and the automation's impact on the signalling task investigated. The interview data highlighted the importance of the signallers' understanding of the automation and their (in)ability to predict its outputs. The interviews also covered the methods used by signallers to interact with and control the automation, and the perceived effects on their workload. The results indicate that despite a generally low level of understanding and ability to predict the actions of the automation system, signallers have developed largely successful coping mechanisms that enable them to use the technology effectively. These findings, along with parallel work identifying desirable attributes of automation from the literature in the area, were used to develop 12 principles of automation which can be used to help design new systems which better facilitate cooperative working. The work reported in this article was completed with the active involvement of operational rail staff who regularly use automated systems in rail signalling. The outcomes are currently being used to inform decisions on the extent and type of automation and user interfaces in future generations of rail control systems.

  1. An Updated Process for Automated Deepspace Conjunction Assessment

    NASA Technical Reports Server (NTRS)

    Tarzi, Zahi B.; Berry, David S.; Roncoli, Ralph B.

    2015-01-01

    There is currently a high level of interest in the areas of conjunction assessment and collision avoidance from organizations conducting space operations. Current conjunction assessment activity is mainly focused on spacecraft and debris in the Earth orbital environment [1]. However, collisions are possible in other orbital environments as well [2]. This paper will focus on the current operations of and recent updates to the Multimission Automated Deep Space Conjunction Assessment Process (MADCAP) used at the Jet Propulsion Laboratory for NASA to perform conjunction assessment at Mars and the Moon. Various space agencies have satellites in orbit at Mars and the Moon with additional future missions planned. The consequences of collisions are catastrophically high. Intuitive notions predict low probability of collisions in these sparsely populated environments, but may be inaccurate due to several factors. Orbits of scientific interest often tend to have similar characteristics as do the orbits of spacecraft that provide a communications relay for surface missions. The MADCAP process is controlled by an automated scheduler which initializes analysis based on a set timetable or the appearance of new ephemeris files either locally or on the Deep Space Network (DSN) Portal. The process then generates and communicates reports which are used to facilitate collision avoidance decisions. The paper also describes the operational experience and utilization of the automated tool during periods of high activity and interest such as: the close approaches of NASA's Lunar Atmosphere & Dust Environment Explorer (LADEE) and Lunar Reconnaissance Orbiter (LRO) during the LADEE mission. In addition, special consideration was required for the treatment of missions with rapidly varying orbits and less reliable long term downtrack estimates; in particular this was necessitated by perturbations to MAVEN's orbit induced by the Martian atmosphere. The application of special techniques to non-operational spacecraft with large uncertainties is also studied. Areas for future work are also described. Although the applications discussed in this paper are in the Martian and Lunar environments, the techniques are not unique to these bodies and could be applied to other orbital environments.

  2. Electronic Data Interchange in Procurement

    DTIC Science & Technology

    1990-04-01

    contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated

  3. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  4. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  5. 3-D Imaging Systems for Agricultural Applications—A Review

    PubMed Central

    Vázquez-Arellano, Manuel; Griepentrog, Hans W.; Reiser, David; Paraforos, Dimitris S.

    2016-01-01

    Efficiency increase of resources through automation of agriculture requires more information about the production process, as well as process and machinery status. Sensors are necessary for monitoring the status and condition of production by recognizing the surrounding structures such as objects, field structures, natural or artificial markers, and obstacles. Currently, three dimensional (3-D) sensors are economically affordable and technologically advanced to a great extent, so a breakthrough is already possible if enough research projects are commercialized. The aim of this review paper is to investigate the state-of-the-art of 3-D vision systems in agriculture, and the role and value that only 3-D data can have to provide information about environmental structures based on the recent progress in optical 3-D sensors. The structure of this research consists of an overview of the different optical 3-D vision techniques, based on the basic principles. Afterwards, their application in agriculture are reviewed. The main focus lays on vehicle navigation, and crop and animal husbandry. The depth dimension brought by 3-D sensors provides key information that greatly facilitates the implementation of automation and robotics in agriculture. PMID:27136560

  6. A population MRI brain template and analysis tools for the macaque.

    PubMed

    Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam

    2018-04-15

    The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.

  7. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  8. A Software Engineering Paradigm for Quick-turnaround Earth Science Data Projects

    NASA Astrophysics Data System (ADS)

    Moore, K.

    2016-12-01

    As is generally the case with applied sciences professional and educational programs, the participants of such programs can come from a variety of technical backgrounds. In the NASA DEVELOP National Program, the participants constitute an interdisciplinary set of backgrounds, with varying levels of experience with computer programming. DEVELOP makes use of geographically explicit data sets, and it is necessary to use geographic information systems and geospatial image processing environments. As data sets cover longer time spans and include more complex sets of parameters, automation is becoming an increasingly prevalent feature. Though platforms such as ArcGIS, ERDAS Imagine, and ENVI facilitate the batch-processing of geospatial imagery, these environments are naturally constricting to the user in that they limit him or her to the tools that are available. Users must then turn to "homemade" scripting in more traditional programming languages such as Python, JavaScript, or R, to automate workflows. However, in the context of quick-turnaround projects like those in DEVELOP, the programming learning curve may be prohibitively steep. In this work, we consider how to best design a software development paradigm that addresses two major constants: an arbitrarily experienced programmer and quick-turnaround project timelines.

  9. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    PubMed

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  10. Advanced automation for in-space vehicle processing

    NASA Technical Reports Server (NTRS)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  11. qDIET: toward an automated, self-sustaining knowledge base to facilitate linking point-of-sale grocery items to nutritional content

    PubMed Central

    Chidambaram, Valliammai; Brewster, Philip J.; Jordan, Kristine C.; Hurdle, John F.

    2013-01-01

    The United States, indeed the world, struggles with a serious obesity epidemic. The costs of this epidemic in terms of healthcare dollar expenditures and human morbidity/mortality are staggering. Surprisingly, clinicians are ill-equipped in general to advise patients on effective, longitudinal weight loss strategies. We argue that one factor hindering clinicians and patients in effective shared decision-making about weight loss is the absence of a metric that can be reasoned about and monitored over time, as clinicians do routinely with, say, serum lipid levels or HgA1C. We propose that a dietary quality measure championed by the USDA and NCI, the HEI-2005/2010, is an ideal metric for this purpose. We describe a new tool, the quality Dietary Information Extraction Tool (qDIET), which is a step toward an automated, self-sustaining process that can link retail grocery purchase data to the appropriate USDA databases to permit the calculation of the HEI-2005/2010. PMID:24551333

  12. qDIET: toward an automated, self-sustaining knowledge base to facilitate linking point-of-sale grocery items to nutritional content.

    PubMed

    Chidambaram, Valliammai; Brewster, Philip J; Jordan, Kristine C; Hurdle, John F

    2013-01-01

    The United States, indeed the world, struggles with a serious obesity epidemic. The costs of this epidemic in terms of healthcare dollar expenditures and human morbidity/mortality are staggering. Surprisingly, clinicians are ill-equipped in general to advise patients on effective, longitudinal weight loss strategies. We argue that one factor hindering clinicians and patients in effective shared decision-making about weight loss is the absence of a metric that can be reasoned about and monitored over time, as clinicians do routinely with, say, serum lipid levels or HgA1C. We propose that a dietary quality measure championed by the USDA and NCI, the HEI-2005/2010, is an ideal metric for this purpose. We describe a new tool, the quality Dietary Information Extraction Tool (qDIET), which is a step toward an automated, self-sustaining process that can link retail grocery purchase data to the appropriate USDA databases to permit the calculation of the HEI-2005/2010.

  13. Text Mining in Biomedical Domain with Emphasis on Document Clustering

    PubMed Central

    2017-01-01

    Objectives With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. Methods This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Results Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Conclusions Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise. PMID:28875048

  14. Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.; Munoz, Cesar A.

    2009-01-01

    This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.

  15. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    NASA Astrophysics Data System (ADS)

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.

    2016-03-01

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  16. Automated Comparative Auditing of NCIT Genomic Roles Using NCBI

    PubMed Central

    Cohen, Barry; Oren, Marc; Min, Hua; Perl, Yehoshua; Halper, Michael

    2008-01-01

    Biomedical research has identified many human genes and various knowledge about them. The National Cancer Institute Thesaurus (NCIT) represents such knowledge as concepts and roles (relationships). Due to the rapid advances in this field, it is to be expected that the NCIT’s Gene hierarchy will contain role errors. A comparative methodology to audit the Gene hierarchy with the use of the National Center for Biotechnology Information’s (NCBI’s) Entrez Gene database is presented. The two knowledge sources are accessed via a pair of Web crawlers to ensure up-to-date data. Our algorithms then compare the knowledge gathered from each, identify discrepancies that represent probable errors, and suggest corrective actions. The primary focus is on two kinds of gene-roles: (1) the chromosomal locations of genes, and (2) the biological processes in which genes plays a role. Regarding chromosomal locations, the discrepancies revealed are striking and systematic, suggesting a structurally common origin. In regard to the biological processes, difficulties arise because genes frequently play roles in multiple processes, and processes may have many designations (such as synonymous terms). Our algorithms make use of the roles defined in the NCIT Biological Process hierarchy to uncover many probable gene-role errors in the NCIT. These results show that automated comparative auditing is a promising technique that can identify a large number of probable errors and corrections for them in a terminological genomic knowledge repository, thus facilitating its overall maintenance. PMID:18486558

  17. Automated Production of Movies on a Cluster of Computers

    NASA Technical Reports Server (NTRS)

    Nail, Jasper; Le, Duong; Nail, William L.; Nail, William

    2008-01-01

    A method of accelerating and facilitating production of video and film motion-picture products, and software and generic designs of computer hardware to implement the method, are undergoing development. The method provides for automation of most of the tedious and repetitive tasks involved in editing and otherwise processing raw digitized imagery into final motion-picture products. The method was conceived to satisfy requirements, in industrial and scientific testing, for rapid processing of multiple streams of simultaneously captured raw video imagery into documentation in the form of edited video imagery and video derived data products for technical review and analysis. In the production of such video technical documentation, unlike in production of motion-picture products for entertainment, (1) it is often necessary to produce multiple video derived data products, (2) there are usually no second chances to repeat acquisition of raw imagery, (3) it is often desired to produce final products within minutes rather than hours, days, or months, and (4) consistency and quality, rather than aesthetics, are the primary criteria for judging the products. In the present method, the workflow has both serial and parallel aspects: processing can begin before all the raw imagery has been acquired, each video stream can be subjected to different stages of processing simultaneously on different computers that may be grouped into one or more cluster(s), and the final product may consist of multiple video streams. Results of processing on different computers are shared, so that workers can collaborate effectively.

  18. Flight deck automation: Promises and realities

    NASA Technical Reports Server (NTRS)

    Norman, Susan D. (Editor); Orlady, Harry W. (Editor)

    1989-01-01

    Issues of flight deck automation are multifaceted and complex. The rapid introduction of advanced computer-based technology onto the flight deck of transport category aircraft has had considerable impact both on aircraft operations and on the flight crew. As part of NASA's responsibility to facilitate an active exchange of ideas and information among members of the aviation community, a NASA/FAA/Industry workshop devoted to flight deck automation, organized by the Aerospace Human Factors Research Division of NASA Ames Research Center. Participants were invited from industry and from government organizations responsible for design, certification, operation, and accident investigation of transport category, automated aircraft. The goal of the workshop was to clarify the implications of automation, both positive and negative. Workshop panels and working groups identified issues regarding the design, training, and procedural aspects of flight deck automation, as well as the crew's ability to interact and perform effectively with the new technology. The proceedings include the invited papers and the panel and working group reports, as well as the summary and conclusions of the conference.

  19. Adaptive Algorithms for Automated Processing of Document Images

    DTIC Science & Technology

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  20. Automated processing of endoscopic surgical instruments.

    PubMed

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  1. Proof-of-concept automation of propellant processing

    NASA Technical Reports Server (NTRS)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  2. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    NASA Astrophysics Data System (ADS)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  3. Automation, parallelism, and robotics for proteomics.

    PubMed

    Alterovitz, Gil; Liu, Jonathan; Chow, Jijun; Ramoni, Marco F

    2006-07-01

    The speed of the human genome project (Lander, E. S., Linton, L. M., Birren, B., Nusbaum, C. et al., Nature 2001, 409, 860-921) was made possible, in part, by developments in automation of sequencing technologies. Before these technologies, sequencing was a laborious, expensive, and personnel-intensive task. Similarly, automation and robotics are changing the field of proteomics today. Proteomics is defined as the effort to understand and characterize proteins in the categories of structure, function and interaction (Englbrecht, C. C., Facius, A., Comb. Chem. High Throughput Screen. 2005, 8, 705-715). As such, this field nicely lends itself to automation technologies since these methods often require large economies of scale in order to achieve cost and time-saving benefits. This article describes some of the technologies and methods being applied in proteomics in order to facilitate automation within the field as well as in linking proteomics-based information with other related research areas.

  4. Data processing pipeline for Herschel HIFI

    NASA Astrophysics Data System (ADS)

    Shipman, R. F.; Beaulieu, S. F.; Teyssier, D.; Morris, P.; Rengel, M.; McCoey, C.; Edwards, K.; Kester, D.; Lorenzani, A.; Coeur-Joly, O.; Melchior, M.; Xie, J.; Sanchez, E.; Zaal, P.; Avruch, I.; Borys, C.; Braine, J.; Comito, C.; Delforge, B.; Herpin, F.; Hoac, A.; Kwon, W.; Lord, S. D.; Marston, A.; Mueller, M.; Olberg, M.; Ossenkopf, V.; Puga, E.; Akyilmaz-Yabaci, M.

    2017-12-01

    Context. The HIFI instrument on the Herschel Space Observatory performed over 9100 astronomical observations, almost 900 of which were calibration observations in the course of the nearly four-year Herschel mission. The data from each observation had to be converted from raw telemetry into calibrated products and were included in the Herschel Science Archive. Aims: The HIFI pipeline was designed to provide robust conversion from raw telemetry into calibrated data throughout all phases of the HIFI missions. Pre-launch laboratory testing was supported as were routine mission operations. Methods: A modular software design allowed components to be easily added, removed, amended and/or extended as the understanding of the HIFI data developed during and after mission operations. Results: The HIFI pipeline processed data from all HIFI observing modes within the Herschel automated processing environment as well as within an interactive environment. The same software can be used by the general astronomical community to reprocess any standard HIFI observation. The pipeline also recorded the consistency of processing results and provided automated quality reports. Many pipeline modules were in use since the HIFI pre-launch instrument level testing. Conclusions: Processing in steps facilitated data analysis to discover and address instrument artefacts and uncertainties. The availability of the same pipeline components from pre-launch throughout the mission made for well-understood, tested, and stable processing. A smooth transition from one phase to the next significantly enhanced processing reliability and robustness. Herschel was an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  5. The NIF DISCO Framework: Facilitating Automated Integration of Neuroscience Content on the Web

    PubMed Central

    Marenco, Luis; Wang, Rixin; Shepherd, Gordon M.; Miller, Perry L.

    2013-01-01

    This paper describes the capabilities of DISCO, an extensible approach that supports integrative Web-based information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are “harvested” on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource’s content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) “LinkOut” to a resource’s data items from NCBI Entrez resources such as PubMed, 3) Web-based interoperation with a resource, 4) sharing a resource’s lexicon and ontology, 5) sharing a resource’s database schema, and 6) participation by the resource in neuroscience-related RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research. PMID:20387131

  6. The NIF DISCO Framework: facilitating automated integration of neuroscience content on the web.

    PubMed

    Marenco, Luis; Wang, Rixin; Shepherd, Gordon M; Miller, Perry L

    2010-06-01

    This paper describes the capabilities of DISCO, an extensible approach that supports integrative Web-based information dissemination. DISCO is a component of the Neuroscience Information Framework (NIF), an NIH Neuroscience Blueprint initiative that facilitates integrated access to diverse neuroscience resources via the Internet. DISCO facilitates the automated maintenance of several distinct capabilities using a collection of files 1) that are maintained locally by the developers of participating neuroscience resources and 2) that are "harvested" on a regular basis by a central DISCO server. This approach allows central NIF capabilities to be updated as each resource's content changes over time. DISCO currently supports the following capabilities: 1) resource descriptions, 2) "LinkOut" to a resource's data items from NCBI Entrez resources such as PubMed, 3) Web-based interoperation with a resource, 4) sharing a resource's lexicon and ontology, 5) sharing a resource's database schema, and 6) participation by the resource in neuroscience-related RSS news dissemination. The developers of a resource are free to choose which DISCO capabilities their resource will participate in. Although DISCO is used by NIF to facilitate neuroscience data integration, its capabilities have general applicability to other areas of research.

  7. Review of Armor Battalion and Below Automated Command and Control (C2) Soldier Performance Requirements

    DTIC Science & Technology

    1991-11-01

    baseline conditions, and a minimum of face -to- face communications to stress the automated communication capabilities. The terminals in the TOC will provide...orders. Yet, there is something beneficial, comforting, and reassuring about face -to- face meetings among such personnel. Meetings facilitate...The amount of information received by the company commander through CVCC will be less comprehensive than that received in a face -to- face meeting with

  8. Implementing Cardiopulmonary Resuscitation Training Programs in High Schools: Iowa's Experience.

    PubMed

    Hoyme, Derek B; Atkins, Dianne L

    2017-02-01

    To understand perceived barriers to providing cardiopulmonary resuscitation (CPR) education, implementation processes, and practices in high schools. Iowa has required CPR as a graduation requirement since 2011 as an unfunded mandate. A cross-sectional study was performed through multiple choice surveys sent to Iowa high schools to collect data about school demographics, details of CPR programs, cost, logistics, and barriers to implementation, as well as automated external defibrillator training and availability. Eighty-four schools responded (26%), with the most frequently reported school size of 100-500 students and faculty size of 25-50. When the law took effect, 51% of schools had training programs already in place; at the time of the study, 96% had successfully implemented CPR training. Perceived barriers to implementation were staffing, time commitment, equipment availability, and cost. The average estimated startup cost was <$1000 US, and the yearly maintenance cost was <$500 with funds typically allocated from existing school resources. The facilitator was a school official or volunteer for 81% of schools. Average estimated training time commitment per student was <2 hours. Automated external defibrillators are available in 98% of schools, and 61% include automated external defibrillator training in their curriculum. Despite perceived barriers, school CPR training programs can be implemented with reasonable resource and time allocations. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Implementing Cardiopulmonary Resuscitation Training Programs in High Schools: Iowa's Experience

    PubMed Central

    Hoyme, Derek B.; Atkins, Dianne L.

    2017-01-01

    Objective To understand perceived barriers to providing cardiopulmonary resuscitation (CPR) education, implementation processes, and practices in high schools. Study design Iowa has required CPR as a graduation requirement since 2011 as an unfunded mandate. A cross-sectional study was performed through multiple choice surveys sent to Iowa high schools to collect data about school demographics, details of CPR programs, cost, logistics, and barriers to implementation, as well as automated external defibrillator training and availability. Results Eighty-four schools responded (26%), with the most frequently reported school size of 100-500 students and faculty size of 25-50. When the law took effect, 51% of schools had training programs already in place; at the time of the study, 96% had successfully implemented CPR training. Perceived barriers to implementation were staffing, time commitment, equipment availability, and cost. The average estimated startup cost was <$1000 US, and the yearly maintenance cost was <$500 with funds typically allocated from existing school resources. The facilitator was a school official or volunteer for 81% of schools. Average estimated training time commitment per student was <2 hours. Automated external defibrillators are available in 98% of schools, and 61% include automated external defibrillator training in their curriculum. Conclusions Despite perceived barriers, school CPR training programs can be implemented with reasonable resource and time allocations. PMID:27852456

  10. A 1-night operant learning task without food-restriction differentiates among mouse strains in an automated home-cage environment.

    PubMed

    Remmelink, Esther; Loos, Maarten; Koopmans, Bastijn; Aarts, Emmeke; van der Sluis, Sophie; Smit, August B; Verhage, Matthijs

    2015-04-15

    Individuals are able to change their behavior based on its consequences, a process involving instrumental learning. Studying instrumental learning in mice can provide new insights in this elementary aspect of cognition. Conventional appetitive operant learning tasks that facilitate the study of this form of learning in mice, as well as more complex operant paradigms, require labor-intensive handling and food deprivation to motivate the animals. Here, we describe a 1-night operant learning protocol that exploits the advantages of automated home-cage testing and circumvents the interfering effects of food restriction. The task builds on behavior that is part of the spontaneous exploratory repertoire during the days before the task. We compared the behavior of C57BL/6J, BALB/cJ and DBA/2J mice and found various differences in behavior during this task, but no differences in learning curves. BALB/cJ mice showed the largest instrumental learning response, providing a superior dynamic range and statistical power to study instrumental learning by using this protocol. Insights gained with this home-cage-based learning protocol without food restriction will be valuable for the development of other, more complex, cognitive tasks in automated home-cages. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Conflict-Aware Scheduling Algorithm

    NASA Technical Reports Server (NTRS)

    Wang, Yeou-Fang; Borden, Chester

    2006-01-01

    conflict-aware scheduling algorithm is being developed to help automate the allocation of NASA s Deep Space Network (DSN) antennas and equipment that are used to communicate with interplanetary scientific spacecraft. The current approach for scheduling DSN ground resources seeks to provide an equitable distribution of tracking services among the multiple scientific missions and is very labor intensive. Due to the large (and increasing) number of mission requests for DSN services, combined with technical and geometric constraints, the DSN is highly oversubscribed. To help automate the process, and reduce the DSN and spaceflight project labor effort required for initiating, maintaining, and negotiating schedules, a new scheduling algorithm is being developed. The scheduling algorithm generates a "conflict-aware" schedule, where all requests are scheduled based on a dynamic priority scheme. The conflict-aware scheduling algorithm allocates all requests for DSN tracking services while identifying and maintaining the conflicts to facilitate collaboration and negotiation between spaceflight missions. These contrast with traditional "conflict-free" scheduling algorithms that assign tracks that are not in conflict and mark the remainder as unscheduled. In the case where full schedule automation is desired (based on mission/event priorities, fairness, allocation rules, geometric constraints, and ground system capabilities/ constraints), a conflict-free schedule can easily be created from the conflict-aware schedule by removing lower priority items that are in conflict.

  12. Comparison of two methods for measuring γ-H2AX nuclear fluorescence as a marker of DNA damage in cultured human cells: applications for microbeam radiation therapy

    NASA Astrophysics Data System (ADS)

    Anderson, D.; Andrais, B.; Mirzayans, R.; Siegbahn, E. A.; Fallone, B. G.; Warkentin, B.

    2013-06-01

    Microbeam radiation therapy (MRT) delivers single fractions of very high doses of synchrotron x-rays using arrays of microbeams. In animal experiments, MRT has achieved higher tumour control and less normal tissue toxicity compared to single-fraction broad beam irradiations of much lower dose. The mechanism behind the normal tissue sparing of MRT has yet to be fully explained. An accurate method for evaluating DNA damage, such as the γ-H2AX immunofluorescence assay, will be important for understanding the role of cellular communication in the radiobiological response of normal and cancerous cell types to MRT. We compare two methods of quantifying γ-H2AX nuclear fluorescence for uniformly irradiated cell cultures: manual counting of γ-H2AX foci by eye, and an automated, MATLAB-based fluorescence intensity measurement. We also demonstrate the automated analysis of cell cultures irradiated with an array of microbeams. In addition to offering a relatively high dynamic range of γ-H2AX signal versus irradiation dose ( > 10 Gy), our automated method provides speed, robustness, and objectivity when examining a series of images. Our in-house analysis facilitates the automated extraction of the spatial distribution of the γ-H2AX intensity with respect to the microbeam array — for example, the intensities in the peak (high dose area) and valley (area between two microbeams) regions. The automated analysis is particularly beneficial when processing a large number of samples, as is needed to systematically study the relationship between the numerous dosimetric and geometric parameters involved with MRT (e.g., microbeam width, microbeam spacing, microbeam array dimensions, peak dose, valley dose, and geometric arrangement of multiple arrays) and the resulting DNA damage.

  13. Automated Chromium Plating Line for Gun Barrels

    DTIC Science & Technology

    1979-09-01

    consistent pretreatments and bath dwell times. Some of the advantages of automated processing include increased productivity (average of 20^) due to...when automated processing procedures’ are used. The current method of applying chromium electrodeposits to gun tubes is a manual, batch operation...currently practiced with rotary swaged gun tubes would substantially reduce the difficulties in automated processing . RECOMMENDATIONS

  14. Automation in School Library Media Centers.

    ERIC Educational Resources Information Center

    Driver, Russell W.; Driver, Mary Anne

    1982-01-01

    Surveys the historical development of automated technical processing in schools and notes the impact of this automation in a number of cases. Speculations about the future involvement of school libraries in automated processing and networking are included. Thirty references are listed. (BBM)

  15. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    PubMed

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  16. Barriers and facilitators to public access defibrillation in out-of-hospital cardiac arrest: a systematic review.

    PubMed

    Smith, Christopher M; Lim Choi Keung, Sarah N; Khan, Mohammed O; Arvanitis, Theodoros N; Fothergill, Rachael; Hartley-Sharpe, Christopher; Wilson, Mark H; Perkins, Gavin D

    2017-10-01

    Public access defibrillation initiatives make automated external defibrillators available to the public. This facilitates earlier defibrillation of out-of-hospital cardiac arrest victims and could save many lives. It is currently only used for a minority of cases. The aim of this systematic review was to identify barriers and facilitators to public access defibrillation. A comprehensive literature review was undertaken defining formal search terms for a systematic review of the literature in March 2017. Studies were included if they considered reasons affecting the likelihood of public access defibrillation and presented original data. An electronic search strategy was devised searching MEDLINE and EMBASE, supplemented by bibliography and related-article searches. Given the low-quality and observational nature of the majority of articles, a narrative review was performed. Sixty-four articles were identified in the initial literature search. An additional four unique articles were identified from the electronic search strategies. The following themes were identified related to public access defibrillation: knowledge and awareness; willingness to use; acquisition and maintenance; availability and accessibility; training issues; registration and regulation; medicolegal issues; emergency medical services dispatch-assisted use of automated external defibrillators; automated external defibrillator-locator systems; demographic factors; other behavioural factors. In conclusion, several barriers and facilitators to public access defibrillation deployment were identified. However, the evidence is of very low quality and there is not enough information to inform changes in practice. This is an area in urgent need of further high-quality research if public access defibrillation is to be increased and more lives saved. PROSPERO registration number CRD42016035543. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2017. For permissions, please email: journals.permissions@oup.com.

  17. Demonstration of the feasibility of automated silicon solar cell fabrication

    NASA Technical Reports Server (NTRS)

    Taylor, W. E.; Schwartz, F. M.

    1975-01-01

    A study effort was undertaken to determine the process, steps and design requirements of an automated silicon solar cell production facility. Identification of the key process steps was made and a laboratory model was conceptually designed to demonstrate the feasibility of automating the silicon solar cell fabrication process. A detailed laboratory model was designed to demonstrate those functions most critical to the question of solar cell fabrication process automating feasibility. The study and conceptual design have established the technical feasibility of automating the solar cell manufacturing process to produce low cost solar cells with improved performance. Estimates predict an automated process throughput of 21,973 kilograms of silicon a year on a three shift 49-week basis, producing 4,747,000 hexagonal cells (38mm/side), a total of 3,373 kilowatts at an estimated manufacturing cost of $0.866 per cell or $1.22 per watt.

  18. Automated Space Processing Payloads Study. Volume 1: Executive Summary

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An investigation is described which examined the extent to which the experiment hardware and operational requirements can be met by automatic control and material handling devices; payload and system concepts are defined which make extensive use of automation technology. Topics covered include experiment requirements and hardware data, capabilities and characteristics of industrial automation equipment and controls, payload grouping, automated payload conceptual design, space processing payload preliminary design, automated space processing payloads for early shuttle missions, and cost and scheduling.

  19. Automation of cellular therapy product manufacturing: results of a split validation comparing CD34 selection of peripheral blood stem cell apheresis product with a semi-manual vs. an automatic procedure.

    PubMed

    Hümmer, Christiane; Poppe, Carolin; Bunos, Milica; Stock, Belinda; Wingenfeld, Eva; Huppert, Volker; Stuth, Juliane; Reck, Kristina; Essl, Mike; Seifried, Erhard; Bonig, Halvard

    2016-03-16

    Automation of cell therapy manufacturing promises higher productivity of cell factories, more economical use of highly-trained (and costly) manufacturing staff, facilitation of processes requiring manufacturing steps at inconvenient hours, improved consistency of processing steps and other benefits. One of the most broadly disseminated engineered cell therapy products is immunomagnetically selected CD34+ hematopoietic "stem" cells (HSCs). As the clinical GMP-compliant automat CliniMACS Prodigy is being programmed to perform ever more complex sequential manufacturing steps, we developed a CD34+ selection module for comparison with the standard semi-automatic CD34 "normal scale" selection process on CliniMACS Plus, applicable for 600 × 10(6) target cells out of 60 × 10(9) total cells. Three split-validation processings with healthy donor G-CSF-mobilized apheresis products were performed; feasibility, time consumption and product quality were assessed. All processes proceeded uneventfully. Prodigy runs took about 1 h longer than CliniMACS Plus runs, albeit with markedly less hands-on operator time and therefore also suitable for less experienced operators. Recovery of target cells was the same for both technologies. Although impurities, specifically T- and B-cells, were 5 ± 1.6-fold and 4 ± 0.4-fold higher in the Prodigy products (p = ns and p = 0.013 for T and B cell depletion, respectively), T cell contents per kg of a virtual recipient receiving 4 × 10(6) CD34+ cells/kg was below 10 × 10(3)/kg even in the worst Prodigy product and thus more than fivefold below the specification of CD34+ selected mismatched-donor stem cell products. The products' theoretical clinical usability is thus confirmed. This split validation exercise of a relatively short and simple process exemplifies the potential of automatic cell manufacturing. Automation will further gain in attractiveness when applied to more complex processes, requiring frequent interventions or handling at unfavourable working hours, such as re-targeting of T-cells.

  20. AMPHION: Specification-based programming for scientific subroutine libraries

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Philpot, Andrew; Pressburger, Thomas; Underwood, Ian; Waldinger, Richard; Stickel, Mark

    1994-01-01

    AMPHION is a knowledge-based software engineering (KBSE) system that guides a user in developing a diagram representing a formal problem specification. It then automatically implements a solution to this specification as a program consisting of calls to subroutines from a library. The diagram provides an intuitive domain oriented notation for creating a specification that also facilitates reuse and modification. AMPHION'S architecture is domain independent. AMPHION is specialized to an application domain by developing a declarative domain theory. Creating a domain theory is an iterative process that currently requires the joint expertise of domain experts and experts in automated formal methods for software development.

  1. Developing Data Citations from Digital Object Identifier Metadata

    NASA Technical Reports Server (NTRS)

    James, Nathan; Wanchoo, Lalit

    2015-01-01

    NASA's Earth Science Data and Information System (ESDIS) Project has been processing information for the registration of Digital Object Identifiers (DOI) for the last five years of which an automated system has been in operation for the last two years. The ESDIS DOI registration system has registered over 2000 DOIs with over 1000 DOIs held in reserve until all required information has been collected. By working towards the goal of assigning DOIs to the 8000+ data collections under its management, ESDIS has taken the first step towards facilitating the use of data citations with those products. Jeanne Behnke, ESDIS Deputy Project Manager has reviewed and approved the poster.

  2. Monitoring gypsy moth defoliation by applying change detection techniques to Landsat imagery

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Stauffer, M. L.

    1978-01-01

    The overall objective of a research effort at NASA's Goddard Space Flight Center is to develop and evaluate digital image processing techniques that will facilitate the assessment of the intensity and spatial distribution of forest insect damage in Northeastern U.S. forests using remotely sensed data from Landsats 1, 2 and C. Automated change detection techniques are presently being investigated as a method of isolating the areas of change in the forest canopy resulting from pest outbreaks. In order to follow the change detection approach, Landsat scene correction and overlay capabilities are utilized to provide multispectral/multitemporal image files of 'defoliation' and 'nondefoliation' forest stand conditions.

  3. CF Metadata Conventions: Founding Principles, Governance, and Future Directions

    NASA Astrophysics Data System (ADS)

    Taylor, K. E.

    2016-12-01

    The CF Metadata Conventions define attributes that promote sharing of climate and forecasting data and facilitate automated processing by computers. The development, maintenance, and evolution of the conventions have mainly been provided by voluntary community contributions. Nevertheless, an organizational framework has been established, which relies on established rules and web-based discussion to ensure smooth (but relatively efficient) evolution of the standard to accommodate new types of data. The CF standard has been essential to the success of high-profile internationally-coordinated modeling activities (e.g, the Coupled Model Intercomparison Project). A summary of CF's founding principles and the prospects for its future evolution will be discussed.

  4. Drone inflight mixing of biochemical samples.

    PubMed

    Katariya, Mayur; Chung, Dwayne Chung Kim; Minife, Tristan; Gupta, Harshit; Zahidi, Alifa Afiah Ahmad; Liew, Oi Wah; Ng, Tuck Wah

    2018-03-15

    Autonomous systems for sample transport to the laboratory for analysis can be improved in terms of timeliness, cost and error mitigation in the pre-analytical testing phase. Drones have been reported for outdoor sample transport but incorporating devices on them to attain homogenous mixing of reagents during flight to enhance sample processing timeliness is limited by payload issues. It is shown here that flipping maneuvers conducted with quadcopters are able to facilitate complete and gentle mixing. This capability incorporated during automated sample transport serves to address an important factor contributing to pre-analytical variability which ultimately impacts on test result reliability. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Resource allocation planning with international components

    NASA Technical Reports Server (NTRS)

    Burke, Gene; Durham, Ralph; Leppla, Frank; Porter, David

    1993-01-01

    Dumas, Briggs, Reid and Smith (1989) describe the need for identifying mutually acceptable methodologies for developing standard agreements for the exchange of tracking time or facility use among international components. One possible starting point is the current process used at the Jet Propulsion Laboratory (JPL) in planning the use of tracking resources. While there is a significant promise of better resource utilization by international cooperative agreements, there is a serious challenge to provide convenient user participation given the separate project and network locations. Coordination among users and facility providers will require a more decentralized communication process and a wider variety of automated planning tools to help users find potential exchanges. This paper provides a framework in which international cooperation in the utilization of ground based space communication systems can be facilitated.

  6. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  7. Understanding human management of automation errors.

    PubMed

    McBride, Sara E; Rogers, Wendy A; Fisk, Arthur D

    2014-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance.

  8. The Holistic Targeting (HOT) Methodology as the Means to Improve Information Operations (IO) Target Development and Prioritization

    DTIC Science & Technology

    2008-09-01

    software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1

  9. A new highly automated sputter equipment for in situ investigation of deposition processes with synchrotron radiation.

    PubMed

    Döhrmann, Ralph; Botta, Stephan; Buffet, Adeline; Santoro, Gonzalo; Schlage, Kai; Schwartzkopf, Matthias; Bommel, Sebastian; Risch, Johannes F H; Mannweiler, Roman; Brunner, Simon; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Roth, Stephan V

    2013-04-01

    HASE (Highly Automated Sputter Equipment) is a new mobile setup developed to investigate deposition processes with synchrotron radiation. HASE is based on an ultra-high vacuum sputter deposition chamber equipped with an in-vacuum sample pick-and-place robot. This enables a fast and reliable sample change without breaking the vacuum conditions and helps to save valuable measurement time, which is required for experiments at synchrotron sources like PETRA III at DESY. An advantageous arrangement of several sputter guns, mounted on a rotative flange, gives the possibility to sputter under different deposition angles or to sputter different materials on the same substrate. The chamber is also equipped with a modular sample stage, which allows for the integration of different sample environments, such as a sample heating and cooling device. The design of HASE is unique in the flexibility. The combination of several different sputtering methods like standard deposition, glancing angle deposition, and high pressure sputter deposition combined with heating and cooling possibilities of the sample, the large exit windows, and the degree of automation facilitate many different grazing incidence X-ray scattering experiments, such as grazing incidence small and wide angle X-ray scattering, in one setup. In this paper we describe in detail the design and the performance of the new equipment and present the installation of the HASE apparatus at the Micro and Nano focus X-ray Scattering beamline (MiNaXS) at PETRA III. Furthermore, we describe the measurement options and present some selected results. The HASE setup has been successfully commissioned and is now available for users.

  10. A new highly automated sputter equipment for in situ investigation of deposition processes with synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Döhrmann, Ralph; Botta, Stephan; Buffet, Adeline; Santoro, Gonzalo; Schlage, Kai; Schwartzkopf, Matthias; Bommel, Sebastian; Risch, Johannes F. H.; Mannweiler, Roman; Brunner, Simon; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Roth, Stephan V.

    2013-04-01

    HASE (Highly Automated Sputter Equipment) is a new mobile setup developed to investigate deposition processes with synchrotron radiation. HASE is based on an ultra-high vacuum sputter deposition chamber equipped with an in-vacuum sample pick-and-place robot. This enables a fast and reliable sample change without breaking the vacuum conditions and helps to save valuable measurement time, which is required for experiments at synchrotron sources like PETRA III at DESY. An advantageous arrangement of several sputter guns, mounted on a rotative flange, gives the possibility to sputter under different deposition angles or to sputter different materials on the same substrate. The chamber is also equipped with a modular sample stage, which allows for the integration of different sample environments, such as a sample heating and cooling device. The design of HASE is unique in the flexibility. The combination of several different sputtering methods like standard deposition, glancing angle deposition, and high pressure sputter deposition combined with heating and cooling possibil-ities of the sample, the large exit windows, and the degree of automation facilitate many different grazing incidence X-ray scattering experiments, such as grazing incidence small and wide angle X-ray scattering, in one setup. In this paper we describe in detail the design and the performance of the new equipment and present the installation of the HASE apparatus at the Micro and Nano focus X-ray Scattering beamline (MiNaXS) at PETRA III. Furthermore, we describe the measurement options and present some selected results. The HASE setup has been successfully commissioned and is now available for users.

  11. Planning bioinformatics workflows using an expert system.

    PubMed

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  13. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  14. MilxXplore: a web-based system to explore large imaging datasets.

    PubMed

    Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J

    2013-01-01

    As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis.

  15. A universal method for automated gene mapping

    PubMed Central

    Zipperlen, Peder; Nairz, Knud; Rimann, Ivo; Basler, Konrad; Hafen, Ernst; Hengartner, Michael; Hajnal, Alex

    2005-01-01

    Small insertions or deletions (InDels) constitute a ubiquituous class of sequence polymorphisms found in eukaryotic genomes. Here, we present an automated high-throughput genotyping method that relies on the detection of fragment-length polymorphisms (FLPs) caused by InDels. The protocol utilizes standard sequencers and genotyping software. We have established genome-wide FLP maps for both Caenorhabditis elegans and Drosophila melanogaster that facilitate genetic mapping with a minimum of manual input and at comparatively low cost. PMID:15693948

  16. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    PubMed

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  17. PANDA: a pipeline toolbox for analyzing brain diffusion images

    PubMed Central

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named “Pipeline for Analyzing braiN Diffusion imAges” (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies. PMID:23439846

  18. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format

    PubMed Central

    2011-01-01

    The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments. PMID:22136293

  19. Protocol: high throughput silica-based purification of RNA from Arabidopsis seedlings in a 96-well format.

    PubMed

    Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E

    2011-12-02

    The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.

  20. An Experimental Characterization System for Deep Ultra-Violet (UV) Photoresists

    NASA Astrophysics Data System (ADS)

    Drako, Dean M.; Partlo, William N.; Oldham, William G.; Neureuther, Andrew R.

    1989-08-01

    A versatile system designed specifically for experimental automated photoresist characterization has been constructed utilizing an excimer laser source for exposure at 248nm. The system was assembled, as much as possible, from commercially available components in order to facilitate its replication. The software and hardware are completely documented in a University of California-Berkeley Engineering Research Lab Memo. An IBM PC-AT compatible computer controls an excimer laser, operates a Fourier Transform Infrared (FTIR) Spectrometer, measures and records the energy of each laser pulse (incident, reflected, and transmitted), opens and closes shutters, and operates two linear stages for sample movement. All operations (except FTIR data reduction) are managed by a control program written in the "C" language. The system is capable of measuring total exposure dose, performing bleaching measurements, creating and recording exposure pulse sequences, and generating exposure patterns suitable for multiple channel monitoring of the development. The total exposure energy, energy per pulse, and pulse rate are selectable over a wide range. The system contains an in-situ Fourier Transform Infrared Spectrometer for qualitative and quantitative analysis of the photoresist baking and exposure processes (baking is not done in-situ). FIIR may be performed in transmission or reflection. The FTIR data will form the basis of comprehensive multi-state resist models. The system's versatility facilitates the development of new automated and repeatable experiments. Simple controlling software, utilizing the provided interface sub-routines, can be written to control new experiments and collect data.

  1. Rapid automated classification of anesthetic depth levels using GPU based parallelization of neural networks.

    PubMed

    Peker, Musa; Şen, Baha; Gürüler, Hüseyin

    2015-02-01

    The effect of anesthesia on the patient is referred to as depth of anesthesia. Rapid classification of appropriate depth level of anesthesia is a matter of great importance in surgical operations. Similarly, accelerating classification algorithms is important for the rapid solution of problems in the field of biomedical signal processing. However numerous, time-consuming mathematical operations are required when training and testing stages of the classification algorithms, especially in neural networks. In this study, to accelerate the process, parallel programming and computing platform (Nvidia CUDA) facilitates dramatic increases in computing performance by harnessing the power of the graphics processing unit (GPU) was utilized. The system was employed to detect anesthetic depth level on related electroencephalogram (EEG) data set. This dataset is rather complex and large. Moreover, the achieving more anesthetic levels with rapid response is critical in anesthesia. The proposed parallelization method yielded high accurate classification results in a faster time.

  2. Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1983-01-01

    Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.

  3. Integrated Structural Analysis and Test Program

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2005-01-01

    An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.

  4. Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Daniel G.

    In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less

  5. Automation of Space Processing Applications Shuttle payloads

    NASA Technical Reports Server (NTRS)

    Crosmer, W. E.; Neau, O. T.; Poe, J.

    1975-01-01

    The Space Processing Applications Program is examining the effect of weightlessness on key industrial materials processes, such as crystal growth, fine-grain casting of metals, and production of unique and ultra-pure glasses. Because of safety and in order to obtain optimum performance, some of these processes lend themselves to automation. Automation can increase the number of potential Space Shuttle flight opportunities and increase the overall productivity of the program. Five automated facility design concepts and overall payload combinations incorporating these facilities are presented.

  6. Transportable educational programs for scientific and technical professionals: More effective utilization of automated scientific and technical data base systems

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D.

    1987-01-01

    This grant final report executive summary documents a major, long-term program addressing innovative educational issues associated with the development, administration, evaluation, and widespread distribution of transportable educational programs for scientists and engineers to increase their knowledge of, and facilitate their utilization of automated scientific and technical information storage and retrieval systems. This educational program is of very broad scope, being targeted at Colleges of Engineering and Colleges of Physical sciences at a large number of colleges and universities throughout the United States. The educational program is designed to incorporate extensive hands-on, interactive usage of the NASA RECON system and is supported by a number of microcomputer-based software systems to facilitate the delivery and usage of the educational course materials developed as part of the program.

  7. Development of a plan for automating integrated circuit processing

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.

  8. The Hyperspectral Imager for the Coastal Ocean (HICO): Sensor and Data Processing Overview

    DTIC Science & Technology

    2010-01-20

    backscattering coefficients, and others. Several of these software modules will be developed within the Automated Processing System (APS), a data... Automated Processing System (APS) NRL developed APS, which processes satellite data into ocean color data products. APS is a collection of methods...used for ocean color processing which provide the tools for the automated processing of satellite imagery [1]. These tools are in the process of

  9. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  10. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the Gulf of Mexico, and improved the accuracy and resolution of the Probabilistic Storm Surge model.

  11. Highly multiplexed targeted proteomics using precise control of peptide retention time.

    PubMed

    Gallien, Sebastien; Peterman, Scott; Kiyonami, Reiko; Souady, Jamal; Duriez, Elodie; Schoen, Alan; Domon, Bruno

    2012-04-01

    Large-scale proteomics applications using SRM analysis on triple quadrupole mass spectrometers present new challenges to LC-MS/MS experimental design. Despite the automation of building large-scale LC-SRM methods, the increased numbers of targeted peptides can compromise the balance between sensitivity and selectivity. To facilitate large target numbers, time-scheduled SRM transition acquisition is performed. Previously published results have demonstrated incorporation of a well-characterized set of synthetic peptides enabled chromatographic characterization of the elution profile for most endogenous peptides. We have extended this application of peptide trainer kits to not only build SRM methods but to facilitate real-time elution profile characterization that enables automated adjustment of the scheduled detection windows. Incorporation of dynamic retention time adjustments better facilitate targeted assays lasting several days without the need for constant supervision. This paper provides an overview of how the dynamic retention correction approach identifies and corrects for commonly observed LC variations. This adjustment dramatically improves robustness in targeted discovery experiments as well as routine quantification experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Self-organizing ontology of biochemically relevant small molecules

    PubMed Central

    2012-01-01

    Background The advent of high-throughput experimentation in biochemistry has led to the generation of vast amounts of chemical data, necessitating the development of novel analysis, characterization, and cataloguing techniques and tools. Recently, a movement to publically release such data has advanced biochemical structure-activity relationship research, while providing new challenges, the biggest being the curation, annotation, and classification of this information to facilitate useful biochemical pattern analysis. Unfortunately, the human resources currently employed by the organizations supporting these efforts (e.g. ChEBI) are expanding linearly, while new useful scientific information is being released in a seemingly exponential fashion. Compounding this, currently existing chemical classification and annotation systems are not amenable to automated classification, formal and transparent chemical class definition axiomatization, facile class redefinition, or novel class integration, thus further limiting chemical ontology growth by necessitating human involvement in curation. Clearly, there is a need for the automation of this process, especially for novel chemical entities of biological interest. Results To address this, we present a formal framework based on Semantic Web technologies for the automatic design of chemical ontology which can be used for automated classification of novel entities. We demonstrate the automatic self-assembly of a structure-based chemical ontology based on 60 MeSH and 40 ChEBI chemical classes. This ontology is then used to classify 200 compounds with an accuracy of 92.7%. We extend these structure-based classes with molecular feature information and demonstrate the utility of our framework for classification of functionally relevant chemicals. Finally, we discuss an iterative approach that we envision for future biochemical ontology development. Conclusions We conclude that the proposed methodology can ease the burden of chemical data annotators and dramatically increase their productivity. We anticipate that the use of formal logic in our proposed framework will make chemical classification criteria more transparent to humans and machines alike and will thus facilitate predictive and integrative bioactivity model development. PMID:22221313

  13. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  14. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model......Do not return it to the originator. ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory

  15. HomeBank: An Online Repository of Daylong Child-Centered Audio Recordings

    PubMed Central

    VanDam, Mark; Warlaumont, Anne S.; Bergelson, Elika; Cristia, Alejandrina; Soderstrom, Melanie; De Palma, Paul; MacWhinney, Brian

    2017-01-01

    HomeBank is introduced here. It is a public, permanent, extensible, online database of daylong audio recorded in naturalistic environments. HomeBank serves two primary purposes. First, it is a repository for raw audio and associated files: one database requires special permissions, and another redacted database allows unrestricted public access. Associated files include metadata such as participant demographics and clinical diagnostics, automated annotations, and human-generated transcriptions and annotations. Many recordings use the child-perspective LENA recorders (LENA Research Foundation, Boulder, Colorado, United States), but various recordings and metadata can be accommodated. The HomeBank database can have both vetted and unvetted recordings, with different levels of accessibility. Additionally, HomeBank is an open repository for processing and analysis tools for HomeBank or similar data sets. HomeBank is flexible for users and contributors, making primary data available to researchers, especially those in child development, linguistics, and audio engineering. HomeBank facilitates researchers’ access to large-scale data and tools, linking the acoustic, auditory, and linguistic characteristics of children’s environments with a variety of variables including socioeconomic status, family characteristics, language trajectories, and disorders. Automated processing applied to daylong home audio recordings is now becoming widely used in early intervention initiatives, helping parents to provide richer speech input to at-risk children. PMID:27111272

  16. GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.

    PubMed

    Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M

    2010-04-05

    Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.

  17. EVALUATING HYDROLOGICAL RESPONSE TO ...

    EPA Pesticide Factsheets

    Studies of future management and policy options based on different assumptions provide a mechanism to examine possible outcomes and especially their likely benefits or consequences. Planning and assessment in land and water resource management are evolving toward complex, spatially explicit regional assessments. These problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and temporal scales. The extensive data requirements and the difficult task of building input parameter files, however, have long been an obstacle to the timely and cost-effective use of such complex models by resource managers. The U.S. EPA Landscape Ecology Branch in collaboration with the USDA-ARS Southwest Watershed Research Center has developed a geographic information system (GIS) tool to facilitate this process. A GIS provides the framework within which spatially distributed data are collected and used to prepare model input files, and model results are evaluated. The Automated Geospatial Watershed Assessment (AGWA) tool uses widely available standardized spatial datasets that can be obtained via the internet at no cost to the user. The data are used to develop input parameter files for KINEROS2 and SWAT, two watershed runoff and erosion simulation models that operate at different spatial and temporal scales. AGWA automates the process of transforming digital data into simulation model results and provides a visualization tool

  18. easyDAS: Automatic creation of DAS servers

    PubMed Central

    2011-01-01

    Background The Distributed Annotation System (DAS) has proven to be a successful way to publish and share biological data. Although there are more than 750 active registered servers from around 50 organizations, setting up a DAS server comprises a fair amount of work, making it difficult for many research groups to share their biological annotations. Given the clear advantage that the generalized sharing of relevant biological data is for the research community it would be desirable to facilitate the sharing process. Results Here we present easyDAS, a web-based system enabling anyone to publish biological annotations with just some clicks. The system, available at http://www.ebi.ac.uk/panda-srv/easydas is capable of reading different standard data file formats, process the data and create a new publicly available DAS source in a completely automated way. The created sources are hosted on the EBI systems and can take advantage of its high storage capacity and network connection, freeing the data provider from any network management work. easyDAS is an open source project under the GNU LGPL license. Conclusions easyDAS is an automated DAS source creation system which can help many researchers in sharing their biological data, potentially increasing the amount of relevant biological data available to the scientific community. PMID:21244646

  19. The optics inside an automated single molecule array analyzer

    NASA Astrophysics Data System (ADS)

    McGuigan, William; Fournier, David R.; Watson, Gary W.; Walling, Les; Gigante, Bill; Duffy, David C.; Rissin, David M.; Kan, Cheuk W.; Meyer, Raymond E.; Piech, Tomasz; Fishburn, Matthew W.

    2014-02-01

    Quanterix and Stratec Biomedical have developed an instrument that enables the automated measurement of multiple proteins at concentration ~1000 times lower than existing immunoassays. The instrument is based on Quanterix's proprietary Single Molecule Array technology (Simoa™ ) that facilitates the detection and quantification of biomarkers previously difficult to measure, thus opening up new applications in life science research and in-vitro diagnostics. Simoa is based on trapping individual beads in arrays of femtoliter-sized wells that, when imaged with sufficient resolution, allows for counting of single molecules associated with each bead. When used to capture and detect proteins, this approach is known as digital ELISA (Enzyme-linked immunosorbent assay). The platform developed is a merger of many science and engineering disciplines. This paper concentrates on the optical technologies that have enabled the development of a fully-automated single molecule analyzer. At the core of the system is a custom, wide field-of-view, fluorescence microscope that images arrays of microwells containing single molecules bound to magnetic beads. A consumable disc containing 24 microstructure arrays was developed previously in collaboration with Sony DADC. The system cadence requirements, array dimensions, and requirement to detect single molecules presented significant optical challenges. Specifically, the wide field-of-view needed to image the entire array resulted in the need for a custom objective lens. Additionally, cost considerations for the system required a custom solution that leveraged the image processing capabilities. This paper will discuss the design considerations and resultant optical architecture that has enabled the development of an automated digital ELISA platform.

  20. Automated Data Aggregation for Time-Series Analysis: Study Case on Anaesthesia Data Warehouse.

    PubMed

    Lamer, Antoine; Jeanne, Mathieu; Ficheur, Grégoire; Marcilly, Romaric

    2016-01-01

    Data stored in operational databases are not reusable directly. Aggregation modules are necessary to facilitate secondary use. They decrease volume of data while increasing the number of available information. In this paper, we present four automated engines of aggregation, integrated into an anaesthesia data warehouse. Four instances of clinical questions illustrate the use of those engines for various improvements of quality of care: duration of procedure, drug administration, assessment of hypotension and its related treatment.

  1. An Intelligent Automation Platform for Rapid Bioprocess Design.

    PubMed

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  2. Automated Student Aid Processing: The Challenge and Opportunity.

    ERIC Educational Resources Information Center

    St. John, Edward P.

    1985-01-01

    To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)

  3. A Decision Support System for Managing a Diverse Portfolio of Technology Resources

    NASA Technical Reports Server (NTRS)

    Smith, J.

    2000-01-01

    This paper describes an automated decision support system designed to facilitate the management of a continuously changing portfolio of technologies as new technologies are deployed and older technologies are decommissioned.

  4. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    NASA Astrophysics Data System (ADS)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  5. CFD Process Pre- and Post-processing Automation in Support of Space Propulsion

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne M.

    2003-01-01

    The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.

  6. Comparability of automated human induced pluripotent stem cell culture: a pilot study.

    PubMed

    Archibald, Peter R T; Chandra, Amit; Thomas, Dave; Chose, Olivier; Massouridès, Emmanuelle; Laâbi, Yacine; Williams, David J

    2016-12-01

    Consistent and robust manufacturing is essential for the translation of cell therapies, and the utilisation automation throughout the manufacturing process may allow for improvements in quality control, scalability, reproducibility and economics of the process. The aim of this study was to measure and establish the comparability between alternative process steps for the culture of hiPSCs. Consequently, the effects of manual centrifugation and automated non-centrifugation process steps, performed using TAP Biosystems' CompacT SelecT automated cell culture platform, upon the culture of a human induced pluripotent stem cell (hiPSC) line (VAX001024c07) were compared. This study, has demonstrated that comparable morphologies and cell diameters were observed in hiPSCs cultured using either manual or automated process steps. However, non-centrifugation hiPSC populations exhibited greater cell yields, greater aggregate rates, increased pluripotency marker expression, and decreased differentiation marker expression compared to centrifugation hiPSCs. A trend for decreased variability in cell yield was also observed after the utilisation of the automated process step. This study also highlights the detrimental effect of the cryopreservation and thawing processes upon the growth and characteristics of hiPSC cultures, and demonstrates that automated hiPSC manufacturing protocols can be successfully transferred between independent laboratories.

  7. The automated system for technological process of spacecraft's waveguide paths soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.

    2016-11-01

    The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.

  8. Applying formal methods and object-oriented analysis to existing flight software

    NASA Technical Reports Server (NTRS)

    Cheng, Betty H. C.; Auernheimer, Brent

    1993-01-01

    Correctness is paramount for safety-critical software control systems. Critical software failures in medical radiation treatment, communications, and defense are familiar to the public. The significant quantity of software malfunctions regularly reported to the software engineering community, the laws concerning liability, and a recent NRC Aeronautics and Space Engineering Board report additionally motivate the use of error-reducing and defect detection software development techniques. The benefits of formal methods in requirements driven software development ('forward engineering') is well documented. One advantage of rigorously engineering software is that formal notations are precise, verifiable, and facilitate automated processing. This paper describes the application of formal methods to reverse engineering, where formal specifications are developed for a portion of the shuttle on-orbit digital autopilot (DAP). Three objectives of the project were to: demonstrate the use of formal methods on a shuttle application, facilitate the incorporation and validation of new requirements for the system, and verify the safety-critical properties to be exhibited by the software.

  9. One-pot preparation of mRNA/cDNA display by a novel and versatile puromycin-linker DNA.

    PubMed

    Mochizuki, Yuki; Biyani, Manish; Tsuji-Ueno, Sachika; Suzuki, Miho; Nishigaki, Koichi; Husimi, Yuzuru; Nemoto, Naoto

    2011-09-12

    A rapid, easy, and robust preparation method for mRNA/cDNA display using a newly designed puromycin-linker DNA is presented. The new linker is structurally simple, easy to synthesize, and cost-effective for use in "in vitro peptide and protein selection". An introduction of RNase T1 nuclease site to the new linker facilitates the easy recovery of mRNA/cDNA displayed protein by an improvement of the efficiency of ligating the linker to mRNAs and efficient release of mRNA/cDNA displayed protein from the solid-phase (magnetic bead). For application demonstration, affinity selections were successfully performed. Furthermore, we introduced a "one-pot" preparation protocol to perform mRNA display easy. Unlike conventional approaches that require tedious and downstream multistep process including purification, this protocol will make the mRNA/cDNA display methods more practical and convenient and also facilitate the development of next-generation, high-throughput mRNA/cDNA display systems amenable to automation.

  10. [Algorithm for the automated processing of rheosignals].

    PubMed

    Odinets, G S

    1988-01-01

    Algorithm for rheosignals recognition for a microprocessing device with a representation apparatus and with automated and manual cursor control was examined. The algorithm permits to automate rheosignals registrating and processing taking into account their changeability.

  11. Quantifying Spiral Ganglion Neurite and Schwann Behavior on Micropatterned Polymer Substrates.

    PubMed

    Cheng, Elise L; Leigh, Braden; Guymon, C Allan; Hansen, Marlan R

    2016-01-01

    The first successful in vitro experiments on the cochlea were conducted in 1928 by Honor Fell (Fell, Arch Exp Zellforsch 7(1):69-81, 1928). Since then, techniques for culture of this tissue have been refined, and dissociated primary culture of the spiral ganglion has become a widely accepted in vitro model for studying nerve damage and regeneration in the cochlea. Additionally, patterned substrates have been developed that facilitate and direct neural outgrowth. A number of automated and semi-automated methods for quantifying this neurite outgrowth have been utilized in recent years (Zhang et al., J Neurosci Methods 160(1):149-162, 2007; Tapias et al., Neurobiol Dis 54:158-168, 2013). Here, we describe a method to study the effect of topographical cues on spiral ganglion neurite and Schwann cell alignment. We discuss our microfabrication process, characterization of pattern features, cell culture techniques for both spiral ganglion neurons and spiral ganglion Schwann cells. In addition, we describe protocols for reducing fibroblast count, immunocytochemistry, and methods for quantifying neurite and Schwann cell alignment.

  12. ARM Data File Standards Version: 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kehoe, Kenneth; Beus, Sherman; Cialella, Alice

    2014-04-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools formore » the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.« less

  13. Ensemble LUT classification for degraded document enhancement

    NASA Astrophysics Data System (ADS)

    Obafemi-Ajayi, Tayo; Agam, Gady; Frieder, Ophir

    2008-01-01

    The fast evolution of scanning and computing technologies have led to the creation of large collections of scanned paper documents. Examples of such collections include historical collections, legal depositories, medical archives, and business archives. Moreover, in many situations such as legal litigation and security investigations scanned collections are being used to facilitate systematic exploration of the data. It is almost always the case that scanned documents suffer from some form of degradation. Large degradations make documents hard to read and substantially deteriorate the performance of automated document processing systems. Enhancement of degraded document images is normally performed assuming global degradation models. When the degradation is large, global degradation models do not perform well. In contrast, we propose to estimate local degradation models and use them in enhancing degraded document images. Using a semi-automated enhancement system we have labeled a subset of the Frieder diaries collection.1 This labeled subset was then used to train an ensemble classifier. The component classifiers are based on lookup tables (LUT) in conjunction with the approximated nearest neighbor algorithm. The resulting algorithm is highly effcient. Experimental evaluation results are provided using the Frieder diaries collection.1

  14. Studying fish near ocean energy devices using underwater video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzner, Shari; Hull, Ryan E.; Harker-Klimes, Genevra EL

    The effects of energy devices on fish populations are not well-understood, and studying the interactions of fish with tidal and instream turbines is challenging. To address this problem, we have evaluated algorithms to automatically detect fish in underwater video and propose a semi-automated method for ocean and river energy device ecological monitoring. The key contributions of this work are the demonstration of a background subtraction algorithm (ViBE) that detected 87% of human-identified fish events and is suitable for use in a real-time system to reduce data volume, and the demonstration of a statistical model to classify detections as fish ormore » not fish that achieved a correct classification rate of 85% overall and 92% for detections larger than 5 pixels. Specific recommendations for underwater video acquisition to better facilitate automated processing are given. The recommendations will help energy developers put effective monitoring systems in place, and could lead to a standard approach that simplifies the monitoring effort and advances the scientific understanding of the ecological impacts of ocean and river energy devices.« less

  15. Particle tracking in drug and gene delivery research: State-of-the-art applications and methods.

    PubMed

    Schuster, Benjamin S; Ensign, Laura M; Allan, Daniel B; Suk, Jung Soo; Hanes, Justin

    2015-08-30

    Particle tracking is a powerful microscopy technique to quantify the motion of individual particles at high spatial and temporal resolution in complex fluids and biological specimens. Particle tracking's applications and impact in drug and gene delivery research have greatly increased during the last decade. Thanks to advances in hardware and software, this technique is now more accessible than ever, and can be reliably automated to enable rapid processing of large data sets, thereby further enhancing the role that particle tracking will play in drug and gene delivery studies in the future. We begin this review by discussing particle tracking-based advances in characterizing extracellular and cellular barriers to therapeutic nanoparticles and in characterizing nanoparticle size and stability. To facilitate wider adoption of the technique, we then present a user-friendly review of state-of-the-art automated particle tracking algorithms and methods of analysis. We conclude by reviewing technological developments for next-generation particle tracking methods, and we survey future research directions in drug and gene delivery where particle tracking may be useful. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Automated Operant Conditioning in the Mouse Home Cage.

    PubMed

    Francis, Nikolas A; Kanold, Patrick O

    2017-01-01

    Recent advances in neuroimaging and genetics have made mice an advantageous animal model for studying the neurophysiology of sensation, cognition, and locomotion. A key benefit of mice is that they provide a large population of test subjects for behavioral screening. Reflex-based assays of hearing in mice, such as the widely used acoustic startle response, are less accurate than operant conditioning in measuring auditory processing. To date, however, there are few cost-effective options for scalable operant conditioning systems. Here, we describe a new system for automated operant conditioning, the Psibox. It is assembled from low cost parts, designed to fit within typical commercial wire-top cages, and allows large numbers of mice to train independently in their home cages on positive reinforcement tasks. We found that groups of mice trained together learned to accurately detect sounds within 2 weeks of training. In addition, individual mice isolated from groups also showed good task performance. The Psibox facilitates high-throughput testing of sensory, motor, and cognitive skills in mice, and provides a readily available animal population for studies ranging from experience-dependent neural plasticity to rodent models of mental disorders.

  17. Flexible End2End Workflow Automation of Hit-Discovery Research.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  18. Extraction of phenotypic traits from taxonomic descriptions for the tree of life using natural language processing.

    PubMed

    Endara, Lorena; Cui, Hong; Burleigh, J Gordon

    2018-03-01

    Phenotypic data sets are necessary to elucidate the genealogy of life, but assembling phenotypic data for taxa across the tree of life can be technically challenging and prohibitively time consuming. We describe a semi-automated protocol to facilitate and expedite the assembly of phenotypic character matrices of plants from formal taxonomic descriptions. This pipeline uses new natural language processing (NLP) techniques and a glossary of over 9000 botanical terms. Our protocol includes the Explorer of Taxon Concepts (ETC), an online application that assembles taxon-by-character matrices from taxonomic descriptions, and MatrixConverter, a Java application that enables users to evaluate and discretize the characters extracted by ETC. We demonstrate this protocol using descriptions from Araucariaceae. The NLP pipeline unlocks the phenotypic data found in taxonomic descriptions and makes them usable for evolutionary analyses.

  19. Enabling Smart Workflows over Heterogeneous ID-Sensing Technologies

    PubMed Central

    Giner, Pau; Cetina, Carlos; Lacuesta, Raquel; Palacios, Guillermo

    2012-01-01

    Sensing technologies in mobile devices play a key role in reducing the gap between the physical and the digital world. The use of automatic identification capabilities can improve user participation in business processes where physical elements are involved (Smart Workflows). However, identifying all objects in the user surroundings does not automatically translate into meaningful services to the user. This work introduces Parkour, an architecture that allows the development of services that match the goals of each of the participants in a smart workflow. Parkour is based on a pluggable architecture that can be extended to provide support for new tasks and technologies. In order to facilitate the development of these plug-ins, tools that automate the development process are also provided. Several Parkour-based systems have been developed in order to validate the applicability of the proposal. PMID:23202193

  20. Use of the World Wide Web for multisite data collection.

    PubMed

    Subramanian, A K; McAfee, A T; Getzinger, J P

    1997-08-01

    As access to the Internet becomes increasingly available, research applications in medicine will increase. This paper describes the use of the Internet, and, more specifically, the World Wide Web (WWW), as a channel of communication between EDs throughout the world and investigators who are interested in facilitating the collection of data from multiple sites. Data entered into user-friendly electronic surveys can be transmitted over the Internet to a database located at the site of the study, rendering geographic separation less of a barrier to the conduction of multisite studies. The electronic format of the data can enable real-time statistical processing while data are stored using existing database technologies. In theory, automated processing of variables within such a database enables early identification of data trends. Methods of ensuring validity, security, and compliance are discussed.

  1. Development of a Magnetic Microbead Affinity Selection Screen (MagMASS) Using Mass Spectrometry for Ligands to the Retinoid X Receptor-α

    NASA Astrophysics Data System (ADS)

    Rush, Michael D.; Walker, Elisabeth M.; Prehna, Gerd; Burton, Tristesse; van Breemen, Richard B.

    2017-03-01

    To overcome limiting factors in mass spectrometry-based screening methods such as automation while still facilitating the screening of complex mixtures such as botanical extracts, magnetic microbead affinity selection screening (MagMASS) was developed. The screening process involves immobilization of a target protein on a magnetic microbead using a variety of possible chemistries, incubation with mixtures of molecules containing possible ligands, a washing step that removes non-bound compounds while a magnetic field retains the beads in the microtiter well, and an organic solvent release step followed by LC-MS analysis. Using retinoid X receptor-α (RXRα) as an example, which is a nuclear receptor and target for anti-inflammation therapy as well as cancer treatment and prevention, a MagMASS assay was developed and compared with an existing screening assay, pulsed ultrafiltration (PUF)-MS. Optimization of MagMASS involved evaluation of multiple protein constructs and several magnetic bead immobilization chemistries. The full-length RXRα construct immobilized with amylose beads provided optimum results. Additional enhancements of MagMASS were the application of 96-well plates to enable automation, use of UHPLC instead of HPLC for faster MS analyses, and application of metabolomics software for faster, automated data analysis. Performance of MagMASS was demonstrated using mixtures of synthetic compounds and known ligands spiked into botanical extracts.

  2. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs

    NASA Astrophysics Data System (ADS)

    Gladhill, R.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Nolke, S.; Riddick, J.; Straub, J. A.

    2005-11-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. A production implementation of automated photomask manufacturing rule checking (MRC) is presented and discussed for various photomask lithography and inspection lines. This paper will focus on identifying data which may cause production delays at the mask inspection stage. It will be shown how photomask MRC can be used to discover data related problems prior to inspection, separating jobs which are likely to have problems at inspection from those which are not. Photomask MRC can also be used to identify geometries requiring adjustment of inspection parameters for optimal inspection, and to assist with any special handling or change of routing requirements. With this foreknowledge, steps can be taken to avoid production delays that increase manufacturing costs. Finally, the data flow implemented for MRC can be used as a platform for other photomask data preparation tasks.

  3. Automated database-guided expert-supervised orientation for immunophenotypic diagnosis and classification of acute leukemia

    PubMed Central

    Lhermitte, L; Mejstrikova, E; van der Sluijs-Gelling, A J; Grigore, G E; Sedek, L; Bras, A E; Gaipa, G; Sobral da Costa, E; Novakova, M; Sonneveld, E; Buracchi, C; de Sá Bacelar, T; te Marvelde, J G; Trinquand, A; Asnafi, V; Szczepanski, T; Matarraz, S; Lopez, A; Vidriales, B; Bulsa, J; Hrusak, O; Kalina, T; Lecrevisse, Q; Martin Ayuso, M; Brüggemann, M; Verde, J; Fernandez, P; Burgos, L; Paiva, B; Pedreira, C E; van Dongen, J J M; Orfao, A; van der Velden, V H J

    2018-01-01

    Precise classification of acute leukemia (AL) is crucial for adequate treatment. EuroFlow has previously designed an AL orientation tube (ALOT) to guide towards the relevant classification panel (T-cell acute lymphoblastic leukemia (T-ALL), B-cell precursor (BCP)-ALL and/or acute myeloid leukemia (AML)) and final diagnosis. Now we built a reference database with 656 typical AL samples (145 T-ALL, 377 BCP-ALL, 134 AML), processed and analyzed via standardized protocols. Using principal component analysis (PCA)-based plots and automated classification algorithms for direct comparison of single-cells from individual patients against the database, another 783 cases were subsequently evaluated. Depending on the database-guided results, patients were categorized as: (i) typical T, B or Myeloid without or; (ii) with a transitional component to another lineage; (iii) atypical; or (iv) mixed-lineage. Using this automated algorithm, in 781/783 cases (99.7%) the right panel was selected, and data comparable to the final WHO-diagnosis was already provided in >93% of cases (85% T-ALL, 97% BCP-ALL, 95% AML and 87% mixed-phenotype AL patients), even without data on the full-characterization panels. Our results show that database-guided analysis facilitates standardized interpretation of ALOT results and allows accurate selection of the relevant classification panels, hence providing a solid basis for designing future WHO AL classifications. PMID:29089646

  4. SeqMule: automated pipeline for analysis of human exome/genome sequencing data.

    PubMed

    Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai

    2015-09-18

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.

  5. Hexicon 2: Automated Processing of Hydrogen-Deuterium Exchange Mass Spectrometry Data with Improved Deuteration Distribution Estimation

    NASA Astrophysics Data System (ADS)

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L.; Hamprecht, Fred A.; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  6. Generic HPLC platform for automated enzyme reaction monitoring: Advancing the assay toolbox for transaminases and other PLP-dependent enzymes.

    PubMed

    Börner, Tim; Grey, Carl; Adlercreutz, Patrick

    2016-08-01

    Methods for rapid and direct quantification of enzyme kinetics independent of the substrate stand in high demand for both fundamental research and bioprocess development. This study addresses the need for a generic method by developing an automated, standardizable HPLC platform monitoring reaction progress in near real-time. The method was applied to amine transaminase (ATA) catalyzed reactions intensifying process development for chiral amine synthesis. Autosampler-assisted pipetting facilitates integrated mixing and sampling under controlled temperature. Crude enzyme formulations in high and low substrate concentrations can be employed. Sequential, small (1 µL) sample injections and immediate detection after separation permits fast reaction monitoring with excellent sensitivity, accuracy and reproducibility. Due to its modular design, different chromatographic techniques, e.g. reverse phase and size exclusion chromatography (SEC) can be employed. A novel assay for pyridoxal 5'-phosphate-dependent enzymes is presented using SEC for direct monitoring of enzyme-bound and free reaction intermediates. Time-resolved changes of the different cofactor states, e.g. pyridoxal 5'-phosphate, pyridoxamine 5'-phosphate and the internal aldimine were traced in both half reactions. The combination of the automated HPLC platform with SEC offers a method for substrate-independent screening, which renders a missing piece in the assay and screening toolbox for ATAs and other PLP-dependent enzymes. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Automated blood-sample handling in the clinical laboratory.

    PubMed

    Godolphin, W; Bodtker, K; Uyeno, D; Goh, L O

    1990-09-01

    The only significant advances in blood-taking in 25 years have been the disposable needle and evacuated blood-drawing tube. With the exception of a few isolated barcode experiments, most sample-tracking is performed through handwritten or computer-printed labels. Attempts to reduce the hazards of centrifugation have resulted in air-tight lids or chambers, the use of which is time-consuming and cumbersome. Most commonly used clinical analyzers require serum or plasma, distributed into specialized containers, unique to that analyzer. Aliquots for different tests are prepared by handpouring or pipetting. Moderate to large clinical laboratories perform so many different tests that even multi-analyzers performing multiple analyses on a single sample may account for only a portion of all tests ordered for a patient. Thus several aliquots of each specimen are usually required. We have developed a proprietary serial centrifuge and blood-collection tube suitable for incorporation into an automated or robotic sample-handling system. The system we propose is (a) safe--avoids or prevents biological danger to the many "handlers" of blood; (b) small--minimizes the amount of sample taken and space required to adapt to the needs of satellite and mobile testing, and direct interfacing with analyzers; (c) serial--permits each sample to be treated according to its own "merits," optimizes throughput, and facilitates flexible automation; and (d) smart--ensures quality results through monitoring and intelligent control of patient identification, sample characteristics, and separation process.

  8. Fully Automated Centrifugal Microfluidic Device for Ultrasensitive Protein Detection from Whole Blood.

    PubMed

    Park, Yang-Seok; Sunkara, Vijaya; Kim, Yubin; Lee, Won Seok; Han, Ja-Ryoung; Cho, Yoon-Kyoung

    2016-04-16

    Enzyme-linked immunosorbent assay (ELISA) is a promising method to detect small amount of proteins in biological samples. The devices providing a platform for reduced sample volume and assay time as well as full automation are required for potential use in point-of-care-diagnostics. Recently, we have demonstrated ultrasensitive detection of serum proteins, C-reactive protein (CRP) and cardiac troponin I (cTnI), utilizing a lab-on-a-disc composed of TiO2 nanofibrous (NF) mats. It showed a large dynamic range with femto molar (fM) detection sensitivity, from a small volume of whole blood in 30 min. The device consists of several components for blood separation, metering, mixing, and washing that are automated for improved sensitivity from low sample volumes. Here, in the video demonstration, we show the experimental protocols and know-how for the fabrication of NFs as well as the disc, their integration and the operation in the following order: processes for preparing TiO2 NF mat; transfer-printing of TiO2 NF mat onto the disc; surface modification for immune-reactions, disc assembly and operation; on-disc detection and representative results for immunoassay. Use of this device enables multiplexed analysis with minimal consumption of samples and reagents. Given the advantages, the device should find use in a wide variety of applications, and prove beneficial in facilitating the analysis of low abundant proteins.

  9. Hexicon 2: automated processing of hydrogen-deuterium exchange mass spectrometry data with improved deuteration distribution estimation.

    PubMed

    Lindner, Robert; Lou, Xinghua; Reinstein, Jochen; Shoeman, Robert L; Hamprecht, Fred A; Winkler, Andreas

    2014-06-01

    Hydrogen-deuterium exchange (HDX) experiments analyzed by mass spectrometry (MS) provide information about the dynamics and the solvent accessibility of protein backbone amide hydrogen atoms. Continuous improvement of MS instrumentation has contributed to the increasing popularity of this method; however, comprehensive automated data analysis is only beginning to mature. We present Hexicon 2, an automated pipeline for data analysis and visualization based on the previously published program Hexicon (Lou et al. 2010). Hexicon 2 employs the sensitive NITPICK peak detection algorithm of its predecessor in a divide-and-conquer strategy and adds new features, such as chromatogram alignment and improved peptide sequence assignment. The unique feature of deuteration distribution estimation was retained in Hexicon 2 and improved using an iterative deconvolution algorithm that is robust even to noisy data. In addition, Hexicon 2 provides a data browser that facilitates quality control and provides convenient access to common data visualization tasks. Analysis of a benchmark dataset demonstrates superior performance of Hexicon 2 compared with its predecessor in terms of deuteration centroid recovery and deuteration distribution estimation. Hexicon 2 greatly reduces data analysis time compared with manual analysis, whereas the increased number of peptides provides redundant coverage of the entire protein sequence. Hexicon 2 is a standalone application available free of charge under http://hx2.mpimf-heidelberg.mpg.de.

  10. CROSS: A GDSS for the Evaluation and Prioritization of Engineering Support Requests and Advanced Technology Projects at NASA

    NASA Technical Reports Server (NTRS)

    Tavana, Madjid; Lee, Seunghee

    1996-01-01

    Objective evaluation and prioritization of engineering support requests (ESRs) is a difficult task at the Kennedy Space Center (KSC) Shuttle Project Engineering Office. The difficulty arises from the complexities inherent in the evaluation process and the lack of structured information. The purpose of this project is to implement the consensus ranking organizational support system (CROSS), a multiple criteria decision support system (DSS) developed at KSC that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. CROSS utilizes the analytic hierarchy process (AHP), subjective probabilities, entropy concept, and maximize agreement heuristic (MAH) to enhance the decision maker's intuition in evaluation ESRs. Some of the preliminary goals of the project are to: (1) revisit the structure of the ground systems working team (GWST) steering committee, (2) develop a template for ESR originators to provide more comple and consistent information to the GSWT steering committee members to eliminate the need for a facilitator, (3) develop an objective and structured process for the initial screening of ESRs, (4) extensive training of the stakeholders and the GWST steering committee to eliminate the need for a facilitator, (5) automate the process as much as possible, (6) create an environment to compile project success factor data on ESRs and move towards a disciplined system that could be used to address supportability threshold issues at the KSC, and (7) investigate the possibility of an organization-wide implementation of CROSS.

  11. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  12. National Forum on the Future of Automated Materials Processing in US Industry: The Role of Sensors. Report of a workshop (1st) held at Santa Barbara, California on December 16-17, 1985

    NASA Astrophysics Data System (ADS)

    Yolken, H. T.; Mehrabian, R.

    1985-12-01

    These are the proceedings of the workshop A National Forum on the Future of Automated Materials Processing in U.S. Industry - The Role of Sensors. This is the first of two workshops to be sponsored by the Industrial Research Institute and the White House Office of Science and Technology Policy, Committee on Materials Working Group on Automation of Materials Processing. The second workshop will address the other two key components required for automated materials processing, process models and artificial intelligence coupled with computer integration of the system. The objective of these workshops is to identify and assess important issues afecting the competitive position of U.S. industry related to its ability to automate production processes for basic and advanced materials and to develop approaches for improved capability through cooperative R&D and associated efforts.

  13. Single Pilot Workload Management During Cruise in Entry Level Jets

    NASA Technical Reports Server (NTRS)

    Burian, Barbara K.; Pruchnicki, Shawn; Christopher, Bonny; Silverman, Evan; Hackworth, Carla; Rogers, Jason; Williams, Kevin; Drechsler, Gena; Runnels, Barry; Mead, Andy

    2013-01-01

    Advanced technologies and automation are important facilitators of single pilot operations, but they also contribute to the workload management challenges faced by the pilot. We examined task completion, workload management, and automation use in an entry level jet (ELJ) flown by single pilots. Thirteen certificated Cessna Citation Mustang (C510-S) pilots flew an instrument flight rules (IFR) experimental flight in a Cessna Citation Mustang simulator. At one point participants had to descend to meet a crossing restriction prior to a waypoint and prepare for an instrument approach into an un-towered field while facilitating communication from a lost pilot who was flying too low for ATC to hear. Four participants experienced some sort of difficulty with regard to meeting the crossing restriction and almost half (n=6) had problems associated with the instrument approach. Additional errors were also observed including eight participants landing at the airport with an incorrect altimeter setting.

  14. Nonanalytic Laboratory Automation: A Quarter Century of Progress.

    PubMed

    Hawker, Charles D

    2017-06-01

    Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.

  15. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    PubMed

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  16. Hanford Site Composite Analysis Technical Approach Description: Automated Quality Assurance Process Design.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dockter, Randy E.

    2017-07-31

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needsmore » if potential problems are identified.« less

  17. Communicating Navigation Data Inside the Cassini-Huygens Project: Visualizations and Tools

    NASA Technical Reports Server (NTRS)

    Wagner, Sean V.; Gist, Emily M.; Goodson, Troy D.; Hahn, Yungsun; Stumpf, Paul W.; Williams, Powtawche N.

    2008-01-01

    The Cassini-Huygens Saturn tour poses an interesting navigation challenge. From July 2004 through June 2008, the Cassini orbiter performed 112 of 161 planned maneuvers. This demanding schedule, where maneuvers are often separated by just a few days, motivated the development of maneuver design/analysis automation software tools. Besides generating maneuver designs and presentations, these tools are the mechanism to producing other types of navigation information; information used to facilitate operational decisions on such issues as maneuver cancellation and alternate maneuver strategies. This paper will discuss the navigation data that are communicated inside the Cassini-Huygens Project, as well as the maneuver software tools behind the processing of the data.

  18. Auditing of SNOMED CT's Hierarchical Structure using the National Drug File - Reference Terminology.

    PubMed

    Zakharchenko, Aleksandr; Geller, James

    2015-01-01

    With the ongoing development in the field of Medical Informatics, the availability of cross-references and the consistency of coverage between terminologies become critical requirements for clinical decision support. In this paper, we examine the possibility of developing a framework that highlights and exposes hierarchical incompatibilities between different medical terminologies in order to facilitate the process of achieving a sufficient level of consistency between terminologies. For the purpose of this research, we are working with the Systematized Nomenclature of Medicine--Clinical Terms (SNOMED CT) and the National Drug File--Reference Terminology (NDF-RT)--a clinical terminology focused on drugs. For discovery of inconsistencies we built an automated tool.

  19. More steps towards process automation for optical fabrication

    NASA Astrophysics Data System (ADS)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  20. The Science of Home Automation

    NASA Astrophysics Data System (ADS)

    Thomas, Brian Louis

    Smart home technologies and the concept of home automation have become more popular in recent years. This popularity has been accompanied by social acceptance of passive sensors installed throughout the home. The subsequent increase in smart homes facilitates the creation of home automation strategies. We believe that home automation strategies can be generated intelligently by utilizing smart home sensors and activity learning. In this dissertation, we hypothesize that home automation can benefit from activity awareness. To test this, we develop our activity-aware smart automation system, CARL (CASAS Activity-aware Resource Learning). CARL learns the associations between activities and device usage from historical data and utilizes the activity-aware capabilities to control the devices. To help validate CARL we deploy and test three different versions of the automation system in a real-world smart environment. To provide a foundation of activity learning, we integrate existing activity recognition and activity forecasting into CARL home automation. We also explore two alternatives to using human-labeled data to train the activity learning models. The first unsupervised method is Activity Detection, and the second is a modified DBSCAN algorithm that utilizes Dynamic Time Warping (DTW) as a distance metric. We compare the performance of activity learning with human-defined labels and with automatically-discovered activity categories. To provide evidence in support of our hypothesis, we evaluate CARL automation in a smart home testbed. Our results indicate that home automation can be boosted through activity awareness. We also find that the resulting automation has a high degree of usability and comfort for the smart home resident.

  1. Automated Subsystem Control for Life Support System (ASCLSS)

    NASA Technical Reports Server (NTRS)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  2. Improvements to the Processing and Characterization of Needled Composite Laminates

    DTIC Science & Technology

    2014-01-01

    the automated processing equipment are shown and discussed. The modifications allow better spatial control at the penetration sites and the ability... automated processing equipment are shown and discussed. The modifications allow better spatial control at the penetration sites and the ability to...semi- automated processing equipment, commercial off-the-shelf (COTS) needles and COTS aramid mat designed for other applications. Needled material

  3. Knowledge Representation Artifacts for Use in Sensemaking Support Systems

    DTIC Science & Technology

    2015-03-12

    and manual processing must be replaced by automated processing wherever it makes sense and is possible. Clearly, given the data and cognitive...knowledge-centric view to situation analysis and decision-making as previously discussed, has lead to the development of several automated processing components...for use in sensemaking support systems [6-11]. In turn, automated processing has required the development of appropriate knowledge

  4. Command and Control Common Semantic Core Required to Enable Net-centric Operations

    DTIC Science & Technology

    2008-05-20

    automated processing capability. A former US Marine Corps component C4 director during Operation Iraqi Freedom identified the problems of 1) uncertainty...interoperability improvements to warfighter community processes, thanks to ubiquitous automated processing , are likely high and somewhat easier to quantify. A...synchronized with the actions of other partners / warfare communities. This requires high- quality information, rapid sharing and automated processing – which

  5. The Automated Office.

    ERIC Educational Resources Information Center

    Naclerio, Nick

    1979-01-01

    Clerical personnel may be able to climb career ladders as a result of office automation and expanded job opportunities in the word processing area. Suggests opportunities in an automated office system and lists books and periodicals on word processing for counselors and teachers. (MF)

  6. lumpR 2.0.0: an R package facilitating landscape discretisation for hillslope-based hydrological models

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2017-08-01

    The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.

  7. An Intelligent Automation Platform for Rapid Bioprocess Design

    PubMed Central

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  8. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  9. Aquatic models, genomics and chemical risk management.

    PubMed

    Cheng, Keith C; Hinton, David E; Mattingly, Carolyn J; Planchart, Antonio

    2012-01-01

    The 5th Aquatic Animal Models for Human Disease meeting follows four previous meetings (Nairn et al., 2001; Schmale, 2004; Schmale et al., 2007; Hinton et al., 2009) in which advances in aquatic animal models for human disease research were reported, and community discussion of future direction was pursued. At this meeting, discussion at a workshop entitled Bioinformatics and Computational Biology with Web-based Resources (20 September 2010) led to an important conclusion: Aquatic model research using feral and experimental fish, in combination with web-based access to annotated anatomical atlases and toxicological databases, yields data that advance our understanding of human gene function, and can be used to facilitate environmental management and drug development. We propose here that the effects of genes and environment are best appreciated within an anatomical context - the specifically affected cells and organs in the whole animal. We envision the use of automated, whole-animal imaging at cellular resolution and computational morphometry facilitated by high-performance computing and automated entry into toxicological databases, as anchors for genetic and toxicological data, and as connectors between human and model system data. These principles should be applied to both laboratory and feral fish populations, which have been virtually irreplaceable sentinals for environmental contamination that results in human morbidity and mortality. We conclude that automation, database generation, and web-based accessibility, facilitated by genomic/transcriptomic data and high-performance and cloud computing, will potentiate the unique and potentially key roles that aquatic models play in advancing systems biology, drug development, and environmental risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Development Status: Automation Advanced Development Space Station Freedom Electric Power System

    NASA Technical Reports Server (NTRS)

    Dolce, James L.; Kish, James A.; Mellor, Pamela A.

    1990-01-01

    Electric power system automation for Space Station Freedom is intended to operate in a loop. Data from the power system is used for diagnosis and security analysis to generate Operations Management System (OMS) requests, which are sent to an arbiter, which sends a plan to a commander generator connected to the electric power system. This viewgraph presentation profiles automation software for diagnosis, scheduling, and constraint interfaces, and simulation to support automation development. The automation development process is diagrammed, and the process of creating Ada and ART versions of the automation software is described.

  11. Containerless automated processing of intermetallic compounds and composites

    NASA Technical Reports Server (NTRS)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  12. Information Fusion for Feature Extraction and the Development of Geospatial Information

    DTIC Science & Technology

    2004-07-01

    of automated processing . 2. Requirements for Geospatial Information Accurate, timely geospatial information is critical for many military...this evaluation illustrates some of the difficulties in comparing manual and automated processing results (figure 5). The automated delineation of

  13. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  14. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  15. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  16. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  17. 45 CFR 309.145 - What costs are allowable for Tribal IV-D programs carried out under § 309.65(a) of this part?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... (h) Automated data processing computer systems, including: (1) Planning efforts in the identification, evaluation, and selection of an automated data processing computer system solution meeting the program... existing automated data processing computer system to support Tribal IV-D program operations, and...

  18. MilxXplore: a web-based system to explore large imaging datasets

    PubMed Central

    Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J

    2013-01-01

    Objective As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. Materials and methods MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Discussion Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. Conclusions MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis. PMID:23775173

  19. DaMold: A data-mining platform for variant annotation and visualization in molecular diagnostics research.

    PubMed

    Pandey, Ram Vinay; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas

    2017-07-01

    Next-generation sequencing (NGS) has become a powerful and efficient tool for routine mutation screening in clinical research. As each NGS test yields hundreds of variants, the current challenge is to meaningfully interpret the data and select potential candidates. Analyzing each variant while manually investigating several relevant databases to collect specific information is a cumbersome and time-consuming process, and it requires expertise and familiarity with these databases. Thus, a tool that can seamlessly annotate variants with clinically relevant databases under one common interface would be of great help for variant annotation, cross-referencing, and visualization. This tool would allow variants to be processed in an automated and high-throughput manner and facilitate the investigation of variants in several genome browsers. Several analysis tools are available for raw sequencing-read processing and variant identification, but an automated variant filtering, annotation, cross-referencing, and visualization tool is still lacking. To fulfill these requirements, we developed DaMold, a Web-based, user-friendly tool that can filter and annotate variants and can access and compile information from 37 resources. It is easy to use, provides flexible input options, and accepts variants from NGS and Sanger sequencing as well as hotspots in VCF and BED formats. DaMold is available as an online application at http://damold.platomics.com/index.html, and as a Docker container and virtual machine at https://sourceforge.net/projects/damold/. © 2017 Wiley Periodicals, Inc.

  20. Automated workflows for modelling chemical fate, kinetics and toxicity.

    PubMed

    Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P

    2017-12-01

    Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Adaptive automation of human-machine system information-processing functions.

    PubMed

    Kaber, David B; Wright, Melanie C; Prinzel, Lawrence J; Clamann, Michael P

    2005-01-01

    The goal of this research was to describe the ability of human operators to interact with adaptive automation (AA) applied to various stages of complex systems information processing, defined in a model of human-automation interaction. Forty participants operated a simulation of an air traffic control task. Automated assistance was adaptively applied to information acquisition, information analysis, decision making, and action implementation aspects of the task based on operator workload states, which were measured using a secondary task. The differential effects of the forms of automation were determined and compared with a manual control condition. Results of two 20-min trials of AA or manual control revealed a significant effect of the type of automation on performance, particularly during manual control periods as part of the adaptive conditions. Humans appear to better adapt to AA applied to sensory and psychomotor information-processing functions (action implementation) than to AA applied to cognitive functions (information analysis and decision making), and AA is superior to completely manual control. Potential applications of this research include the design of automation to support air traffic controller information processing.

  2. Automated method for the rapid and precise estimation of adherent cell culture characteristics from phase contrast microscopy images.

    PubMed

    Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas

    2014-03-01

    The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.

  3. Automated Method for the Rapid and Precise Estimation of Adherent Cell Culture Characteristics from Phase Contrast Microscopy Images

    PubMed Central

    Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas

    2014-01-01

    The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. Biotechnol. Bioeng. 2014;111: 504–517. © 2013 Wiley Periodicals, Inc. PMID:24037521

  4. In-depth investigation of spin-on doped solar cells with thermally grown oxide passivation

    NASA Astrophysics Data System (ADS)

    Ahmad, Samir Mahmmod; Cheow, Siu Leong; Ludin, Norasikin A.; Sopian, K.; Zaidi, Saleem H.

    Solar cell industrial manufacturing, based largely on proven semiconductor processing technologies supported by significant advancements in automation, has reached a plateau in terms of cost and efficiency. However, solar cell manufacturing cost (dollar/watt) is still substantially higher than fossil fuels. The route to lowering cost may not lie with continuing automation and economies of scale. Alternate fabrication processes with lower cost and environmental-sustainability coupled with self-reliance, simplicity, and affordability may lead to price compatibility with carbon-based fuels. In this paper, a custom-designed formulation of phosphoric acid has been investigated, for n-type doping in p-type substrates, as a function of concentration and drive-in temperature. For post-diffusion surface passivation and anti-reflection, thermally-grown oxide films in 50-150-nm thickness were grown. These fabrication methods facilitate process simplicity, reduced costs, and environmental sustainability by elimination of poisonous chemicals and toxic gases (POCl3, SiH4, NH3). Simultaneous fire-through contact formation process based on screen-printed front surface Ag and back surface through thermally grown oxide films was optimized as a function of the peak temperature in conveyor belt furnace. Highest efficiency solar cells fabricated exhibited efficiency of ∼13%. Analysis of results based on internal quantum efficiency and minority carried measurements reveals three contributing factors: high front surface recombination, low minority carrier lifetime, and higher reflection. Solar cell simulations based on PC1D showed that, with improved passivation, lower reflection, and high lifetimes, efficiency can be enhanced to match with commercially-produced PECVD SiN-coated solar cells.

  5. PRISM, Processing and Review Interface for Strong Motion Data Software

    NASA Astrophysics Data System (ADS)

    Kalkan, E.; Jones, J. M.; Stephens, C. D.; Ng, P.

    2016-12-01

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the U.S., calls for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. PRISM automates the processing of strong-motion records by providing batch-processing capabilities. The PRISM software is platform-independent (coded in Java), open-source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a graphical user interface (GUI) for manual review and processing. To facilitate the use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and GUI components) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X and Windows. PRISM was designed to be flexible and extensible in order to accommodate implementation of new processing techniques. Input to PRISM currently is limited to data files in the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) V0 format, so that all retrieved acceleration time series need to be converted to this format. Output products include COSMOS V1, V2 and V3 files as: (i) raw acceleration time series in physical units with mean removed (V1), (ii) baseline-corrected and filtered acceleration, velocity, and displacement time series (V2), and (iii) response spectra, Fourier amplitude spectra and common earthquake-engineering intensity measures (V3). A thorough description of the record processing features supported by PRISM is presented with examples and validation results. All computing features have been thoroughly tested.

  6. Effectiveness of a web-based automated cell distribution system.

    PubMed

    Niland, Joyce C; Stiller, Tracey; Cravens, James; Sowinski, Janice; Kaddis, John; Qian, Dajun

    2010-01-01

    In recent years, industries have turned to the field of operations research to help improve the efficiency of production and distribution processes. Largely absent is the application of this methodology to biological materials, such as the complex and costly procedure of human pancreas procurement and islet isolation. Pancreatic islets are used for basic science research and in a promising form of cell replacement therapy for a subset of patients afflicted with severe type 1 diabetes mellitus. Having an accurate and reliable system for cell distribution is therefore crucial. The Islet Cell Resource Center Consortium was formed in 2001 as the first and largest cooperative group of islet production and distribution facilities in the world. We previously reported on the development of a Matching Algorithm for Islet Distribution (MAID), an automated web-based tool used to optimize the distribution of human pancreatic islets by matching investigator requests to islet characteristics. This article presents an assessment of that algorithm and compares it to the manual distribution process used prior to MAID. A comparison was done using an investigator's ratio of the number of islets received divided by the number requested pre- and post-MAID. Although the supply of islets increased between the pre- versus post-MAID period, the median received-to-requested ratio remained around 60% due to an increase in demand post-MAID. A significantly smaller variation in the received-to-requested ratio was achieved in the post- versus pre-MAID period. In particular, the undesirable outcome of providing users with more islets than requested, ranging up to four times their request, was greatly reduced through the algorithm. In conclusion, this analysis demonstrates, for the first time, the effectiveness of using an automated web-based cell distribution system to facilitate efficient and consistent delivery of human pancreatic islets by enhancing the islet matching process.

  7. Automation and Intensity Modulated Radiation Therapy for Individualized High-Quality Tangent Breast Treatment Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purdie, Thomas G., E-mail: Tom.Purdie@rmp.uhn.on.ca; Department of Radiation Oncology, University of Toronto, Toronto, Ontario; Techna Institute, University Health Network, Toronto, Ontario

    Purpose: To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Methods and Materials: Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to definemore » and simplify the technical aspects of the treatment planning process. Results: Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Conclusions: Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use.« less

  8. Automation and intensity modulated radiation therapy for individualized high-quality tangent breast treatment plans.

    PubMed

    Purdie, Thomas G; Dinniwell, Robert E; Fyles, Anthony; Sharpe, Michael B

    2014-11-01

    To demonstrate the large-scale clinical implementation and performance of an automated treatment planning methodology for tangential breast intensity modulated radiation therapy (IMRT). Automated planning was used to prospectively plan tangential breast IMRT treatment for 1661 patients between June 2009 and November 2012. The automated planning method emulates the manual steps performed by the user during treatment planning, including anatomical segmentation, beam placement, optimization, dose calculation, and plan documentation. The user specifies clinical requirements of the plan to be generated through a user interface embedded in the planning system. The automated method uses heuristic algorithms to define and simplify the technical aspects of the treatment planning process. Automated planning was used in 1661 of 1708 patients receiving tangential breast IMRT during the time interval studied. Therefore, automated planning was applicable in greater than 97% of cases. The time for treatment planning using the automated process is routinely 5 to 6 minutes on standard commercially available planning hardware. We have shown a consistent reduction in plan rejections from plan reviews through the standard quality control process or weekly quality review multidisciplinary breast rounds as we have automated the planning process for tangential breast IMRT. Clinical plan acceptance increased from 97.3% using our previous semiautomated inverse method to 98.9% using the fully automated method. Automation has become the routine standard method for treatment planning of tangential breast IMRT at our institution and is clinically feasible on a large scale. The method has wide clinical applicability and can add tremendous efficiency, standardization, and quality to the current treatment planning process. The use of automated methods can allow centers to more rapidly adopt IMRT and enhance access to the documented improvements in care for breast cancer patients, using technologies that are widely available and already in clinical use. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  10. Automated manufacturing process for DEAP stack-actuators

    NASA Astrophysics Data System (ADS)

    Tepel, Dominik; Hoffstadt, Thorben; Maas, Jürgen

    2014-03-01

    Dielectric elastomers (DE) are thin polymer films belonging to the class of electroactive polymers (EAP), which are coated with compliant and conductive electrodes on each side. Due to the influence of an electrical field, dielectric elastomers perform a large amount of deformation. In this contribution a manufacturing process of automated fabricated stack-actuators based on dielectric electroactive polymers (DEAP) are presented. First of all the specific design of the considered stack-actuator is explained and afterwards the development, construction and realization of an automated manufacturing process is presented in detail. By applying this automated process, stack-actuators with reproducible and homogeneous properties can be manufactured. Finally, first DEAP actuator modules fabricated by the mentioned process are validated experimentally.

  11. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    DTIC Science & Technology

    2018-01-09

    ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER

  12. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  13. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    PubMed Central

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  14. Windows Terminal Servers Orchestration

    NASA Astrophysics Data System (ADS)

    Bukowiec, Sebastian; Gaspar, Ricardo; Smith, Tim

    2017-10-01

    Windows Terminal Servers provide application gateways for various parts of the CERN accelerator complex, used by hundreds of CERN users every day. The combination of new tools such as Puppet, HAProxy and Microsoft System Center suite enable automation of provisioning workflows to provide a terminal server infrastructure that can scale up and down in an automated manner. The orchestration does not only reduce the time and effort necessary to deploy new instances, but also facilitates operations such as patching, analysis and recreation of compromised nodes as well as catering for workload peaks.

  15. Oxygen-controlled automated neural differentiation of mouse embryonic stem cells.

    PubMed

    Mondragon-Teran, Paul; Tostoes, Rui; Mason, Chris; Lye, Gary J; Veraitch, Farlan S

    2013-03-01

    Automation and oxygen tension control are two tools that provide significant improvements to the reproducibility and efficiency of stem cell production processes. the aim of this study was to establish a novel automation platform capable of controlling oxygen tension during both the cell-culture and liquid-handling steps of neural differentiation processes. We built a bespoke automation platform, which enclosed a liquid-handling platform in a sterile, oxygen-controlled environment. An airtight connection was used to transfer cell culture plates to and from an automated oxygen-controlled incubator. Our results demonstrate that our system yielded comparable cell numbers, viabilities, metabolism profiles and differentiation efficiencies when compared with traditional manual processes. Interestingly, eliminating exposure to ambient conditions during the liquid-handling stage resulted in significant improvements in the yield of MAP2-positive neural cells, indicating that this level of control can improve differentiation processes. This article describes, for the first time, an automation platform capable of maintaining oxygen tension control during both the cell-culture and liquid-handling stages of a 2D embryonic stem cell differentiation process.

  16. ISSLS PRIZE IN BIOENGINEERING SCIENCE 2017: Automation of reading of radiological features from magnetic resonance images (MRIs) of the lumbar spine without human intervention is comparable with an expert radiologist.

    PubMed

    Jamaludin, Amir; Lootus, Meelis; Kadir, Timor; Zisserman, Andrew; Urban, Jill; Battié, Michele C; Fairbank, Jeremy; McCall, Iain

    2017-05-01

    Investigation of the automation of radiological features from magnetic resonance images (MRIs) of the lumbar spine. To automate the process of grading lumbar intervertebral discs and vertebral bodies from MRIs. MR imaging is the most common imaging technique used in investigating low back pain (LBP). Various features of degradation, based on MRIs, are commonly recorded and graded, e.g., Modic change and Pfirrmann grading of intervertebral discs. Consistent scoring and grading is important for developing robust clinical systems and research. Automation facilitates this consistency and reduces the time of radiological analysis considerably and hence the expense. 12,018 intervertebral discs, from 2009 patients, were graded by a radiologist and were then used to train: (1) a system to detect and label vertebrae and discs in a given scan, and (2) a convolutional neural network (CNN) model that predicts several radiological gradings. The performance of the model, in terms of class average accuracy, was compared with the intra-observer class average accuracy of the radiologist. The detection system achieved 95.6% accuracy in terms of disc detection and labeling. The model is able to produce predictions of multiple pathological gradings that consistently matched those of the radiologist. The model identifies 'Evidence Hotspots' that are the voxels that most contribute to the degradation scores. Automation of radiological grading is now on par with human performance. The system can be beneficial in aiding clinical diagnoses in terms of objectivity of gradings and the speed of analysis. It can also draw the attention of a radiologist to regions of degradation. This objectivity and speed is an important stepping stone in the investigation of the relationship between MRIs and clinical diagnoses of back pain in large cohorts. Level 3.

  17. Augmenting team cognition in human-automation teams performing in complex operational environments.

    PubMed

    Cuevas, Haydee M; Fiore, Stephen M; Caldwell, Barrett S; Strater, Laura

    2007-05-01

    There is a growing reliance on automation (e.g., intelligent agents, semi-autonomous robotic systems) to effectively execute increasingly cognitively complex tasks. Successful team performance for such tasks has become even more dependent on team cognition, addressing both human-human and human-automation teams. Team cognition can be viewed as the binding mechanism that produces coordinated behavior within experienced teams, emerging from the interplay between each team member's individual cognition and team process behaviors (e.g., coordination, communication). In order to better understand team cognition in human-automation teams, team performance models need to address issues surrounding the effect of human-agent and human-robot interaction on critical team processes such as coordination and communication. Toward this end, we present a preliminary theoretical framework illustrating how the design and implementation of automation technology may influence team cognition and team coordination in complex operational environments. Integrating constructs from organizational and cognitive science, our proposed framework outlines how information exchange and updating between humans and automation technology may affect lower-level (e.g., working memory) and higher-level (e.g., sense making) cognitive processes as well as teams' higher-order "metacognitive" processes (e.g., performance monitoring). Issues surrounding human-automation interaction are discussed and implications are presented within the context of designing automation technology to improve task performance in human-automation teams.

  18. Film/Adhesive Processing Module for Fiber-Placement Processing of Composites

    NASA Technical Reports Server (NTRS)

    Hulcher, A. Bruce

    2007-01-01

    An automated apparatus has been designed and constructed that enables the automated lay-up of composite structures incorporating films, foils, and adhesives during the automated fiber-placement process. This apparatus, denoted a film module, could be used to deposit materials in film or thin sheet form either simultaneously when laying down the fiber composite article or in an independent step.

  19. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  20. Automation Bias: Decision Making and Performance in High-Tech Cockpits

    NASA Technical Reports Server (NTRS)

    Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)

    1997-01-01

    Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.

  1. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  2. A Comparison of Mindray BC-6800, Sysmex XN-2000, and Beckman Coulter LH750 Automated Hematology Analyzers: A Pediatric Study.

    PubMed

    Ciepiela, Olga; Kotuła, Iwona; Kierat, Szymon; Sieczkowska, Sandra; Podsiadłowska, Anna; Jenczelewska, Anna; Księżarczyk, Karolina; Demkow, Urszula

    2016-11-01

    Modern automated laboratory hematology analyzers allow the measurement of over 30 different hematological parameters useful in the diagnostic and clinical interpretation of patient symptoms. They use different methods to measure the same parameters. Thus, a comparison of complete blood count made by Mindray BC-6800, Sysmex XN-2000 and Beckman Coulter LH750 was performed. A comparison of results obtained by automated analysis of 807 anticoagulated blood samples from children and 125 manual microscopic differentiations were performed. This comparative study included white blood cell count, red blood cell count, and erythrocyte indices, as well as platelet count. The present study showed a poor level of agreement between white blood cell enumeration and differentiation of the three automated hematology analyzers under comparison. A very good agreement was found when comparing manual blood smear and automated granulocytes, monocytes, and lymphocytes differentiation. Red blood cell evaluation showed better agreement than white blood cells between the studied analyzers. To conclude, studied instruments did not ensure satisfactory interchangeability and did not facilitate a substitution of one analyzer by another. © 2016 Wiley Periodicals, Inc.

  3. Automated Terrestrial EMI Emitter Detection, Classification, and Localization

    NASA Astrophysics Data System (ADS)

    Stottler, R.; Ong, J.; Gioia, C.; Bowman, C.; Bhopale, A.

    Clear operating spectrum at ground station antenna locations is critically important for communicating with, commanding, controlling, and maintaining the health of satellites. Electro Magnetic Interference (EMI) can interfere with these communications, so it is extremely important to track down and eliminate sources of EMI. The Terrestrial RFI-locating Automation with CasE based Reasoning (TRACER) system is being implemented to automate terrestrial EMI emitter localization and identification to improve space situational awareness, reduce manpower requirements, dramatically shorten EMI response time, enable the system to evolve without programmer involvement, and support adversarial scenarios such as jamming. The operational version of TRACER is being implemented and applied with real data (power versus frequency over time) for both satellite communication antennas and sweeping Direction Finding (DF) antennas located near them. This paper presents the design and initial implementation of TRACER’s investigation data management, automation, and data visualization capabilities. TRACER monitors DF antenna signals and detects and classifies EMI using neural network technology, trained on past cases of both normal communications and EMI events. When EMI events are detected, an Investigation Object is created automatically. The user interface facilitates the management of multiple investigations simultaneously. Using a variant of the Friis transmission equation, emissions data is used to estimate and plot the emitter’s locations over time for comparison with current flights. The data is also displayed on a set of five linked graphs to aid in the perception of patterns spanning power, time, frequency, and bearing. Based on details of the signal (its classification, direction, and strength, etc.), TRACER retrieves one or more cases of EMI investigation methodologies which are represented as graphical behavior transition networks (BTNs). These BTNs can be edited easily, and they naturally represent the flow-chart-like process often followed by experts in time pressured situations.

  4. Toward standardized quantitative image quality (IQ) assessment in computed tomography (CT): A comprehensive framework for automated and comparative IQ analysis based on ICRU Report 87.

    PubMed

    Pahn, Gregor; Skornitzke, Stephan; Schlemmer, Hans-Peter; Kauczor, Hans-Ulrich; Stiller, Wolfram

    2016-01-01

    Based on the guidelines from "Report 87: Radiation Dose and Image-quality Assessment in Computed Tomography" of the International Commission on Radiation Units and Measurements (ICRU), a software framework for automated quantitative image quality analysis was developed and its usability for a variety of scientific questions demonstrated. The extendable framework currently implements the calculation of the recommended Fourier image quality (IQ) metrics modulation transfer function (MTF) and noise-power spectrum (NPS), and additional IQ quantities such as noise magnitude, CT number accuracy, uniformity across the field-of-view, contrast-to-noise ratio (CNR) and signal-to-noise ratio (SNR) of simulated lesions for a commercially available cone-beam phantom. Sample image data were acquired with different scan and reconstruction settings on CT systems from different manufacturers. Spatial resolution is analyzed in terms of edge-spread function, line-spread-function, and MTF. 3D NPS is calculated according to ICRU Report 87, and condensed to 2D and radially averaged 1D representations. Noise magnitude, CT numbers, and uniformity of these quantities are assessed on large samples of ROIs. Low-contrast resolution (CNR, SNR) is quantitatively evaluated as a function of lesion contrast and diameter. Simultaneous automated processing of several image datasets allows for straightforward comparative assessment. The presented framework enables systematic, reproducible, automated and time-efficient quantitative IQ analysis. Consistent application of the ICRU guidelines facilitates standardization of quantitative assessment not only for routine quality assurance, but for a number of research questions, e.g. the comparison of different scanner models or acquisition protocols, and the evaluation of new technology or reconstruction methods. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. 45 CFR 310.5 - What options are available for Computerized Tribal IV-D Systems and office automation?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to conduct automated data processing and recordkeeping activities through Office Automation... IV-D Systems and office automation? 310.5 Section 310.5 Public Welfare Regulations Relating to Public... AUTOMATION Requirements for Computerized Tribal IV-D Systems and Office Automation § 310.5 What options are...

  6. An Automation Survival Guide for Media Centers.

    ERIC Educational Resources Information Center

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  7. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  8. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    PubMed

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  9. Tests of Spectral Cloud Classification Using DMSP Fine Mode Satellite Data.

    DTIC Science & Technology

    1980-06-02

    processing techniques of potential value. Fourier spectral analysis was identified as the most promising technique to upgrade automated processing of...these measurements on the Earth’s surface is 0. 3 n mi. 3. Pickett, R.M., and Blackman, E.S. (1976) Automated Processing of Satellite Imagery Data at Air...and Pickett. R. Al. (1977) Automated Processing of Satellite Imagery Data at the Air Force Global Weather Central: Demonstrations of Spectral Analysis

  10. Space station automation study: Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The electroepitaxial process and the Very Large Scale Integration (VLSI) circuits (chips) facilities were chosen because each requires a very high degree of automation, and therefore involved extensive use of teleoperators, robotics, process mechanization, and artificial intelligence. Both cover a raw materials process and a sophisticated multi-step process and are therfore highly representative of the kinds of difficult operation, maintenance, and repair challenges which can be expected for any type of space manufacturing facility. Generic areas were identified which will require significant further study. The initial design will be based on terrestrial state-of-the-art hard automation. One hundred candidate missions were evaluated on the basis of automation portential and availability of meaning ful knowldege. The design requirements and unconstrained design concepts developed for the two missions are presented.

  11. Faded-example as a Tool to Acquire and Automate Mathematics Knowledge

    NASA Astrophysics Data System (ADS)

    Retnowati, E.

    2017-04-01

    Students themselves accomplish Knowledge acquisition and automation. The teacher plays a role as the facilitator by creating mathematics tasks that assist students in building knowledge efficiently and effectively. Cognitive load caused by learning material presented by teachers should be considered as a critical factor. While the intrinsic cognitive load is related to the degree of complexity of the material learning ones can handle, the extraneous cognitive load is directly caused by how the material is presented. Strategies to present a learning material in computational learning domains like mathematics are a namely worked example (fully-guided task) or problem-solving (discovery task with no guidance). According to the empirical evidence, learning based on problem-solving may cause high-extraneous cognitive load for students who have limited prior knowledge, conversely learn based on worked example may cause high-extraneous cognitive load for students who have mastered the knowledge base. An alternative is a faded example consisting of the partly-completed task. Learning from faded-example can facilitate students who already acquire some knowledge about the to-be-learned material but still need more practice to automate the knowledge further. This instructional strategy provides a smooth transition from a fully-guided into an independent problem solver. Designs of faded examples for learning trigonometry are discussed.

  12. Software to Facilitate Remote Sensing Data Access for Disease Early Warning Systems

    PubMed Central

    Liu, Yi; Hu, Jiameng; Snell-Feikema, Isaiah; VanBemmel, Michael S.; Lamsal, Aashis; Wimberly, Michael C.

    2015-01-01

    Satellite remote sensing produces an abundance of environmental data that can be used in the study of human health. To support the development of early warning systems for mosquito-borne diseases, we developed an open-source, client based software application to enable the Epidemiological Applications of Spatial Technologies (EASTWeb). Two major design decisions were full automation of the discovery, retrieval and processing of remote sensing data from multiple sources, and making the system easily modifiable in response to changes in data availability and user needs. Key innovations that helped to achieve these goals were the implementation of a software framework for data downloading and the design of a scheduler that tracks the complex dependencies among multiple data processing tasks and makes the system resilient to external errors. EASTWeb has been successfully applied to support forecasting of West Nile virus outbreaks in the United States and malaria epidemics in the Ethiopian highlands. PMID:26644779

  13. Evidence and diagnostic reporting in the IHE context.

    PubMed

    Loef, Cor; Truyen, Roel

    2005-05-01

    Capturing clinical observations and findings during the diagnostic imaging process is increasingly becoming a critical step in diagnostic reporting. Standards developers-notably HL7 and DICOM-are making significant progress toward standards that enable exchanging clinical observations and findings among the various information systems of the healthcare enterprise. DICOM-like the HL7 Clinical Document Architecture (CDA) -uses templates and constrained, coded vocabulary (SNOMED, LOINC, etc.). Such a representation facilitates automated software recognition of findings and observations, intrapatient comparison, correlation to norms, and outcomes research. The scope of DICOM Structured Reporting (SR) includes many findings that products routinely create in digital form (measurements, computed estimates, etc.). In the Integrating the Healthcare Enterprise (IHE) framework, two Integration Profiles are defined for clinical data capture and diagnostic reporting: Evidence Document, and Simple Image and Numeric Report. This report describes these two DICOM SR-based integration profiles in the diagnostic reporting process.

  14. Object-oriented software design in semiautomatic building extraction

    NASA Astrophysics Data System (ADS)

    Guelch, Eberhard; Mueller, Hardo

    1997-08-01

    Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.

  15. Application of preprocessing filtering on Decision Tree C4.5 and rough set theory

    NASA Astrophysics Data System (ADS)

    Chan, Joseph C. C.; Lin, Tsau Y.

    2001-03-01

    This paper compares two artificial intelligence methods: the Decision Tree C4.5 and Rough Set Theory on the stock market data. The Decision Tree C4.5 is reviewed with the Rough Set Theory. An enhanced window application is developed to facilitate the pre-processing filtering by introducing the feature (attribute) transformations, which allows users to input formulas and create new attributes. Also, the application produces three varieties of data set with delaying, averaging, and summation. The results prove the improvement of pre-processing by applying feature (attribute) transformations on Decision Tree C4.5. Moreover, the comparison between Decision Tree C4.5 and Rough Set Theory is based on the clarity, automation, accuracy, dimensionality, raw data, and speed, which is supported by the rules sets generated by both algorithms on three different sets of data.

  16. Digital microfluidics: A promising technique for biochemical applications

    NASA Astrophysics Data System (ADS)

    Wang, He; Chen, Liguo; Sun, Lining

    2017-12-01

    Digital microfluidics (DMF) is a versatile microfluidics technology that has significant application potential in the areas of automation and miniaturization. In DMF, discrete droplets containing samples and reagents are controlled to implement a series of operations via electrowetting-on-dielectric. This process works by applying electrical potentials to an array of electrodes coated with a hydrophobic dielectric layer. Unlike microchannels, DMF facilitates precise control over multiple reaction processes without using complex pump, microvalve, and tubing networks. DMF also presents other distinct features, such as portability, less sample consumption, shorter chemical reaction time, flexibility, and easier combination with other technology types. Due to its unique advantages, DMF has been applied to a broad range of fields (e.g., chemistry, biology, medicine, and environment). This study reviews the basic principles of droplet actuation, configuration design, and fabrication of the DMF device, as well as discusses the latest progress in DMF from the biochemistry perspective.

  17. STR-validator: an open source platform for validation and process control.

    PubMed

    Hansson, Oskar; Gill, Peter; Egeland, Thore

    2014-11-01

    This paper addresses two problems faced when short tandem repeat (STR) systems are validated for forensic purposes: (1) validation is extremely time consuming and expensive, and (2) there is strong consensus about what to validate but not how. The first problem is solved by powerful data processing functions to automate calculations. Utilising an easy-to-use graphical user interface, strvalidator (hereafter referred to as STR-validator) can greatly increase the speed of validation. The second problem is exemplified by a series of analyses, and subsequent comparison with published material, highlighting the need for a common validation platform. If adopted by the forensic community STR-validator has the potential to standardise the analysis of validation data. This would not only facilitate information exchange but also increase the pace at which laboratories are able to switch to new technology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  18. Optimized anion exchange column isolation of zirconium-89 ( 89Zr) from yttrium cyclotron target: Method development and implementation on an automated fluidic platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Hara, Matthew J.; Murray, Nathaniel J.; Carter, Jennifer C.

    Zirconium-89 ( 89Zr), produced by the (p, n) reaction from naturally monoisotopic yttrium ( natY), is a promising positron emitting isotope for immunoPET imaging. Its long half-life of 78.4 h is sufficient for evaluating slow physiological processes. A prototype automated fluidic system, coupled to on-line and in-line detectors, has been constructed to facilitate development of new 89Zr purification methodologies. The highly reproducible reagent delivery platform and near-real time monitoring of column effluents allows for efficient method optimization. The separation of Zr from dissolved Y metal targets was evaluated using several anion exchange resins. Each resin was evaluated against its abilitymore » to quantitatively capture Zr from a load solution high in dissolved Y. The most appropriate anion exchange resin for this application was identified, and the separation method was optimized. The method is capable of a high Y decontamination factor (>10 5) and has been shown to remove Fe, an abundant contaminant in Y foils, from the 89Zr elution fraction. Finally, the method was evaluated using cyclotron bombarded Y foil targets; the method was shown to achieve >95% recovery of the 89Zr present in the foils. The anion exchange column method described here is intended to be the first 89Zr isolation stage in a dual-column purification process.« less

  19. Optimized anion exchange column isolation of zirconium-89 ( 89Zr) from yttrium cyclotron target: Method development and implementation on an automated fluidic platform

    DOE PAGES

    O’Hara, Matthew J.; Murray, Nathaniel J.; Carter, Jennifer C.; ...

    2018-02-24

    Zirconium-89 ( 89Zr), produced by the (p, n) reaction from naturally monoisotopic yttrium ( natY), is a promising positron emitting isotope for immunoPET imaging. Its long half-life of 78.4 h is sufficient for evaluating slow physiological processes. A prototype automated fluidic system, coupled to on-line and in-line detectors, has been constructed to facilitate development of new 89Zr purification methodologies. The highly reproducible reagent delivery platform and near-real time monitoring of column effluents allows for efficient method optimization. The separation of Zr from dissolved Y metal targets was evaluated using several anion exchange resins. Each resin was evaluated against its abilitymore » to quantitatively capture Zr from a load solution high in dissolved Y. The most appropriate anion exchange resin for this application was identified, and the separation method was optimized. The method is capable of a high Y decontamination factor (>10 5) and has been shown to remove Fe, an abundant contaminant in Y foils, from the 89Zr elution fraction. Finally, the method was evaluated using cyclotron bombarded Y foil targets; the method was shown to achieve >95% recovery of the 89Zr present in the foils. The anion exchange column method described here is intended to be the first 89Zr isolation stage in a dual-column purification process.« less

  20. Towards automated processing of clinical Finnish: sublanguage analysis and a rule-based parser.

    PubMed

    Laippala, Veronika; Ginter, Filip; Pyysalo, Sampo; Salakoski, Tapio

    2009-12-01

    In this paper, we present steps taken towards more efficient automated processing of clinical Finnish, focusing on daily nursing notes in a Finnish Intensive Care Unit (ICU). First, we analyze ICU Finnish as a sublanguage, identifying its specific features facilitating, for example, the development of a specialized syntactic analyser. The identified features include frequent omission of finite verbs, limitations in allowed syntactic structures, and domain-specific vocabulary. Second, we develop a formal grammar and a parser for ICU Finnish, thus providing better tools for the development of further applications in the clinical domain. The grammar is implemented in the LKB system in a typed feature structure formalism. The lexicon is automatically generated based on the output of the FinTWOL morphological analyzer adapted to the clinical domain. As an additional experiment, we study the effect of using Finnish constraint grammar to reduce the size of the lexicon. The parser construction thus makes efficient use of existing resources for Finnish. The grammar currently covers 76.6% of ICU Finnish sentences, producing highly accurate best-parse analyzes with F-score of 91.1%. We find that building a parser for the highly specialized domain sublanguage is not only feasible, but also surprisingly efficient, given an existing morphological analyzer with broad vocabulary coverage. The resulting parser enables a deeper analysis of the text than was previously possible.

  1. Improved compliance by BPM-driven workflow automation.

    PubMed

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  2. PyMS: a Python toolkit for processing of gas chromatography-mass spectrometry (GC-MS) data. Application and comparative study of selected tools.

    PubMed

    O'Callaghan, Sean; De Souza, David P; Isaac, Andrew; Wang, Qiao; Hodkinson, Luke; Olshansky, Moshe; Erwin, Tim; Appelbe, Bill; Tull, Dedreia L; Roessner, Ute; Bacic, Antony; McConville, Malcolm J; Likić, Vladimir A

    2012-05-30

    Gas chromatography-mass spectrometry (GC-MS) is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX), noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI) fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI), allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS). PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs as well or better than leading software packages. We demonstrate data processing scenarios simple to implement in PyMS, yet difficult to achieve with many conventional GC-MS data processing software. Automated sample processing and quantitation with PyMS can provide substantial time savings compared to more traditional interactive software systems that tightly integrate data processing with the graphical user interface.

  3. Problems of Automation and Management Principles Information Flow in Manufacturing

    NASA Astrophysics Data System (ADS)

    Grigoryuk, E. N.; Bulkin, V. V.

    2017-07-01

    Automated control systems of technological processes are complex systems that are characterized by the presence of elements of the overall focus, the systemic nature of the implemented algorithms for the exchange and processing of information, as well as a large number of functional subsystems. The article gives examples of automatic control systems and automated control systems of technological processes held parallel between them by identifying strengths and weaknesses. Other proposed non-standard control system of technological process.

  4. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    NASA Astrophysics Data System (ADS)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  5. Advanced Earth Observation System Instrumentation Study (AEOSIS)

    NASA Technical Reports Server (NTRS)

    Var, R. E.

    1976-01-01

    The feasibility, practicality, and cost are investigated for establishing a national system or grid of artificial landmarks suitable for automated (near real time) recognition in the multispectral scanner imagery data from an earth observation satellite (EOS). The intended use of such landmarks, for orbit determination and improved mapping accuracy is reviewed. The desirability of using xenon searchlight landmarks for this purpose is explored theoretically and by means of experimental results obtained with LANDSAT 1 and LANDSAT 2. These results are used, in conjunction with the demonstrated efficiency of an automated detection scheme, to determine the size and cost of a xenon searchlight that would be suitable for an EOS Searchlight Landmark Station (SLS), and to facilitate the development of a conceptual design for an automated and environmentally protected EOS SLS.

  6. Automated assembly of oligosaccharides containing multiple cis-glycosidic linkages

    NASA Astrophysics Data System (ADS)

    Hahm, Heung Sik; Hurevich, Mattan; Seeberger, Peter H.

    2016-09-01

    Automated glycan assembly (AGA) has advanced from a concept to a commercial technology that rapidly provides access to diverse oligosaccharide chains as long as 30-mers. To date, AGA was mainly employed to incorporate trans-glycosidic linkages, where C2 participating protecting groups ensure stereoselective couplings. Stereocontrol during the installation of cis-glycosidic linkages cannot rely on C2-participation and anomeric mixtures are typically formed. Here, we demonstrate that oligosaccharides containing multiple cis-glycosidic linkages can be prepared efficiently by AGA using monosaccharide building blocks equipped with remote participating protecting groups. The concept is illustrated by the automated syntheses of biologically relevant oligosaccharides bearing various cis-galactosidic and cis-glucosidic linkages. This work provides further proof that AGA facilitates the synthesis of complex oligosaccharides with multiple cis-linkages and other biologically important oligosaccharides.

  7. Public Library Automation Report: 1984.

    ERIC Educational Resources Information Center

    Gotanda, Masae

    Data processing was introduced to public libraries in Hawaii in 1973 with a feasibility study which outlined the candidate areas for automation. Since then, the Office of Library Services has automated the order procedures for one of the largest book processing centers for public libraries in the country; created one of the first COM…

  8. Development of an automated fuzing station for the future armored resupply vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chesser, J.B.; Jansen, J.F.; Lloyd, P.D.

    1995-03-01

    The US Army is developing the Advanced Field Artillery System (SGSD), a next generation armored howitzer. The Future Armored Resupply Vehicle (FARV) will be its companion ammunition resupply vehicle. The FARV with automate the supply of ammunition and fuel to the AFAS which will increase capabilities over the current system. One of the functions being considered for automation is ammunition processing. Oak Ridge National Laboratory is developing equipment to demonstrate automated ammunition processing. One of the key operations to be automated is fuzing. The projectiles are initially unfuzed, and a fuze must be inserted and threaded into the projectile asmore » part of the processing. A constraint on the design solution is that the ammunition cannot be modified to simplify automation. The problem was analyzed to determine the alignment requirements. Using the results of the analysis, ORNL designed, built, and tested a test stand to verify the selected design solution.« less

  9. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  10. Automated strip-mine and reclamation mapping from ERTS

    NASA Technical Reports Server (NTRS)

    Rogers, R. H. (Principal Investigator); Reed, L. E.; Pettyjohn, W. A.

    1974-01-01

    The author has identified the following significant results. Computer processing techniques were applied to ERTS-1 computer-compatible tape (CCT) data acquired in August 1972 on the Ohio Power Company's coal mining operation in Muskingum County, Ohio. Processing results succeeded in automatically classifying, with an accuracy greater than 90%: (1) stripped earth and major sources of erosion; (2) partially reclaimed areas and minor sources of erosion; (3) water with sedimentation; (4) water without sedimentation; and (5) vegetation. Computer-generated tables listing the area in acres and square kilometers were produced for each target category. Processing results also included geometrically corrected map overlays, one for each target category, drawn on a transparent material by a pen under computer control. Each target category is assigned a distinctive color on the overlay to facilitate interpretation. The overlays, drawn at a scale of 1:250,000 when placed over an AMS map of the same area, immediately provided map locations for each target. These mapping products were generated at a tenth of the cost of conventional mapping techniques.

  11. Extending the Instructional Systems Development Methodology.

    ERIC Educational Resources Information Center

    O'Neill, Colin E.

    1993-01-01

    Describes ways that components of Information Engineering (IE) methodology can be used by training system developers to extend Instructional Systems Development (ISD) methodology. Aspects of IE that are useful in ISD are described, including requirements determination, group facilitation, integrated automated tool support, and prototyping.…

  12. Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.

    PubMed

    Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels

    2012-10-01

    We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.

  13. Automated Processing of 2-D Gel Electrophoretograms of Genomic DNA for Hunting Pathogenic DNA Molecular Changes.

    PubMed

    Takahashi; Nakazawa; Watanabe; Konagaya

    1999-01-01

    We have developed the automated processing algorithms for 2-dimensional (2-D) electrophoretograms of genomic DNA based on RLGS (Restriction Landmark Genomic Scanning) method, which scans the restriction enzyme recognition sites as the landmark and maps them onto a 2-D electrophoresis gel. Our powerful processing algorithms realize the automated spot recognition from RLGS electrophoretograms and the automated comparison of a huge number of such images. In the final stage of the automated processing, a master spot pattern, on which all the spots in the RLGS images are mapped at once, can be obtained. The spot pattern variations which seemed to be specific to the pathogenic DNA molecular changes can be easily detected by simply looking over the master spot pattern. When we applied our algorithms to the analysis of 33 RLGS images derived from human colon tissues, we successfully detected several colon tumor specific spot pattern changes.

  14. Complacency and bias in human use of automation: an attentional integration.

    PubMed

    Parasuraman, Raja; Manzey, Dietrich H

    2010-06-01

    Our aim was to review empirical studies of complacency and bias in human interaction with automated and decision support systems and provide an integrated theoretical model for their explanation. Automation-related complacency and automation bias have typically been considered separately and independently. Studies on complacency and automation bias were analyzed with respect to the cognitive processes involved. Automation complacency occurs under conditions of multiple-task load, when manual tasks compete with the automated task for the operator's attention. Automation complacency is found in both naive and expert participants and cannot be overcome with simple practice. Automation bias results in making both omission and commission errors when decision aids are imperfect. Automation bias occurs in both naive and expert participants, cannot be prevented by training or instructions, and can affect decision making in individuals as well as in teams. While automation bias has been conceived of as a special case of decision bias, our analysis suggests that it also depends on attentional processes similar to those involved in automation-related complacency. Complacency and automation bias represent different manifestations of overlapping automation-induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics. The integrated model and attentional synthesis provides a heuristic framework for further research on complacency and automation bias and design options for mitigating such effects in automated and decision support systems.

  15. Volta phase plate data collection facilitates image processing and cryo-EM structure determination.

    PubMed

    von Loeffelholz, Ottilie; Papai, Gabor; Danev, Radostin; Myasnikov, Alexander G; Natchiar, S Kundhavai; Hazemann, Isabelle; Ménétret, Jean-François; Klaholz, Bruno P

    2018-06-01

    A current bottleneck in structure determination of macromolecular complexes by cryo electron microscopy (cryo-EM) is the large amount of data needed to obtain high-resolution 3D reconstructions, including through sorting into different conformations and compositions with advanced image processing. Additionally, it may be difficult to visualize small ligands that bind in sub-stoichiometric levels. Volta phase plates (VPP) introduce a phase shift in the contrast transfer and drastically increase the contrast of the recorded low-dose cryo-EM images while preserving high frequency information. Here we present a comparative study to address the behavior of different data sets during image processing and quantify important parameters during structure refinement. The automated data collection was done from the same human ribosome sample either as a conventional defocus range dataset or with a Volta phase plate close to focus (cfVPP) or with a small defocus (dfVPP). The analysis of image processing parameters shows that dfVPP data behave more robustly during cryo-EM structure refinement because particle alignments, Euler angle assignments and 2D & 3D classifications behave more stably and converge faster. In particular, less particle images are required to reach the same resolution in the 3D reconstructions. Finally, we find that defocus range data collection is also applicable to VPP. This study shows that data processing and cryo-EM map interpretation, including atomic model refinement, are facilitated significantly by performing VPP cryo-EM, which will have an important impact on structural biology. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. CASPER Version 2.0

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Rabideau, Gregg; Tran, Daniel; Knight, Russell; Chouinard, Caroline; Estlin, Tara; Gaines, Daniel; Clement, Bradley; Barrett, Anthony

    2007-01-01

    CASPER is designed to perform automated planning of interdependent activities within a system subject to requirements, constraints, and limitations on resources. In contradistinction to the traditional concept of batch planning followed by execution, CASPER implements a concept of continuous planning and replanning in response to unanticipated changes (including failures), integrated with execution. Improvements over other, similar software that have been incorporated into CASPER version 2.0 include an enhanced executable interface to facilitate integration with a wide range of execution software systems and supporting software libraries; features to support execution while reasoning about urgency, importance, and impending deadlines; features that enable accommodation to a wide range of computing environments that include various central processing units and random- access-memory capacities; and improved generic time-server and time-control features.

  17. CellML metadata standards, associated tools and repositories

    PubMed Central

    Beard, Daniel A.; Britten, Randall; Cooling, Mike T.; Garny, Alan; Halstead, Matt D.B.; Hunter, Peter J.; Lawson, James; Lloyd, Catherine M.; Marsh, Justin; Miller, Andrew; Nickerson, David P.; Nielsen, Poul M.F.; Nomura, Taishin; Subramanium, Shankar; Wimalaratne, Sarala M.; Yu, Tommy

    2009-01-01

    The development of standards for encoding mathematical models is an important component of model building and model sharing among scientists interested in understanding multi-scale physiological processes. CellML provides such a standard, particularly for models based on biophysical mechanisms, and a substantial number of models are now available in the CellML Model Repository. However, there is an urgent need to extend the current CellML metadata standard to provide biological and biophysical annotation of the models in order to facilitate model sharing, automated model reduction and connection to biological databases. This paper gives a broad overview of a number of new developments on CellML metadata and provides links to further methodological details available from the CellML website. PMID:19380315

  18. A Fully Automated Microfluidic Femtosecond Laser Axotomy Platform for Nerve Regeneration Studies in C. elegans

    PubMed Central

    Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130

  19. Data Processing and First Products from the Hyperspectral Imager for the Coastal Ocean (HICO) on the International Space Station

    DTIC Science & Technology

    2010-04-01

    NRL Stennis Space Center (NRL-SSC) for further processing using the NRL SSC Automated Processing System (APS). APS was developed for processing...have not previously developed automated processing for 73 hyperspectral ocean color data. The hyperspectral processing branch includes several

  20. Evolution paths for advanced automation

    NASA Technical Reports Server (NTRS)

    Healey, Kathleen J.

    1990-01-01

    As Space Station Freedom (SSF) evolves, increased automation and autonomy will be required to meet Space Station Freedom Program (SSFP) objectives. As a precursor to the use of advanced automation within the SSFP, especially if it is to be used on SSF (e.g., to automate the operation of the flight systems), the underlying technologies will need to be elevated to a high level of readiness to ensure safe and effective operations. Ground facilities supporting the development of these flight systems -- from research and development laboratories through formal hardware and software development environments -- will be responsible for achieving these levels of technology readiness. These facilities will need to evolve support the general evolution of the SSFP. This evolution will include support for increasing the use of advanced automation. The SSF Advanced Development Program has funded a study to define evolution paths for advanced automaton within the SSFP's ground-based facilities which will enable, promote, and accelerate the appropriate use of advanced automation on-board SSF. The current capability of the test beds and facilities, such as the Software Support Environment, with regard to advanced automation, has been assessed and their desired evolutionary capabilities have been defined. Plans and guidelines for achieving this necessary capability have been constructed. The approach taken has combined indepth interviews of test beds personnel at all SSF Work Package centers with awareness of relevant state-of-the-art technology and technology insertion methodologies. Key recommendations from the study include advocating a NASA-wide task force for advanced automation, and the creation of software prototype transition environments to facilitate the incorporation of advanced automation in the SSFP.

  1. Report of the workshop on Aviation Safety/Automation Program

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A. (Editor)

    1990-01-01

    As part of NASA's responsibility to encourage and facilitate active exchange of information and ideas among members of the aviation community, an Aviation Safety/Automation workshop was organized and sponsored by the Flight Management Division of NASA Langley Research Center. The one-day workshop was held on October 10, 1989, at the Sheraton Beach Inn and Conference Center in Virginia Beach, Virginia. Participants were invited from industry, government, and universities to discuss critical questions and issues concerning the rapid introduction and utilization of advanced computer-based technology into the flight deck and air traffic controller workstation environments. The workshop was attended by approximately 30 discipline experts, automation and human factors researchers, and research and development managers. The goal of the workshop was to address major issues identified by the NASA Aviation Safety/Automation Program. Here, the results of the workshop are documented. The ideas, thoughts, and concepts were developed by the workshop participants. The findings, however, have been synthesized into a final report primarily by the NASA researchers.

  2. Automated detection system of single nucleotide polymorphisms using two kinds of functional magnetic nanoparticles

    NASA Astrophysics Data System (ADS)

    Liu, Hongna; Li, Song; Wang, Zhifei; Li, Zhiyang; Deng, Yan; Wang, Hua; Shi, Zhiyang; He, Nongyue

    2008-11-01

    Single nucleotide polymorphisms (SNPs) comprise the most abundant source of genetic variation in the human genome wide codominant SNPs identification. Therefore, large-scale codominant SNPs identification, especially for those associated with complex diseases, has induced the need for completely high-throughput and automated SNP genotyping method. Herein, we present an automated detection system of SNPs based on two kinds of functional magnetic nanoparticles (MNPs) and dual-color hybridization. The amido-modified MNPs (NH 2-MNPs) modified with APTES were used for DNA extraction from whole blood directly by electrostatic reaction, and followed by PCR, was successfully performed. Furthermore, biotinylated PCR products were captured on the streptavidin-coated MNPs (SA-MNPs) and interrogated by hybridization with a pair of dual-color probes to determine SNP, then the genotype of each sample can be simultaneously identified by scanning the microarray printed with the denatured fluorescent probes. This system provided a rapid, sensitive and highly versatile automated procedure that will greatly facilitate the analysis of different known SNPs in human genome.

  3. Designing of smart home automation system based on Raspberry Pi

    NASA Astrophysics Data System (ADS)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar; Wattanawisuth, Nattapol; Leeprechanon, Nopbhorn

    2016-03-01

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pins of Raspberry Pi by pressing the corresponding key for turning "on" and "off" of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.

  4. Designing of smart home automation system based on Raspberry Pi

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saini, Ravi Prakash; Singh, Bhanu Pratap; Sharma, Mahesh Kumar

    Locally networked or remotely controlled home automation system becomes a popular paradigm because of the numerous advantages and is suitable for academic research. This paper proposes a method for an implementation of Raspberry Pi based home automation system presented with an android phone access interface. The power consumption profile across the connected load is measured accurately through programming. Users can access the graph of total power consumption with respect to time worldwide using their Dropbox account. An android application has been developed to channelize the monitoring and controlling operation of home appliances remotely. This application facilitates controlling of operating pinsmore » of Raspberry Pi by pressing the corresponding key for turning “on” and “off” of any desired appliance. Systems can range from the simple room lighting control to smart microcontroller based hybrid systems incorporating several other additional features. Smart home automation systems are being adopted to achieve flexibility, scalability, security in the sense of data protection through the cloud-based data storage protocol, reliability, energy efficiency, etc.« less

  5. Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.

    2000-01-01

    The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.

  6. Application of the BioMek 2000 Laboratory Automation Workstation and the DNA IQ System to the extraction of forensic casework samples.

    PubMed

    Greenspoon, Susan A; Ban, Jeffrey D; Sykes, Karen; Ballard, Elizabeth J; Edler, Shelley S; Baisden, Melissa; Covington, Brian L

    2004-01-01

    Robotic systems are commonly utilized for the extraction of database samples. However, the application of robotic extraction to forensic casework samples is a more daunting task. Such a system must be versatile enough to accommodate a wide range of samples that may contain greatly varying amounts of DNA, but it must also pose no more risk of contamination than the manual DNA extraction methods. This study demonstrates that the BioMek 2000 Laboratory Automation Workstation, used in combination with the DNA IQ System, is versatile enough to accommodate the wide range of samples typically encountered by a crime laboratory. The use of a silica coated paramagnetic resin, as with the DNA IQ System, facilitates the adaptation of an open well, hands off, robotic system to the extraction of casework samples since no filtration or centrifugation steps are needed. Moreover, the DNA remains tightly coupled to the silica coated paramagnetic resin for the entire process until the elution step. A short pre-extraction incubation step is necessary prior to loading samples onto the robot and it is at this step that most modifications are made to accommodate the different sample types and substrates commonly encountered with forensic evidentiary samples. Sexual assault (mixed stain) samples, cigarette butts, blood stains, buccal swabs, and various tissue samples were successfully extracted with the BioMek 2000 Laboratory Automation Workstation and the DNA IQ System, with no evidence of contamination throughout the extensive validation studies reported here.

  7. Multi-Dimensional Signal Processing Research Program

    DTIC Science & Technology

    1981-09-30

    applications to real-time image processing and analysis. A specific long-range application is the automated processing of aerial reconnaissance imagery...Non-supervised image segmentation is a potentially im- portant operation in the automated processing of aerial reconnaissance pho- tographs since it

  8. 76 FR 20425 - Self-Regulatory Organizations; National Securities Clearing Corporation; Order Approving Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-12

    ... Relating to Establishing an Automated Service for the Processing of Transfers, Replacements, and Exchanges... (the ``Act'').\\1\\ The proposed rule change allows NSCC to add a new automated service to process... offer a new automated service for the transfer, replacement, or exchange (collectively referred to as a...

  9. AUTOMATED LITERATURE PROCESSING HANDLING AND ANALYSIS SYSTEM--FIRST GENERATION.

    ERIC Educational Resources Information Center

    Redstone Scientific Information Center, Redstone Arsenal, AL.

    THE REPORT PRESENTS A SUMMARY OF THE DEVELOPMENT AND THE CHARACTERISTICS OF THE FIRST GENERATION OF THE AUTOMATED LITERATURE PROCESSING, HANDLING AND ANALYSIS (ALPHA-1) SYSTEM. DESCRIPTIONS OF THE COMPUTER TECHNOLOGY OF ALPHA-1 AND THE USE OF THIS AUTOMATED LIBRARY TECHNIQUE ARE PRESENTED. EACH OF THE SUBSYSTEMS AND MODULES NOW IN OPERATION ARE…

  10. Evolution of a Benthic Imaging System From a Towed Camera to an Automated Habitat Characterization System

    DTIC Science & Technology

    2008-09-01

    automated processing of images for color correction, segmentation of foreground targets from sediment and classification of targets to taxonomic category...element in the development of HabCam as a tool for habitat characterization is the automated processing of images for color correction, segmentation of

  11. Classification Trees for Quality Control Processes in Automated Constructed Response Scoring.

    ERIC Educational Resources Information Center

    Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…

  12. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahi-Anwar, M; Lo, P; Kim, H

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifiesmore » the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel component to automatically verify image acquisition parameters and automated adherence to specifications. Institutional research agreement, Siemens Healthcare; Past recipient, research grant support, Siemens Healthcare; Consultant, Toshiba America Medical Systems; Consultant, Samsung Electronics; NIH Grant support from: U01 CA181156.« less

  13. Operator modeling in commerical aviation: Cognitive models, intelligent displays, and pilot's assistants

    NASA Technical Reports Server (NTRS)

    Govindaraj, T.; Mitchell, C. M.

    1994-01-01

    One of the goals of the National Aviation Safety/Automation program is to address the issue of human-centered automation in the cockpit. Human-centered automation is automation that, in the cockpit, enhances or assists the crew rather than replacing them. The Georgia Tech research program focused on this general theme, with emphasis on designing a computer-based pilot's assistant, intelligent (i.e, context-sensitive) displays, and an intelligent tutoring system for understanding and operating the autoflight system. In particular, the aids and displays were designed to enhance the crew's situational awareness of the current state of the automated flight systems and to assist the crew's situational awareness of the current state of the automated flight systems and to assist the crew in coordinating the autoflight system resources. The activities of this grant included: (1) an OFMspert to understand pilot navigation activities in a 727 class aircraft; (2) an extension of OFMspert to understand mode control in a glass cockpit, Georgia Tech Crew Activity Tracking System (GT-CATS); (3) the design of a training system to teach pilots about the vertical navigation portion of the flight management system -VNAV Tutor; and (4) a proof-of-concept display, using existing display technology, to facilitate mode awareness, particularly in situations in which controlled flight into terrain (CFIT) is a potential.

  14. The Use of AMET and Automated Scripts for Model Evaluation

    EPA Science Inventory

    The Atmospheric Model Evaluation Tool (AMET) is a suite of software designed to facilitate the analysis and evaluation of meteorological and air quality models. AMET matches the model output for particular locations to the corresponding observed values from one or more networks ...

  15. BCube: Building a Geoscience Brokering Framework

    NASA Astrophysics Data System (ADS)

    Jodha Khalsa, Siri; Nativi, Stefano; Duerr, Ruth; Pearlman, Jay

    2014-05-01

    BCube is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. As a prototype "building block" for NSF's EarthCube cyberinfrastructure initiative, BCube is demonstrating how a broker can serve as an intermediary between information systems that implement well-defined interfaces, thereby providing a bridge between communities that employ different specifications. Building on the GEOSS Discover and Access Broker (DAB), BCube will develop new modules and services including: • Expanded semantic brokering capabilities • Business Model support for work flows • Automated metadata generation • Automated linking to services discovered via web crawling • Credential passing for seamless access to data • Ranking of search results from brokered catalogs Because facilitating cross-discipline research involves cultural and well as technical challenges, BCube is also addressing the sociological and educational components of infrastructure development. We are working, initially, with four geoscience disciplines: hydrology, oceans, polar and weather, with an emphasis on connecting existing domain infrastructure elements to facilitate cross-domain communications.

  16. Characterization of Temporal Semantic Shifts of Peer-to-Peer Communication in a Health-Related Online Community: Implications for Data-driven Health Promotion.

    PubMed

    Sridharan, Vishnupriya; Cohen, Trevor; Cobb, Nathan; Myneni, Sahiti

    2016-01-01

    With online social platforms gaining popularity as venues of behavior change, it is important to understand the ways in which these platforms facilitate peer interactions. In this paper, we characterize temporal trends in user communication through mapping of theoretically-linked semantic content. We used qualitative coding and automated text analysis to assign theoretical techniques to peer interactions in an online community for smoking cessation, subsequently facilitating temporal visualization of the observed techniques. Results indicate manifestation of several behavior change techniques such as feedback and monitoring' and 'rewards'. Automated methods yielded reasonable results (F-measure=0.77). Temporal trends among relapsers revealed reduction in communication after a relapse event. This social withdrawal may be attributed to failure guilt after the relapse. Results indicate significant change in thematic categories such as 'social support', 'natural consequences', and 'comparison of outcomes' pre and post relapse. Implications for development of behavioral support technologies that promote long-term abstinence are discussed.

  17. Automated personnel data base system specifications, Task V. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartley, H.J.; Bocast, A.K.; Deppner, F.O.

    1978-11-01

    The full title of this study is 'Development of Qualification Requirements, Training Programs, Career Plans, and Methodologies for Effective Management and Training of Inspection and Enforcement Personnel.' Task V required the development of an automated personnel data base system for NRC/IE. This system is identified as the NRC/IE Personnel, Assignment, Qualifications, and Training System (PAQTS). This Task V report provides the documentation for PAQTS including the Functional Requirements Document (FRD), the Data Requirements Document (DRD), the Hardware and Software Capabilities Assessment, and the Detailed Implementation Schedule. Specific recommendations to facilitate implementation of PAQTS are also included.

  18. Design and implementation of Ada programs to facilitate automated testing

    NASA Technical Reports Server (NTRS)

    Dean, Jack; Fox, Barry; Oropeza, Michael

    1991-01-01

    An automated method utilized to test the software components of COMPASS, an interactive computer aided scheduling system, is presented. Each package of this system introduces a private type, and works to construct instances of that type, along with read and write routines for that type. Generic procedures that can generate test drivers for these functions are given and show how the test drivers can read from a test data file the functions to call, the arguments for those functions, what the anticipated result should be, and whether an exception should be raised for the function given the arguments.

  19. Toward automated denoising of single molecular Förster resonance energy transfer data

    NASA Astrophysics Data System (ADS)

    Lee, Hao-Chih; Lin, Bo-Lin; Chang, Wei-Hau; Tu, I.-Ping

    2012-01-01

    A wide-field two-channel fluorescence microscope is a powerful tool as it allows for the study of conformation dynamics of hundreds to thousands of immobilized single molecules by Förster resonance energy transfer (FRET) signals. To date, the data reduction from a movie to a final set containing meaningful single-molecule FRET (smFRET) traces involves human inspection and intervention at several critical steps, greatly hampering the efficiency at the post-imaging stage. To facilitate the data reduction from smFRET movies to smFRET traces and to address the noise-limited issues, we developed a statistical denoising system toward fully automated processing. This data reduction system has embedded several novel approaches. First, as to background subtraction, high-order singular value decomposition (HOSVD) method is employed to extract spatial and temporal features. Second, to register and map the two color channels, the spots representing bleeding through the donor channel to the acceptor channel are used. Finally, correlation analysis and likelihood ratio statistic for the change point detection (CPD) are developed to study the two channels simultaneously, resolve FRET states, and report the dwelling time of each state. The performance of our method has been checked using both simulation and real data.

  20. Managing protected health information in distributed research network environments: automated review to facilitate collaboration.

    PubMed

    Bredfeldt, Christine E; Butani, Amy; Padmanabhan, Sandhyasree; Hitz, Paul; Pardee, Roy

    2013-03-22

    Multi-site health sciences research is becoming more common, as it enables investigation of rare outcomes and diseases and new healthcare innovations. Multi-site research usually involves the transfer of large amounts of research data between collaborators, which increases the potential for accidental disclosures of protected health information (PHI). Standard protocols for preventing release of PHI are extremely vulnerable to human error, particularly when the shared data sets are large. To address this problem, we developed an automated program (SAS macro) to identify possible PHI in research data before it is transferred between research sites. The macro reviews all data in a designated directory to identify suspicious variable names and data patterns. The macro looks for variables that may contain personal identifiers such as medical record numbers and social security numbers. In addition, the macro identifies dates and numbers that may identify people who belong to small groups, who may be identifiable even in the absences of traditional identifiers. Evaluation of the macro on 100 sample research data sets indicated a recall of 0.98 and precision of 0.81. When implemented consistently, the macro has the potential to streamline the PHI review process and significantly reduce accidental PHI disclosures.

  1. Methods for Automated Identification of Informative Behaviors in Natural Bioptic Driving

    PubMed Central

    Luo, Gang; Peli, Eli

    2012-01-01

    Visually impaired people may legally drive if wearing bioptic telescopes in some developed countries. To address the controversial safety issue of the practice, we have developed a low cost in-car recording system that can be installed in study participants’ own vehicles to record their daily driving activities. We also developed a set of automated identification techniques of informative behaviors to facilitate efficient manual review of important segments submerged in the vast amount of uncontrolled data. Here we present the methods and quantitative results of the detection performance for six types of driving maneuvers and behaviors that are important for bioptic driving: bioptic telescope use, turns, curves, intersections, weaving, and rapid stops. The testing data were collected from one normally sighted and two visually impaired subjects across multiple days. The detection rates ranged from 82% up to 100%, and the false discovery rates ranged from 0% to 13%. In addition, two human observers were able to interpret about 80% of targets viewed through the telescope. These results indicate that with appropriate data processing the low-cost system is able to provide reliable data for natural bioptic driving studies. PMID:22514200

  2. Multiobjective Multifactorial Optimization in Evolutionary Multitasking.

    PubMed

    Gupta, Abhishek; Ong, Yew-Soon; Feng, Liang; Tan, Kay Chen

    2016-05-03

    In recent decades, the field of multiobjective optimization has attracted considerable interest among evolutionary computation researchers. One of the main features that makes evolutionary methods particularly appealing for multiobjective problems is the implicit parallelism offered by a population, which enables simultaneous convergence toward the entire Pareto front. While a plethora of related algorithms have been proposed till date, a common attribute among them is that they focus on efficiently solving only a single optimization problem at a time. Despite the known power of implicit parallelism, seldom has an attempt been made to multitask, i.e., to solve multiple optimization problems simultaneously. It is contended that the notion of evolutionary multitasking leads to the possibility of automated transfer of information across different optimization exercises that may share underlying similarities, thereby facilitating improved convergence characteristics. In particular, the potential for automated transfer is deemed invaluable from the standpoint of engineering design exercises where manual knowledge adaptation and reuse are routine. Accordingly, in this paper, we present a realization of the evolutionary multitasking paradigm within the domain of multiobjective optimization. The efficacy of the associated evolutionary algorithm is demonstrated on some benchmark test functions as well as on a real-world manufacturing process design problem from the composites industry.

  3. PLIP: fully automated protein-ligand interaction profiler.

    PubMed

    Salentin, Sebastian; Schreiber, Sven; Haupt, V Joachim; Adasme, Melissa F; Schroeder, Michael

    2015-07-01

    The characterization of interactions in protein-ligand complexes is essential for research in structural bioinformatics, drug discovery and biology. However, comprehensive tools are not freely available to the research community. Here, we present the protein-ligand interaction profiler (PLIP), a novel web service for fully automated detection and visualization of relevant non-covalent protein-ligand contacts in 3D structures, freely available at projects.biotec.tu-dresden.de/plip-web. The input is either a Protein Data Bank structure, a protein or ligand name, or a custom protein-ligand complex (e.g. from docking). In contrast to other tools, the rule-based PLIP algorithm does not require any structure preparation. It returns a list of detected interactions on single atom level, covering seven interaction types (hydrogen bonds, hydrophobic contacts, pi-stacking, pi-cation interactions, salt bridges, water bridges and halogen bonds). PLIP stands out by offering publication-ready images, PyMOL session files to generate custom images and parsable result files to facilitate successive data processing. The full python source code is available for download on the website. PLIP's command-line mode allows for high-throughput interaction profiling. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  5. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  6. Abstracts of AF Materials Laboratory Reports

    DTIC Science & Technology

    1975-09-01

    NO: TITLE: AUTHOR(S): CONTRACT NO; CONTRACTOR: AFML-TR-73-307 200,397 IMPROVED AUTOMATED TAPE LAYING MACHINE M. Poullos, W. J. Murray, D.L...AUTOMATED IMPROVED AUTOMATED TAPE LAYING MACHINE AUTOMATION AUTOMATION OF COATING PROCESSES FOR GAS TURBINE DLADcS AND VANES 203222/111 203072...IMP90VE0 TAPE LAYING MACHINE IMPP)VED AUTOMATED TAPE LAYING MACHINE A STUDY O^ THE STRESS-STRAIN TEHAVIOR OF GRAPHITE

  7. DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,

    DTIC Science & Technology

    1995-08-14

    processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated

  8. Automated Sequence Processor: Something Old, Something New

    NASA Technical Reports Server (NTRS)

    Streiffert, Barbara; Schrock, Mitchell; Fisher, Forest; Himes, Terry

    2012-01-01

    High productivity required for operations teams to meet schedules Risk must be minimized. Scripting used to automate processes. Scripts perform essential operations functions. Automated Sequence Processor (ASP) was a grass-roots task built to automate the command uplink process System engineering task for ASP revitalization organized. ASP is a set of approximately 200 scripts written in Perl, C Shell, AWK and other scripting languages.. ASP processes/checks/packages non-interactive commands automatically.. Non-interactive commands are guaranteed to be safe and have been checked by hardware or software simulators.. ASP checks that commands are non-interactive.. ASP processes the commands through a command. simulator and then packages them if there are no errors.. ASP must be active 24 hours/day, 7 days/week..

  9. Managing laboratory automation

    PubMed Central

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed. PMID:18925018

  10. Managing laboratory automation.

    PubMed

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  11. A new image representation for compact and secure communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Lakshman; Skourikhine, A. N.

    In many areas of nuclear materials management there is a need for communication, archival, and retrieval of annotated image data between heterogeneous platforms and devices to effectively implement safety, security, and safeguards of nuclear materials. Current image formats such as JPEG are not ideally suited in such scenarios as they are not scalable to different viewing formats, and do not provide a high-level representation of images that facilitate automatic object/change detection or annotation. The new Scalable Vector Graphics (SVG) open standard for representing graphical information, recommended by the World Wide Web Consortium (W3C) is designed to address issues of imagemore » scalability, portability, and annotation. However, until now there has been no viable technology to efficiently field images of high visual quality under this standard. Recently, LANL has developed a vectorized image representation that is compatible with the SVG standard and preserves visual quality. This is based on a new geometric framework for characterizing complex features in real-world imagery that incorporates perceptual principles of processing visual information known from cognitive psychology and vision science, to obtain a polygonal image representation of high fidelity. This representation can take advantage of all textual compression and encryption routines unavailable to other image formats. Moreover, this vectorized image representation can be exploited to facilitate automated object recognition that can reduce time required for data review. The objects/features of interest in these vectorized images can be annotated via animated graphics to facilitate quick and easy display and comprehension of processed image content.« less

  12. Application of the informational reference system OZhUR to the automated processing of data from satellites of the Kosmos series

    NASA Technical Reports Server (NTRS)

    Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.

    1978-01-01

    The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.

  13. Laboratory automation: total and subtotal.

    PubMed

    Hawker, Charles D

    2007-12-01

    Worldwide, perhaps 2000 or more clinical laboratories have implemented some form of laboratory automation, either a modular automation system, such as for front-end processing, or a total laboratory automation system. This article provides descriptions and examples of these various types of automation. It also presents an outline of how a clinical laboratory that is contemplating automation should approach its decision and the steps it should follow to ensure a successful implementation. Finally, the role of standards in automation is reviewed.

  14. Optimized anion exchange column isolation of zirconium-89 ( 89 Zr) from yttrium cyclotron target: Method development and implementation on an automated fluidic platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O’Hara, Matthew J.; Murray, Nathaniel J.; Carter, Jennifer C.

    Zirconium-89 (89Zr), produced by the (p,n) reaction from naturally monoisotopic yttrium (natY), is a promising positron emitting isotope for immunoPET imaging. Its long half-life of 78.4 h is sufficient for evaluating slow physiological processes. A prototype automated fluidic system, coupled to on-line and in-line detectors, has been constructed to facilitate development of new 89Zr purification methodologies. The highly reproducible reagent delivery platform and near-real time monitoring of column effluents allows for efficient method optimization. The separation of Zr from dissolved Y metal targets was evaluated using several anion exchange resins. Each resin was evaluated against its ability to quantitatively capturemore » Zr from a load solution that is high in dissolved Y. The most appropriate anion exchange resin for this application was identified, and the separation method was optimized. The method is capable of a high Y decontamination factor (>105) and has been shown to separate Fe, an abundant contaminant in Y foils, from the 89Zr elution fraction. Finally, the performance of the method was evaluated using cyclotron bombarded Y foil targets. The separation method was shown to achieve >95% recovery of the 89Zr present in the foils. The 89Zr eluent, however, was in a chemical matrix not immediately conducive to labeling onto proteins. The main intent of this study was to develop a tandem column 89Zr purification process, wherein the anion exchange column method described here is the first separation in a dual-column purification process.« less

  15. Scaling Retro-Commissioning to Small Commercial Buildings: A Turnkey Automated Hardware-Software Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guanjing; Granderson, J.; Brambley, Michael R.

    2015-07-01

    In the United States, small commercial buildings represent 51% of total floor space of all commercial buildings and consume nearly 3 quadrillion Btu (3.2 quintillion joule) of site energy annually, presenting an enormous opportunity for energy savings. Retro-commissioning (RCx), the process through which professional energy service providers identify and correct operational problems, has proven to be a cost-effective means to achieve median energy savings of 16%. However, retro-commissioning is not typically conducted at scale throughout the commercial stock. Very few small commercial buildings are retro-commissioned because utility expenses are relatively modest, margins are tighter, and capital for improvements is limited.more » In addition, small buildings do not have in-house staff with the expertise to identify improvement opportunities. In response, a turnkey hardware-software solution was developed to enable cost-effective, monitoring-based RCx of small commercial buildings. This highly tailored solution enables non-commissioning providers to identify energy and comfort problems, as well as associated cost impacts and remedies. It also facilitates scale by offering energy service providers the means to streamline their existing processes and reduce costs by more than half. The turnkey RCx sensor suitcase consists of two primary components: a suitcase of sensors for short-term building data collection that guides users through the process of deploying and retrieving their data and a software application that automates analysis of sensor data, identifies problems and generates recommendations. This paper presents the design and testing of prototype models, including descriptions of the hardware design, analysis algorithms, performance testing, and plans for dissemination.« less

  16. Virtual reality case-specific rehearsal in temporal bone surgery: a preliminary evaluation.

    PubMed

    Arora, Asit; Swords, Chloe; Khemani, Sam; Awad, Zaid; Darzi, Ara; Singh, Arvind; Tolley, Neil

    2014-01-01

    1. To investigate the feasibility of performing case-specific surgical rehearsal using a virtual reality temporal bone simulator. 2. To identify potential clinical applications in temporal bone surgery. Prospective assessment study. St Mary's Hospital, Imperial College NHS Trust, London UK. Sixteen participants consisting of a trainer and trainee group. Twenty-four cadaver temporal bones were CT-scanned and uploaded onto the Voxelman simulator. Sixteen participants performed a 90-min temporal bone dissection on the generic simulation model followed by 3 dissection tasks on the case simulation and cadaver models. Case rehearsal was assessed for feasibility. Clinical applications and usefulness were evaluated using a 5-point Likert-type scale. The upload process required a semi-automated system. Average time for upload was 20 min. Suboptimal reconstruction occurred in 21% of cases arising when the mastoid process and ossicular chain were not captured (n = 2) or when artefact was generated (n = 3). Case rehearsal rated highly (Likert score >4) for confidence (75%), facilitating planning (75%) and training (94%). Potential clinical applications for case rehearsal include ossicular chain surgery, cochlear implantation and congenital anomalies. Case rehearsal of cholesteatoma surgery is not possible on the current platform due to suboptimal soft tissue representation. The process of uploading CT data onto a virtual reality temporal bone simulator to perform surgical rehearsal is feasible using a semi-automated system. Further clinical evaluation is warranted to assess the benefit of performing patient-specific surgical rehearsal in selected procedures. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  17. Towards automatic patient selection for chemotherapy in colorectal cancer trials

    NASA Astrophysics Data System (ADS)

    Wright, Alexander; Magee, Derek; Quirke, Philip; Treanor, Darren E.

    2014-03-01

    A key factor in the prognosis of colorectal cancer, and its response to chemoradiotherapy, is the ratio of cancer cells to surrounding tissue (the so called tumour:stroma ratio). Currently tumour:stroma ratio is calculated manually, by examining H&E stained slides and counting the proportion of area of each. Virtual slides facilitate this analysis by allowing pathologists to annotate areas of tumour on a given digital slide image, and in-house developed stereometry tools mark random, systematic points on the slide, known as spots. These spots are examined and classified by the pathologist. Typical analyses require a pathologist to score at least 300 spots per tumour. This is a time consuming (10- 60 minutes per case) and laborious task for the pathologist and automating this process is highly desirable. Using an existing dataset of expert-classified spots from one colorectal cancer clinical trial, an automated tumour:stroma detection algorithm has been trained and validated. Each spot is extracted as an image patch, and then processed for feature extraction, identifying colour, texture, stain intensity and object characteristics. These features are used as training data for a random forest classification algorithm, and validated against unseen image patches. This process was repeated for multiple patch sizes. Over 82,000 such patches have been used, and results show an accuracy of 79%, depending on image patch size. A second study examining contextual requirements for pathologist scoring was conducted and indicates that further analysis of structures within each image patch is required in order to improve algorithm accuracy.

  18. KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.

    PubMed

    Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert

    2017-05-15

    Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.

  19. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    PubMed

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Linking Goal-Oriented Requirements and Model-Driven Development

    NASA Astrophysics Data System (ADS)

    Pastor, Oscar; Giachetti, Giovanni

    In the context of Goal-Oriented Requirement Engineering (GORE) there are interesting modeling approaches for the analysis of complex scenarios that are oriented to obtain and represent the relevant requirements for the development of software products. However, the way to use these GORE models in an automated Model-Driven Development (MDD) process is not clear, and, in general terms, the translation of these models into the final software products is still manually performed. Therefore, in this chapter, we show an approach to automatically link GORE models and MDD processes, which has been elaborated by considering the experience obtained from linking the i * framework with an industrially applied MDD approach. The linking approach proposed is formulated by means of a generic process that is based on current modeling standards and technologies in order to facilitate its application for different MDD and GORE approaches. Special attention is paid to how this process generates appropriate model transformation mechanisms to automatically obtain MDD conceptual models from GORE models, and how it can be used to specify validation mechanisms to assure the correct model transformations.

  1. Automated solar cell assembly team process research

    NASA Astrophysics Data System (ADS)

    Nowlan, M. J.; Hogan, S. J.; Darkazalli, G.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.

    1994-06-01

    This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire's objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell's Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.

  2. Industrial applications of automated X-ray inspection

    NASA Astrophysics Data System (ADS)

    Shashishekhar, N.

    2015-03-01

    Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.

  3. An automated Genomes-to-Natural Products platform (GNP) for the discovery of modular natural products.

    PubMed

    Johnston, Chad W; Skinnider, Michael A; Wyatt, Morgan A; Li, Xiang; Ranieri, Michael R M; Yang, Lian; Zechel, David L; Ma, Bin; Magarvey, Nathan A

    2015-09-28

    Bacterial natural products are a diverse and valuable group of small molecules, and genome sequencing indicates that the vast majority remain undiscovered. The prediction of natural product structures from biosynthetic assembly lines can facilitate their discovery, but highly automated, accurate, and integrated systems are required to mine the broad spectrum of sequenced bacterial genomes. Here we present a genome-guided natural products discovery tool to automatically predict, combinatorialize and identify polyketides and nonribosomal peptides from biosynthetic assembly lines using LC-MS/MS data of crude extracts in a high-throughput manner. We detail the directed identification and isolation of six genetically predicted polyketides and nonribosomal peptides using our Genome-to-Natural Products platform. This highly automated, user-friendly programme provides a means of realizing the potential of genetically encoded natural products.

  4. An automated field phenotyping pipeline for application in grapevine research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-02-26

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.

  5. Automated tumor analysis for molecular profiling in lung cancer

    PubMed Central

    Boyd, Clinton; James, Jacqueline A.; Loughrey, Maurice B.; Hougton, Joseph P.; Boyle, David P.; Kelly, Paul; Maxwell, Perry; McCleary, David; Diamond, James; McArt, Darragh G.; Tunstall, Jonathon; Bankhead, Peter; Salto-Tellez, Manuel

    2015-01-01

    The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p < 0.001) with benchmark tumor cell counts. This study demonstrates a robust image analysis technology that can facilitate the automated quantitative analysis of tissue samples for molecular profiling in discovery and diagnostics. PMID:26317646

  6. Astronomical algorithms for automated analysis of tissue protein expression in breast cancer

    PubMed Central

    Ali, H R; Irwin, M; Morris, L; Dawson, S-J; Blows, F M; Provenzano, E; Mahler-Araujo, B; Pharoah, P D; Walton, N A; Brenton, J D; Caldas, C

    2013-01-01

    Background: High-throughput evaluation of tissue biomarkers in oncology has been greatly accelerated by the widespread use of tissue microarrays (TMAs) and immunohistochemistry. Although TMAs have the potential to facilitate protein expression profiling on a scale to rival experiments of tumour transcriptomes, the bottleneck and imprecision of manually scoring TMAs has impeded progress. Methods: We report image analysis algorithms adapted from astronomy for the precise automated analysis of IHC in all subcellular compartments. The power of this technique is demonstrated using over 2000 breast tumours and comparing quantitative automated scores against manual assessment by pathologists. Results: All continuous automated scores showed good correlation with their corresponding ordinal manual scores. For oestrogen receptor (ER), the correlation was 0.82, P<0.0001, for BCL2 0.72, P<0.0001 and for HER2 0.62, P<0.0001. Automated scores showed excellent concordance with manual scores for the unsupervised assignment of cases to ‘positive' or ‘negative' categories with agreement rates of up to 96%. Conclusion: The adaptation of astronomical algorithms coupled with their application to large annotated study cohorts, constitutes a powerful tool for the realisation of the enormous potential of digital pathology. PMID:23329232

  7. RNA–protein binding kinetics in an automated microfluidic reactor

    PubMed Central

    Ridgeway, William K.; Seitaridou, Effrosyni; Phillips, Rob; Williamson, James R.

    2009-01-01

    Microfluidic chips can automate biochemical assays on the nanoliter scale, which is of considerable utility for RNA–protein binding reactions that would otherwise require large quantities of proteins. Unfortunately, complex reactions involving multiple reactants cannot be prepared in current microfluidic mixer designs, nor is investigation of long-time scale reactions possible. Here, a microfluidic ‘Riboreactor’ has been designed and constructed to facilitate the study of kinetics of RNA–protein complex formation over long time scales. With computer automation, the reactor can prepare binding reactions from any combination of eight reagents, and is optimized to monitor long reaction times. By integrating a two-photon microscope into the microfluidic platform, 5-nl reactions can be observed for longer than 1000 s with single-molecule sensitivity and negligible photobleaching. Using the Riboreactor, RNA–protein binding reactions with a fragment of the bacterial 30S ribosome were prepared in a fully automated fashion and binding rates were consistent with rates obtained from conventional assays. The microfluidic chip successfully combines automation, low sample consumption, ultra-sensitive fluorescence detection and a high degree of reproducibility. The chip should be able to probe complex reaction networks describing the assembly of large multicomponent RNPs such as the ribosome. PMID:19759214

  8. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  9. Space biology initiative program definition review. Trade study 1: Automation costs versus crew utilization

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.

    1989-01-01

    A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.

  10. Student Emotions in Conversation-Based Assessments

    ERIC Educational Resources Information Center

    Lehman, Blair A.; Zapata-Rivera, Diego

    2018-01-01

    Students can experience a variety of emotions while completing assessments. Some emotions can get in the way of students performing their best (e.g., anxiety, frustration), whereas other emotions can facilitate student performance (e.g., engagement). Many new, non-traditional assessments, such as automated conversation-based assessments (CBA), are…

  11. Educators' Perceptions of Automated Feedback Systems

    ERIC Educational Resources Information Center

    Debuse, Justin C. W.; Lawley, Meredith; Shibl, Rania

    2008-01-01

    Assessment of student learning is a core function of educators. Ideally students should be provided with timely, constructive feedback to facilitate learning. However, provision of high quality feedback becomes more complex as class sizes increase, modes of study expand and academic workloads increase. ICT solutions are being developed to…

  12. Improving Access to and Understanding of Regulations through Taxonomies

    ERIC Educational Resources Information Center

    Cheng, Chin Pang; Lau. Gloria T.; Law, Kincho H.; Pan, Jiayi; Jones, Albert

    2009-01-01

    Industrial taxonomies have the potential to automate information retrieval, facilitate interoperability and, most importantly, improve decision making - decisions that must comply with existing government regulations and codes of practice. However, it is difficult to find those regulations and codes most relevant to a particular decision, even…

  13. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    PubMed

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  14. FRAME (Force Review Automation Environment): MATLAB-based AFM data processor.

    PubMed

    Partola, Kostyantyn R; Lykotrafitis, George

    2016-05-03

    Data processing of force-displacement curves generated by atomic force microscopes (AFMs) for elastic moduli and unbinding event measurements is very time consuming and susceptible to user error or bias. There is an evident need for consistent, dependable, and easy-to-use AFM data processing software. We have developed an open-source software application, the force review automation environment (or FRAME), that provides users with an intuitive graphical user interface, automating data processing, and tools for expediting manual processing. We did not observe a significant difference between manually processed and automatically processed results from the same data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Accelerated design of bioconversion processes using automated microscale processing techniques.

    PubMed

    Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M

    2003-01-01

    Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.

  16. Automating Acquisitions: The Planning Process.

    ERIC Educational Resources Information Center

    Bryant, Bonita

    1984-01-01

    Account of process followed at large academic library in preparing for automation of acquisition and fund accounting functions highlights planning criteria, local goals, planning process elements (selecting participants, assigning tasks, devising timetable, providing foundations, evaluating systems, determining costs, formulating recommendations).…

  17. EOS Terra: EOS DAM Automation Constellation MOWG

    NASA Technical Reports Server (NTRS)

    Mantziaras, Dimitrios C.

    2017-01-01

    Brief summary of the decision factors considered and process improvement steps made, to evolve the ESMO debris avoidance maneuver process to a more automated process. Presentation is in response to an action item/question received at a prior MOWG meeting.

  18. A system-level approach to automation research

    NASA Technical Reports Server (NTRS)

    Harrison, F. W.; Orlando, N. E.

    1984-01-01

    Automation is the application of self-regulating mechanical and electronic devices to processes that can be accomplished with the human organs of perception, decision, and actuation. The successful application of automation to a system process should reduce man/system interaction and the perceived complexity of the system, or should increase affordability, productivity, quality control, and safety. The expense, time constraints, and risk factors associated with extravehicular activities have led the Automation Technology Branch (ATB), as part of the NASA Automation Research and Technology Program, to investigate the use of robots and teleoperators as automation aids in the context of space operations. The ATB program addresses three major areas: (1) basic research in autonomous operations, (2) human factors research on man-machine interfaces with remote systems, and (3) the integration and analysis of automated systems. This paper reviews the current ATB research in the area of robotics and teleoperators.

  19. First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    NASA Technical Reports Server (NTRS)

    Griffin, Sandy (Editor)

    1987-01-01

    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.

  20. Application of automation and information systems to forensic genetic specimen processing.

    PubMed

    Leclair, Benoît; Scholl, Tom

    2005-03-01

    During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.

  1. High-throughput physical mapping of chromosomes using automated in situ hybridization.

    PubMed

    George, Phillip; Sharakhova, Maria V; Sharakhov, Igor V

    2012-06-28

    Projects to obtain whole-genome sequences for 10,000 vertebrate species and for 5,000 insect and related arthropod species are expected to take place over the next 5 years. For example, the sequencing of the genomes for 15 malaria mosquitospecies is currently being done using an Illumina platform. This Anopheles species cluster includes both vectors and non-vectors of malaria. When the genome assemblies become available, researchers will have the unique opportunity to perform comparative analysis for inferring evolutionary changes relevant to vector ability. However, it has proven difficult to use next-generation sequencing reads to generate high-quality de novo genome assemblies. Moreover, the existing genome assemblies for Anopheles gambiae, although obtained using the Sanger method, are gapped or fragmented. Success of comparative genomic analyses will be limited if researchers deal with numerous sequencing contigs, rather than with chromosome-based genome assemblies. Fragmented, unmapped sequences create problems for genomic analyses because: (i) unidentified gaps cause incorrect or incomplete annotation of genomic sequences; (ii) unmapped sequences lead to confusion between paralogous genes and genes from different haplotypes; and (iii) the lack of chromosome assignment and orientation of the sequencing contigs does not allow for reconstructing rearrangement phylogeny and studying chromosome evolution. Developing high-resolution physical maps for species with newly sequenced genomes is a timely and cost-effective investment that will facilitate genome annotation, evolutionary analysis, and re-sequencing of individual genomes from natural populations. Here, we present innovative approaches to chromosome preparation, fluorescent in situ hybridization (FISH), and imaging that facilitate rapid development of physical maps. Using An. gambiae as an example, we demonstrate that the development of physical chromosome maps can potentially improve genome assemblies and, thus, the quality of genomic analyses. First, we use a high-pressure method to prepare polytene chromosome spreads. This method, originally developed for Drosophila, allows the user to visualize more details on chromosomes than the regular squashing technique. Second, a fully automated, front-end system for FISH is used for high-throughput physical genome mapping. The automated slide staining system runs multiple assays simultaneously and dramatically reduces hands-on time. Third, an automatic fluorescent imaging system, which includes a motorized slide stage, automatically scans and photographs labeled chromosomes after FISH. This system is especially useful for identifying and visualizing multiple chromosomal plates on the same slide. In addition, the scanning process captures a more uniform FISH result. Overall, the automated high-throughput physical mapping protocol is more efficient than a standard manual protocol.

  2. Automation bias: decision making and performance in high-tech cockpits.

    PubMed

    Mosier, K L; Skitka, L J; Heers, S; Burdick, M

    1997-01-01

    Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.

  3. Examining single- and multiple-process theories of trust in automation.

    PubMed

    Rice, Stephen

    2009-07-01

    The author examined the effects of human responses to automation alerts and nonalerts. Previous research has shown that automation false alarms and misses have differential effects on human trust (i.e., automation false alarms tend to affect operator compliance, whereas automation misses tend to affect operator reliance). Participants performed a simulated combat task, whereby they examined aerial photographs for the presence of enemy targets. A diagnostic aid provided a recommendation during each trial. The author manipulated the reliability and response bias of the aid to provide appropriate data for state-trace analyses. The analyses provided strong evidence that only a multiple-process theory of operator trust can explain the effects of automation errors on human dependence behaviors. The author discusses the theoretical and practical implications of this finding.

  4. Robotics for Nuclear Material Handling at LANL:Capabilities and Needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harden, Troy A; Lloyd, Jane A; Turner, Cameron J

    Nuclear material processing operations present numerous challenges for effective automation. Confined spaces, hazardous materials and processes, particulate contamination, radiation sources, and corrosive chemical operations are but a few of the significant hazards. However, automated systems represent a significant safety advance when deployed in place of manual tasks performed by human workers. The replacement of manual operations with automated systems has been desirable for nearly 40 years, yet only recently are automated systems becoming increasingly common for nuclear materials handling applications. This paper reviews several automation systems which are deployed or about to be deployed at Los Alamos National Laboratory formore » nuclear material handling operations. Highlighted are the current social and technological challenges faced in deploying automated systems into hazardous material handling environments and the opportunities for future innovations.« less

  5. Automation of Cassini Support Imaging Uplink Command Development

    NASA Technical Reports Server (NTRS)

    Ly-Hollins, Lisa; Breneman, Herbert H.; Brooks, Robert

    2010-01-01

    "Support imaging" is imagery requested by other Cassini science teams to aid in the interpretation of their data. The generation of the spacecraft command sequences for these images is performed by the Cassini Instrument Operations Team. The process initially established for doing this was very labor-intensive, tedious and prone to human error. Team management recognized this process as one that could easily benefit from automation. Team members were tasked to document the existing manual process, develop a plan and strategy to automate the process, implement the plan and strategy, test and validate the new automated process, and deliver the new software tools and documentation to Flight Operations for use during the Cassini extended mission. In addition to the goals of higher efficiency and lower risk in the processing of support imaging requests, an effort was made to maximize adaptability of the process to accommodate uplink procedure changes and the potential addition of new capabilities outside the scope of the initial effort.

  6. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  7. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  8. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  9. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  10. 9 CFR 381.307 - Record review and maintenance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... be identified by production date, container code, processing vessel number or other designation and... review. (b) Automated process monitoring and recordkeeping. Automated process monitoring and... applicable requirements of § 381.306. (c) Container closure records. Written records of all container closure...

  11. Comparison of automated processing of flocked swabs with manual processing of fiber swabs for detection of nasal carriage of Staphylococcus aureus.

    PubMed

    Jones, Gillian; Matthews, Roger; Cunningham, Richard; Jenks, Peter

    2011-07-01

    The sensitivity of automated culture of Staphylococcus aureus from flocked swabs versus that of manual culture of fiber swabs was prospectively compared using nasal swabs from 867 patients. Automated culture from flocked swabs significantly increased the detection rate, by 13.1% for direct culture and 10.2% for enrichment culture.

  12. Team performance in networked supervisory control of unmanned air vehicles: effects of automation, working memory, and communication content.

    PubMed

    McKendrick, Ryan; Shaw, Tyler; de Visser, Ewart; Saqer, Haneen; Kidwell, Brian; Parasuraman, Raja

    2014-05-01

    Assess team performance within a net-worked supervisory control setting while manipulating automated decision aids and monitoring team communication and working memory ability. Networked systems such as multi-unmanned air vehicle (UAV) supervision have complex properties that make prediction of human-system performance difficult. Automated decision aid can provide valuable information to operators, individual abilities can limit or facilitate team performance, and team communication patterns can alter how effectively individuals work together. We hypothesized that reliable automation, higher working memory capacity, and increased communication rates of task-relevant information would offset performance decrements attributed to high task load. Two-person teams performed a simulated air defense task with two levels of task load and three levels of automated aid reliability. Teams communicated and received decision aid messages via chat window text messages. Task Load x Automation effects were significant across all performance measures. Reliable automation limited the decline in team performance with increasing task load. Average team spatial working memory was a stronger predictor than other measures of team working memory. Frequency of team rapport and enemy location communications positively related to team performance, and word count was negatively related to team performance. Reliable decision aiding mitigated team performance decline during increased task load during multi-UAV supervisory control. Team spatial working memory, communication of spatial information, and team rapport predicted team success. An automated decision aid can improve team performance under high task load. Assessment of spatial working memory and the communication of task-relevant information can help in operator and team selection in supervisory control systems.

  13. Perspectives on bioanalytical mass spectrometry and automation in drug discovery.

    PubMed

    Janiszewski, John S; Liston, Theodore E; Cole, Mark J

    2008-11-01

    The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.

  14. Exponential error reduction in pretransfusion testing with automation.

    PubMed

    South, Susan F; Casina, Tony S; Li, Lily

    2012-08-01

    Protecting the safety of blood transfusion is the top priority of transfusion service laboratories. Pretransfusion testing is a critical element of the entire transfusion process to enhance vein-to-vein safety. Human error associated with manual pretransfusion testing is a cause of transfusion-related mortality and morbidity and most human errors can be eliminated by automated systems. However, the uptake of automation in transfusion services has been slow and many transfusion service laboratories around the world still use manual blood group and antibody screen (G&S) methods. The goal of this study was to compare error potentials of commonly used manual (e.g., tiles and tubes) versus automated (e.g., ID-GelStation and AutoVue Innova) G&S methods. Routine G&S processes in seven transfusion service laboratories (four with manual and three with automated G&S methods) were analyzed using failure modes and effects analysis to evaluate the corresponding error potentials of each method. Manual methods contained a higher number of process steps ranging from 22 to 39, while automated G&S methods only contained six to eight steps. Corresponding to the number of the process steps that required human interactions, the risk priority number (RPN) of the manual methods ranged from 5304 to 10,976. In contrast, the RPN of the automated methods was between 129 and 436 and also demonstrated a 90% to 98% reduction of the defect opportunities in routine G&S testing. This study provided quantitative evidence on how automation could transform pretransfusion testing processes by dramatically reducing error potentials and thus would improve the safety of blood transfusion. © 2012 American Association of Blood Banks.

  15. Effects of automation of information-processing functions on teamwork.

    PubMed

    Wright, Melanie C; Kaber, David B

    2005-01-01

    We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.

  16. An Integrated Modeling Framework Forecasting Ecosystem Services: Application to the Albemarle Pamlico Basins, NC and VA (USA)

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  17. An Integrated Modeling Framework Forcasting Ecosystem Services--Application to the Albemarle Pamlico Basins, NC and VA (USA) and Beyond

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  18. SPECIATION OF ARSENIC IN BIOLOGICAL MATRICES BY AUTOMATED HG-AAS WITH MULTIPLE MICROFLAME QUARTZ TUBE ATOMIZER (MULTIATOMIZER)

    EPA Science Inventory

    Analyses of arsenic (As) species in body fluids and tissues of individuals chronically exposed to inorganic arsenic (iAs) provide essential information about the exposure level and pattern of iAs metabolism. This information facilitates the risk assessment of disorders associated...

  19. 77 FR 12336 - Postal Service Pricing Proposal

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-29

    ... the recipients of the mailpieces to a mobile-optimized Web site that facilitates the purchase of a product or service, or to a personalized mobile Web site that is tailored to the recipient. Id. at 1, 4... letters, flats, and cards (presort and automation), which include a qualifying mobile barcode or similar...

  20. Case study of read-across predictions using a Generalized Read-Across (GenRA) Approach (10th World Congress)

    EPA Science Inventory

    We developed the Generalized Read-Across (GenRA) approach to facilitate automated, algorithmic read across predictions. GenRA uses in vitro bioactivity data in conjunction with chemical information to predict up to 574 different apical outcomes from repeat-dose toxicity studies. ...

Top