Sample records for source modular automated

  1. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  2. MetAMOS: a modular and open source metagenomic assembly and analysis pipeline

    PubMed Central

    2013-01-01

    We describe MetAMOS, an open source and modular metagenomic assembly and analysis pipeline. MetAMOS represents an important step towards fully automated metagenomic analysis, starting with next-generation sequencing reads and producing genomic scaffolds, open-reading frames and taxonomic or functional annotations. MetAMOS can aid in reducing assembly errors, commonly encountered when assembling metagenomic samples, and improves taxonomic assignment accuracy while also reducing computational cost. MetAMOS can be downloaded from: https://github.com/treangen/MetAMOS. PMID:23320958

  3. An open-source java platform for automated reaction mapping.

    PubMed

    Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M

    2010-09-27

    This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.

  4. ASTROPOP: ASTROnomical Polarimetry and Photometry pipeline

    NASA Astrophysics Data System (ADS)

    Campagnolo, Julio C. N.

    2018-05-01

    AstroPoP reduces almost any CCD photometry and image polarimetry data. For photometry reduction, the code performs source finding, aperture and PSF photometry, astrometry calibration using different automated and non-automated methods and automated source identification and magnitude calibration based on online and local catalogs. For polarimetry, the code resolves linear and circular Stokes parameters produced by image beam splitter or polarizer polarimeters. In addition to the modular functions, ready-to-use pipelines based in configuration files and header keys are also provided with the code. AstroPOP was initially developed to reduce the IAGPOL polarimeter data installed at Observatório Pico dos Dias (Brazil).

  5. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  6. Nexus: A modular workflow management system for quantum simulation codes

    NASA Astrophysics Data System (ADS)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  7. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    PubMed

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  8. Nexus: a modular workflow management system for quantum simulation codes

    DOE PAGES

    Krogel, Jaron T.

    2015-08-24

    The management of simulation workflows is a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantummore » chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.« less

  9. Application of the Modular Automated Reconfigurable Assembly System (MARAS) concept to adaptable vision gauging and parts feeding

    NASA Technical Reports Server (NTRS)

    By, Andre Bernard; Caron, Ken; Rothenberg, Michael; Sales, Vic

    1994-01-01

    This paper presents the first phase results of a collaborative effort between university researchers and a flexible assembly systems integrator to implement a comprehensive modular approach to flexible assembly automation. This approach, named MARAS (Modular Automated Reconfigurable Assembly System), has been structured to support multiple levels of modularity in terms of both physical components and system control functions. The initial focus of the MARAS development has been on parts gauging and feeding operations for cylinder lock assembly. This phase is nearing completion and has resulted in the development of a highly configurable system for vision gauging functions on a wide range of small components (2 mm to 100 mm in size). The reconfigurable concepts implemented in this adaptive Vision Gauging Module (VGM) are now being extended to applicable aspects of the singulating, selecting, and orienting functions required for the flexible feeding of similar mechanical components and assemblies.

  10. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    NASA Astrophysics Data System (ADS)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  11. Technology assessment of automation trends in the modular home industry

    Treesearch

    Phil Mitchell; Robert Russell Hurst

    2009-01-01

    This report provides an assessment of technology used in manufacturing modular homes in the United States, and that used in the German prefabricated wooden home industry. It is the first step toward identifying the research needs in automation and manufacturing methods that will facilitate mass customization in the home manufacturing industry. Within the United States...

  12. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    NASA Astrophysics Data System (ADS)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  13. A modular approach for automated sample preparation and chemical analysis

    NASA Technical Reports Server (NTRS)

    Clark, Michael L.; Turner, Terry D.; Klingler, Kerry M.; Pacetti, Randolph

    1994-01-01

    Changes in international relations, especially within the past several years, have dramatically affected the programmatic thrusts of the U.S. Department of Energy (DOE). The DOE now is addressing the environmental cleanup required as a result of 50 years of nuclear arms research and production. One major obstacle in the remediation of these areas is the chemical determination of potentially contaminated material using currently acceptable practices. Process bottlenecks and exposure to hazardous conditions pose problems for the DOE. One proposed solution is the application of modular automated chemistry using Standard Laboratory Modules (SLM) to perform Standard Analysis Methods (SAM). The Contaminant Analysis Automation (CAA) Program has developed standards and prototype equipment that will accelerate the development of modular chemistry technology and is transferring this technology to private industry.

  14. A HUMAN AUTOMATION INTERACTION CONCEPT FOR A SMALL MODULAR REACTOR CONTROL ROOM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya; Spielman, Zach; Hill, Rachael

    Many advanced nuclear power plant (NPP) designs incorporate higher degrees of automation than the existing fleet of NPPs. Automation is being introduced or proposed in NPPs through a wide variety of systems and technologies, such as advanced displays, computer-based procedures, advanced alarm systems, and computerized operator support systems. Additionally, many new reactor concepts, both full scale and small modular reactors, are proposing increased automation and reduced staffing as part of their concept of operations. However, research consistently finds that there is a fundamental tradeoff between system performance with increased automation and reduced human performance. There is a need to addressmore » the question of how to achieve high performance and efficiency of high levels of automation without degrading human performance. One example of a new NPP concept that will utilize greater degrees of automation is the SMR concept from NuScale Power. The NuScale Power design requires 12 modular units to be operated in one single control room, which leads to a need for higher degrees of automation in the control room. Idaho National Laboratory (INL) researchers and NuScale Power human factors and operations staff are working on a collaborative project to address the human performance challenges of increased automation and to determine the principles that lead to optimal performance in highly automated systems. This paper will describe this concept in detail and will describe an experimental test of the concept. The benefits and challenges of the approach will be discussed.« less

  15. Effects of Levels of Automation for Advanced Small Modular Reactors: Impacts on Performance, Workload, and Situation Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johanna Oxstrand; Katya Le Blanc

    The Human-Automation Collaboration (HAC) research effort is a part of the Department of Energy (DOE) sponsored Advanced Small Modular Reactor (AdvSMR) program conducted at Idaho National Laboratory (INL). The DOE AdvSMR program focuses on plant design and management, reduction of capital costs as well as plant operations and maintenance costs (O&M), and factory production costs benefits.

  16. A modular, open-source, slide-scanning microscope for diagnostic applications in resource-constrained settings

    PubMed Central

    Lu, Qiang; Liu, Guanghui; Xiao, Chuanli; Hu, Chuanzhen; Zhang, Shiwu; Xu, Ronald X.; Chu, Kaiqin; Xu, Qianming

    2018-01-01

    In this paper we report the development of a cost-effective, modular, open source, and fully automated slide-scanning microscope, composed entirely of easily available off-the-shelf parts, and capable of bright field and fluorescence modes. The automated X-Y stage is composed of two low-cost micrometer stages coupled to stepper motors operated in open-loop mode. The microscope is composed of a low-cost CMOS sensor and low-cost board lenses placed in a 4f configuration. The system has approximately 1 micron resolution, limited by the f/# of available board lenses. The microscope is compact, measuring just 25×25×30 cm, and has an absolute positioning accuracy of ±1 μm in the X and Y directions. A Z-stage enables autofocusing and imaging over large fields of view even on non-planar samples, and custom software enables automatic determination of sample boundaries and image mosaicking. We demonstrate the utility of our device through imaging of fluorescent- and transmission-dye stained blood and fecal smears containing human and animal parasites, as well as several prepared tissue samples. These results demonstrate image quality comparable to high-end commercial microscopes at a cost of less than US$400 for a bright-field system, with an extra US$100 needed for the fluorescence module. PMID:29543835

  17. Experimental research control software system

    NASA Astrophysics Data System (ADS)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  18. Device for modular input high-speed multi-channel digitizing of electrical data

    DOEpatents

    VanDeusen, Alan L.; Crist, Charles E.

    1995-09-26

    A multi-channel high-speed digitizer module converts a plurality of analog signals to digital signals (digitizing) and stores the signals in a memory device. The analog input channels are digitized simultaneously at high speed with a relatively large number of on-board memory data points per channel. The module provides an automated calibration based upon a single voltage reference source. Low signal noise at such a high density and sample rate is accomplished by ensuring the A/D converters are clocked at the same point in the noise cycle each time so that synchronous noise sampling occurs. This sampling process, in conjunction with an automated calibration, yields signal noise levels well below the noise level present on the analog reference voltages.

  19. New Office Technology: A Study on Curriculum Design.

    ERIC Educational Resources Information Center

    Mulder, Martin

    1989-01-01

    A study collected information about office automation trends, office personnel job profiles, and existing curricula. A curriculum conference was held to design and validate a modular curriculum for office automation. (SK)

  20. Laboratory automation: total and subtotal.

    PubMed

    Hawker, Charles D

    2007-12-01

    Worldwide, perhaps 2000 or more clinical laboratories have implemented some form of laboratory automation, either a modular automation system, such as for front-end processing, or a total laboratory automation system. This article provides descriptions and examples of these various types of automation. It also presents an outline of how a clinical laboratory that is contemplating automation should approach its decision and the steps it should follow to ensure a successful implementation. Finally, the role of standards in automation is reviewed.

  1. Device for modular input high-speed multi-channel digitizing of electrical data

    DOEpatents

    VanDeusen, A.L.; Crist, C.E.

    1995-09-26

    A multi-channel high-speed digitizer module converts a plurality of analog signals to digital signals (digitizing) and stores the signals in a memory device. The analog input channels are digitized simultaneously at high speed with a relatively large number of on-board memory data points per channel. The module provides an automated calibration based upon a single voltage reference source. Low signal noise at such a high density and sample rate is accomplished by ensuring the A/D converters are clocked at the same point in the noise cycle each time so that synchronous noise sampling occurs. This sampling process, in conjunction with an automated calibration, yields signal noise levels well below the noise level present on the analog reference voltages. 1 fig.

  2. Semi-automated Modular Program Constructor for physiological modeling: Building cell and organ models.

    PubMed

    Jardine, Bartholomew; Raymond, Gary M; Bassingthwaighte, James B

    2015-01-01

    The Modular Program Constructor (MPC) is an open-source Java based modeling utility, built upon JSim's Mathematical Modeling Language (MML) ( http://www.physiome.org/jsim/) that uses directives embedded in model code to construct larger, more complicated models quickly and with less error than manually combining models. A major obstacle in writing complex models for physiological processes is the large amount of time it takes to model the myriad processes taking place simultaneously in cells, tissues, and organs. MPC replaces this task with code-generating algorithms that take model code from several different existing models and produce model code for a new JSim model. This is particularly useful during multi-scale model development where many variants are to be configured and tested against data. MPC encodes and preserves information about how a model is built from its simpler model modules, allowing the researcher to quickly substitute or update modules for hypothesis testing. MPC is implemented in Java and requires JSim to use its output. MPC source code and documentation are available at http://www.physiome.org/software/MPC/.

  3. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  4. [Modularization by the open standard. (II)].

    PubMed

    Muto, M; Takaha, Y; Chiba, N

    2000-10-01

    In recent years, accompanied by the marvelous development and spread of Laboratory Automation System(LAS), the NCCLS is now proposing five international standards for laboratory automation. We have based our laboratory on these "NCCLS standards of laboratory automation", we take these standards ahead first, and we now propose an open standard called "Open LA 21", to establish more detailed standard replacing the NCCLS laboratory automation standards.

  5. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  6. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  7. Experimental Demonstration of Technologies for Autonomous On-Orbit Robotic Assembly

    NASA Technical Reports Server (NTRS)

    LeMaster, Edward A.; Schaechter, David B.; Carrington, Connie K.

    2006-01-01

    The Modular Reconfigurable High Energy (MRHE) program aimed to develop technologies for the automated assembly and deployment of large-scale space structures and aggregate spacecraft. Part of the project involved creation of a terrestrial robotic testbed for validation and demonstration of these technologies and for the support of future development activities. This testbed was completed in 2005, and was thereafter used to demonstrate automated rendezvous, docking, and self-assembly tasks between a group of three modular robotic spacecraft emulators. This paper discusses the rationale for the MRHE project, describes the testbed capabilities, and presents the MRHE assembly demonstration sequence.

  8. Inselect: Automating the Digitization of Natural History Collections

    PubMed Central

    Hudson, Lawrence N.; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W.; van der Walt, Stéfan; Smith, Vincent S.

    2015-01-01

    The world’s natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect—a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization. PMID:26599208

  9. Inselect: Automating the Digitization of Natural History Collections.

    PubMed

    Hudson, Lawrence N; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W; van der Walt, Stéfan; Smith, Vincent S

    2015-01-01

    The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.

  10. A Modular Robotic System with Applications to Space Exploration

    NASA Technical Reports Server (NTRS)

    Hancher, Matthew D.; Hornby, Gregory S.

    2006-01-01

    Modular robotic systems offer potential advantages as versatile, fault-tolerant, cost-effective platforms for space exploration, but a sufficiently mature system is not yet available. We describe the possible applications of such a system, and present prototype hardware intended as a step in the right direction. We also present elements of an automated design and optimization framework aimed at making modular robots easier to design and use, and discuss the results of applying the system to a gait optimization problem. Finally, we discuss the potential near-term applications of modular robotics to terrestrial robotics research.

  11. [Modularization by the open standard. (I)].

    PubMed

    Hirano, H

    2000-10-01

    We are proceeding with the project called "Open LA21 Project" in the course of the clinical laboratory automation toward the 21st century. With the modular system that realizes integration, downsizing, a reasonable price, and is the future course in the clinical testing automation system as well, we aim to establish common standards among manufacturers as the only way to create user friendly market environments where the proper competition exists among the manufacturers. The common standards which are in preparation by the participating companies as "Open module system standards" are the standards which are going to be made public. They are intended to guarantee connection, compatibility of the products in conformity with the standards. In this project, we intend to realize the modular system that integrates each field, such as chemistry, hematology, coagulation/fibrinolysis, immunology, urinalysis in an early stage, and contribute positively to restructuring and upgrading the "raison d'etre" of the 21st century clinical testing.

  12. An interactive modular design for computerized photometry in spectrochemical analysis

    NASA Technical Reports Server (NTRS)

    Bair, V. L.

    1980-01-01

    A general functional description of totally automatic photometry of emission spectra is not available for an operating environment in which the sample compositions and analysis procedures are low-volume and non-routine. The advantages of using an interactive approach to computer control in such an operating environment are demonstrated. This approach includes modular subroutines selected at multiple-option, menu-style decision points. This style of programming is used to trace elemental determinations, including the automated reading of spectrographic plates produced by a 3.4 m Ebert mount spectrograph using a dc-arc in an argon atmosphere. The simplified control logic and modular subroutine approach facilitates innovative research and program development, yet is easily adapted to routine tasks. Operator confidence and control are increased by the built-in options including degree of automation, amount of intermediate data printed out, amount of user prompting, and multidirectional decision points.

  13. Framework for Human-Automation Collaboration: Conclusions from Four Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna; Le Blanc, Katya L.; O'Hara, John

    The Human Automation Collaboration (HAC) research project is investigating how advanced technologies that are planned for Advanced Small Modular Reactors (AdvSMR) will affect the performance and the reliability of the plant from a human factors and human performance perspective. The HAC research effort investigates the consequences of allocating functions between the operators and automated systems. More specifically, the research team is addressing how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. Oxstrand et al. (2013 - March) describes the efforts conductedmore » by the researchers to identify the research needs for HAC. The research team reviewed the literature on HAC, developed a model of HAC, and identified gaps in the existing knowledge of human-automation collaboration. As described in Oxstrand et al. (2013 – June), the team then prioritized the research topics identified based on the specific needs in the context of AdvSMR. The prioritization was based on two sources of input: 1) The preliminary functions and tasks, and 2) The model of HAC. As a result, three analytical studies were planned and conduced; 1) Models of Teamwork, 2) Standardized HAC Performance Measurement Battery, and 3) Initiators and Triggering Conditions for Adaptive Automation. Additionally, one field study was also conducted at Idaho Falls Power.« less

  14. Modular microfluidic system for biological sample preparation

    DOEpatents

    Rose, Klint A.; Mariella, Jr., Raymond P.; Bailey, Christopher G.; Ness, Kevin Dean

    2015-09-29

    A reconfigurable modular microfluidic system for preparation of a biological sample including a series of reconfigurable modules for automated sample preparation adapted to selectively include a) a microfluidic acoustic focusing filter module, b) a dielectrophoresis bacteria filter module, c) a dielectrophoresis virus filter module, d) an isotachophoresis nucleic acid filter module, e) a lyses module, and f) an isotachophoresis-based nucleic acid filter.

  15. Measuring, Enabling and Comparing Modularity, Regularity and Hierarchy in Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2005-01-01

    For computer-automated design systems to scale to complex designs they must be able to produce designs that exhibit the characteristics of modularity, regularity and hierarchy - characteristics that are found both in man-made and natural designs. Here we claim that these characteristics are enabled by implementing the attributes of combination, control-flow and abstraction in the representation. To support this claim we use an evolutionary algorithm to evolve solutions to different sizes of a table design problem using five different representations, each with different combinations of modularity, regularity and hierarchy enabled and show that the best performance happens when all three of these attributes are enabled. We also define metrics for modularity, regularity and hierarchy in design encodings and demonstrate that high fitness values are achieved with high values of modularity, regularity and hierarchy and that there is a positive correlation between increases in fitness and increases in modularity. regularity and hierarchy.

  16. Anima: Modular Workflow System for Comprehensive Image Data Analysis

    PubMed Central

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  17. Automation Hooks Architecture Trade Study for Flexible Test Orchestration

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.

    2010-01-01

    We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.

  18. In-House Automation of a Small Library Using a Mainframe Computer.

    ERIC Educational Resources Information Center

    Waranius, Frances B.; Tellier, Stephen H.

    1986-01-01

    An automated library routine management system was developed in-house to create system unique to the Library and Information Center, Lunar and Planetary Institute, Houston, Texas. A modular approach was used to allow continuity in operations and services as system was implemented. Acronyms and computer accounts and file names are appended.…

  19. Computer control of a robotic satellite servicer

    NASA Technical Reports Server (NTRS)

    Fernandez, K. R.

    1980-01-01

    The advantages that will accrue from the in-orbit servicing of satellites are listed. It is noted that in a concept in satellite servicing which holds promise as a compromise between the high flexibility and adaptability of manned vehicles and the lower cost of an unmanned vehicle involves an unmanned servicer carrying a remotely supervised robotic manipulator arm. Because of deficiencies in sensor technology, robot servicing would require that satellites be designed according to a modular concept. A description is given of the servicer simulation hardware, the computer and interface hardware, and the software. It is noted that several areas require further development; these include automated docking, modularization of satellite design, reliable connector and latching mechanisms, development of manipulators for space environments, and development of automated diagnostic techniques.

  20. Standards for space automation and robotics

    NASA Technical Reports Server (NTRS)

    Kader, Jac B.; Loftin, R. B.

    1992-01-01

    The AIAA's Committee on Standards for Space Automation and Robotics (COS/SAR) is charged with the identification of key functions and critical technologies applicable to multiple missions that reflect fundamental consideration of environmental factors. COS/SAR's standards/practices/guidelines implementation methods will be based on reliability, performance, and operations, as well as economic viability and life-cycle costs, simplicity, and modularity.

  1. Rodent motor and neuropsychological behaviour measured in home cages using the integrated modular platform SmartCage™

    PubMed Central

    Khroyan, Taline V; Zhang, Jingxi; Yang, Liya; Zou, Bende; Xie, James; Pascual, Conrado; Malik, Adam; Xie, Julian; Zaveri, Nurulain T; Vazquez, Jacqueline; Polgar, Willma; Toll, Lawrence; Fang, Jidong; Xie, Xinmin

    2017-01-01

    SUMMARY To facilitate investigation of diverse rodent behaviours in rodents’ home cages, we have developed an integrated modular platform, the SmartCage™ system (AfaSci, Inc. Burlingame, CA, USA), which enables automated neurobehavioural phenotypic analysis and in vivo drug screening in a relatively higher-throughput and more objective manner.The individual platform consists of an infrared array, a vibration floor sensor and a variety of modular devices. One computer can simultaneously operate up to 16 platforms via USB cables.The SmartCage™ detects drug-induced increases and decreases in activity levels, as well as changes in movement patterns. Wake and sleep states of mice can be detected using the vibration floor sensor. The arousal state classification achieved up to 98% accuracy compared with results obtained by electroencephalography and electromyography. More complex behaviours, including motor coordination, anxiety-related behaviours and social approach behaviour, can be assessed using appropriate modular devices and the results obtained are comparable with results obtained using conventional methods.In conclusion, the SmartCage™ system provides an automated and accurate tool to quantify various rodent behaviours in a ‘stress-free’ environment. This system, combined with the validated testing protocols, offers powerful a tool kit for transgenic phenotyping and in vivo drug screening. PMID:22540540

  2. Automated analysis of information processing, kinetic independence and modular architecture in biochemical networks using MIDIA.

    PubMed

    Bowsher, Clive G

    2011-02-15

    Understanding the encoding and propagation of information by biochemical reaction networks and the relationship of such information processing properties to modular network structure is of fundamental importance in the study of cell signalling and regulation. However, a rigorous, automated approach for general biochemical networks has not been available, and high-throughput analysis has therefore been out of reach. Modularization Identification by Dynamic Independence Algorithms (MIDIA) is a user-friendly, extensible R package that performs automated analysis of how information is processed by biochemical networks. An important component is the algorithm's ability to identify exact network decompositions based on both the mass action kinetics and informational properties of the network. These modularizations are visualized using a tree structure from which important dynamic conditional independence properties can be directly read. Only partial stoichiometric information needs to be used as input to MIDIA, and neither simulations nor knowledge of rate parameters are required. When applied to a signalling network, for example, the method identifies the routes and species involved in the sequential propagation of information between its multiple inputs and outputs. These routes correspond to the relevant paths in the tree structure and may be further visualized using the Input-Output Path Matrix tool. MIDIA remains computationally feasible for the largest network reconstructions currently available and is straightforward to use with models written in Systems Biology Markup Language (SBML). The package is distributed under the GNU General Public License and is available, together with a link to browsable Supplementary Material, at http://code.google.com/p/midia. Further information is at www.maths.bris.ac.uk/~macgb/Software.html.

  3. Recommended Systems for the Incremental Automation of the Morgue of "The Daily Texan."

    ERIC Educational Resources Information Center

    Voges, Mickie; And Others

    A modular program is recommended for automation of the clippings file of "The Daily Texan" (student newspaper of the University of Texas at Austin). The proposed system will lead ultimately to on-line storage of the index, on-line storage of local, staff-written news stories from the previous twenty-four months, micrographic storage for backup and…

  4. An optimized routing algorithm for the automated assembly of standard multimode ribbon fibers in a full-mesh optical backplane

    NASA Astrophysics Data System (ADS)

    Basile, Vito; Guadagno, Gianluca; Ferrario, Maddalena; Fassi, Irene

    2018-03-01

    In this paper a parametric, modular and scalable algorithm allowing a fully automated assembly of a backplane fiber-optic interconnection circuit is presented. This approach guarantees the optimization of the optical fiber routing inside the backplane with respect to specific criteria (i.e. bending power losses), addressing both transmission performance and overall costs issues. Graph theory has been exploited to simplify the complexity of the NxN full-mesh backplane interconnection topology, firstly, into N independent sub-circuits and then, recursively, into a limited number of loops easier to be generated. Afterwards, the proposed algorithm selects a set of geometrical and architectural parameters whose optimization allows to identify the optimal fiber optic routing for each sub-circuit of the backplane. The topological and numerical information provided by the algorithm are then exploited to control a robot which performs the automated assembly of the backplane sub-circuits. The proposed routing algorithm can be extended to any array architecture and number of connections thanks to its modularity and scalability. Finally, the algorithm has been exploited for the automated assembly of an 8x8 optical backplane realized with standard multimode (MM) 12-fiber ribbons.

  5. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    DTIC Science & Technology

    2017-01-01

    004 OFFICE OF NAVAL RESEARCH ATTN JASON STACK MINE WARFARE & OCEAN ENGINEERING PROGRAMS CODE 32, SUITE 1092 875 N RANDOLPH ST ARLINGTON VA 22203 ONR...naval mine countermeasures (MCM) operations by automating a large portion of the data analysis. Successful long-term implementation of ATR requires a...Modular Algorithm Testbed Suite; MATS; Mine Countermeasures Operations U U U SAR 24 Derek R. Kolacinski (850) 230-7218 THIS PAGE INTENTIONALLY LEFT

  6. MODULAR MANIPULATOR FOR ROBOTICS APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph W. Geisinger, Ph.D.

    ARM Automation, Inc. is developing a framework of modular actuators that can address the DOE's wide range of robotics needs. The objective of this effort is to demonstrate the effectiveness of this technology by constructing a manipulator from these actuators within a glovebox for Automated Plutonium Processing (APP). At the end of the project, the system of actuators was used to construct several different manipulator configurations, which accommodate common glovebox tasks such as repackaging. The modular nature and quickconnects of this system simplify installation into ''hot'' boxes and any potential modifications or repair therein. This work focused on the developmentmore » of self-contained robotic actuator modules including the embedded electronic controls for the purpose of building a manipulator system. Both of the actuators developed under this project contain the control electronics, sensors, motor, gear train, wiring, system communications and mechanical interfaces of a complete robotics servo device. Test actuators and accompanying DISC{trademark}s underwent validation testing at The University of Texas at Austin and ARM Automation, Inc. following final design and fabrication. The system also included custom links, an umbilical cord, an open architecture PC-based system controller, and operational software that permitted integration into a completely functional robotic manipulator system. The open architecture on which this system is based avoids proprietary interfaces and communication protocols which only serve to limit the capabilities and flexibility of automation equipment. The system was integrated and tested in the contractor's facility for intended performance and operations. The manipulator was tested using the full-scale equipment and process mock-ups. The project produced a practical and operational system including a quantitative evaluation of its performance and cost.« less

  7. Hierarchy Software Development Framework (h-dp-fwk) project

    NASA Astrophysics Data System (ADS)

    Zaytsev, A.

    2010-04-01

    Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.

  8. DataForge: Modular platform for data storage and analysis

    NASA Astrophysics Data System (ADS)

    Nozik, Alexander

    2018-04-01

    DataForge is a framework for automated data acquisition, storage and analysis based on modern achievements of applied programming. The aim of the DataForge is to automate some standard tasks like parallel data processing, logging, output sorting and distributed computing. Also the framework extensively uses declarative programming principles via meta-data concept which allows a certain degree of meta-programming and improves results reproducibility.

  9. Rodent motor and neuropsychological behaviour measured in home cages using the integrated modular platform SmartCage™.

    PubMed

    Khroyan, Taline V; Zhang, Jingxi; Yang, Liya; Zou, Bende; Xie, James; Pascual, Conrado; Malik, Adam; Xie, Julian; Zaveri, Nurulain T; Vazquez, Jacqueline; Polgar, Willma; Toll, Lawrence; Fang, Jidong; Xie, Xinmin

    2012-07-01

    1. To facilitate investigation of diverse rodent behaviours in rodents' home cages, we have developed an integrated modular platform, the SmartCage(™) system (AfaSci, Inc. Burlingame, CA, USA), which enables automated neurobehavioural phenotypic analysis and in vivo drug screening in a relatively higher-throughput and more objective manner. 2, The individual platform consists of an infrared array, a vibration floor sensor and a variety of modular devices. One computer can simultaneously operate up to 16 platforms via USB cables. 3. The SmartCage(™) detects drug-induced increases and decreases in activity levels, as well as changes in movement patterns. Wake and sleep states of mice can be detected using the vibration floor sensor. The arousal state classification achieved up to 98% accuracy compared with results obtained by electroencephalography and electromyography. More complex behaviours, including motor coordination, anxiety-related behaviours and social approach behaviour, can be assessed using appropriate modular devices and the results obtained are comparable with results obtained using conventional methods. 4. In conclusion, the SmartCage(™) system provides an automated and accurate tool to quantify various rodent behaviours in a 'stress-free' environment. This system, combined with the validated testing protocols, offers powerful a tool kit for transgenic phenotyping and in vivo drug screening. © 2012 The Authors. Clinical and Experimental Pharmacology and Physiology © 2012 Blackwell Publishing Asia Pty Ltd.

  10. Rapid automation of a cell-based assay using a modular approach: case study of a flow-based Varicella Zoster Virus infectivity assay.

    PubMed

    Joelsson, Daniel; Gates, Irina V; Pacchione, Diana; Wang, Christopher J; Bennett, Philip S; Zhang, Yuhua; McMackin, Jennifer; Frey, Tina; Brodbeck, Kristin C; Baxter, Heather; Barmat, Scott L; Benetti, Luca; Bodmer, Jean-Luc

    2010-06-01

    Vaccine manufacturing requires constant analytical monitoring to ensure reliable quality and a consistent safety profile of the final product. Concentration and bioactivity of active components of the vaccine are key attributes routinely evaluated throughout the manufacturing cycle and for product release and dosage. In the case of live attenuated virus vaccines, bioactivity is traditionally measured in vitro by infection of susceptible cells with the vaccine followed by quantification of virus replication, cytopathology or expression of viral markers. These assays are typically multi-day procedures that require trained technicians and constant attention. Considering the need for high volumes of testing, automation and streamlining of these assays is highly desirable. In this study, the automation and streamlining of a complex infectivity assay for Varicella Zoster Virus (VZV) containing test articles is presented. The automation procedure was completed using existing liquid handling infrastructure in a modular fashion, limiting custom-designed elements to a minimum to facilitate transposition. In addition, cellular senescence data provided an optimal population doubling range for long term, reliable assay operation at high throughput. The results presented in this study demonstrate a successful automation paradigm resulting in an eightfold increase in throughput while maintaining assay performance characteristics comparable to the original assay. Copyright 2010 Elsevier B.V. All rights reserved.

  11. Can the Roche hemolysis index be used for automated determination of cell-free hemoglobin? A comparison to photometric assays.

    PubMed

    Petrova, Darinka Todorova; Cocisiu, Gabriela Ariadna; Eberle, Christoph; Rhode, Karl-Heinz; Brandhorst, Gunnar; Walson, Philip D; Oellerich, Michael

    2013-09-01

    The aim of this study was to develop a novel method for automated quantification of cell-free hemoglobin (fHb) based on the HI (Roche Diagnostics). The novel fHb method based on the HI was correlated with fHb measured using the triple wavelength methods of both Harboe [fHb, g/L = (0.915 * HI + 2.634)/100] and Fairbanks et al. [fHb, g/L = (0.917 * HI + 2.131)/100]. fHb concentrations were estimated from the HI using the Roche Modular automated platform in self-made and commercially available quality controls, as well as samples from a proficiency testing scheme (INSTAND). The fHb using Roche automated HI results were then compared to results obtained using the traditional spectrophotometric assays for one hundred plasma samples with varying degrees of hemolysis, lipemia and/or bilirubinemia. The novel method using automated HI quantification on the Roche Modular clinical chemistry platform correlated well with results using the classical methods in the 100 patient samples (Harboe: r = 0.9284; Fairbanks et al.: r = 0.9689) and recovery was good for self-made controls. However, commercially available quality controls showed poor recovery due to an unidentified matrix problem. The novel method produced reliable determination of fHb in samples without interferences. However, poor recovery using commercially available fHb quality control samples currently greatly limits its usefulness. © 2013.

  12. Modular multiaperatures for light sensors

    NASA Technical Reports Server (NTRS)

    Rizzo, A. A.

    1977-01-01

    Process involves electroplating multiaperature masks as unit, eliminating alinement and assembly difficulties previously encountered. Technique may be applied to masks in automated and surveillance light systems, when precise, wide angle field of view is needed.

  13. Programmable Automated Welding System (PAWS)

    NASA Technical Reports Server (NTRS)

    Kline, Martin D.

    1994-01-01

    An ambitious project to develop an advanced, automated welding system is being funded as part of the Navy Joining Center with Babcock & Wilcox as the prime integrator. This program, the Programmable Automated Welding System (PAWS), involves the integration of both planning and real-time control activities. Planning functions include the development of a graphical decision support system within a standard, portable environment. Real-time control functions include the development of a modular, intelligent, real-time control system and the integration of a number of welding process sensors. This paper presents each of these components of the PAWS and discusses how they can be utilized to automate the welding operation.

  14. Modular modeling system for building distributed hydrologic models with a user-friendly software package

    NASA Astrophysics Data System (ADS)

    Wi, S.; Ray, P. A.; Brown, C.

    2015-12-01

    A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.

  15. Agile Acceptance Test–Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software

    PubMed Central

    Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-01-01

    Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922

  16. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    PubMed

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.

  17. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    PubMed

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  18. Automation of testing modules of controller ELSY-ТМК

    NASA Astrophysics Data System (ADS)

    Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.

    2017-01-01

    In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.

  19. Modular thought in the circuit analysis

    NASA Astrophysics Data System (ADS)

    Wang, Feng

    2018-04-01

    Applied to solve the problem of modular thought, provides a whole for simplification's method, the complex problems have become of, and the study of circuit is similar to the above problems: the complex connection between components, make the whole circuit topic solution seems to be more complex, and actually components the connection between the have rules to follow, this article mainly tells the story of study on the application of the circuit modular thought. First of all, this paper introduces the definition of two-terminal network and the concept of two-terminal network equivalent conversion, then summarizes the common source resistance hybrid network modular approach, containing controlled source network modular processing method, lists the common module, typical examples analysis.

  20. INITIATORS AND TRIGGERING CONDITIONS FOR ADAPTIVE AUTOMATION IN ADVANCED SMALL MODULAR REACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katya L Le Blanc; Johanna h Oxstrand

    It is anticipated that Advanced Small Modular Reactors (AdvSMRs) will employ high degrees of automation. High levels of automation can enhance system performance, but often at the cost of reduced human performance. Automation can lead to human out-of the loop issues, unbalanced workload, complacency, and other problems if it is not designed properly. Researchers have proposed adaptive automation (defined as dynamic or flexible allocation of functions) as a way to get the benefits of higher levels of automation without the human performance costs. Adaptive automation has the potential to balance operator workload and enhance operator situation awareness by allocating functionsmore » to the operators in a way that is sensitive to overall workload and capabilities at the time of operation. However, there still a number of questions regarding how to effectively design adaptive automation to achieve that potential. One of those questions is related to how to initiate (or trigger) a shift in automation in order to provide maximal sensitivity to operator needs without introducing undesirable consequences (such as unpredictable mode changes). Several triggering mechanisms for shifts in adaptive automation have been proposed including: operator initiated, critical events, performance-based, physiological measurement, model-based, and hybrid methods. As part of a larger project to develop design guidance for human-automation collaboration in AdvSMRs, researchers at Idaho National Laboratory have investigated the effectiveness and applicability of each of these triggering mechanisms in the context of AdvSMR. Researchers reviewed the empirical literature on adaptive automation and assessed each triggering mechanism based on the human-system performance consequences of employing that mechanism. Researchers also assessed the practicality and feasibility of using the mechanism in the context of an AdvSMR control room. Results indicate that there are tradeoffs associated with each mechanism, but that some are more applicable to the AdvSMR domain. The two mechanisms that consistently improve performance in laboratory studies are operator initiated adaptive automation based on hierarchical task delegation and the Electroencephalogram(EEG) –based measure of engagement. Current EEG methods are intrusive and require intensive analysis; therefore it is not recommended for an AdvSMR control rooms at this time. Researchers also discuss limitations in the existing empirical literature and make recommendations for further research.« less

  1. ALC: automated reduction of rule-based models

    PubMed Central

    Koschorreck, Markus; Gilles, Ernst Dieter

    2008-01-01

    Background Combinatorial complexity is a challenging problem for the modeling of cellular signal transduction since the association of a few proteins can give rise to an enormous amount of feasible protein complexes. The layer-based approach is an approximative, but accurate method for the mathematical modeling of signaling systems with inherent combinatorial complexity. The number of variables in the simulation equations is highly reduced and the resulting dynamic models show a pronounced modularity. Layer-based modeling allows for the modeling of systems not accessible previously. Results ALC (Automated Layer Construction) is a computer program that highly simplifies the building of reduced modular models, according to the layer-based approach. The model is defined using a simple but powerful rule-based syntax that supports the concepts of modularity and macrostates. ALC performs consistency checks on the model definition and provides the model output in different formats (C MEX, MATLAB, Mathematica and SBML) as ready-to-run simulation files. ALC also provides additional documentation files that simplify the publication or presentation of the models. The tool can be used offline or via a form on the ALC website. Conclusion ALC allows for a simple rule-based generation of layer-based reduced models. The model files are given in different formats as ready-to-run simulation files. PMID:18973705

  2. The development of a post-test diagnostic system for rocket engines

    NASA Technical Reports Server (NTRS)

    Zakrajsek, June F.

    1991-01-01

    An effort was undertaken by NASA to develop an automated post-test, post-flight diagnostic system for rocket engines. The automated system is designed to be generic and to automate the rocket engine data review process. A modular, distributed architecture with a generic software core was chosen to meet the design requirements. The diagnostic system is initially being applied to the Space Shuttle Main Engine data review process. The system modules currently under development are the session/message manager, and portions of the applications section, the component analysis section, and the intelligent knowledge server. An overview is presented of a rocket engine data review process, the design requirements and guidelines, the architecture and modules, and the projected benefits of the automated diagnostic system.

  3. Medical Data Architecture Capabilities and Design

    NASA Technical Reports Server (NTRS)

    Middour, C.; Krihak, M.; Lindsey, A.; Marker, N.; Wolfe, S.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2017-01-01

    Mission constraints will challenge the delivery of medical care on a long-term, deep space explorationmission. This type of mission will be restricted in the availability of medical knowledge, skills, procedures and resourcesto prevent, diagnose, and treat in-flight medical events. Challenges to providing medical care are anticipated, includingresource and resupply constraints, delayed communications and no ability for medical evacuation. The Medical DataArchitecture (MDA) project will enable medical care capability in this constrained environment.The first version of thesystem, called Test Bed 1, includes capabilities for automated data collection, data storage and data retrieval to provideinformation to the Crew Medical Officer (CMO). Test Bed 1 seeks to establish a data architecture foundation and developa scalable data management system through modular design and standardized interfaces. In addition, it will demonstrateto stakeholders the potential for an improved, automated, flow of data to and from the medical system over the currentmethods employed on the International Space Station (ISS). It integrates a set of external devices, software andprocesses, and a Subjective, Objective, Assessment, and Plan (SOAP) note commonly used by clinicians. Medical datalike electrocardiogram plots, heart rate, skin temperature, respiration rate, medications taken, and more are collectedfrom devices and stored in the Electronic Medical Records (EMR) system, and reported to crew and clinician. Devicesintegrated include the Astroskin biosensor vest and IMED CARDIAX electrocardiogram (ECG) device with INEED MDECG Glove, and the NASA-developed Medical Dose Tracker application.The system is designed to be operated as astandalone system, and can be deployed in a variety of environments, from a laptop to a data center. The system isprimarily composed of open-source software tools, and is designed to be modular, so new capabilities can be added. Thesoftware components and integration methods will be discussed.

  4. Design and performance of an automated radionuclide separator: its application on the determination of ⁹⁹Tc in groundwater.

    PubMed

    Chung, Kun Ho; Choi, Sang Do; Choi, Geun Sik; Kang, Mun Ja

    2013-11-01

    A modular automated radionuclide separator for (99)Tc (MARS Tc-99) has been developed for the rapid and reproducible separation of technetium in groundwater samples. The control software of MARS Tc-99 was developed in the LabView programming language. An automated radiochemical method for separating (99)Tc was developed and validated by the purification of (99m)Tc tracer solution eluted from a commercial (99)Mo/(99m)Tc generator. The chemical recovery and analytical time for this radiochemical method were found to be 96 ± 2% and 81 min, respectively. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Tackling the x-ray cargo inspection challenge using machine learning

    NASA Astrophysics Data System (ADS)

    Jaccard, Nicolas; Rogers, Thomas W.; Morton, Edward J.; Griffin, Lewis D.

    2016-05-01

    The current infrastructure for non-intrusive inspection of cargo containers cannot accommodate exploding com-merce volumes and increasingly stringent regulations. There is a pressing need to develop methods to automate parts of the inspection workflow, enabling expert operators to focus on a manageable number of high-risk images. To tackle this challenge, we developed a modular framework for automated X-ray cargo image inspection. Employing state-of-the-art machine learning approaches, including deep learning, we demonstrate high performance for empty container verification and specific threat detection. This work constitutes a significant step towards the partial automation of X-ray cargo image inspection.

  6. Automated fiber pigtailing machine

    DOEpatents

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  7. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    DOEpatents

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  8. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  9. Reader Response to Front Pages with Modular Format and Color [and] Newspaper Errors: Source Perception, Reporter Response and Some Causes. American Newspaper Publishers Association (ANPA) News Research Report No. 35.

    ERIC Educational Resources Information Center

    Click, J. W.; And Others

    Two studies were conducted, the first to determine reader response to newspaper front pages with modular format and color, and the second to examine source perception and reporter response to errors in news stories. Results of the first study revealed that respondents in three cities preferred modular front pages to other modern format pages and…

  10. Modular implementation of a digital hardware design automation system

    NASA Astrophysics Data System (ADS)

    Masud, M.

    An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.

  11. Research gaps and technology needs in development of PHM for passive AdvSMR components

    NASA Astrophysics Data System (ADS)

    Meyer, Ryan M.; Ramuhalli, Pradeep; Coble, Jamie B.; Hirt, Evelyn H.; Mitchell, Mark R.; Wootan, David W.; Berglin, Eric J.; Bond, Leonard J.; Henagar, Chuck H., Jr.

    2014-02-01

    Advanced small modular reactors (AdvSMRs), which are based on modularization of advanced reactor concepts, may provide a longer-term alternative to traditional light-water reactors and near-term small modular reactors (SMRs), which are based on integral pressurized water reactor (iPWR) concepts. SMRs are challenged economically because of losses in economy of scale; thus, there is increased motivation to reduce the controllable operations and maintenance costs through automation technologies including prognostics health management (PHM) systems. In this regard, PHM systems have the potential to play a vital role in supporting the deployment of AdvSMRs and face several unique challenges with respect to implementation for passive AdvSMR components. This paper presents a summary of a research gaps and technical needs assessment performed for implementation of PHM for passive AdvSMR components.

  12. MRO Sequence Checking Tool

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.

  13. EDTA analysis on the Roche MODULAR analyser.

    PubMed

    Davidson, D F

    2007-05-01

    Patient specimens can be subject to subtle interference from cross contamination by liquid-based, potassium-containing EDTA anticoagulant, leading to misinterpretation of results. A rapid method for EDTA analysis to detect such contamination is described. An in-house EDTA assay on the Roche MODULAR analyser was assessed for accuracy and precision by comparison with an adjusted calcium difference measurement (atomic absorption and o-cresolphthalein complexone colorimetry). EDTA method versus adjusted calcium difference showed: slope = 1.038 (95% confidence interval [CI] 0.949-1.131); intercept = 0.073 (95% CI 0.018-0.132) mmol/L; r = 0.914; n = 94. However, inter-assay precision of the calcium difference method was estimated to be poorer (coefficient of variation 24.8% versus 3.4% for the automated colorimetric method at an EDTA concentration of 0.25 mmol/L). Unequivocal contamination was observed at an EDTA concentration of > or =0.2 mmol/L. The automated method showed positive interference from haemolysis and negative interference from oxalate. The method was unaffected by lipaemia (triglycerides <20 mmol/L), icterus (bilirubin <500 micromol/L), glucose (<100 mmol/L), iron (<100 micromol/L), and citrate, phosphate or fluoride (all <2.5 mmol/L). The automated colorimetric assay described is an accurate, precise and rapid (3 min) means of detecting EDTA contamination of unhaemolysed biochemistry specimens.

  14. A modular computational framework for automated peak extraction from ion mobility spectra

    PubMed Central

    2014-01-01

    Background An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. Results We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Conclusions Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims. PMID:24450533

  15. A modular computational framework for automated peak extraction from ion mobility spectra.

    PubMed

    D'Addario, Marianna; Kopczynski, Dominik; Baumbach, Jörg Ingo; Rahmann, Sven

    2014-01-22

    An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims.

  16. Automated Grading of Gliomas using Deep Learning in Digital Pathology Images: A modular approach with ensemble of convolutional neural networks.

    PubMed

    Ertosun, Mehmet Günhan; Rubin, Daniel L

    2015-01-01

    Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository.

  17. Automated Grading of Gliomas using Deep Learning in Digital Pathology Images: A modular approach with ensemble of convolutional neural networks

    PubMed Central

    Ertosun, Mehmet Günhan; Rubin, Daniel L.

    2015-01-01

    Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository. PMID:26958289

  18. GASICA: generic automated stress induction and control application design of an application for controlling the stress state.

    PubMed

    van der Vijgh, Benny; Beun, Robbert J; van Rood, Maarten; Werkhoven, Peter

    2014-01-01

    In a multitude of research and therapy paradigms it is relevant to know, and desirably to control, the stress state of a patient or participant. Examples include research paradigms in which the stress state is the dependent or independent variable, or therapy paradigms where this state indicates the boundaries of the therapy. To our knowledge, no application currently exists that focuses specifically on the automated control of the stress state while at the same time being generic enough to be used in various therapy and research purposes. Therefore, we introduce GASICA, an application aimed at the automated control of the stress state in a multitude of therapy and research paradigms. The application consists of three components: a digital stressor game, a set of measurement devices, and a feedback model. These three components form a closed loop (called a biocybernetic loop by Pope et al. (1995) and Fairclough (2009) that continuously presents an acute psychological stressor, measures several physiological responses to this stressor, and adjusts the stressor intensity based on these measurements by means of the feedback model, hereby aiming to control the stress state. In this manner GASICA presents multidimensional and ecological valid stressors, whilst continuously in control of the form and intensity of the presented stressors, aiming at the automated control of the stress state. Furthermore, the application is designed as a modular open-source application to easily implement different therapy and research tasks using a high-level programming interface and configuration file, and allows for the addition of (existing) measurement equipment, making it usable for various paradigms.

  19. GASICA: generic automated stress induction and control application design of an application for controlling the stress state

    PubMed Central

    van der Vijgh, Benny; Beun, Robbert J.; van Rood, Maarten; Werkhoven, Peter

    2014-01-01

    In a multitude of research and therapy paradigms it is relevant to know, and desirably to control, the stress state of a patient or participant. Examples include research paradigms in which the stress state is the dependent or independent variable, or therapy paradigms where this state indicates the boundaries of the therapy. To our knowledge, no application currently exists that focuses specifically on the automated control of the stress state while at the same time being generic enough to be used in various therapy and research purposes. Therefore, we introduce GASICA, an application aimed at the automated control of the stress state in a multitude of therapy and research paradigms. The application consists of three components: a digital stressor game, a set of measurement devices, and a feedback model. These three components form a closed loop (called a biocybernetic loop by Pope et al. (1995) and Fairclough (2009) that continuously presents an acute psychological stressor, measures several physiological responses to this stressor, and adjusts the stressor intensity based on these measurements by means of the feedback model, hereby aiming to control the stress state. In this manner GASICA presents multidimensional and ecological valid stressors, whilst continuously in control of the form and intensity of the presented stressors, aiming at the automated control of the stress state. Furthermore, the application is designed as a modular open-source application to easily implement different therapy and research tasks using a high-level programming interface and configuration file, and allows for the addition of (existing) measurement equipment, making it usable for various paradigms. PMID:25538554

  20. Modular high power diode lasers with flexible 3D multiplexing arrangement optimized for automated manufacturing

    NASA Astrophysics Data System (ADS)

    Könning, Tobias; Bayer, Andreas; Plappert, Nora; Faßbender, Wilhelm; Dürsch, Sascha; Küster, Matthias; Hubrich, Ralf; Wolf, Paul; Köhler, Bernd; Biesenbach, Jens

    2018-02-01

    A novel 3-dimensional arrangement of mirrors is used to re-arrange beams from 1-D and 2-D high power diode laser arrays. The approach allows for a variety of stacking geometries, depending on individual requirements. While basic building blocks, including collimating optics, always remain the same, most adaptations can be realized by simple rearrangement of a few optical components. Due to fully automated alignment processes, the required changes can be realized in software by changing coordinates, rather than requiring customized mechanical components. This approach minimizes development costs due to its flexibility, while reducing overall product cost by using similar building blocks for a variety of products and utilizing a high grade of automation. The modules can be operated with industrial grade water, lowering overall system and maintenance cost. Stackable macro coolers are used as the smallest building block of the system. Each cooler can hold up to five diode laser bars. Micro optical components, collimating the beam, are mounted directly to the cooler. All optical assembly steps are fully automated. Initially, the beams from all laser bars propagate in the same direction. Key to the concept is an arrangement of deflectors, which re-arrange the beams into a 2-D array of the desired shape and high fill factor. Standard multiplexing techniques like polarization- or wavelengths-multiplexing have been implemented as well. A variety of fiber coupled modules ranging from a few hundred watts of optical output power to multiple kilowatts of power, as well as customized laser spot geometries like uniform line sources, have been realized.

  1. A Modular System of Interfacing Microcomputers.

    ERIC Educational Resources Information Center

    Martin, Peter

    1983-01-01

    Describes a system of interfacing allowing a range of signal conditioning and control modules to be connected to microcomputers, enabling execution of such experiments as: examining rate of cooling; control by light-activated switch; pH measurements; control frequency of signal generators; and making automated measurements of frequency response of…

  2. Method and apparatus for automated, modular, biomass power generation

    DOEpatents

    Diebold, James P; Lilley, Arthur; Browne, III, Kingsbury; Walt, Robb Ray; Duncan, Dustin; Walker, Michael; Steele, John; Fields, Michael; Smith, Trevor

    2013-11-05

    Method and apparatus for generating a low tar, renewable fuel gas from biomass and using it in other energy conversion devices, many of which were designed for use with gaseous and liquid fossil fuels. An automated, downdraft gasifier incorporates extensive air injection into the char bed to maintain the conditions that promote the destruction of residual tars. The resulting fuel gas and entrained char and ash are cooled in a special heat exchanger, and then continuously cleaned in a filter prior to usage in standalone as well as networked power systems.

  3. Method and apparatus for automated, modular, biomass power generation

    DOEpatents

    Diebold, James P [Lakewood, CO; Lilley, Arthur [Finleyville, PA; Browne, Kingsbury III [Golden, CO; Walt, Robb Ray [Aurora, CO; Duncan, Dustin [Littleton, CO; Walker, Michael [Longmont, CO; Steele, John [Aurora, CO; Fields, Michael [Arvada, CO; Smith, Trevor [Lakewood, CO

    2011-03-22

    Method and apparatus for generating a low tar, renewable fuel gas from biomass and using it in other energy conversion devices, many of which were designed for use with gaseous and liquid fossil fuels. An automated, downdraft gasifier incorporates extensive air injection into the char bed to maintain the conditions that promote the destruction of residual tars. The resulting fuel gas and entrained char and ash are cooled in a special heat exchanger, and then continuously cleaned in a filter prior to usage in standalone as well as networked power systems.

  4. Automated fiber pigtailing machine

    DOEpatents

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  5. Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Daniel G.

    In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less

  6. ThunderSTORM: a comprehensive ImageJ plug-in for PALM and STORM data analysis and super-resolution imaging

    PubMed Central

    Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.

    2014-01-01

    Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516

  7. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  8. Inventory management and reagent supply for automated chemistry.

    PubMed

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  9. Modular Open-Source Software for Item Factor Analysis

    ERIC Educational Resources Information Center

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  10. Computer aided fixture design - A case based approach

    NASA Astrophysics Data System (ADS)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  11. Engineering the smart factory

    NASA Astrophysics Data System (ADS)

    Harrison, Robert; Vera, Daniel; Ahmad, Bilal

    2016-10-01

    The fourth industrial revolution promises to create what has been called the smart factory. The vision is that within such modular structured smart factories, cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralised decisions. This paper provides a view of this initiative from an automation systems perspective. In this context it considers how future automation systems might be effectively configured and supported through their lifecycles and how integration, application modelling, visualisation and reuse of such systems might be best achieved. The paper briefly describes limitations in current engineering methods, and new emerging approaches including the cyber physical systems (CPS) engineering tools being developed by the automation systems group (ASG) at Warwick Manufacturing Group, University of Warwick, UK.

  12. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  13. Computer system for scanning tunneling microscope automation

    NASA Astrophysics Data System (ADS)

    Aguilar, M.; García, A.; Pascual, P. J.; Presa, J.; Santisteban, A.

    1987-03-01

    A computerized system for the automation of a scanning tunneling microscope is presented. It is based on an IBM personal computer (PC) either an XT or an AT, which performs the control, data acquisition and storage operations, displays the STM "images" in real time, and provides image processing tools for the restoration and analysis of data. It supports different data acquisition and control cards and image display cards. The software has been designed in a modular way to allow the replacement of these cards and other equipment improvements as well as the inclusion of user routines for data analysis.

  14. Modular Heat Exchanger With Integral Heat Pipe

    NASA Technical Reports Server (NTRS)

    Schreiber, Jeffrey G.

    1992-01-01

    Modular heat exchanger with integral heat pipe transports heat from source to Stirling engine. Alternative to heat exchangers depending on integrities of thousands of brazed joints, contains only 40 brazed tubes.

  15. Boron-selective reactions as powerful tools for modular synthesis of diverse complex molecules.

    PubMed

    Xu, Liang; Zhang, Shuai; Li, Pengfei

    2015-12-21

    In the context of modular and rapid construction of molecular diversity and complexity for applications in organic synthesis, biomedical and materials sciences, a generally useful strategy has emerged based on boron-selective chemical transformations. In the last decade, these types of reactions have evolved from proof-of-concept to some advanced applications in the efficient preparation of complex natural products and even automated precise manufacturing on the molecular level. These advances have shown the great potential of boron-selective reactions in simplifying synthetic design and experimental operations, and should inspire new developments in related chemical and technological areas. This tutorial review will highlight the original contributions and representative advances in this emerging field.

  16. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    PubMed Central

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-01-01

    Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Conclusion Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs. Decon2LS source code, installer, and tutorials may be downloaded free of charge at . PMID:19292916

  17. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    PubMed

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the biological sample. Decon2LS is an efficient software package for discovering and visualizing features in proteomics studies that require automated interpretation of mass spectra. Besides being easy to use, fast, and reliable, Decon2LS is also open-source, which allows developers in the proteomics and bioinformatics communities to reuse and refine the algorithms to meet individual needs.Decon2LS source code, installer, and tutorials may be downloaded free of charge at http://http:/ncrr.pnl.gov/software/.

  18. Open-source image reconstruction of super-resolution structured illumination microscopy data in ImageJ

    PubMed Central

    Müller, Marcel; Mönkemöller, Viola; Hennig, Simon; Hübner, Wolfgang; Huser, Thomas

    2016-01-01

    Super-resolved structured illumination microscopy (SR-SIM) is an important tool for fluorescence microscopy. SR-SIM microscopes perform multiple image acquisitions with varying illumination patterns, and reconstruct them to a super-resolved image. In its most frequent, linear implementation, SR-SIM doubles the spatial resolution. The reconstruction is performed numerically on the acquired wide-field image data, and thus relies on a software implementation of specific SR-SIM image reconstruction algorithms. We present fairSIM, an easy-to-use plugin that provides SR-SIM reconstructions for a wide range of SR-SIM platforms directly within ImageJ. For research groups developing their own implementations of super-resolution structured illumination microscopy, fairSIM takes away the hurdle of generating yet another implementation of the reconstruction algorithm. For users of commercial microscopes, it offers an additional, in-depth analysis option for their data independent of specific operating systems. As a modular, open-source solution, fairSIM can easily be adapted, automated and extended as the field of SR-SIM progresses. PMID:26996201

  19. Modular Filter and Source-Management Upgrade of RADAC

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Smith, Donna C.

    2007-01-01

    In an upgrade of the Range Data Acquisition Computer (RADAC) software, a modular software object library was developed to implement required functionality for filtering of flight-vehicle-tracking data and management of tracking-data sources. (The RADAC software is used to process flight-vehicle metric data for realtime display in the Wallops Flight Facility Range Control Center and Mobile Control Center.)

  20. Medical Data Architecture Project Capabilities and Design

    NASA Technical Reports Server (NTRS)

    Middour, C.; Krihak, M.; Lindsey, A.; Marker, N.; Wolfe, S.; Winther, S.; Ronzano, K.; Bolles, D.; Toscano, W.; Shaw, T.

    2017-01-01

    Mission constraints will challenge the delivery of medical care on a long-term, deep space exploration mission. This type of mission will be restricted in the availability of medical knowledge, skills, procedures and resources to prevent, diagnose, and treat in-flight medical events. Challenges to providing medical care are anticipated, including resource and resupply constraints, delayed communications and no ability for medical evacuation. The Medical Data Architecture (MDA) project will enable medical care capability in this constrained environment. The first version of the system, called "Test Bed 1," includes capabilities for automated data collection, data storage and data retrieval to provide information to the Crew Medical Officer (CMO). Test Bed 1 seeks to establish a data architecture foundation and develop a scalable data management system through modular design and standardized interfaces. In addition, it will demonstrate to stakeholders the potential for an improved, automated, flow of data to and from the medical system over the current methods employed on the International Space Station (ISS). It integrates a set of external devices, software and processes, and a Subjective, Objective, Assessment, and Plan (SOAP) note commonly used by clinicians. Medical data like electrocardiogram plots, heart rate, skin temperature, respiration rate, medications taken, and more are collected from devices and stored in the Electronic Medical Records (EMR) system, and reported to crew and clinician. Devices integrated include the Astroskin biosensor vest and IMED CARDIAX electrocardiogram (ECG) device with INEED MD ECG Glove, and the NASA-developed Medical Dose Tracker application. The system is designed to be operated as a standalone system, and can be deployed in a variety of environments, from a laptop to a data center. The system is primarily composed of open-source software tools, and is designed to be modular, so new capabilities can be added. The software components and integration methods will be discussed.

  1. Automated multiplex genome-scale engineering in yeast

    PubMed Central

    Si, Tong; Chao, Ran; Min, Yuhao; Wu, Yuying; Ren, Wen; Zhao, Huimin

    2017-01-01

    Genome-scale engineering is indispensable in understanding and engineering microorganisms, but the current tools are mainly limited to bacterial systems. Here we report an automated platform for multiplex genome-scale engineering in Saccharomyces cerevisiae, an important eukaryotic model and widely used microbial cell factory. Standardized genetic parts encoding overexpression and knockdown mutations of >90% yeast genes are created in a single step from a full-length cDNA library. With the aid of CRISPR-Cas, these genetic parts are iteratively integrated into the repetitive genomic sequences in a modular manner using robotic automation. This system allows functional mapping and multiplex optimization on a genome scale for diverse phenotypes including cellulase expression, isobutanol production, glycerol utilization and acetic acid tolerance, and may greatly accelerate future genome-scale engineering endeavours in yeast. PMID:28469255

  2. [The application of new technologies to hospital pharmacy in Spain].

    PubMed

    Bermejo Vicedo, T; Pérez Menéndez Conde, C; Alvarez, Ana; Codina, Carlos; Delgado, Olga; Herranz, Ana; Hidalgo Correas, Francisco; Martín, Isabel; Martínez, Julio; Luis Poveda, José; Queralt Gorgas, María; Sanjurjo Sáez, María

    2007-01-01

    To describe the degree of introduction of new technologies in the medication use process in pharmacy services in Spain. A descriptive study via a survey into the degree of introduction of computer systems for: management, computerized physician order entry (CPOE), automated unit dose drug dispensing, preparation of parenteral nutrition solutions, recording drug administration, pharmaceutical care and foreseen improvements. The survey was sent by electronic mail to the heads of the pharmacy services of 207 hospitals throughout Spain. Response index: 82 hospitals (38.6%). 29 hospitals (36.7%) have a modular management system, 24 (30.4%) an integrated one and 34 (44.9%) a modular-integrated one. CPOE is utilised in 17 (22.4%). According to the size of the hospital, between 17.9 and 26.7% of unit dose dispensing is done online with a management software; between 5.1 and 33.3% of unit dose dispensing is automated. Automation of unit dose dispensing centred in the pharmacy service varies between 10 and 33.3%. Between 13.2 and 35.7% of automated in-ward dispensing systems are utilised. Administration records are kept manually on a computerised sheet at 23 (31.5%) of the hospitals; at 4 (5.4%) on CPOE and 7 (9.5%) online on the integral management programme and 4 (5.4%) on specific nursing softwares. Sixty-three per cent foresee the implementation of improvements in the short to medium term. The introduction of new technologies is being developed in Spain aiming to improve the safety and management of drugs, and there is a trend towards increasing their deployment in the near future. It is hoped that their fomentation could help to bring about process reengineering within pharmacy services in order to increase the time available for devotion to pharmaceutical care.

  3. Lutetium-177 DOTATATE Production with an Automated Radiopharmaceutical Synthesis System.

    PubMed

    Aslani, Alireza; Snowdon, Graeme M; Bailey, Dale L; Schembri, Geoffrey P; Bailey, Elizabeth A; Pavlakis, Nick; Roach, Paul J

    2015-01-01

    Peptide Receptor Radionuclide Therapy (PRRT) with yttrium-90 ((90)Y) and lutetium-177 ((177)Lu)-labelled SST analogues are now therapy option for patients who have failed to respond to conventional medical therapy. In-house production with automated PRRT synthesis systems have clear advantages over manual methods resulting in increasing use in hospital-based radiopharmacies. We report on our one year experience with an automated radiopharmaceutical synthesis system. All syntheses were carried out using the Eckert & Ziegler Eurotope's Modular-Lab Pharm Tracer® automated synthesis system. All materials and methods used were followed as instructed by the manufacturer of the system (Eckert & Ziegler Eurotope, Berlin, Germany). Sterile, GMP-certified, no-carrier added (NCA) (177)Lu was used with GMP-certified peptide. An audit trail was also produced and saved by the system. The quality of the final product was assessed after each synthesis by ITLC-SG and HPLC methods. A total of 17 [(177)Lu]-DOTATATE syntheses were performed between August 2013 and December 2014. The amount of radioactive [(177)Lu]-DOTATATE produced by each synthesis varied between 10-40 GBq and was dependant on the number of patients being treated on a given day. Thirteen individuals received a total of 37 individual treatment administrations in this period. There were no issues and failures with the system or the synthesis cassettes. The average radiochemical purity as determined by ITLC was above 99% (99.8 ± 0.05%) and the average radiochemical purity as determined by HPLC technique was above 97% (97.3 ± 1.5%) for this period. The automated synthesis of [(177)Lu]-DOTATATE using Eckert & Ziegler Eurotope's Modular-Lab Pharm Tracer® system is a robust, convenient and high yield approach to the radiolabelling of DOTATATE peptide benefiting from the use of NCA (177)Lu and almost negligible radiation exposure of the operators.

  4. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-02-01

    The System for Automated Geoscientific Analyses (SAGA) is an open-source Geographic Information System (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular organized software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, an easily approachable graphical user interface with many visualization options, a command line interpreter, and interfaces to scripting and low level programming languages like R and Python. The current version 2.1.4 offers more than 700 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Further, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  5. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-07-01

    The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  6. Teleoperated Modular Robots for Lunar Operations

    NASA Technical Reports Server (NTRS)

    Globus, Al; Hornby, Greg; Larchev, Greg; Hancher, Matt; Cannon, Howard; Lohn, Jason

    2004-01-01

    Solar system exploration is currently carried out by special purpose robots exquisitely designed for the anticipated tasks. However, all contingencies for in situ resource utilization (ISRU), human habitat preparation, and exploration will be difficult to anticipate. Furthermore, developing the necessary special purpose mechanisms for deployment and other capabilities is difficult and error prone. For example, the Galileo high gain antenna never opened, severely restricting the quantity of data returned by the spacecraft. Also, deployment hardware is used only once. To address these problems, we are developing teleoperated modular robots for lunar missions, including operations in transit from Earth. Teleoperation of lunar systems from Earth involves a three second speed-of-light delay, but experiment suggests that interactive operations are feasible.' Modular robots typically consist of many identical modules that pass power and data between them and can be reconfigured for different tasks providing great flexibility, inherent redundancy and graceful degradation as modules fail. Our design features a number of different hub, link, and joint modules to simplify the individual modules, lower structure cost, and provide specialized capabilities. Modular robots are well suited for space applications because of their extreme flexibility, inherent redundancy, high-density packing, and opportunities for mass production. Simple structural modules can be manufactured from lunar regolith in situ using molds or directed solar sintering. Software to direct and control modular robots is difficult to develop. We have used genetic algorithms to evolve both the morphology and control system for walking modular robots3 We are currently using evolvable system technology to evolve controllers for modular robots in the ISS glove box. Development of lunar modular robots will require software and physical simulators, including regolith simulation, to enable design and test of robot software and hardware, particularly automation software. Ready access to these simulators could provide opportunities for contest-driven development ala RoboCup (http://www.robocup.org/). Licensing of module designs could provide opportunities in the toy market and for spin-off applications.

  7. A Modular and Extensible Architecture Integrating Sensors, Dynamic Displays of Anatomy and Physiology, and Automated Instruction for Innovations in Clinical Education

    ERIC Educational Resources Information Center

    Nelson, Douglas Allen, Jr.

    2017-01-01

    Adoption of simulation in healthcare education has increased tremendously over the past two decades. However, the resources necessary to perform simulation are immense. Simulators are large capital investments and require specialized training for both instructors and simulation support staff to develop curriculum using the simulator and to use the…

  8. A modular suite of hardware enabling spaceflight cell culture research

    NASA Technical Reports Server (NTRS)

    Hoehn, Alexander; Klaus, David M.; Stodieck, Louis S.

    2004-01-01

    BioServe Space Technologies, a NASA Research Partnership Center (RPC), has developed and operated various middeck payloads launched on 23 shuttle missions since 1991 in support of commercial space biotechnology projects. Modular cell culture systems are contained within the Commercial Generic Bioprocessing Apparatus (CGBA) suite of flight-qualified hardware, compatible with Space Shuttle, SPACEHAB, Spacelab and International Space Station (ISS) EXPRESS Rack interfaces. As part of the CGBA family, the Isothermal Containment Module (ICM) incubator provides thermal control, data acquisition and experiment manipulation capabilities, including accelerometer launch detection for automated activation and thermal profiling for culture incubation and sample preservation. The ICM can accommodate up to 8 individually controlled temperature zones. Command and telemetry capabilities allow real-time downlink of data and video permitting remote payload operation and ground control synchronization. Individual cell culture experiments can be accommodated in a variety of devices ranging from 'microgravity test tubes' or standard 100 mm Petri dishes, to complex, fed-batch bioreactors with automated culture feeding, waste removal and multiple sample draws. Up to 3 levels of containment can be achieved for chemical fixative addition, and passive gas exchange can be provided through hydrophobic membranes. Many additional options exist for designing customized hardware depending on specific science requirements.

  9. Fully chip-embedded automation of a multi-step lab-on-a-chip process using a modularized timer circuit.

    PubMed

    Kang, Junsu; Lee, Donghyeon; Heo, Young Jin; Chung, Wan Kyun

    2017-11-07

    For highly-integrated microfluidic systems, an actuation system is necessary to control the flow; however, the bulk of actuation devices including pumps or valves has impeded the broad application of integrated microfluidic systems. Here, we suggest a microfluidic process control method based on built-in microfluidic circuits. The circuit is composed of a fluidic timer circuit and a pneumatic logic circuit. The fluidic timer circuit is a serial connection of modularized timer units, which sequentially pass high pressure to the pneumatic logic circuit. The pneumatic logic circuit is a NOR gate array designed to control the liquid-controlling process. By using the timer circuit as a built-in signal generator, multi-step processes could be done totally inside the microchip without any external controller. The timer circuit uses only two valves per unit, and the number of process steps can be extended without limitation by adding timer units. As a demonstration, an automation chip has been designed for a six-step droplet treatment, which entails 1) loading, 2) separation, 3) reagent injection, 4) incubation, 5) clearing and 6) unloading. Each process was successfully performed for a pre-defined step-time without any external control device.

  10. An Adaptive Web-Based Support to e-Education in Robotics and Automation

    NASA Astrophysics Data System (ADS)

    di Giamberardino, Paolo; Temperini, Marco

    The paper presents the hardware and software architecture of a remote laboratory, with robotics and automation applications, devised to support e-teaching and e-learning activities, at an undergraduate level in computer engineering. The hardware is composed by modular structures, based on the Lego Mindstorms components: they are reasonably sophisticated in terms of functions, pretty easy to use, and sufficiently affordable in terms of cost. Moreover, being the robots intrinsically modular, wrt the number and distribution of sensors and actuators, they are easily and quickly reconfigurable. A web application makes the laboratory and its robots available via internet. The software framework allows the teacher to define, for the course under her/his responsibility, a learning path made of different and differently complex exercises, graduated in terms of the "difficulty" they require to meet and of the "competence" that the solver is supposed to have shown. The learning path of exercises is adapted to the individual learner's progressively growing competence: at any moment, only a subset of the exercises is available (depending on how close their levels of competence and difficulty are to those of the exercises already solved by the learner).

  11. An automated Genomes-to-Natural Products platform (GNP) for the discovery of modular natural products.

    PubMed

    Johnston, Chad W; Skinnider, Michael A; Wyatt, Morgan A; Li, Xiang; Ranieri, Michael R M; Yang, Lian; Zechel, David L; Ma, Bin; Magarvey, Nathan A

    2015-09-28

    Bacterial natural products are a diverse and valuable group of small molecules, and genome sequencing indicates that the vast majority remain undiscovered. The prediction of natural product structures from biosynthetic assembly lines can facilitate their discovery, but highly automated, accurate, and integrated systems are required to mine the broad spectrum of sequenced bacterial genomes. Here we present a genome-guided natural products discovery tool to automatically predict, combinatorialize and identify polyketides and nonribosomal peptides from biosynthetic assembly lines using LC-MS/MS data of crude extracts in a high-throughput manner. We detail the directed identification and isolation of six genetically predicted polyketides and nonribosomal peptides using our Genome-to-Natural Products platform. This highly automated, user-friendly programme provides a means of realizing the potential of genetically encoded natural products.

  12. Modular approach to achieving the next-generation X-ray light source

    NASA Astrophysics Data System (ADS)

    Biedron, S. G.; Milton, S. V.; Freund, H. P.

    2001-12-01

    A modular approach to the next-generation light source is described. The "modules" include photocathode, radio-frequency, electron guns and their associated drive-laser systems, linear accelerators, bunch-compression systems, seed laser systems, planar undulators, two-undulator harmonic generation schemes, high-gain harmonic generation systems, nonlinear higher harmonics, and wavelength shifting. These modules will be helpful in distributing the next-generation light source to many more laboratories than the current single-pass, high-gain free-electron laser designs permit, due to both monetary and/or physical space constraints.

  13. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  14. A new highly automated sputter equipment for in situ investigation of deposition processes with synchrotron radiation.

    PubMed

    Döhrmann, Ralph; Botta, Stephan; Buffet, Adeline; Santoro, Gonzalo; Schlage, Kai; Schwartzkopf, Matthias; Bommel, Sebastian; Risch, Johannes F H; Mannweiler, Roman; Brunner, Simon; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Roth, Stephan V

    2013-04-01

    HASE (Highly Automated Sputter Equipment) is a new mobile setup developed to investigate deposition processes with synchrotron radiation. HASE is based on an ultra-high vacuum sputter deposition chamber equipped with an in-vacuum sample pick-and-place robot. This enables a fast and reliable sample change without breaking the vacuum conditions and helps to save valuable measurement time, which is required for experiments at synchrotron sources like PETRA III at DESY. An advantageous arrangement of several sputter guns, mounted on a rotative flange, gives the possibility to sputter under different deposition angles or to sputter different materials on the same substrate. The chamber is also equipped with a modular sample stage, which allows for the integration of different sample environments, such as a sample heating and cooling device. The design of HASE is unique in the flexibility. The combination of several different sputtering methods like standard deposition, glancing angle deposition, and high pressure sputter deposition combined with heating and cooling possibilities of the sample, the large exit windows, and the degree of automation facilitate many different grazing incidence X-ray scattering experiments, such as grazing incidence small and wide angle X-ray scattering, in one setup. In this paper we describe in detail the design and the performance of the new equipment and present the installation of the HASE apparatus at the Micro and Nano focus X-ray Scattering beamline (MiNaXS) at PETRA III. Furthermore, we describe the measurement options and present some selected results. The HASE setup has been successfully commissioned and is now available for users.

  15. A new highly automated sputter equipment for in situ investigation of deposition processes with synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Döhrmann, Ralph; Botta, Stephan; Buffet, Adeline; Santoro, Gonzalo; Schlage, Kai; Schwartzkopf, Matthias; Bommel, Sebastian; Risch, Johannes F. H.; Mannweiler, Roman; Brunner, Simon; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Roth, Stephan V.

    2013-04-01

    HASE (Highly Automated Sputter Equipment) is a new mobile setup developed to investigate deposition processes with synchrotron radiation. HASE is based on an ultra-high vacuum sputter deposition chamber equipped with an in-vacuum sample pick-and-place robot. This enables a fast and reliable sample change without breaking the vacuum conditions and helps to save valuable measurement time, which is required for experiments at synchrotron sources like PETRA III at DESY. An advantageous arrangement of several sputter guns, mounted on a rotative flange, gives the possibility to sputter under different deposition angles or to sputter different materials on the same substrate. The chamber is also equipped with a modular sample stage, which allows for the integration of different sample environments, such as a sample heating and cooling device. The design of HASE is unique in the flexibility. The combination of several different sputtering methods like standard deposition, glancing angle deposition, and high pressure sputter deposition combined with heating and cooling possibil-ities of the sample, the large exit windows, and the degree of automation facilitate many different grazing incidence X-ray scattering experiments, such as grazing incidence small and wide angle X-ray scattering, in one setup. In this paper we describe in detail the design and the performance of the new equipment and present the installation of the HASE apparatus at the Micro and Nano focus X-ray Scattering beamline (MiNaXS) at PETRA III. Furthermore, we describe the measurement options and present some selected results. The HASE setup has been successfully commissioned and is now available for users.

  16. Records Management Handbook; Source Data Automation Equipment Guide.

    ERIC Educational Resources Information Center

    National Archives and Records Service (GSA), Washington, DC. Office of Records Management.

    A detailed guide to selecting appropriate source data automation equipment is presented. Source data automation equipment is used to prepare data for electronic data processing or computerized recordkeeping. The guide contains specifications, performance data cost, and pictures of the major types of machines used in source data automation.…

  17. Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows

    NASA Astrophysics Data System (ADS)

    Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.

    2014-12-01

    The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.

  18. Modular magazine for suitable handling of microparts in industry

    NASA Astrophysics Data System (ADS)

    Grimme, Ralf; Schmutz, Wolfgang; Schlenker, Dirk; Schuenemann, Matthias; Stock, Achim; Schaefer, Wolfgang

    1998-01-01

    Microassembly and microadjustment techniques are key technologies in the industrial production of hybrid microelectromechanical systems. One focal point in current microproduction research and engineering is the design and development of high-precision microassembly and microadjustment equipment capable of operating within the framework of flexible automated industrial production. As well as these developments, suitable microassembly tools for industrial use also need to be equipped with interfaces for the supply and delivery of microcomponents. The microassembly process necessitates the supply of microparts in a geometrically defined manner. In order to reduce processing steps and production costs, there is a demand for magazines capable of providing free accessibility to the fixed microcomponents. Commonly used at present are feeding techniques, which originate from the field of semiconductor production. However none of these techniques fully meets the requirements of industrial microassembly technology. A novel modular magazine set, developed and tested in a joint project, is presented here. The magazines are able to hold microcomponents during cleaning, inspection and assembly without nay additional handling steps. The modularity of their design allows for maximum technical flexibility. The modular magazine fits into currently practiced SEMI standards. The design and concept of the magazine enables industrial manufacturers to promote a cost-efficient and flexible precision assembly of microelectromechanical systems.

  19. EMMA: An Extensible Mammalian Modular Assembly Toolkit for the Rapid Design and Production of Diverse Expression Vectors.

    PubMed

    Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi

    2017-07-21

    Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.

  20. Fast estimation of space-robots inertia parameters: A modular mathematical formulation

    NASA Astrophysics Data System (ADS)

    Nabavi Chashmi, Seyed Yaser; Malaek, Seyed Mohammad-Bagher

    2016-10-01

    This work aims to propose a new technique that considerably helps enhance time and precision needed to identify ;Inertia Parameters (IPs); of a typical Autonomous Space-Robot (ASR). Operations might include, capturing an unknown Target Space-Object (TSO), ;active space-debris removal; or ;automated in-orbit assemblies;. In these operations generating precise successive commands are essential to the success of the mission. We show how a generalized, repeatable estimation-process could play an effective role to manage the operation. With the help of the well-known Force-Based approach, a new ;modular formulation; has been developed to simultaneously identify IPs of an ASR while it captures a TSO. The idea is to reorganize the equations with associated IPs with a ;Modular Set; of matrices instead of a single matrix representing the overall system dynamics. The devised Modular Matrix Set will then facilitate the estimation process. It provides a conjugate linear model in mass and inertia terms. The new formulation is, therefore, well-suited for ;simultaneous estimation processes; using recursive algorithms like RLS. Further enhancements would be needed for cases the effect of center of mass location becomes important. Extensive case studies reveal that estimation time is drastically reduced which in-turn paves the way to acquire better results.

  1. The Effects of Race Conditions When Implementing Single-Source Redundant Clock Trees in Triple Modular Redundant Synchronous Architectures

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Kim, Hak S.; Phan, Anthony M.; Seidleck, Christina M.; Label, Kenneth A.; Pellish, Jonathan A.; Campola, Michael J.

    2016-01-01

    We present the challenges that arise when using redundant clock domains due to their time-skew. Radiation data show that a singular clock domain provides an improved triple modular redundant (TMR) scheme over redundant clocks.

  2. Modular and Spatially Explicit: A Novel Approach to System Dynamics

    EPA Science Inventory

    The Open Modeling Environment (OME) is an open-source System Dynamics (SD) simulation engine which has been created as a joint project between Oregon State University and the US Environmental Protection Agency. It is designed around a modular implementation, and provides a standa...

  3. Development and integration of a LabVIEW-based modular architecture for automated execution of electrochemical catalyst testing.

    PubMed

    Topalov, Angel A; Katsounaros, Ioannis; Meier, Josef C; Klemm, Sebastian O; Mayrhofer, Karl J J

    2011-11-01

    This paper describes a system for performing electrochemical catalyst testing where all hardware components are controlled simultaneously using a single LabVIEW-based software application. The software that we developed can be operated in both manual mode for exploratory investigations and automatic mode for routine measurements, by using predefined execution procedures. The latter enables the execution of high-throughput or combinatorial investigations, which decrease substantially the time and cost for catalyst testing. The software was constructed using a modular architecture which simplifies the modification or extension of the system, depending on future needs. The system was tested by performing stability tests of commercial fuel cell electrocatalysts, and the advantages of the developed system are discussed. © 2011 American Institute of Physics

  4. Automated synthesis of arabinoxylan-oligosaccharides enables characterization of antibodies that recognize plant cell wall glycans.

    PubMed

    Schmidt, Deborah; Schuhmacher, Frank; Geissner, Andreas; Seeberger, Peter H; Pfrengle, Fabian

    2015-04-07

    Monoclonal antibodies that recognize plant cell wall glycans are used for high-resolution imaging, providing important information about the structure and function of cell wall polysaccharides. To characterize the binding epitopes of these powerful molecular probes a library of eleven plant arabinoxylan oligosaccharides was produced by automated solid-phase synthesis. Modular assembly of oligoarabinoxylans from few building blocks was enabled by adding (2-naphthyl)methyl (Nap) to the toolbox of orthogonal protecting groups for solid-phase synthesis. Conjugation-ready oligosaccharides were obtained and the binding specificities of xylan-directed antibodies were determined on microarrays. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Commissioning and field tests of a van-mounted system for the detection of radioactive sources and Special Nuclear Material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cester, D.; Lunardon, M.; Stevanato, L.

    2015-07-01

    MODES SNM project aimed to carry out technical research in order to develop a prototype for a mobile, modular detection system for radioactive sources and Special Nuclear Materials (SNM). Its main goal was to deliver a tested prototype of a modular mobile system capable of passively detecting weak or shielded radioactive sources with accuracy higher than that of currently available systems. By the end of the project all the objectives have been successfully achieved. Results from the laboratory commissioning and the field tests will be presented. (authors)

  6. Obstacle avoidance system with sonar sensing and fuzzy logic

    NASA Astrophysics Data System (ADS)

    Chiang, Wen-chuan; Kelkar, Nikhal; Hall, Ernest L.

    1997-09-01

    Automated guided vehicles (AGVs) have many potential applications in manufacturing, medicine, space and defense. The purpose of this paper is to describe exploratory research on the design of an obstacle avoidance system using sonar sensors for a modular autonomous mobile robot controller. The advantages of a modular system are related to portability and the fact that any vehicle can become autonomous with minimal modifications. A mobile robot test-bed has been constructed using a golf cart base. The obstacle avoidance system is based on a micro-controller interfaced with multiple ultrasonic transducers. This micro-controller independently handles all timing and distance calculations and sends a distance measurement back to the computer via the serial line. This design yields a portable independent system. Testing of these systems has been done in the lab as well as on an outside test track with positive results that show that at five mph the vehicle can follow a line and at the same time avoid obstacles. This design, in its modularity, creates a portable autonomous obstacle avoidance controller applicable for any mobile vehicle with only minor adaptations.

  7. Advanced E-O test capability for Army Next-Generation Automated Test System (NGATS)

    NASA Astrophysics Data System (ADS)

    Errea, S.; Grigor, J.; King, D. F.; Matis, G.; McHugh, S.; McKechnie, J.; Nehring, B.

    2015-05-01

    The Future E-O (FEO) program was established to develop a flexible, modular, automated test capability as part of the Next Generation Automatic Test System (NGATS) program to support the test and diagnostic needs of currently fielded U.S. Army electro-optical (E-O) devices, as well as being expandable to address the requirements of future Navy, Marine Corps and Air Force E-O systems. Santa Barbara infrared (SBIR) has designed, fabricated, and delivered three (3) prototype FEO for engineering and logistics evaluation prior to anticipated full-scale production beginning in 2016. In addition to presenting a detailed overview of the FEO system hardware design, features and testing capabilities, the integration of SBIR's EO-IR sensor and laser test software package, IRWindows 4™, into FEO to automate the test execution, data collection and analysis, archiving and reporting of results is also described.

  8. Topography-Assisted Electromagnetic Platform for Blood-to-PCR in a Droplet

    PubMed Central

    Chiou, Chi-Han; Shin, Dong Jin; Zhang, Yi; Wang, Tza-Huei

    2013-01-01

    This paper presents an electromagnetically actuated platform for automated sample preparation and detection of nucleic acids. The proposed platform integrates nucleic acid extraction using silica-coated magnetic particles with real-time polymerase chain reaction (PCR) on a single cartridge. Extraction of genomic material was automated by manipulating magnetic particles in droplets using a series of planar coil electromagnets assisted by topographical features, enabling efficient fluidic processing over a variety of buffers and reagents. The functionality of the platform was demonstrated by performing nucleic acid extraction from whole blood, followed by real-time PCR detection of KRAS oncogene. Automated sample processing from whole blood to PCR-ready droplet was performed in 15 minutes. We took a modular approach of decoupling the modules of magnetic manipulation and optical detection from the device itself, enabling a low-complexity cartridge that operates in tandem with simple external instruments. PMID:23835223

  9. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe.

    PubMed

    Weiser, Armin A; Thöns, Christian; Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available.

  10. FoodChain-Lab: A Trace-Back and Trace-Forward Tool Developed and Applied during Food-Borne Disease Outbreak Investigations in Germany and Europe

    PubMed Central

    Filter, Matthias; Falenski, Alexander; Appel, Bernd; Käsbohrer, Annemarie

    2016-01-01

    FoodChain-Lab is modular open-source software for trace-back and trace-forward analysis in food-borne disease outbreak investigations. Development of FoodChain-Lab has been driven by a need for appropriate software in several food-related outbreaks in Germany since 2011. The software allows integrated data management, data linkage, enrichment and visualization as well as interactive supply chain analyses. Identification of possible outbreak sources or vehicles is facilitated by calculation of tracing scores for food-handling stations (companies or persons) and food products under investigation. The software also supports consideration of station-specific cross-contamination, analysis of geographical relationships, and topological clustering of the tracing network structure. FoodChain-Lab has been applied successfully in previous outbreak investigations, for example during the 2011 EHEC outbreak and the 2013/14 European hepatitis A outbreak. The software is most useful in complex, multi-area outbreak investigations where epidemiological evidence may be insufficient to discriminate between multiple implicated food products. The automated analysis and visualization components would be of greater value if trading information on food ingredients and compound products was more easily available. PMID:26985673

  11. Space biology initiative program definition review. Trade study 1: Automation costs versus crew utilization

    NASA Technical Reports Server (NTRS)

    Jackson, L. Neal; Crenshaw, John, Sr.; Hambright, R. N.; Nedungadi, A.; Mcfayden, G. M.; Tsuchida, M. S.

    1989-01-01

    A significant emphasis upon automation within the Space Biology Initiative hardware appears justified in order to conserve crew labor and crew training effort. Two generic forms of automation were identified: automation of data and information handling and decision making, and the automation of material handling, transfer, and processing. The use of automatic data acquisition, expert systems, robots, and machine vision will increase the volume of experiments and quality of results. The automation described may also influence efforts to miniaturize and modularize the large array of SBI hardware identified to date. The cost and benefit model developed appears to be a useful guideline for SBI equipment specifiers and designers. Additional refinements would enhance the validity of the model. Two NASA automation pilot programs, 'The Principal Investigator in a Box' and 'Rack Mounted Robots' were investigated and found to be quite appropriate for adaptation to the SBI program. There are other in-house NASA efforts that provide technology that may be appropriate for the SBI program. Important data is believed to exist in advanced medical labs throughout the U.S., Japan, and Europe. The information and data processing in medical analysis equipment is highly automated and future trends reveal continued progress in this area. However, automation of material handling and processing has progressed in a limited manner because the medical labs are not affected by the power and space constraints that Space Station medical equipment is faced with. Therefore, NASA's major emphasis in automation will require a lead effort in the automation of material handling to achieve optimal crew utilization.

  12. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    PubMed Central

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-01-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively. PMID:27271840

  13. A modular assembling platform for manufacturing of microsystems by optical tweezers

    NASA Astrophysics Data System (ADS)

    Ksouri, Sarah Isabelle; Aumann, Andreas; Ghadiri, Reza; Prüfer, Michael; Baer, Sebastian; Ostendorf, Andreas

    2013-09-01

    Due to the increased complexity in terms of materials and geometries for microsystems new assembling techniques are required. Assembling techniques from the semiconductor industry are often very specific and cannot fulfill all specifications in more complex microsystems. Therefore, holographic optical tweezers are applied to manipulate structures in micrometer range with highest flexibility and precision. As is well known non-spherical assemblies can be trapped and controlled by laser light and assembled with an additional light modulator application, where the incident laser beam is rearranged into flexible light patterns in order to generate multiple spots. The complementary building blocks are generated by a two-photon-polymerization process. The possibilities of manufacturing arbitrary microstructures and the potential of optical tweezers lead to the idea of combining manufacturing techniques with manipulation processes to "microrobotic" processes. This work presents the manipulation of generated complex microstructures with optical tools as well as a storage solution for 2PP assemblies. A sample holder has been developed for the manual feeding of 2PP building blocks. Furthermore, a modular assembling platform has been constructed for an `all-in-one' 2PP manufacturing process as a dedicated storage system. The long-term objective is the automation process of feeding and storage of several different 2PP micro-assemblies to realize an automated assembly process.

  14. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment.

    PubMed

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S; Phoon, Sin Ye

    2016-06-07

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  15. Virtual Planning, Control, and Machining for a Modular-Based Automated Factory Operation in an Augmented Reality Environment

    NASA Astrophysics Data System (ADS)

    Pai, Yun Suen; Yap, Hwa Jen; Md Dawal, Siti Zawiah; Ramesh, S.; Phoon, Sin Ye

    2016-06-01

    This study presents a modular-based implementation of augmented reality to provide an immersive experience in learning or teaching the planning phase, control system, and machining parameters of a fully automated work cell. The architecture of the system consists of three code modules that can operate independently or combined to create a complete system that is able to guide engineers from the layout planning phase to the prototyping of the final product. The layout planning module determines the best possible arrangement in a layout for the placement of various machines, in this case a conveyor belt for transportation, a robot arm for pick-and-place operations, and a computer numerical control milling machine to generate the final prototype. The robotic arm module simulates the pick-and-place operation offline from the conveyor belt to a computer numerical control (CNC) machine utilising collision detection and inverse kinematics. Finally, the CNC module performs virtual machining based on the Uniform Space Decomposition method and axis aligned bounding box collision detection. The conducted case study revealed that given the situation, a semi-circle shaped arrangement is desirable, whereas the pick-and-place system and the final generated G-code produced the highest deviation of 3.83 mm and 5.8 mm respectively.

  16. 3D-Lab: a collaborative web-based platform for molecular modeling.

    PubMed

    Grebner, Christoph; Norrby, Magnus; Enström, Jonatan; Nilsson, Ingemar; Hogner, Anders; Henriksson, Jonas; Westin, Johan; Faramarzi, Farzad; Werner, Philip; Boström, Jonas

    2016-09-01

    The use of 3D information has shown impact in numerous applications in drug design. However, it is often under-utilized and traditionally limited to specialists. We want to change that, and present an approach making 3D information and molecular modeling accessible and easy-to-use 'for the people'. A user-friendly and collaborative web-based platform (3D-Lab) for 3D modeling, including a blazingly fast virtual screening capability, was developed. 3D-Lab provides an interface to automatic molecular modeling, like conformer generation, ligand alignments, molecular dockings and simple quantum chemistry protocols. 3D-Lab is designed to be modular, and to facilitate sharing of 3D-information to promote interactions between drug designers. Recent enhancements to our open-source virtual reality tool Molecular Rift are described. The integrated drug-design platform allows drug designers to instantaneously access 3D information and readily apply advanced and automated 3D molecular modeling tasks, with the aim to improve decision-making in drug design projects.

  17. Using a GIS to link digital spatial data and the precipitation-runoff modeling system, Gunnison River Basin, Colorado

    USGS Publications Warehouse

    Battaglin, William A.; Kuhn, Gerhard; Parker, Randolph S.

    1993-01-01

    The U.S. Geological Survey Precipitation-Runoff Modeling System, a modular, distributed-parameter, watershed-modeling system, is being applied to 20 smaller watersheds within the Gunnison River basin. The model is used to derive a daily water balance for subareas in a watershed, ultimately producing simulated streamflows that can be input into routing and accounting models used to assess downstream water availability under current conditions, and to assess the sensitivity of water resources in the basin to alterations in climate. A geographic information system (GIS) is used to automate a method for extracting physically based hydrologic response unit (HRU) distributed parameter values from digital data sources, and for the placement of those estimates into GIS spatial datalayers. The HRU parameters extracted are: area, mean elevation, average land-surface slope, predominant aspect, predominant land-cover type, predominant soil type, average total soil water-holding capacity, and average water-holding capacity of the root zone.

  18. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  19. Automated extraction for the analysis of 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in urine using a six-head probe Hamilton Microlab 2200 system and gas chromatography-mass spectrometry.

    PubMed

    Whitter, P D; Cary, P L; Leaton, J I; Johnson, J E

    1999-01-01

    An automated extraction scheme for the analysis of 11 -nor-delta9-tetrahydrocannabinol-9-carboxylic acid using the Hamilton Microlab 2200, which was modified for gravity-flow solid-phase extraction, has been evaluated. The Hamilton was fitted with a six-head probe, a modular valve positioner, and a peristaltic pump. The automated method significantly increased sample throughput, improved assay consistency, and reduced the time spent performing the extraction. Extraction recovery for the automated method was > 90%. The limit of detection, limit of quantitation, and upper limit of linearity were equivalent to the manual method: 1.5, 3.0, and 300 ng/mL, respectively. Precision at the 15-ng/mL cut-off was as follows: mean = 14.4, standard deviation = 0.5, coefficient of variation = 3.5%. Comparison of 38 patient samples, extracted by the manual and automated extraction methods, demonstrated the following correlation statistics: r = .991, slope 1.029, and y-intercept -2.895. Carryover was < 0.3% at 1000 ng/mL. Aliquoting/extraction time for the automated method (48 urine samples) was 50 min, and the manual procedure required approximately 2.5 h. The automated aliquoting/extraction method on the Hamilton Microlab 2200 and its use in forensic applications are reviewed.

  20. Python-Assisted MODFLOW Application and Code Development

    NASA Astrophysics Data System (ADS)

    Langevin, C.

    2013-12-01

    The U.S. Geological Survey (USGS) has a long history of developing and maintaining free, open-source software for hydrological investigations. The MODFLOW program is one of the most popular hydrologic simulation programs released by the USGS, and it is considered to be the most widely used groundwater flow simulation code. MODFLOW was written using a modular design and a procedural FORTRAN style, which resulted in code that could be understood, modified, and enhanced by many hydrologists. The code is fast, and because it uses standard FORTRAN it can be run on most operating systems. Most MODFLOW users rely on proprietary graphical user interfaces for constructing models and viewing model results. Some recent efforts, however, have focused on construction of MODFLOW models using open-source Python scripts. Customizable Python packages, such as FloPy (https://code.google.com/p/flopy), can be used to generate input files, read simulation results, and visualize results in two and three dimensions. Automating this sequence of steps leads to models that can be reproduced directly from original data and rediscretized in space and time. Python is also being used in the development and testing of new MODFLOW functionality. New packages and numerical formulations can be quickly prototyped and tested first with Python programs before implementation in MODFLOW. This is made possible by the flexible object-oriented design capabilities available in Python, the ability to call FORTRAN code from Python, and the ease with which linear systems of equations can be solved using SciPy, for example. Once new features are added to MODFLOW, Python can then be used to automate comprehensive regression testing and ensure reliability and accuracy of new versions prior to release.

  1. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  2. Implementation of a scalable, web-based, automated clinical decision support risk-prediction tool for chronic kidney disease using C-CDA and application programming interfaces.

    PubMed

    Samal, Lipika; D'Amore, John D; Bates, David W; Wright, Adam

    2017-11-01

    Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electronic health records (EHRs), other options are available. As a clinical case study, we sought to build a scalable, web-based system that would automate calculation of kidney failure risk and display clinical decision support to users in primary care practices. We developed a single-page application, web server, database, and application programming interface to calculate and display kidney failure risk. Data were extracted from the EHR using the Consolidated Clinical Document Architecture interoperability standard for Continuity of Care Documents (CCDs). EHR users were presented with a noninterruptive alert on the patient's summary screen and a hyperlink to details and recommendations provided through a web application. Clinic schedules and CCDs were retrieved using existing application programming interfaces to the EHR, and we provided a clinical decision support hyperlink to the EHR as a service. We debugged a series of terminology and technical issues. The application was validated with data from 255 patients and subsequently deployed to 10 primary care clinics where, over the course of 1 year, 569 533 CCD documents were processed. We validated the use of interoperable documents and open-source components to develop a low-cost tool for automated clinical decision support. Since Consolidated Clinical Document Architecture-based data extraction extends to any certified EHR, this demonstrates a successful modular approach to clinical decision support. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  3. How to Construct an Automated Warehouse Based on Colored Timed Petri Nets

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; He, Shanjun

    The automated warehouse considered here consists of a number of rack locations with three cranes, a narrow aisle shuttle, and several buffer stations with the roller. Based on analyzing of the behaviors of the active resources in the system, a modular and computerized model is presented via a colored timed Petri net approach, in which places are multicolored to simplify model and characterize control flow of the resources, and token colors are defined as the routes of storage/retrieval operations. In addition, an approach for realization of model via visual c++ is briefly given. These facts allow us to render an emulate system to simulate a discrete control application for online monitoring, dynamic dispatching control and off-line revising scheduler policies.

  4. Algorithms for Automated DNA Assembly

    DTIC Science & Technology

    2010-01-01

    polyketide synthase gene cluster. Proc. Natl Acad. Sci. USA, 101, 15573–15578. 16. Shetty,R.P., Endy,D. and Knight,T.F. Jr (2008) Engineering BioBrick vectors...correct theoretical construction scheme is de- veloped manually, it is likely to be suboptimal by any number of cost metrics. Modular, robust and...to an exhaustive search on a small synthetic dataset and our results show that our algorithms can quickly find an optimal solution. Comparison with

  5. JPRS Report Science & Technology Europe Twenty-Fourth Isata International Symposium on Automotive Technology and Automation.

    DTIC Science & Technology

    1991-09-05

    34 Learning from Learning : Principles for Supporting Drivers" J A Groeger, MRC Applied Psychology Unit, UK "Argos: A Driver Behaviour Analysis System...Technology (CEST), UK MISCELLANEOUS "Modular Sensor System for Guiding Handling Machines " J Geit and J 423 Heinrich, TZN Forshcungs, FRG "Flexible...PUBLIC TRANSP . MANAa RESEARCH Arrrtympe PARTI "Implementation Strategl»» Systems engineering \\ PART III / Validation through Pilot

  6. Engineering Design Handbook: Timing Systems and Components

    DTIC Science & Technology

    1975-12-01

    23-1 23-2 Modular Components 23-2 23—3 Integrated Circuits 23—2 23—4 Matching Techniques 23-5 23-5 DC and AC Systems 23-7 23-6 Hybrid...Assembly Illustrating Modular Design . . 23—4 23-3 Characteristics of the Source 23—6 23—4 Characteristics of the Load 23—6 23—5 Matching Source and...4-1 INTRODUCTION There is a continuous demand for increased precision and accuracy in frequency control. Today fast time pulses are used in

  7. Minifactory: a precision assembly system adaptable to the product life cycle

    NASA Astrophysics Data System (ADS)

    Muir, Patrick F.; Rizzi, Alfred A.; Gowdy, Jay W.

    1997-12-01

    Automated product assembly systems are traditionally designed with the intent that they will be operated with few significant changes for as long as the product is being manufactured. This approach to factory design and programming has may undesirable qualities which have motivated the development of more 'flexible' systems. In an effort to improve agility, different types of flexibility have been integrated into factory designs. Specifically, automated assembly systems have been endowed with the ability to assemble differing products by means of computer-controlled robots, and to accommodate variations in parts locations and dimensions by means of sensing. The product life cycle (PLC) is a standard four-stage model of the performance of a product from the time that it is first introduced in the marketplace until the time that it is discontinued. Manufacturers can improve their return on investment by adapting the production process to the PLC. We are developing two concepts to enable manufacturers to more readily achieve this goal: the agile assembly architecture (AAA), an abstract framework for distributed modular automation; and minifactory, our physical instantation of this architecture for the assembly of precision electro-mechanical devices. By examining the requirements which each PLC stage places upon the production system, we identify characteristics of factory design and programming which are appropriate for that stage. As the product transitions from one stage to the next, the factory design and programing should also transition from one embodiment to the next in order to achieve the best return on investment. Modularity of the factory components, highly flexible product transport mechanisms, and a high level of distributed intelligence are key characteristics of minifactory that enable this adaptation.

  8. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    PubMed

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years before these drugs were withdrawn from the market. Copyright © 2014 John Wiley & Sons, Ltd.

  9. OptoDyCE: Automated system for high-throughput all-optical dynamic cardiac electrophysiology

    NASA Astrophysics Data System (ADS)

    Klimas, Aleksandra; Yu, Jinzhu; Ambrosi, Christina M.; Williams, John C.; Bien, Harold; Entcheva, Emilia

    2016-02-01

    In the last two decades, <30% of drugs withdrawals from the market were due to cardiac toxicity, where unintended interactions with ion channels disrupt the heart's normal electrical function. Consequently, all new drugs must undergo preclinical testing for cardiac liability, adding to an already expensive and lengthy process. Recognition that proarrhythmic effects often result from drug action on multiple ion channels demonstrates a need for integrative and comprehensive measurements. Additionally, patient-specific therapies relying on emerging technologies employing stem-cell derived cardiomyocytes (e.g. induced pluripotent stem-cell-derived cardiomyocytes, iPSC-CMs) require better screening methods to become practical. However, a high-throughput, cost-effective approach for cellular cardiac electrophysiology has not been feasible. Optical techniques for manipulation and recording provide a contactless means of dynamic, high-throughput testing of cells and tissues. Here, we consider the requirements for all-optical electrophysiology for drug testing, and we implement and validate OptoDyCE, a fully automated system for all-optical cardiac electrophysiology. We demonstrate the high-throughput capabilities using multicellular samples in 96-well format by combining optogenetic actuation with simultaneous fast high-resolution optical sensing of voltage or intracellular calcium. The system can also be implemented using iPSC-CMs and other cell-types by delivery of optogenetic drivers, or through the modular use of dedicated light-sensitive somatic cells in conjunction with non-modified cells. OptoDyCE provides a truly modular and dynamic screening system, capable of fully-automated acquisition of high-content information integral for improved discovery and development of new drugs and biologics, as well as providing a means of better understanding of electrical disturbances in the heart.

  10. Modular optical detector system

    DOEpatents

    Horn, Brent A [Livermore, CA; Renzi, Ronald F [Tracy, CA

    2006-02-14

    A modular optical detector system. The detector system is designed to detect the presence of molecules or molecular species by inducing fluorescence with exciting radiation and detecting the emitted fluorescence. Because the system is capable of accurately detecting and measuring picomolar concentrations it is ideally suited for use with microchemical analysis systems generally and capillary chromatographic systems in particular. By employing a modular design, the detector system provides both the ability to replace various elements of the detector system without requiring extensive realignment or recalibration of the components as well as minimal user interaction with the system. In addition, the modular concept provides for the use and addition of a wide variety of components, including optical elements (lenses and filters), light sources, and detection means, to fit particular needs.

  11. OpenDrift - an open source framework for ocean trajectory modeling

    NASA Astrophysics Data System (ADS)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  12. A modular approach to detection and identification of defects in rough lumber

    Treesearch

    Sang Mook Lee; A. Lynn Abbott; Daniel L. Schmoldt

    2001-01-01

    This paper describes a prototype scanning system that can automatically identify several important defects on rough hardwood lumber. The scanning system utilizes 3 laser sources and an embedded-processor camera to capture and analyze profile and gray-scale images. The modular approach combines the detection of wane (the curved sides of a board, possibly containing...

  13. A Modular Soft Robotic Wrist for Underwater Manipulation.

    PubMed

    Kurumaya, Shunichi; Phillips, Brennan T; Becker, Kaitlyn P; Rosen, Michelle H; Gruber, David F; Galloway, Kevin C; Suzumori, Koichi; Wood, Robert J

    2018-04-19

    This article presents the development of modular soft robotic wrist joint mechanisms for delicate and precise manipulation in the harsh deep-sea environment. The wrist consists of a rotary module and bending module, which can be combined with other actuators as part of a complete manipulator system. These mechanisms are part of a suite of soft robotic actuators being developed for deep-sea manipulation via submersibles and remotely operated vehicles, and are designed to be powered hydraulically with seawater. The wrist joint mechanisms can also be activated with pneumatic pressure for terrestrial-based applications, such as automated assembly and robotic locomotion. Here we report the development and characterization of a suite of rotary and bending modules by varying fiber number and silicone hardness. Performance of the complete soft robotic wrist is demonstrated in normal atmospheric conditions using both pneumatic and hydraulic pressures for actuation and under high ambient hydrostatic pressures equivalent to those found at least 2300 m deep in the ocean. This rugged modular wrist holds the potential to be utilized at full ocean depths (>10,000 m) and is a step forward in the development of jointed underwater soft robotic arms.

  14. Experimental Verification and Integration of a Next Generation Smart Power Management System

    NASA Astrophysics Data System (ADS)

    Clemmer, Tavis B.

    With the increase in energy demand by the residential community in this country and the diminishing fossil fuel resources being used for electric energy production there is a need for a system to efficiently manage power within a residence. The Smart Green Power Node (SGPN) is a next generation energy management system that automates on-site energy production, storage, consumption, and grid usage to yield the most savings for both the utility and the consumer. Such a system automatically manages on-site distributed generation sources such as a PhotoVoltaic (PV) input and battery storage to curtail grid energy usage when the price is high. The SGPN high level control features an advanced modular algorithm that incorporates weather data for projected PV generation, battery health monitoring algorithms, user preferences for load prioritization within the home in case of an outage, Time of Use (ToU) grid power pricing, and status of on-site resources to intelligently schedule and manage power flow between the grid, loads, and the on-site resources. The SGPN has a scalable, modular architecture such that it can be customized for user specific applications. This drove the topology for the SGPN which connects on-site resources at a low voltage DC microbus; a two stage bi-directional inverter/rectifier then couples the AC load and residential grid connect to on-site generation. The SGPN has been designed, built, and is undergoing testing. Hardware test results obtained are consistent with the design goals set and indicate that the SGPN is a viable system with recommended changes and future work.

  15. X-ray optics for the LAMAR facility, an overview. [Large Area Modular Array of Reflectors

    NASA Technical Reports Server (NTRS)

    Gorenstein, P.

    1979-01-01

    The paper surveys the Large Area Modular Array of Reflectors (LAMAR), the concept of which is based on meeting two major requirements in X-ray astronomy, large collecting area and moderately good or better angular resolution for avoiding source confusion and imaging source fields. It is shown that the LAMAR provides the same sensitivity and signal to noise in imaging as a single large telescope having the same area and angular resolution but is a great deal less costly to develop, construct, and integrate into a space mission. Attention is also given to the LAMAR modular nature which will allow for an evolutionary development from a modest size array on Spacelab to a Shuttle launched free flyer. Finally, consideration is given to manufacturing methods which show promise of making LAMAR meet the criteria of good angular resolution, relatively low cost, and capability for fast volume production.

  16. CERES: A Set of Automated Routines for Echelle Spectra

    NASA Astrophysics Data System (ADS)

    Brahm, Rafael; Jordán, Andrés; Espinoza, Néstor

    2017-03-01

    We present the Collection of Elemental Routines for Echelle Spectra (CERES). These routines were developed for the construction of automated pipelines for the reduction, extraction, and analysis of spectra acquired with different instruments, allowing the obtention of homogeneous and standardized results. This modular code includes tools for handling the different steps of the processing: CCD image reductions; identification and tracing of the echelle orders; optimal and rectangular extraction; computation of the wavelength solution; estimation of radial velocities; and rough and fast estimation of the atmospheric parameters. Currently, CERES has been used to develop automated pipelines for 13 different spectrographs, namely CORALIE, FEROS, HARPS, ESPaDOnS, FIES, PUCHEROS, FIDEOS, CAFE, DuPont/Echelle, Magellan/Mike, Keck/HIRES, Magellan/PFS, and APO/ARCES, but the routines can be easily used to deal with data coming from other spectrographs. We show the high precision in radial velocity that CERES achieves for some of these instruments, and we briefly summarize some results that have already been obtained using the CERES pipelines.

  17. An adaptable product for material processing and life science missions

    NASA Technical Reports Server (NTRS)

    Wassick, Gregory; Dobbs, Michael

    1995-01-01

    The Experiment Control System II (ECS-II) is designed to make available to the microgravity research community the same tools and mode of automated experimentation that their ground-based counterparts have enjoyed for the last two decades. The design goal was accomplished by combining commercial automation tools familiar to the experimenter community with system control components that interface with the on-orbit platform in a distributed architecture. The architecture insulates the tools necessary for managing a payload. By using commercial software and hardware components whenever possible, development costs were greatly reduced when compared to traditional space development projects. Using commercial-off-the-shelf (COTS) components also improved the usability documentation, and reducing the need for training of the system by providing familiar user interfaces, providing a wealth of readily available documentation, and reducing the need for training on system-specific details. The modularity of the distributed architecture makes it very amenable for modification to different on-orbit experiments requiring robotics-based automation.

  18. Automation of 3D cell culture using chemically defined hydrogels.

    PubMed

    Rimann, Markus; Angres, Brigitte; Patocchi-Tenzer, Isabel; Braum, Susanne; Graf-Hausner, Ursula

    2014-04-01

    Drug development relies on high-throughput screening involving cell-based assays. Most of the assays are still based on cells grown in monolayer rather than in three-dimensional (3D) formats, although cells behave more in vivo-like in 3D. To exemplify the adoption of 3D techniques in drug development, this project investigated the automation of a hydrogel-based 3D cell culture system using a liquid-handling robot. The hydrogel technology used offers high flexibility of gel design due to a modular composition of a polymer network and bioactive components. The cell inert degradation of the gel at the end of the culture period guaranteed the harmless isolation of live cells for further downstream processing. Human colon carcinoma cells HCT-116 were encapsulated and grown in these dextran-based hydrogels, thereby forming 3D multicellular spheroids. Viability and DNA content of the cells were shown to be similar in automated and manually produced hydrogels. Furthermore, cell treatment with toxic Taxol concentrations (100 nM) had the same effect on HCT-116 cell viability in manually and automated hydrogel preparations. Finally, a fully automated dose-response curve with the reference compound Taxol showed the potential of this hydrogel-based 3D cell culture system in advanced drug development.

  19. SCALE: A modular code system for performing Standardized Computer Analyses for Licensing Evaluation. Volume 1, Part 2: Control modules S1--H1; Revision 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automated the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.3 of the system.« less

  20. An Approach to Automated Fusion System Design and Adaptation

    PubMed Central

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-01-01

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum. PMID:28300762

  1. An Approach to Automated Fusion System Design and Adaptation.

    PubMed

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-03-16

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  2. Manufacturing Technology for Apparel Automation. Phases 1 and 2.

    DTIC Science & Technology

    1987-07-15

    Modularized Work Unit Groups . . . . . . . 12 i +,. :.Aooesston For :".. NTIS GR.AA1 ’- " "D T I C T A B Unannounced [] ~Justification i Availability...The monthly interim reports are summarized in this semiannual report. Activity to date has included work performed by Ms. Carol Carrere Dr. T. G. Clapp...Management. Provide, in accordance with paragraph 3.1 of the Statement of Work (SOW), North Carolina State University’s Technical Proposal, Manufacturing

  3. The Dilemma of Department of Defense Business System Modernization Efforts: Why Intended Outcomes Have Not Been Fully Met and What Needs to Change

    DTIC Science & Technology

    2016-06-01

    between contract writing systems and the associated accounting and logistics systems. Employing this modular plug and play approach simplifies system...Automated Contract Preparation System (ACPS), Integrated Technical Item Management (ITIMP), and EProcurement are the contract writing systems that...research was on the DOD contract writing systems (CWS). This JAP seeks to report on the progress of DOD business system modernization efforts and

  4. The Joint Modular Intermodal Container, is this the Future of Naval Logistics?

    DTIC Science & Technology

    2005-06-01

    pallet size. Contrast this with the commercial shipping industry , which for the last 40 years has been moving non-bulk goods in hyper-efficient container...a Heavy UNREP station than a current STREAM .7station7. Figure 4: Heavy UNREP Enables New Loads to be passed Between Ships UACN et Enggines (12,000 ls...man-hours are being spent on inefficient and relatively inaccurate paper-based accounting methods. The industry standard for automated accounting

  5. SAGES: A Suite of Freely-Available Software Tools for Electronic Disease Surveillance in Resource-Limited Settings

    DTIC Science & Technology

    2011-05-10

    concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility...existing surveillance applications or the SAGES tools may be used en masse for an end–to-end biosurveillance capability. doi:10.1371/journal.pone...health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular

  6. Design of an Advanced Modular Automated Evaluation System for Experimental High Power SGTOS

    DTIC Science & Technology

    2013-06-01

    POWER SGTOS Shelby Lacouture, Kevin Lawson, Stephen Bayne, Michael Giesselmann, Heather O’Brien 1 , Aderinto Ogunniyi 1 , Charles J...Travis T. Vollmer and Michael G. Giesselmann, Rapid Capacitor Charging Power Supply for an 1800J PFN, Proceedings of the 2012 Power Modulator and High Voltage Conference, San Diego, CA, June 3-7, 2012. 1023 ...Scozzie 1 Center for Pulsed Power and Power Electronics Department of Electrical & Computer Engineering Texas Tech

  7. Towards a Multifunctional Electrochemical Sensing and Niosome Generation Lab-on-Chip Platform Based on a Plug-and-Play Concept.

    PubMed

    Kara, Adnane; Rouillard, Camille; Mathault, Jessy; Boisvert, Martin; Tessier, Frédéric; Landari, Hamza; Melki, Imene; Laprise-Pelletier, Myriam; Boisselier, Elodie; Fortin, Marc-André; Boilard, Eric; Greener, Jesse; Miled, Amine

    2016-05-28

    In this paper, we present a new modular lab on a chip design for multimodal neurotransmitter (NT) sensing and niosome generation based on a plug-and-play concept. This architecture is a first step toward an automated platform for an automated modulation of neurotransmitter concentration to understand and/or treat neurodegenerative diseases. A modular approach has been adopted in order to handle measurement or drug delivery or both measurement and drug delivery simultaneously. The system is composed of three fully independent modules: three-channel peristaltic micropumping system, a three-channel potentiostat and a multi-unit microfluidic system composed of pseudo-Y and cross-shape channels containing a miniature electrode array. The system was wirelessly controlled by a computer interface. The system is compact, with all the microfluidic and sensing components packaged in a 5 cm × 4 cm × 4 cm box. Applied to serotonin, a linear calibration curve down to 0.125 mM, with a limit of detection of 31 μ M was collected at unfunctionalized electrodes. Added sensitivity and selectivity was achieved by incorporating functionalized electrodes for dopamine sensing. Electrode functionalization was achieved with gold nanoparticles and using DNA and o-phenylene diamine polymer. The as-configured platform is demonstrated as a central component toward an "intelligent" drug delivery system based on a feedback loop to monitor drug delivery.

  8. Towards a Multifunctional Electrochemical Sensing and Niosome Generation Lab-on-Chip Platform Based on a Plug-and-Play Concept

    PubMed Central

    Kara, Adnane; Rouillard, Camille; Mathault, Jessy; Boisvert, Martin; Tessier, Frédéric; Landari, Hamza; Melki, Imene; Laprise-Pelletier, Myriam; Boisselier, Elodie; Fortin, Marc-André; Boilard, Eric; Greener, Jesse; Miled, Amine

    2016-01-01

    In this paper, we present a new modular lab on a chip design for multimodal neurotransmitter (NT) sensing and niosome generation based on a plug-and-play concept. This architecture is a first step toward an automated platform for an automated modulation of neurotransmitter concentration to understand and/or treat neurodegenerative diseases. A modular approach has been adopted in order to handle measurement or drug delivery or both measurement and drug delivery simultaneously. The system is composed of three fully independent modules: three-channel peristaltic micropumping system, a three-channel potentiostat and a multi-unit microfluidic system composed of pseudo-Y and cross-shape channels containing a miniature electrode array. The system was wirelessly controlled by a computer interface. The system is compact, with all the microfluidic and sensing components packaged in a 5 cm × 4 cm × 4 cm box. Applied to serotonin, a linear calibration curve down to 0.125 mM, with a limit of detection of 31 μM was collected at unfunctionalized electrodes. Added sensitivity and selectivity was achieved by incorporating functionalized electrodes for dopamine sensing. Electrode functionalization was achieved with gold nanoparticles and using DNA and o-phenylene diamine polymer. The as-configured platform is demonstrated as a central component toward an “intelligent” drug delivery system based on a feedback loop to monitor drug delivery. PMID:27240377

  9. Human Impacts and Climate Change Influence Nestedness and Modularity in Food-Web and Mutualistic Networks.

    PubMed

    Takemoto, Kazuhiro; Kajihara, Kosuke

    2016-01-01

    Theoretical studies have indicated that nestedness and modularity-non-random structural patterns of ecological networks-influence the stability of ecosystems against perturbations; as such, climate change and human activity, as well as other sources of environmental perturbations, affect the nestedness and modularity of ecological networks. However, the effects of climate change and human activities on ecological networks are poorly understood. Here, we used a spatial analysis approach to examine the effects of climate change and human activities on the structural patterns of food webs and mutualistic networks, and found that ecological network structure is globally affected by climate change and human impacts, in addition to current climate. In pollination networks, for instance, nestedness increased and modularity decreased in response to increased human impacts. Modularity in seed-dispersal networks decreased with temperature change (i.e., warming), whereas food web nestedness increased and modularity declined in response to global warming. Although our findings are preliminary owing to data-analysis limitations, they enhance our understanding of the effects of environmental change on ecological communities.

  10. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  11. An open-source data storage and visualization back end for experimental data.

    PubMed

    Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert; Nielsen, Jane H; Chorkendorff, Ib

    2014-04-01

    In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component for the data back end has furthermore been written that enables live visualization of data on any device capable of displaying Web pages. The system consists of three parts: data-logging clients, a data server, and a data presentation Web site. The logging of data from independent clients leads to high resilience to equipment failure, whereas the central storage of data dramatically eases backup and data exchange. The visualization front end allows direct monitoring of acquired data to see live progress of long-duration experiments. This enables the user to alter experimental conditions based on these data and to interfere with the experiment if needed. The data stored consist both of specific measurements and of continuously logged system parameters. The latter is crucial to a variety of automation and surveillance features, and three cases of such features are described: monitoring system health, getting status of long-duration experiments, and implementation of instant alarms in the event of failure.

  12. AEGIS: a robust and scalable real-time public health surveillance system.

    PubMed

    Reis, Ben Y; Kirby, Chaim; Hadden, Lucy E; Olson, Karen; McMurry, Andrew J; Daniel, James B; Mandl, Kenneth D

    2007-01-01

    In this report, we describe the Automated Epidemiological Geotemporal Integrated Surveillance system (AEGIS), developed for real-time population health monitoring in the state of Massachusetts. AEGIS provides public health personnel with automated near-real-time situational awareness of utilization patterns at participating healthcare institutions, supporting surveillance of bioterrorism and naturally occurring outbreaks. As real-time public health surveillance systems become integrated into regional and national surveillance initiatives, the challenges of scalability, robustness, and data security become increasingly prominent. A modular and fault tolerant design helps AEGIS achieve scalability and robustness, while a distributed storage model with local autonomy helps to minimize risk of unauthorized disclosure. The report includes a description of the evolution of the design over time in response to the challenges of a regional and national integration environment.

  13. Automated microbeam observation environment for biological analysis—Custom portable environmental control applied to a vertical microbeam system

    PubMed Central

    England, Matthew J.; Bigelow, Alan W.; Merchant, Michael J.; Velliou, Eirini; Welch, David; Brenner, David J.; Kirkby, Karen J.

    2018-01-01

    Vertical Microbeams (VMB) are used to irradiate individual cells with low MeV energy ions. The irradiation of cells using VMBs requires cells to be removed from an incubator; this can cause physiological changes to cells because of the lower CO2 concentration, temperature and relative humidity outside of the incubator. Consequently, for experiments where cells require irradiation and observation for extended time periods, it is important to provide a controlled environment. The highly customised nature of the microscopes used on VMB systems means that there are no commercially available environmentally controlled microscope systems for VMB systems. The Automated Microbeam Observation Environment for Biological Analysis (AMOEBA) is a highly flexible modular environmental control system used to create incubator conditions on the end of a VMB. The AMOEBA takes advantage of the recent “maker” movement to create an open source control system that can be easily configured by the user to fit their control needs even beyond VMB applications. When applied to the task of controlling cell medium temperature, CO2 concentration and relative humidity on VMBs it creates a stable environment that allows cells to multiply on the end of a VMB over a period of 36 h, providing a low-cost (costing less than $2700 to build), customisable alternative to commercial time-lapse microscopy systems. AMOEBA adds the potential of VMBs to explore the long-term effects of radiation on single cells opening up new research areas for VMBs. PMID:29515291

  14. Automated microbeam observation environment for biological analysis-Custom portable environmental control applied to a vertical microbeam system.

    PubMed

    England, Matthew J; Bigelow, Alan W; Merchant, Michael J; Velliou, Eirini; Welch, David; Brenner, David J; Kirkby, Karen J

    2017-02-01

    Vertical Microbeams (VMB) are used to irradiate individual cells with low MeV energy ions. The irradiation of cells using VMBs requires cells to be removed from an incubator; this can cause physiological changes to cells because of the lower CO 2 concentration, temperature and relative humidity outside of the incubator. Consequently, for experiments where cells require irradiation and observation for extended time periods, it is important to provide a controlled environment. The highly customised nature of the microscopes used on VMB systems means that there are no commercially available environmentally controlled microscope systems for VMB systems. The Automated Microbeam Observation Environment for Biological Analysis (AMOEBA) is a highly flexible modular environmental control system used to create incubator conditions on the end of a VMB. The AMOEBA takes advantage of the recent "maker" movement to create an open source control system that can be easily configured by the user to fit their control needs even beyond VMB applications. When applied to the task of controlling cell medium temperature, CO 2 concentration and relative humidity on VMBs it creates a stable environment that allows cells to multiply on the end of a VMB over a period of 36 h, providing a low-cost (costing less than $2700 to build), customisable alternative to commercial time-lapse microscopy systems. AMOEBA adds the potential of VMBs to explore the long-term effects of radiation on single cells opening up new research areas for VMBs.

  15. A Novel Multilevel DC - AC Converter from Green Energy Power Generators Using Step-Square Waving and PWM Technique

    NASA Astrophysics Data System (ADS)

    Fajingbesi, F. E.; Midi, N. S.; Khan, S.

    2017-06-01

    Green energy sources or renewable energy system generally utilize modular approach in their design. This sort of power sources are generally in DC form or in single cases AC. Due to high fluctuation in the natural origin of this energy (wind & solar) source they are stored as DC. DC power however are difficult to transfer over long distances hence DC to AC converters and storage system are very important in green energy system design. In this work we have designed a novel multilevel DC to AC converter that takes into account the modular design of green energy systems. A power conversion efficiency of 99% with reduced total harmonic distortion (THD) was recorded from our simulated system design.

  16. Development of a Deployable Nonmetallic Boom for Reconfigurable Systems of Small Modular Spacecraft

    NASA Technical Reports Server (NTRS)

    Rehnmark, Fredrik

    2007-01-01

    Launch vehicle payload capacity and the launch environment represent two of the most operationally limiting constraints on space system mass, volume, and configuration. Large-scale space science and power platforms as well as transit vehicles have been proposed that greatly exceed single-launch capabilities. Reconfigurable systems launched as multiple small modular spacecraft with the ability to rendezvous, approach, mate, and conduct coordinated operations have the potential to make these designs feasible. A key characteristic of these proposed systems is their ability to assemble into desired geometric (spatial) configurations. While flexible and sparse formations may be realized by groups of spacecraft flying in close proximity, flyers physically connected by active structural elements could continuously exchange power, fluids, and heat (via fluids). Configurations of small modular spacecraft temporarily linked together could be sustained as long as needed with minimal propellant use and reconfigured as often as needed over extended missions with changing requirements. For example, these vehicles could operate in extremely compact configurations during boost phases of a mission and then redeploy to generate power or communicate while coasting and upon reaching orbit. In 2005, NASA funded Phase 1 of a program called Modular Reconfigurable High-Energy Technology Demonstrator Assembly Testbed (MRHE) to investigate reconfigurable systems of small spacecraft. The MRHE team was led by NASA's Marshall Space Flight Center and included Lockheed Martin's Advanced Technology Center (ATC) in Palo Alto and its subcontractor, ATK. One of the goals of Phase 1 was to develop an MRHE concept demonstration in a relevant 1-g environment to highlight a number of requisite technologies. In Phase 1 of the MRHE program, Lockheed Martin devised and conducted an automated space system assembly demonstration featuring multipurpose free-floating robots representing Spacecraft in the newly built Controls and Automation Laboratory (CAL) at the ATC. The CAL lab features a 12' x 24' granite air-bearing table and an overhead simulated starfield. Among the technologies needed for the concept demo were mating interfaces allowing the spacecraft to dock and deployable structures allowing for adjustable separation between spacecraft after a rigid connection had been established. The decision to use a nonmetallic deployable boom for this purpose was driven by the MRHE concept demo requirements reproduced in Table 1.

  17. Rapid prototyping prosthetic hand acting by a low-cost shape-memory-alloy actuator.

    PubMed

    Soriano-Heras, Enrique; Blaya-Haro, Fernando; Molino, Carlos; de Agustín Del Burgo, José María

    2018-06-01

    The purpose of this article is to develop a new concept of modular and operative prosthetic hand based on rapid prototyping and a novel shape-memory-alloy (SMA) actuator, thus minimizing the manufacturing costs. An underactuated mechanism was needed for the design of the prosthesis to use only one input source. Taking into account the state of the art, an underactuated mechanism prosthetic hand was chosen so as to implement the modifications required for including the external SMA actuator. A modular design of a new prosthesis was developed which incorporated a novel SMA actuator for the index finger movement. The primary objective of the prosthesis is achieved, obtaining a modular and functional low-cost prosthesis based on additive manufacturing executed by a novel SMA actuator. The external SMA actuator provides a modular system which allows implementing it in different systems. This paper combines rapid prototyping and a novel SMA actuator to develop a new concept of modular and operative low-cost prosthetic hand.

  18. Development of a beam builder for automatic fabrication of large composite space structures

    NASA Technical Reports Server (NTRS)

    Bodle, J. G.

    1979-01-01

    The composite material beam builder which will produce triangular beams from pre-consolidated graphite/glass/thermoplastic composite material through automated mechanical processes is presented, side member storage, feed and positioning, ultrasonic welding, and beam cutoff are formed. Each process lends itself to modular subsystem development. Initial development is concentrated on the key processes for roll forming and ultrasonic welding composite thermoplastic materials. The construction and test of an experimental roll forming machine and ultrasonic welding process control techniques are described.

  19. ASPEN Version 3.0

    NASA Technical Reports Server (NTRS)

    Rabideau, Gregg; Chien, Steve; Knight, Russell; Schaffer, Steven; Tran, Daniel; Cichy, Benjamin; Sherwood, Robert

    2006-01-01

    The Automated Scheduling and Planning Environment (ASPEN) computer program has been updated to version 3.0. ASPEN is a modular, reconfigurable, application software framework for solving batch problems that involve reasoning about time, activities, states, and resources. Applications of ASPEN can include planning spacecraft missions, scheduling of personnel, and managing supply chains, inventories, and production lines. ASPEN 3.0 can be customized for a wide range of applications and for a variety of computing environments that include various central processing units and random access memories.

  20. Multiple-Objective Stepwise Calibration Using Luca

    USGS Publications Warehouse

    Hay, Lauren E.; Umemoto, Makiko

    2007-01-01

    This report documents Luca (Let us calibrate), a multiple-objective, stepwise, automated procedure for hydrologic model calibration and the associated graphical user interface (GUI). Luca is a wizard-style user-friendly GUI that provides an easy systematic way of building and executing a calibration procedure. The calibration procedure uses the Shuffled Complex Evolution global search algorithm to calibrate any model compiled with the U.S. Geological Survey's Modular Modeling System. This process assures that intermediate and final states of the model are simulated consistently with measured values.

  1. Chemical markup, XML, and the World Wide Web. 5. Applications of chemical metadata in RSS aggregators.

    PubMed

    Murray-Rust, Peter; Rzepa, Henry S; Williamson, Mark J; Willighagen, Egon L

    2004-01-01

    Examples of the use of the RSS 1.0 (RDF Site Summary) specification together with CML (Chemical Markup Language) to create a metadata based alerting service termed CMLRSS for molecular content are presented. CMLRSS can be viewed either using generic software or with modular opensource chemical viewers and editors enhanced with CMLRSS modules. We discuss the more automated use of CMLRSS as a component of a World Wide Molecular Matrix of semantically rich chemical information.

  2. Automation of Flight Software Regression Testing

    NASA Technical Reports Server (NTRS)

    Tashakkor, Scott B.

    2016-01-01

    NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add automation, and the ability of the harness and cases to be executed continually. This test concept is an approach that can be adapted to support other projects.

  3. Navy Expeditionary Technology Transition Program (NETTP)

    DTIC Science & Technology

    2012-03-02

    water vapor from feed air using a zeolite membrane •Temperature/Humidity levels can be met in warm, humid climates without reheating •Allows higher...UNCLASSIFIED, Distribution Unlimited Modular Thermal Hub •Small, efficient absorption cooling •Energy source: Combustion, low- grade waste heat, solar... thermal energy •Reversible operation enables space cooling and heating, and water heating •Modular cooling and heating unit •Monolithic packaging offers

  4. Characteristics of detectors for prevention of nuclear radiation terrorism

    NASA Astrophysics Data System (ADS)

    Kolesnikov, S. V.; Ryabeva, E. V.; Samosadny, V. T.

    2017-01-01

    There is description of one type of detectors in use for the task of nuclear terrorism cases prevention to determine the direction to the radioactive source and geometrical structure of radiation field. This type is a modular detector with anisotropic sensitivity. The principle of work of a modular detecting device is the simultaneous operation of several detecting modules with anisotropic sensitivity to gamma radiation.

  5. A modular optically powered floating high voltage generator.

    PubMed

    Antonini, P; Borsato, E; Carugno, G; Pegoraro, M; Zotto, P

    2013-02-01

    The feasibility of fully floating high voltage (HV) generation was demonstrated producing a prototype of a modular HV system. The primary power source is provided by a high efficiency semiconductor power cell illuminated by a laser system ensuring the floating nature of each module. The HV is then generated by dc-dc conversion and a HV multiplier. The possibility of series connection among modules was verified.

  6. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  7. Progress toward Modular UAS for Geoscience Applications

    NASA Astrophysics Data System (ADS)

    Dahlgren, R. P.; Clark, M. A.; Comstock, R. J.; Fladeland, M.; Gascot, H., III; Haig, T. H.; Lam, S. J.; Mazhari, A. A.; Palomares, R. R.; Pinsker, E. A.; Prathipati, R. T.; Sagaga, J.; Thurling, J. S.; Travers, S. V.

    2017-12-01

    Small Unmanned Aerial Systems (UAS) have become accepted tools for geoscience, ecology, agriculture, disaster response, land management, and industry. A variety of consumer UAS options exist as science and engineering payload platforms, but their incompatibilities with one another contribute to high operational costs compared with those of piloted aircraft. This research explores the concept of modular UAS, demonstrating airframes that can be reconfigured in the field for experimental optimization, to enable multi-mission support, facilitate rapid repair, or respond to changing field conditions. Modular UAS is revolutionary in allowing aircraft to be optimized around the payload, reversing the conventional wisdom of designing the payload to accommodate an unmodifiable aircraft. UAS that are reconfigurable like Legos™ are ideal for airborne science service providers, system integrators, instrument designers and end users to fulfill a wide range of geoscience experiments. Modular UAS facilitate the adoption of open-source software and rapid prototyping technology where design reuse is important in the context of a highly regulated industry like aerospace. The industry is now at a stage where consolidation, acquisition, and attrition will reduce the number of small manufacturers, with a reduction of innovation and motivation to reduce costs. Modularity leads to interface specifications, which can evolve into de facto or formal standards which contain minimum (but sufficient) details such that multiple vendors can then design to those standards and demonstrate interoperability. At that stage, vendor coopetition leads to robust interface standards, interoperability standards and multi-source agreements which in turn drive costs down significantly.

  8. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  9. An Open Source Low-Cost Automatic System for Image-Based 3d Digitization

    NASA Astrophysics Data System (ADS)

    Menna, F.; Nocerino, E.; Morabito, D.; Farella, E. M.; Perini, M.; Remondino, F.

    2017-11-01

    3D digitization of heritage artefacts, reverse engineering of industrial components or rapid prototyping-driven design are key topics today. Indeed, millions of archaeological finds all over the world need to be surveyed in 3D either to allow convenient investigations by researchers or because they are inaccessible to visitors and scientists or, unfortunately, because they are seriously endangered by wars and terrorist attacks. On the other hand, in case of industrial and design components there is often the need of deformation analyses or physical replicas starting from reality-based 3D digitisations. The paper is aligned with these needs and presents the realization of the ORION (arduinO Raspberry pI rOtating table for image based 3D recostructioN) prototype system, with its hardware and software components, providing critical insights about its modular design. ORION is an image-based 3D reconstruction system based on automated photogrammetric acquisitions and processing. The system is being developed under a collaborative educational project between FBK Trento, the University of Trento and internship programs with high school in the Trentino province (Italy).

  10. A modular approach to addressing model design, scale, and parameter estimation issues in distributed hydrological modelling

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Restrepo, Pedro J.; Viger, R.J.

    2002-01-01

    A modular approach to model design and construction provides a flexible framework in which to focus the multidisciplinary research and operational efforts needed to facilitate the development, selection, and application of the most robust distributed modelling methods. A variety of modular approaches have been developed, but with little consideration for compatibility among systems and concepts. Several systems are proprietary, limiting any user interaction. The US Geological Survey modular modelling system (MMS) is a modular modelling framework that uses an open source software approach to enable all members of the scientific community to address collaboratively the many complex issues associated with the design, development, and application of distributed hydrological and environmental models. Implementation of a common modular concept is not a trivial task. However, it brings the resources of a larger community to bear on the problems of distributed modelling, provides a framework in which to compare alternative modelling approaches objectively, and provides a means of sharing the latest modelling advances. The concepts and components of the MMS are described and an example application of the MMS, in a decision-support system context, is presented to demonstrate current system capabilities. Copyright ?? 2002 John Wiley and Sons, Ltd.

  11. Note: Hollow cathode lamp with integral, high optical efficiency isolation valve: a modular vacuum ultraviolet source.

    PubMed

    Roberts, F Sloan; Anderson, Scott L

    2013-12-01

    The design and operating conditions of a hollow cathode discharge lamp for the generation of vacuum ultraviolet radiation, suitable for ultrahigh vacuum (UHV) application, are described in detail. The design is easily constructed, and modular, allowing it to be adapted to different experimental requirements. A thin isolation valve is built into one of the differential pumping stages, isolating the discharge section from the UHV section, both for vacuum safety and to allow lamp maintenance without venting the UHV chamber. The lamp has been used both for ultraviolet photoelectron spectroscopy of surfaces and as a "soft" photoionization source for gas-phase mass spectrometry.

  12. NASA Tech Briefs, August 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics covered include: Stable, Thermally Conductive Fillers for Bolted Joints; Connecting to Thermocouples with Fewer Lead Wires; Zipper Connectors for Flexible Electronic Circuits; Safety Interlock for Angularly Misdirected Power Tool; Modular, Parallel Pulse-Shaping Filter Architectures; High-Fidelity Piezoelectric Audio Device; Photovoltaic Power Station with Ultracapacitors for Storage; Time Analyzer for Time Synchronization and Monitor of the Deep Space Network; Program for Computing Albedo; Integrated Software for Analyzing Designs of Launch Vehicles; Abstract-Reasoning Software for Coordinating Multiple Agents; Software Searches for Better Spacecraft-Navigation Models; Software for Partly Automated Recognition of Targets; Antistatic Polycarbonate/Copper Oxide Composite; Better VPS Fabrication of Crucibles and Furnace Cartridges; Burn-Resistant, Strong Metal-Matrix Composites; Self-Deployable Spring-Strip Booms; Explosion Welding for Hermetic Containerization; Improved Process for Fabricating Carbon Nanotube Probes; Automated Serial Sectioning for 3D Reconstruction; and Parallel Subconvolution Filtering Architectures.

  13. Stratway: A Modular Approach to Strategic Conflict Resolution

    NASA Technical Reports Server (NTRS)

    Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.

    2011-01-01

    In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.

  14. Biomechanical, Physiological, and Agility Performance of Soldiers Carrying Loads: A Comparison of the Modular Lightweight Load Carrying Equipment and a Lightning Packs, LLC, Prototype

    DTIC Science & Technology

    2016-12-27

    2015 Approved for public release; distribution is unlimited U.S. Army Natick Soldier Research, Development and Engineering Center...is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...MODULAR LIGHTWEIGHT LOAD CARRYING EQUIPMENT) HUMAN FACTORS ENGINEERING U.S. Army Natick Soldier Research, Development and Engineering Center ATTN

  15. Synthetic Aperture Imaging Polarimeter: Postprint

    DTIC Science & Technology

    2010-02-01

    mechanical design of the SAlP prototype revol .... es around the concept of a modular array. The modular aspect allows for the array to be built in...imagery of source . The top row images are of the actual fringe pattern incident on the SAlP prototype array. These pictures were taken through the...processed images associated with each of the inputs. The results demonstrated that the SAlP prototype array works in conjunction with the algorithm

  16. Modular hardware synthesis using an HDL. [Hardware Description Language

    NASA Technical Reports Server (NTRS)

    Covington, J. A.; Shiva, S. G.

    1981-01-01

    Although hardware description languages (HDL) are becoming more and more necessary to automated design systems, their application is complicated due to the difficulty in translating the HDL description into an implementable format, nonfamiliarity of hardware designers with high-level language programming, nonuniform design methodologies and the time and costs involved in transfering HDL design software. Digital design language (DDL) suffers from all of the above problems and in addition can only by synthesized on a complete system and not on its subparts, making it unsuitable for synthesis using standard modules or prefabricated chips such as those required in LSI or VLSI circuits. The present paper presents a method by which the DDL translator can be made to generate modular equations that will allow the system to be synthesized as an interconnection of lower-level modules. The method involves the introduction of a new language construct called a Module which provides for the separate translation of all equations bounded by it.

  17. The Rotary Zone Thermal Cycler: A Low-Power System Enabling Automated Rapid PCR

    PubMed Central

    Bartsch, Michael S.; Renzi, Ronald F.; Van de Vreugde, James L.; Kim, Hanyoup; Knight, Daniel L.; Sinha, Anupama; Branda, Steven S.; Patel, Kamlesh D.

    2015-01-01

    Advances in molecular biology, microfluidics, and laboratory automation continue to expand the accessibility and applicability of these methods beyond the confines of conventional, centralized laboratory facilities and into point of use roles in clinical, military, forensic, and field-deployed applications. As a result, there is a growing need to adapt the unit operations of molecular biology (e.g., aliquoting, centrifuging, mixing, and thermal cycling) to compact, portable, low-power, and automation-ready formats. Here we present one such adaptation, the rotary zone thermal cycler (RZTC), a novel wheel-based device capable of cycling up to four different fixed-temperature blocks into contact with a stationary 4-microliter capillary-bound sample to realize 1-3 second transitions with steady state heater power of less than 10 W. We demonstrate the utility of the RZTC for DNA amplification as part of a highly integrated rotary zone PCR (rzPCR) system that uses low-volume valves and syringe-based fluid handling to automate sample loading and unloading, thermal cycling, and between-run cleaning functionalities in a compact, modular form factor. In addition to characterizing the performance of the RZTC and the efficacy of different online cleaning protocols, we present preliminary results for rapid single-plex PCR, multiplex short tandem repeat (STR) amplification, and second strand cDNA synthesis. PMID:25826708

  18. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    NASA Technical Reports Server (NTRS)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  19. Note: Hollow cathode lamp with integral, high optical efficiency isolation valve: A modular vacuum ultraviolet source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sloan Roberts, F.; Anderson, Scott L.

    2013-12-15

    The design and operating conditions of a hollow cathode discharge lamp for the generation of vacuum ultraviolet radiation, suitable for ultrahigh vacuum (UHV) application, are described in detail. The design is easily constructed, and modular, allowing it to be adapted to different experimental requirements. A thin isolation valve is built into one of the differential pumping stages, isolating the discharge section from the UHV section, both for vacuum safety and to allow lamp maintenance without venting the UHV chamber. The lamp has been used both for ultraviolet photoelectron spectroscopy of surfaces and as a “soft” photoionization source for gas-phase massmore » spectrometry.« less

  20. Comparing clinical automated, medical record, and hybrid data sources for diabetes quality measures.

    PubMed

    Kerr, Eve A; Smith, Dylan M; Hogan, Mary M; Krein, Sarah L; Pogach, Leonard; Hofer, Timothy P; Hayward, Rodney A

    2002-10-01

    Little is known about the relative reliability of medical record and clinical automated data, sources commonly used to assess diabetes quality of care. The agreement between diabetes quality measures constructed from clinical automated versus medical record data sources was compared, and the performance of hybrid measures derived from a combination of the two data sources was examined. Medical records were abstracted for 1,032 patients with diabetes who received care from 21 facilities in 4 Veterans Integrated Service Networks. Automated data were obtained from a central Veterans Health Administration diabetes registry containing information on laboratory tests and medication use. Success rates were higher for process measures derived from medical record data than from automated data, but no substantial differences among data sources were found for the intermediate outcome measures. Agreement for measures derived from the medical record compared with automated data was moderate for process measures but high for intermediate outcome measures. Hybrid measures yielded success rates similar to those of medical record-based measures but would have required about 50% fewer chart reviews. Agreement between medical record and automated data was generally high. Yet even in an integrated health care system with sophisticated information technology, automated data tended to underestimate the success rate in technical process measures for diabetes care and yielded different quartile performance rankings for facilities. Applying hybrid methodology yielded results consistent with the medical record but required less data to come from medical record reviews.

  1. Forward Modeling of Large-scale Structure: An Open-source Approach with Halotools

    NASA Astrophysics Data System (ADS)

    Hearin, Andrew P.; Campbell, Duncan; Tollerud, Erik; Behroozi, Peter; Diemer, Benedikt; Goldbaum, Nathan J.; Jennings, Elise; Leauthaud, Alexie; Mao, Yao-Yuan; More, Surhud; Parejko, John; Sinha, Manodeep; Sipöcz, Brigitta; Zentner, Andrew

    2017-11-01

    We present the first stable release of Halotools (v0.2), a community-driven Python package designed to build and test models of the galaxy-halo connection. Halotools provides a modular platform for creating mock universes of galaxies starting from a catalog of dark matter halos obtained from a cosmological simulation. The package supports many of the common forms used to describe galaxy-halo models: the halo occupation distribution, the conditional luminosity function, abundance matching, and alternatives to these models that include effects such as environmental quenching or variable galaxy assembly bias. Satellite galaxies can be modeled to live in subhalos or to follow custom number density profiles within their halos, including spatial and/or velocity bias with respect to the dark matter profile. The package has an optimized toolkit to make mock observations on a synthetic galaxy population—including galaxy clustering, galaxy-galaxy lensing, galaxy group identification, RSD multipoles, void statistics, pairwise velocities and others—allowing direct comparison to observations. Halotools is object-oriented, enabling complex models to be built from a set of simple, interchangeable components, including those of your own creation. Halotools has an automated testing suite and is exhaustively documented on http://halotools.readthedocs.io, which includes quickstart guides, source code notes and a large collection of tutorials. The documentation is effectively an online textbook on how to build and study empirical models of galaxy formation with Python.

  2. A Demonstration Test of the Modular Automated Weather System (MAWS)

    DTIC Science & Technology

    1980-03-24

    Through the Atmosphere, University of -- Toronto Press, Chaps 6-7. 21 4-I where ET s illuminance threshold, I is light intensity , Vr is visual range, and a... intensity per unit distance and is expressed 1l 7 -1as cd mi . Douglas and Booker found a value of 0. 084 cd mi for S corre- sponded to a light ...threshold of 0.055. However if the calculated RVR was less than 1200 m or it was night, Allard’s Law was applied using a light intensity of 10, 000 cd

  3. Generative Representations for the Automated Design of Modular Physical Robots

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.; Lipson, Hod; Pollack, Jordan B.

    2003-01-01

    We will begin with a brief background of evolutionary robotics and related work, and demonstrate the scaling problem with our own prior results. Next we propose the use of an evolved generative representation as opposed to a non-generative representation. We describe this representation in detail as well as the evolutionary process that uses it. We then compare progress of evolved robots with and without the use of the grammar, and quantify the obtained advantage. Working two- dimensional and three-dimensional physical robots produced by the system are shown.

  4. IEEE/AIAA/NASA Digital Avionics Systems Conference, 9th, Virginia Beach, VA, Oct. 15-18, 1990, Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The present conference on digital avionics discusses vehicle-management systems, spacecraft avionics, special vehicle avionics, communication/navigation/identification systems, software qualification and quality assurance, launch-vehicle avionics, Ada applications, sensor and signal processing, general aviation avionics, automated software development, design-for-testability techniques, and avionics-software engineering. Also discussed are optical technology and systems, modular avionics, fault-tolerant avionics, commercial avionics, space systems, data buses, crew-station technology, embedded processors and operating systems, AI and expert systems, data links, and pilot/vehicle interfaces.

  5. Continuous system modeling

    NASA Technical Reports Server (NTRS)

    Cellier, Francois E.

    1991-01-01

    A comprehensive and systematic introduction is presented for the concepts associated with 'modeling', involving the transition from a physical system down to an abstract description of that system in the form of a set of differential and/or difference equations, and basing its treatment of modeling on the mathematics of dynamical systems. Attention is given to the principles of passive electrical circuit modeling, planar mechanical systems modeling, hierarchical modular modeling of continuous systems, and bond-graph modeling. Also discussed are modeling in equilibrium thermodynamics, population dynamics, and system dynamics, inductive reasoning, artificial neural networks, and automated model synthesis.

  6. Data Integration and Mining for Synthetic Biology Design.

    PubMed

    Mısırlı, Göksel; Hallinan, Jennifer; Pocock, Matthew; Lord, Phillip; McLaughlin, James Alastair; Sauro, Herbert; Wipat, Anil

    2016-10-21

    One aim of synthetic biologists is to create novel and predictable biological systems from simpler modular parts. This approach is currently hampered by a lack of well-defined and characterized parts and devices. However, there is a wealth of existing biological information, which can be used to identify and characterize biological parts, and their design constraints in the literature and numerous biological databases. However, this information is spread among these databases in many different formats. New computational approaches are required to make this information available in an integrated format that is more amenable to data mining. A tried and tested approach to this problem is to map disparate data sources into a single data set, with common syntax and semantics, to produce a data warehouse or knowledge base. Ontologies have been used extensively in the life sciences, providing this common syntax and semantics as a model for a given biological domain, in a fashion that is amenable to computational analysis and reasoning. Here, we present an ontology for applications in synthetic biology design, SyBiOnt, which facilitates the modeling of information about biological parts and their relationships. SyBiOnt was used to create the SyBiOntKB knowledge base, incorporating and building upon existing life sciences ontologies and standards. The reasoning capabilities of ontologies were then applied to automate the mining of biological parts from this knowledge base. We propose that this approach will be useful to speed up synthetic biology design and ultimately help facilitate the automation of the biological engineering life cycle.

  7. An Automated End-To Multi-Agent Qos Based Architecture for Selection of Geospatial Web Services

    NASA Astrophysics Data System (ADS)

    Shah, M.; Verma, Y.; Nandakumar, R.

    2012-07-01

    Over the past decade, Service-Oriented Architecture (SOA) and Web services have gained wide popularity and acceptance from researchers and industries all over the world. SOA makes it easy to build business applications with common services, and it provides like: reduced integration expense, better asset reuse, higher business agility, and reduction of business risk. Building of framework for acquiring useful geospatial information for potential users is a crucial problem faced by the GIS domain. Geospatial Web services solve this problem. With the help of web service technology, geospatial web services can provide useful geospatial information to potential users in a better way than traditional geographic information system (GIS). A geospatial Web service is a modular application designed to enable the discovery, access, and chaining of geospatial information and services across the web that are often both computation and data-intensive that involve diverse sources of data and complex processing functions. With the proliferation of web services published over the internet, multiple web services may provide similar functionality, but with different non-functional properties. Thus, Quality of Service (QoS) offers a metric to differentiate the services and their service providers. In a quality-driven selection of web services, it is important to consider non-functional properties of the web service so as to satisfy the constraints or requirements of the end users. The main intent of this paper is to build an automated end-to-end multi-agent based solution to provide the best-fit web service to service requester based on QoS.

  8. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation

    PubMed Central

    2013-01-01

    The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening. PMID:23938087

  9. CellSegm - a MATLAB toolbox for high-throughput 3D cell segmentation.

    PubMed

    Hodneland, Erlend; Kögel, Tanja; Frei, Dominik Michael; Gerdes, Hans-Hermann; Lundervold, Arvid

    2013-08-09

    : The application of fluorescence microscopy in cell biology often generates a huge amount of imaging data. Automated whole cell segmentation of such data enables the detection and analysis of individual cells, where a manual delineation is often time consuming, or practically not feasible. Furthermore, compared to manual analysis, automation normally has a higher degree of reproducibility. CellSegm, the software presented in this work, is a Matlab based command line software toolbox providing an automated whole cell segmentation of images showing surface stained cells, acquired by fluorescence microscopy. It has options for both fully automated and semi-automated cell segmentation. Major algorithmic steps are: (i) smoothing, (ii) Hessian-based ridge enhancement, (iii) marker-controlled watershed segmentation, and (iv) feature-based classfication of cell candidates. Using a wide selection of image recordings and code snippets, we demonstrate that CellSegm has the ability to detect various types of surface stained cells in 3D. After detection and outlining of individual cells, the cell candidates can be subject to software based analysis, specified and programmed by the end-user, or they can be analyzed by other software tools. A segmentation of tissue samples with appropriate characteristics is also shown to be resolvable in CellSegm. The command-line interface of CellSegm facilitates scripting of the separate tools, all implemented in Matlab, offering a high degree of flexibility and tailored workflows for the end-user. The modularity and scripting capabilities of CellSegm enable automated workflows and quantitative analysis of microscopic data, suited for high-throughput image based screening.

  10. A geothermal AMTEC system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuller, M.J.; LeMire, R.A.; Horner-Richardson, K.

    1995-12-31

    The Phillips Laboratory Power and Thermal Management Division (PL/VTP), with the support of ORION International Technologies, is investigating new methods of advanced thermal to electric power conversion for space and terrestrial applications. The alkali metal thermal-to-electric converter (AMTEC), manufactured primarily by Advanced Modular Power Systems (AMPS) of Ann Arbor, MI, has reached a level of technological maturity which would allow its use in a constant, unattended thermal source, such as a geothermal field. Approximately 95,000 square miles in the western United States has hot dry rock with thermal gradients of 60 C/km and higher. Several places in the United Statesmore » and the world have thermal gradients of 500 C/km. Such heat sources represent an excellent thermal source for a system of modular power units using AMTEC devices to convert the heat to electricity. AMTEC cells using sodium as a working fluid require heat input at temperatures between 500 and 1,000 C to generate power. The present state of the art is capable of 15% efficiency with 800 C heat input and has demonstrated 18% efficiency for single cells. This paper discusses the basics of AMTEC operation, current drilling technology as a cost driver, design of modular AMTEC power units, heat rejection technologies, materials considerations, and estimates of power production from a geothermal AMTEC concept.« less

  11. Managing complexity in simulations of land surface and near-surface processes

    DOE PAGES

    Coon, Ethan T.; Moulton, J. David; Painter, Scott L.

    2016-01-12

    Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less

  12. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  13. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    PubMed

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  14. SBROME: a scalable optimization and module matching framework for automated biosystems design.

    PubMed

    Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias

    2013-05-17

    The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.

  15. An overview of suite for automated global electronic biosurveillance (SAGES)

    NASA Astrophysics Data System (ADS)

    Lewis, Sheri L.; Feighner, Brian H.; Loschen, Wayne A.; Wojcik, Richard A.; Skora, Joseph F.; Coberly, Jacqueline S.; Blazes, David L.

    2012-06-01

    Public health surveillance is undergoing a revolution driven by advances in the field of information technology. Many countries have experienced vast improvements in the collection, ingestion, analysis, visualization, and dissemination of public health data. Resource-limited countries have lagged behind due to challenges in information technology infrastructure, public health resources, and the costs of proprietary software. The Suite for Automated Global Electronic bioSurveillance (SAGES) is a collection of modular, flexible, freely-available software tools for electronic disease surveillance in resource-limited settings. One or more SAGES tools may be used in concert with existing surveillance applications or the SAGES tools may be used en masse for an end-to-end biosurveillance capability. This flexibility allows for the development of an inexpensive, customized, and sustainable disease surveillance system. The ability to rapidly assess anomalous disease activity may lead to more efficient use of limited resources and better compliance with World Health Organization International Health Regulations.

  16. Using Generative Representations to Evolve Robots. Chapter 1

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    Recent research has demonstrated the ability of evolutionary algorithms to automatically design both the physical structure and software controller of real physical robots. One of the challenges for these automated design systems is to improve their ability to scale to the high complexities found in real-world problems. Here we claim that for automated design systems to scale in complexity they must use a representation which allows for the hierarchical creation and reuse of modules, which we call a generative representation. Not only is the ability to reuse modules necessary for functional scalability, but it is also valuable for improving efficiency in testing and construction. We then describe an evolutionary design system with a generative representation capable of hierarchical modularity and demonstrate it for the design of locomoting robots in simulation. Finally, results from our experiments show that evolution with our generative representation produces better robots than those evolved with a non-generative representation.

  17. Advances in X-Ray Simulator Technology

    DTIC Science & Technology

    1995-07-01

    d’Etudes de Gramat ; I. Vitkovitsky, Logicon RDA INTRODUCTION DNA’s future x-ray simulators are based upon inductive energy storage, a technology which...switch. SYRINX, a proposed design to be built by the Centre d’Etudes de Gramat (CEG) in France would employ a modular approach, possibly with a...called SYRINX, would be built at the Centred’ Etudes de Gramat (CEG). It would employ a modular.long conduction time current source to drive a PRS

  18. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  19. Photonomics: automation approaches yield economic aikido for photonics device manufacture

    NASA Astrophysics Data System (ADS)

    Jordan, Scott

    2002-09-01

    In the glory days of photonics, with exponentiating demand for photonics devices came exponentiating competition, with new ventures commencing deliveries seemingly weekly. Suddenly the industry was faced with a commodity marketplace well before a commodity cost structure was in place. Economic issues like cost, scalability, yield-call it all "Photonomics" -now drive the industry. Automation and throughput-optimization are obvious answers, but until now, suitable modular tools had not been introduced. Available solutions were barely compatible with typical transverse alignment tolerances and could not automate angular alignments of collimated devices and arrays. And settling physics served as the insoluble bottleneck to throughput and resolution advancement in packaging, characterization and fabrication processes. The industry has addressed these needs in several ways, ranging from special configurations of catalog motion devices to integrated microrobots based on a novel mini-hexapod configuration. This intriguing approach allows tip/tilt alignments to be automated about any point in space, such as a beam waist, a focal point, the cleaved face of a fiber, or the optical axis of a waveguide- ideal for MEMS packaging automation and array alignment. Meanwhile, patented new low-cost settling-enhancement technology has been applied in applications ranging from air-bearing long-travel stages to subnanometer-resolution piezo positioners to advance resolution and process cycle-times in sensitive applications such as optical coupling characterization and fiber Bragg grating generation. Background, examples and metrics are discussed, providing an up-to-date industry overview of available solutions.

  20. Next Generation Sequence Assembly with AMOS

    PubMed Central

    Treangen, Todd J; Sommer, Dan D; Angly, Florent E; Koren, Sergey; Pop, Mihai

    2011-01-01

    A Modular Open-Source Assembler (AMOS) was designed to offer a modular approach to genome assembly. AMOS includes a wide range of tools for assembly, including lightweight de novo assemblers Minimus and Minimo, and Bambus 2, a robust scaffolder able to handle metagenomic and polymorphic data. This protocol describes how to configure and use AMOS for the assembly of Next Generation sequence data. Additionally, we provide three tutorial examples that include bacterial, viral, and metagenomic datasets with specific tips for improving assembly quality. PMID:21400694

  1. Reconfigurable Software for Mission Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2014-01-01

    We developed software that provides flexibility to mission organizations through modularity and composability. Modularity enables removal and addition of functionality through the installation of plug-ins. Composability enables users to assemble software from pre-built reusable objects, thus reducing or eliminating the walls associated with traditional application architectures and enabling unique combinations of functionality. We have used composable objects to reduce display build time, create workflows, and build scenarios to test concepts for lunar roving operations. The software is open source, and may be downloaded from https:github.comnasamct.

  2. 40 CFR 461.2 - General definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... STANDARDS (CONTINUED) BATTERY MANUFACTURING POINT SOURCE CATEGORY General Provisions § 461.2 General... this part: (a) “Battery” means a modular electric power source where part or all of the fuel is... and a battery. (b) “Battery manufacturing operations” means all of the specific processes used to...

  3. 40 CFR 461.2 - General definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... STANDARDS (CONTINUED) BATTERY MANUFACTURING POINT SOURCE CATEGORY General Provisions § 461.2 General... this part: (a) “Battery” means a modular electric power source where part or all of the fuel is... and a battery. (b) “Battery manufacturing operations” means all of the specific processes used to...

  4. 40 CFR 461.2 - General definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... STANDARDS (CONTINUED) BATTERY MANUFACTURING POINT SOURCE CATEGORY General Provisions § 461.2 General... this part: (a) “Battery” means a modular electric power source where part or all of the fuel is... and a battery. (b) “Battery manufacturing operations” means all of the specific processes used to...

  5. @Note: a workbench for biomedical text mining.

    PubMed

    Lourenço, Anália; Carreira, Rafael; Carneiro, Sónia; Maia, Paulo; Glez-Peña, Daniel; Fdez-Riverola, Florentino; Ferreira, Eugénio C; Rocha, Isabel; Rocha, Miguel

    2009-08-01

    Biomedical Text Mining (BioTM) is providing valuable approaches to the automated curation of scientific literature. However, most efforts have addressed the benchmarking of new algorithms rather than user operational needs. Bridging the gap between BioTM researchers and biologists' needs is crucial to solve real-world problems and promote further research. We present @Note, a platform for BioTM that aims at the effective translation of the advances between three distinct classes of users: biologists, text miners and software developers. Its main functional contributions are the ability to process abstracts and full-texts; an information retrieval module enabling PubMed search and journal crawling; a pre-processing module with PDF-to-text conversion, tokenisation and stopword removal; a semantic annotation schema; a lexicon-based annotator; a user-friendly annotation view that allows to correct annotations and a Text Mining Module supporting dataset preparation and algorithm evaluation. @Note improves the interoperability, modularity and flexibility when integrating in-home and open-source third-party components. Its component-based architecture allows the rapid development of new applications, emphasizing the principles of transparency and simplicity of use. Although it is still on-going, it has already allowed the development of applications that are currently being used.

  6. Flexible data registration and automation in semiconductor production

    NASA Astrophysics Data System (ADS)

    Dudde, Ralf; Staudt-Fischbach, Peter; Kraemer, Benedict

    1997-08-01

    The need for cost reduction and flexibility in semiconductor production will result in a wider application of computer based automation systems. With the setup of a new and advanced CMOS semiconductor line in the Fraunhofer Institute for Silicon Technology [ISIT, Itzehoe (D)] a new line information system (LIS) was introduced based on an advanced model for the underlying data structure. This data model was implemented into an ORACLE-RDBMS. A cellworks based system (JOSIS) was used for the integration of the production equipment, communication and automated database bookings and information retrievals. During the ramp up of the production line this new system is used for the fab control. The data model and the cellworks based system integration is explained. This system enables an on-line overview of the work in progress in the fab, lot order history and equipment status and history. Based on this figures improved production and cost monitoring and optimization is possible. First examples of the information gained by this system are presented. The modular set-up of the LIS system will allow easy data exchange with additional software tools like scheduler, different fab control systems like PROMIS and accounting systems like SAP. Modifications necessary for the integration of PROMIS are described.

  7. Automated home cage observations as a tool to measure the effects of wheel running on cage floor locomotion.

    PubMed

    de Visser, Leonie; van den Bos, Ruud; Spruijt, Berry M

    2005-05-28

    This paper introduces automated observations in a modular home cage system as a tool to measure the effects of wheel running on the time distribution and daily organization of cage floor locomotor activity in female C57BL/6 mice. Mice (n = 16) were placed in the home cage system for 6 consecutive days. Fifty percent of the subjects had free access to a running wheel that was integrated in the home cage. Overall activity levels in terms of duration of movement were increased by wheel running, while time spent inside a sheltering box was decreased. Wheel running affected the hourly pattern of movement during the animals' active period of the day. Mice without a running wheel, in contrast to mice with a running wheel, showed a clear differentiation between novelty-induced and baseline levels of locomotion as reflected by a decrease after the first day of introduction to the home cage. The results are discussed in the light of the use of running wheels as a tool to measure general activity and as an object for environmental enrichment. Furthermore, the possibilities of using automated home cage observations for e.g. behavioural phenotyping are discussed.

  8. 40 CFR 461.2 - General definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... STANDARDS BATTERY MANUFACTURING POINT SOURCE CATEGORY General Provisions § 461.2 General definitions. In...) “Battery” means a modular electric power source where part or all of the fuel is contained within the unit... heat cycle engine. In this regulation there is no differentiation between a single cell and a battery...

  9. 40 CFR 461.2 - General definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... STANDARDS BATTERY MANUFACTURING POINT SOURCE CATEGORY General Provisions § 461.2 General definitions. In...) “Battery” means a modular electric power source where part or all of the fuel is contained within the unit... heat cycle engine. In this regulation there is no differentiation between a single cell and a battery...

  10. The rotary zone thermal cycler: A low-power system enabling automated rapid PCR

    DOE PAGES

    Bartsch, Michael S.; Edwards, Harrison S.; Gas Transmission Systems, Walnut Creek, CA; ...

    2015-03-31

    In this study, advances in molecular biology, microfluidics, and laboratory automation continue to expand the accessibility and applicability of these methods beyond the confines of conventional, centralized laboratory facilities and into point of use roles in clinical, military, forensic, portable, and field-deployed applications. As a result, there is a growing need to adapt the unit operations of molecular biology such as aliquoting, centrifuging, mixing, and thermal cycling to compact, portable, low-power, and automation-ready formats. Here we present one such adaptation, the rotary zone thermal cycler (RZTC), a novel wheel-based device capable of cycling up to four different fixed-temperature blocks intomore » contact with a stationary 4-microliter capillary-bound sample to realize 1-3 second transitions with steady state heater power of less than 10 W. We further demonstrate the utility of the RZTC for DNA amplification as part of a highly integrated rotary zone PCR (rzPCR) system using low-volume valves and syringe-based fluid handling to automate sample loading and unloading, thermal cycling, and between run cleaning functionalities in a compact, modular form factor. In addition to characterizing the performance of the RZTC and the efficacy of different online cleaning protocols, preliminary results are presented for rapid single-plex PCR, multiplex short tandem repeat (STR) amplification, and second strand cDNA synthesis.« less

  11. The rotary zone thermal cycler: A low-power system enabling automated rapid PCR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartsch, Michael S.; Edwards, Harrison S.; Gas Transmission Systems, Walnut Creek, CA

    In this study, advances in molecular biology, microfluidics, and laboratory automation continue to expand the accessibility and applicability of these methods beyond the confines of conventional, centralized laboratory facilities and into point of use roles in clinical, military, forensic, portable, and field-deployed applications. As a result, there is a growing need to adapt the unit operations of molecular biology such as aliquoting, centrifuging, mixing, and thermal cycling to compact, portable, low-power, and automation-ready formats. Here we present one such adaptation, the rotary zone thermal cycler (RZTC), a novel wheel-based device capable of cycling up to four different fixed-temperature blocks intomore » contact with a stationary 4-microliter capillary-bound sample to realize 1-3 second transitions with steady state heater power of less than 10 W. We further demonstrate the utility of the RZTC for DNA amplification as part of a highly integrated rotary zone PCR (rzPCR) system using low-volume valves and syringe-based fluid handling to automate sample loading and unloading, thermal cycling, and between run cleaning functionalities in a compact, modular form factor. In addition to characterizing the performance of the RZTC and the efficacy of different online cleaning protocols, preliminary results are presented for rapid single-plex PCR, multiplex short tandem repeat (STR) amplification, and second strand cDNA synthesis.« less

  12. The MyoRobot: A novel automated biomechatronics system to assess voltage/Ca2+ biosensors and active/passive biomechanics in muscle and biomaterials.

    PubMed

    Haug, M; Reischl, B; Prölß, G; Pollmann, C; Buckert, T; Keidel, C; Schürmann, S; Hock, M; Rupitsch, S; Heckel, M; Pöschel, T; Scheibel, T; Haynl, C; Kiriaev, L; Head, S I; Friedrich, O

    2018-04-15

    We engineered an automated biomechatronics system, MyoRobot, for robust objective and versatile assessment of muscle or polymer materials (bio-)mechanics. It covers multiple levels of muscle biosensor assessment, e.g. membrane voltage or contractile apparatus Ca 2+ ion responses (force resolution 1µN, 0-10mN for the given sensor; [Ca 2+ ] range ~ 100nM-25µM). It replaces previously tedious manual protocols to obtain exhaustive information on active/passive biomechanical properties across various morphological tissue levels. Deciphering mechanisms of muscle weakness requires sophisticated force protocols, dissecting contributions from altered Ca 2+ homeostasis, electro-chemical, chemico-mechanical biosensors or visco-elastic components. From whole organ to single fibre levels, experimental demands and hardware requirements increase, limiting biomechanics research potential, as reflected by only few commercial biomechatronics systems that can address resolution, experimental versatility and mostly, automation of force recordings. Our MyoRobot combines optical force transducer technology with high precision 3D actuation (e.g. voice coil, 1µm encoder resolution; stepper motors, 4µm feed motion), and customized control software, enabling modular experimentation packages and automated data pre-analysis. In small bundles and single muscle fibres, we demonstrate automated recordings of (i) caffeine-induced-, (ii) electrical field stimulation (EFS)-induced force, (iii) pCa-force, (iv) slack-tests and (v) passive length-tension curves. The system easily reproduces results from manual systems (two times larger stiffness in slow over fast muscle) and provides novel insights into unloaded shortening velocities (declining with increasing slack lengths). The MyoRobot enables automated complex biomechanics assessment in muscle research. Applications also extend to material sciences, exemplarily shown here for spider silk and collagen biopolymers. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Modular radiochemistry synthesis system

    DOEpatents

    Satyamurthy, Nagichettiar; Barrio, Jorge R.; Amarasekera, Bernard; Van Dam, Michael R.; Olma, Sebastian; Williams, Dirk; Eddings, Mark; Shen, Clifton Kwang-Fu

    2016-11-01

    A modular chemical production system includes multiple modules for performing a chemical reaction, particularly of radiochemical compounds, from a remote location. One embodiment comprises a reaction vessel including a moveable heat source with the position thereof relative to the reaction vessel being controllable from a remote position. Alternatively the heat source may be fixed in location and the reaction vial is moveable into and out of the heat source. The reaction vessel has one or more sealing plugs, the positioning of which in relationship to the reaction vessel is controllable from a remote position. Also the one or more reaction vessel sealing plugs can include one or more conduits there through for delivery of reactants, gases at atmospheric or an elevated pressure, inert gases, drawing a vacuum and removal of reaction end products to and from the reaction vial, the reaction vial with sealing plug in position being operable at elevated pressures. The modular chemical production system is assembled from modules which can each include operating condition sensors and controllers configured for monitoring and controlling the individual modules and the assembled system from a remote position. Other modules include, but are not limited to a Reagent Storage and Delivery Module, a Cartridge Purification Module, a Microwave Reaction Module, an External QC/Analysis/Purification Interface Module, an Aliquotting Module, an F-18 Drying Module, a Concentration Module, a Radiation Counting Module, and a Capillary Reactor Module.

  14. Path to Market for Compact Modular Fusion Power Cores

    NASA Astrophysics Data System (ADS)

    Woodruff, Simon; Baerny, Jennifer K.; Mattor, Nathan; Stoulil, Don; Miller, Ronald; Marston, Theodore

    2012-08-01

    The benefits of an energy source whose reactants are plentiful and whose products are benign is hard to measure, but at no time in history has this energy source been more needed. Nuclear fusion continues to promise to be this energy source. However, the path to market for fusion systems is still regularly a matter for long-term (20 + year) plans. This white paper is intended to stimulate discussion of faster commercialization paths, distilling guidance from investors, utilities, and the wider energy research community (including from ARPA-E). There is great interest in a small modular fusion system that can be developed quickly and inexpensively. A simple model shows how compact modular fusion can produce a low cost development path by optimizing traditional systems that burn deuterium and tritium, operating not only at high magnetic field strength, but also by omitting some components that allow for the core to become more compact and easier to maintain. The dominant hurdles to the development of low cost, practical fusion systems are discussed, primarily in terms of the constraints placed on the cost of development stages in the private sector. The main finding presented here is that the bridge from DOE Office of Science to the energy market can come at the Proof of Principle development stage, providing the concept is sufficiently compact and inexpensive that its development allows for a normal technology commercialization path.

  15. Modular radiochemistry synthesis system

    DOEpatents

    Satyamurthy, Nagichettiar; Barrio, Jorge R.; Amarasekera, Bernard; Van Dam, R. Michael; Olma, Sebastian; Williams, Dirk; Eddings, Mark; Shen, Clifton Kwang-Fu

    2015-12-15

    A modular chemical production system includes multiple modules for performing a chemical reaction, particularly of radiochemical compounds, from a remote location. One embodiment comprises a reaction vessel including a moveable heat source with the position thereof relative to the reaction vessel being controllable from a remote position. Alternatively the heat source may be fixed in location and the reaction vial is moveable into and out of the heat source. The reaction vessel has one or more sealing plugs, the positioning of which in relationship to the reaction vessel is controllable from a remote position. Also the one or more reaction vessel sealing plugs can include one or more conduits there through for delivery of reactants, gases at atmospheric or an elevated pressure, inert gases, drawing a vacuum and removal of reaction end products to and from the reaction vial, the reaction vial with sealing plug in position being operable at elevated pressures. The modular chemical production system is assembled from modules which can each include operating condition sensors and controllers configured for monitoring and controlling the individual modules and the assembled system from a remote position. Other modules include, but are not limited to a Reagent Storage and Delivery Module, a Cartridge Purification Module, a Microwave Reaction Module, an External QC/Analysis/Purification Interface Module, an Aliquotting Module, an F-18 Drying Module, a Concentration Module, a Radiation Counting Module, and a Capillary Reactor Module.

  16. Modular radiochemistry synthesis system

    DOEpatents

    Satyamurthy, Nagichettiar; Barrio, Jorge R; Amarasekera, Bernard; Van Dam, R. Michael; Olma, Sebastian; Williams, Dirk; Eddings, Mark A; Shen, Clifton Kwang-Fu

    2015-02-10

    A modular chemical production system includes multiple modules for performing a chemical reaction, particularly of radiochemical compounds, from a remote location. One embodiment comprises a reaction vessel including a moveable heat source with the position thereof relative to the reaction vessel being controllable from a remote position. Alternatively the heat source may be fixed in location and the reaction vial is moveable into and out of the heat source. The reaction vessel has one or more sealing plugs, the positioning of which in relationship to the reaction vessel is controllable from a remote position. Also the one or more reaction vessel sealing plugs can include one or more conduits there through for delivery of reactants, gases at atmospheric or an elevated pressure, inert gases, drawing a vacuum and removal of reaction end products to and from the reaction vial, the reaction vial with sealing plug in position being operable at elevated pressures. The modular chemical production system is assembled from modules which can each include operating condition sensors and controllers configured for monitoring and controlling the individual modules and the assembled system from a remote position. Other modules include, but are not limited to a Reagent Storage and Delivery Module, a Cartridge Purification Module, a Microwave Reaction Module, an External QC/Analysis/Purification Interface Module, an Aliquotting Module, an F-18 Drying Module, a Concentration Module, a Radiation Counting Module, and a Capillary Reactor Module.

  17. Development and comparison in uncertainty assessment based Bayesian modularization method in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn

    2013-04-01

    SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.

  18. A cost and utility analysis of NIM/CAMAC standards and equipment for shuttle payload data acquisition and control systems. Volume 2: Tasks 1 and 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A representative set of payloads for both science and applications disciplines were selected that would ensure a realistic and statistically significant estimate of equipment utilization. The selected payloads were analyzed to determine the applicability of Nuclear Instrumentation Modular (NIM)/Computer Automated Measurement Control (CAMAC) equipment in satisfying their data acquisition and control requirements. The analyses results were combined with the comparable results from related studies to arrive at an overall assessment of the applicability and commonality of NIM/CAMAC equipment usage across the spectrum of payloads.

  19. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  20. Automated feature extraction and classification from image sources

    USGS Publications Warehouse

    ,

    1995-01-01

    The U.S. Department of the Interior, U.S. Geological Survey (USGS), and Unisys Corporation have completed a cooperative research and development agreement (CRADA) to explore automated feature extraction and classification from image sources. The CRADA helped the USGS define the spectral and spatial resolution characteristics of airborne and satellite imaging sensors necessary to meet base cartographic and land use and land cover feature classification requirements and help develop future automated geographic and cartographic data production capabilities. The USGS is seeking a new commercial partner to continue automated feature extraction and classification research and development.

  1. A review on automated sorting of source-separated municipal solid waste for recycling.

    PubMed

    Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul

    2017-02-01

    A crucial prerequisite for recycling forming an integral part of municipal solid waste (MSW) management is sorting of useful materials from source-separated MSW. Researchers have been exploring automated sorting techniques to improve the overall efficiency of recycling process. This paper reviews recent advances in physical processes, sensors, and actuators used as well as control and autonomy related issues in the area of automated sorting and recycling of source-separated MSW. We believe that this paper will provide a comprehensive overview of the state of the art and will help future system designers in the area. In this paper, we also present research challenges in the field of automated waste sorting and recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Personal Electronic Devices and the ISR Data Explosion: The Impact of Cyber Cameras on the Intelligence Community

    DTIC Science & Technology

    2015-06-01

    ground.aspx?p=1 Texas Tech Security Group, “Automated Open Source Intelligence ( OSINT ) Using APIs.” RaiderSec, Sunday 30 December 2012, http...Open Source Intelligence ( OSINT ) Using APIs,” RaiderSec, Sunday 30 December 2012, http://raidersec.blogspot.com/2012/12/automated-open- source

  3. SCALE: A modular code system for performing standardized computer analyses for licensing evaluation. Functional modules F1--F8 -- Volume 2, Part 1, Revision 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, N.M.; Petrie, L.M.; Westfall, R.M.

    SCALE--a modular code system for Standardized Computer Analyses Licensing Evaluation--has been developed by Oak Ridge National Laboratory at the request of the US Nuclear Regulatory Commission. The SCALE system utilizes well-established computer codes and methods within standard analysis sequences that (1) allow an input format designed for the occasional user and/or novice, (2) automate the data processing and coupling between modules, and (3) provide accurate and reliable results. System development has been directed at problem-dependent cross-section processing and analysis of criticality safety, shielding, heat transfer, and depletion/decay problems. Since the initial release of SCALE in 1980, the code system hasmore » been heavily used for evaluation of nuclear fuel facility and package designs. This revision documents Version 4.2 of the system. The manual is divided into three volumes: Volume 1--for the control module documentation; Volume 2--for functional module documentation; and Volume 3--for documentation of the data libraries and subroutine libraries.« less

  4. Modular integration of electronics and microfluidic systems using flexible printed circuit boards.

    PubMed

    Wu, Amy; Wang, Lisen; Jensen, Erik; Mathies, Richard; Boser, Bernhard

    2010-02-21

    Microfluidic systems offer an attractive alternative to conventional wet chemical methods with benefits including reduced sample and reagent volumes, shorter reaction times, high-throughput, automation, and low cost. However, most present microfluidic systems rely on external means to analyze reaction products. This substantially adds to the size, complexity, and cost of the overall system. Electronic detection based on sub-millimetre size integrated circuits (ICs) has been demonstrated for a wide range of targets including nucleic and amino acids, but deployment of this technology to date has been limited due to the lack of a flexible process to integrate these chips within microfluidic devices. This paper presents a modular and inexpensive process to integrate ICs with microfluidic systems based on standard printed circuit board (PCB) technology to assemble the independently designed microfluidic and electronic components. The integrated system can accommodate multiple chips of different sizes bonded to glass or PDMS microfluidic systems. Since IC chips and flex PCB manufacturing and assembly are industry standards with low cost, the integrated system is economical for both laboratory and point-of-care settings.

  5. Open source modular ptosis crutch for the treatment of myasthenia gravis.

    PubMed

    Saidi, Trust; Sivarasu, Sudesh; Douglas, Tania S

    2018-02-01

    Pharmacologic treatment of Myasthenia Gravis presents challenges due to poor tolerability in some patients. Conventional ptosis crutches have limitations such as interference with blinking which causes ocular surface drying, and frequent irritation of the eyes. To address this problem, a modular and adjustable ptosis crutch for elevating the upper eyelid in Myasthenia Gravis patients has been proposed as a non-surgical and low-cost solution. Areas covered: This paper reviews the literature on the challenges in the treatment of Myasthenia Gravis globally and focuses on a modular and adjustable ptosis crutch that has been developed by the Medical Device Laboratory at the University of Cape Town. Expert commentary: The new medical device has potential as a simple, effective and unobtrusive solution to elevate the drooping upper eyelid(s) above the visual axis without the need for medication and surgery. Access to the technology is provided through an open source platform which makes it available globally. Open access provides opportunities for further open innovation to address the current limitations of the device, ultimately for the benefit not only of people suffering from Myasthenia Gravis but also of those with ptosis from other aetiologies.

  6. CIS-lunar space infrastructure lunar technologies: Executive summary

    NASA Technical Reports Server (NTRS)

    Faller, W.; Hoehn, A.; Johnson, S.; Moos, P.; Wiltberger, N.

    1989-01-01

    Technologies necessary for the creation of a cis-Lunar infrastructure, namely: (1) automation and robotics; (2) life support systems; (3) fluid management; (4) propulsion; and (5) rotating technologies, are explored. The technological focal point is on the development of automated and robotic systems for the implementation of a Lunar Oasis produced by Automation and Robotics (LOAR). Under direction from the NASA Office of Exploration, automation and robotics were extensively utilized as an initiating stage in the return to the Moon. A pair of autonomous rovers, modular in design and built from interchangeable and specialized components, is proposed. Utilizing a buddy system, these rovers will be able to support each other and to enhance their individual capabilities. One rover primarily explores and maps while the second rover tests the feasibility of various materials-processing techniques. The automated missions emphasize availability and potential uses of Lunar resources, and the deployment and operations of the LOAR program. An experimental bio-volume is put into place as the precursor to a Lunar environmentally controlled life support system. The bio-volume will determine the reproduction, growth and production characteristics of various life forms housed on the Lunar surface. Physicochemical regenerative technologies and stored resources will be used to buffer biological disturbances of the bio-volume environment. The in situ Lunar resources will be both tested and used within this bio-volume. Second phase development on the Lunar surface calls for manned operations. Repairs and re-configuration of the initial framework will ensue. An autonomously-initiated manned Lunar oasis can become an essential component of the United States space program.

  7. TrackMate: An open and extensible platform for single-particle tracking.

    PubMed

    Tinevez, Jean-Yves; Perry, Nick; Schindelin, Johannes; Hoopes, Genevieve M; Reynolds, Gregory D; Laplantine, Emmanuel; Bednarek, Sebastian Y; Shorte, Spencer L; Eliceiri, Kevin W

    2017-02-15

    We present TrackMate, an open source Fiji plugin for the automated, semi-automated, and manual tracking of single-particles. It offers a versatile and modular solution that works out of the box for end users, through a simple and intuitive user interface. It is also easily scriptable and adaptable, operating equally well on 1D over time, 2D over time, 3D over time, or other single and multi-channel image variants. TrackMate provides several visualization and analysis tools that aid in assessing the relevance of results. The utility of TrackMate is further enhanced through its ability to be readily customized to meet specific tracking problems. TrackMate is an extensible platform where developers can easily write their own detection, particle linking, visualization or analysis algorithms within the TrackMate environment. This evolving framework provides researchers with the opportunity to quickly develop and optimize new algorithms based on existing TrackMate modules without the need of having to write de novo user interfaces, including visualization, analysis and exporting tools. The current capabilities of TrackMate are presented in the context of three different biological problems. First, we perform Caenorhabditis-elegans lineage analysis to assess how light-induced damage during imaging impairs its early development. Our TrackMate-based lineage analysis indicates the lack of a cell-specific light-sensitive mechanism. Second, we investigate the recruitment of NEMO (NF-κB essential modulator) clusters in fibroblasts after stimulation by the cytokine IL-1 and show that photodamage can generate artifacts in the shape of TrackMate characterized movements that confuse motility analysis. Finally, we validate the use of TrackMate for quantitative lifetime analysis of clathrin-mediated endocytosis in plant cells. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  8. Bicentric evaluation of six anti-toxoplasma immunoglobulin G (IgG) automated immunoassays and comparison to the Toxo II IgG Western blot.

    PubMed

    Maudry, Arnaud; Chene, Gautier; Chatelain, Rémi; Patural, Hugues; Bellete, Bahrie; Tisseur, Bernard; Hafid, Jamal; Raberin, Hélène; Beretta, Sophie; Sung, Roger Tran Manh; Belot, Georges; Flori, Pierre

    2009-09-01

    A comparative study of the Toxoplasma IgG(I) and IgG(II) Access (Access I and II, respectively; Beckman Coulter Inc.), AxSYM Toxo IgG (AxSYM; Abbott Diagnostics), Vidas Toxo IgG (Vidas; bioMerieux, Marcy l'Etoile, France), Immulite Toxo IgG (Immulite; Siemens Healthcare Diagnostics Inc.), and Modular Toxo IgG (Modular; Roche Diagnostics, Basel, Switzerland) tests was done with 406 consecutive serum samples. The Toxo II IgG Western blot (LDBio, Lyon, France) was used as a reference technique in the case of intertechnique discordance. Of the 406 serum samples tested, the results for 35 were discordant by the different techniques. Using the 175 serum samples with positive results, we evaluated the standardization of the titrations obtained (in IU/ml); the medians (second quartiles) obtained were 9.1 IU/ml for the AxSYM test, 21 IU/ml for the Access I test, 25.7 IU/ml for the Access II test, 32 IU/ml for the Vidas test, 34.6 IU/ml for the Immulite test, and 248 IU/ml for the Modular test. For all the immunoassays tested, the following relative sensitivity and specificity values were found: 89.7 to 100% for the Access II test, 89.7 to 99.6% for the Immulite test, 90.2 to 99.6% for the AxSYM test, 91.4 to 99.6% for the Vidas test, 94.8 to 99.6% for the Access I test, and 98.3 to 98.7% for the Modular test. Among the 406 serum samples, we did not find any false-positive values by two different tests for the same serum sample. Except for the Modular test, which prioritized sensitivity, it appears that the positive cutoff values suggested by the pharmaceutical companies are very high (either for economical or for safety reasons). This led to imperfect sensitivity, a large number of unnecessary serological follow-ups of pregnant women, and difficulty in determining the serological status of immunosuppressed individuals.

  9. Network Disruption in the Preclinical Stages of Alzheimer's Disease: From Subjective Cognitive Decline to Mild Cognitive Impairment.

    PubMed

    López-Sanz, David; Garcés, Pilar; Álvarez, Blanca; Delgado-Losada, María Luisa; López-Higes, Ramón; Maestú, Fernando

    2017-12-01

    Subjective Cognitive Decline (SCD) is a largely unknown state thought to represent a preclinical stage of Alzheimer's Disease (AD) previous to mild cognitive impairment (MCI). However, the course of network disruption in these stages is scarcely characterized. We employed resting state magnetoencephalography in the source space to calculate network smallworldness, clustering, modularity and transitivity. Nodal measures (clustering and node degree) as well as modular partitions were compared between groups. The MCI group exhibited decreased smallworldness, clustering and transitivity and increased modularity in theta and beta bands. SCD showed similar but smaller changes in clustering and transitivity, while exhibiting alterations in the alpha band in opposite direction to those showed by MCI for modularity and transitivity. At the node level, MCI disrupted both clustering and nodal degree while SCD showed minor changes in the latter. Additionally, we observed an increase in modular partition variability in both SCD and MCI in theta and beta bands. SCD elders exhibit a significant network disruption, showing intermediate values between HC and MCI groups in multiple parameters. These results highlight the relevance of cognitive concerns in the clinical setting and suggest that network disorganization in AD could start in the preclinical stages before the onset of cognitive symptoms.

  10. Next generation sequence assembly with AMOS.

    PubMed

    Treangen, Todd J; Sommer, Dan D; Angly, Florent E; Koren, Sergey; Pop, Mihai

    2011-03-01

    A Modular Open-Source Assembler (AMOS) was designed to offer a modular approach to genome assembly. AMOS includes a wide range of tools for assembly, including the lightweight de novo assemblers Minimus and Minimo, and Bambus 2, a robust scaffolder able to handle metagenomic and polymorphic data. This protocol describes how to configure and use AMOS for the assembly of Next Generation sequence data. Additionally, we provide three tutorial examples that include bacterial, viral, and metagenomic datasets with specific tips for improving assembly quality. © 2011 by John Wiley & Sons, Inc.

  11. Forward Modeling of Large-scale Structure: An Open-source Approach with Halotools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hearin, Andrew P.; Campbell, Duncan; Tollerud, Erik

    We present the first stable release of Halotools (v0.2), a community-driven Python package designed to build and test models of the galaxy-halo connection. Halotools provides a modular platform for creating mock universes of galaxies starting from a catalog of dark matter halos obtained from a cosmological simulation. The package supports many of the common forms used to describe galaxy-halo models: the halo occupation distribution, the conditional luminosity function, abundance matching, and alternatives to these models that include effects such as environmental quenching or variable galaxy assembly bias. Satellite galaxies can be modeled to live in subhalos or to follow custommore » number density profiles within their halos, including spatial and/or velocity bias with respect to the dark matter profile. The package has an optimized toolkit to make mock observations on a synthetic galaxy population—including galaxy clustering, galaxy–galaxy lensing, galaxy group identification, RSD multipoles, void statistics, pairwise velocities and others—allowing direct comparison to observations. Halotools is object-oriented, enabling complex models to be built from a set of simple, interchangeable components, including those of your own creation. Halotools has an automated testing suite and is exhaustively documented on http://halotools.readthedocs.io, which includes quickstart guides, source code notes and a large collection of tutorials. The documentation is effectively an online textbook on how to build and study empirical models of galaxy formation with Python.« less

  12. Modular Object-Oriented Dynamic Learning Environment: What Open Source Has to Offer

    ERIC Educational Resources Information Center

    Antonenko, Pavlo; Toy, Serkan; Niederhauser, Dale

    2004-01-01

    Open source online learning environments have emerged and developed over the past 10 years. In this paper we will analyze the underlying philosophy and features of MOODLE based on the theoretical framework developed by Hannafin and Land (2000). Psychological, pedagogical, technological, cultural, and pragmatic foundations comprise the framework…

  13. Generative Representations for Automated Design of Robots

    NASA Technical Reports Server (NTRS)

    Homby, Gregory S.; Lipson, Hod; Pollack, Jordan B.

    2007-01-01

    A method of automated design of complex, modular robots involves an evolutionary process in which generative representations of designs are used. The term generative representations as used here signifies, loosely, representations that consist of or include algorithms, computer programs, and the like, wherein encoded designs can reuse elements of their encoding and thereby evolve toward greater complexity. Automated design of robots through synthetic evolutionary processes has already been demonstrated, but it is not clear whether genetically inspired search algorithms can yield designs that are sufficiently complex for practical engineering. The ultimate success of such algorithms as tools for automation of design depends on the scaling properties of representations of designs. A nongenerative representation (one in which each element of the encoded design is used at most once in translating to the design) scales linearly with the number of elements. Search algorithms that use nongenerative representations quickly become intractable (search times vary approximately exponentially with numbers of design elements), and thus are not amenable to scaling to complex designs. Generative representations are compact representations and were devised as means to circumvent the above-mentioned fundamental restriction on scalability. In the present method, a robot is defined by a compact programmatic form (its generative representation) and the evolutionary variation takes place on this form. The evolutionary process is an iterative one, wherein each cycle consists of the following steps: 1. Generative representations are generated in an evolutionary subprocess. 2. Each generative representation is a program that, when compiled, produces an assembly procedure. 3. In a computational simulation, a constructor executes an assembly procedure to generate a robot. 4. A physical-simulation program tests the performance of a simulated constructed robot, evaluating the performance according to a fitness criterion to yield a figure of merit that is fed back into the evolutionary subprocess of the next iteration. In comparison with prior approaches to automated evolutionary design of robots, the use of generative representations offers two advantages: First, a generative representation enables the reuse of components in regular and hierarchical ways and thereby serves a systematic means of creating more complex modules out of simpler ones. Second, the evolved generative representation may capture intrinsic properties of the design problem, so that variations in the representations move through the design space more effectively than do equivalent variations in a nongenerative representation. This method has been demonstrated by using it to design some robots that move, variously, by walking, rolling, or sliding. Some of the robots were built (see figure). Although these robots are very simple, in comparison with robots designed by humans, their structures are more regular, modular, hierarchical, and complex than are those of evolved designs of comparable functionality synthesized by use of nongenerative representations.

  14. Fuzzy logic control of an AGV

    NASA Astrophysics Data System (ADS)

    Kelkar, Nikhal; Samu, Tayib; Hall, Ernest L.

    1997-09-01

    Automated guided vehicles (AGVs) have many potential applications in manufacturing, medicine, space and defense. The purpose of this paper is to describe exploratory research on the design of a modular autonomous mobile robot controller. The controller incorporates a fuzzy logic approach for steering and speed control, a neuro-fuzzy approach for ultrasound sensing (not discussed in this paper) and an overall expert system. The advantages of a modular system are related to portability and transportability, i.e. any vehicle can become autonomous with minimal modifications. A mobile robot test-bed has been constructed using a golf cart base. This cart has full speed control with guidance provided by a vision system and obstacle avoidance using ultrasonic sensors. The speed and steering fuzzy logic controller is supervised by a 486 computer through a multi-axis motion controller. The obstacle avoidance system is based on a micro-controller interfaced with six ultrasonic transducers. This micro- controller independently handles all timing and distance calculations and sends a steering angle correction back to the computer via the serial line. This design yields a portable independent system in which high speed computer communication is not necessary. Vision guidance is accomplished with a CCD camera with a zoom lens. The data is collected by a vision tracking device that transmits the X, Y coordinates of the lane marker to the control computer. Simulation and testing of these systems yielded promising results. This design, in its modularity, creates a portable autonomous fuzzy logic controller applicable to any mobile vehicle with only minor adaptations.

  15. Development of a mobile robot for the 1995 AUVS competition

    NASA Astrophysics Data System (ADS)

    Matthews, Bradley O.; Ruthemeyer, Michael A.; Perdue, David; Hall, Ernest L.

    1995-12-01

    Automated guided vehicles (AGVs) have many potential applications in manufacturing, medicine, space and defense. The purpose of this paper is to describe exploratory research on the design of a modular autonomous mobile robot controller. The advantages of a modular system are related to portability and the fact that any vehicle can become autonomous with minimal modifications. A mobile robot test-bed has been constructed using a golf cart base. This cart has full speed control with guidance provided by a vision system and obstacle avoidance using ultrasonic sensors systems. The speed and steering control are supervised by a 486 computer through a 3-axis motion controller. The obstacle avoidance system is based on a micro-controller interfaced with six ultrasonic transducers. The is micro-controller independently handles all timing and distance calculations and sends a steering angle correction back to the computer via the serial line. This design yields a portable independent system, where even computer communication is not necessary. Vision guidance is accomplished with a CCD camera with a zoom lens. The data is collected through a commercial tracking device, communicating with the computer the X,Y coordinates of the lane marker. Testing of these systems yielded positive results by showing that at five mph the vehicle can follow a line and at the same time avoid obstacles. This design, in its modularity, creates a portable autonomous controller applicable for any mobile vehicle with only minor adaptations.

  16. Conceptual design and thermal analysis of a modular cryostat for one single coil of a 10 MW offshore superconducting wind turbine

    NASA Astrophysics Data System (ADS)

    Sun, Jiuce; Sanz, Santiago; Neumann, Holger

    2015-12-01

    Superconducting generators show the potential to reduce the head mass of large offshore wind turbines. A 10 MW offshore superconducting wind turbine has been investigated in the SUPRAPOWER project. The superconducting coils based on MgB2 tapes are supposed to work at cryogenic temperature of 20 K. In this paper, a novel modular rotating cryostat was presented for one single coil of the superconducting wind turbine. The modular concept and cryogen-free cooling method were proposed to fulfil the requirements of handling, maintenance, reliability of long term and offshore operations. Two stage Gifford-McMahon cryocoolers were used to provide cooling source. Supporting rods made of titanium alloy were selected as support structures of the cryostat in aim of reducing the heat load. The thermal performance in the modular cryostat was carefully investigated. The heat load applied to the cryocooler second stage was 2.17 W@20 K per coil. The corresponding temperature difference along the superconducting coil was only around 1 K.

  17. FPGA-based firmware model for extended measurement systems with data quality monitoring

    NASA Astrophysics Data System (ADS)

    Wojenski, A.; Pozniak, K. T.; Mazon, D.; Chernyshova, M.

    2017-08-01

    Modern physics experiments requires construction of advanced, modular measurement systems for data processing and registration purposes. Components are often designed in one of the common mechanical and electrical standards, e.g. VME or uTCA. The paper is focused on measurement systems using FPGAs as data processing blocks, especially for plasma diagnostics using GEM detectors with data quality monitoring aspects. In the article is proposed standardized model of HDL FPGA firmware implementation, for use in a wide range of different measurement system. The effort was made in term of flexible implementation of data quality monitoring along with source data dynamic selection. In the paper is discussed standard measurement system model followed by detailed model of FPGA firmware for modular measurement systems. Considered are both: functional blocks and data buses. In the summary, necessary blocks and signal lines are described. Implementation of firmware following the presented rules should provide modular design, with ease of change different parts of it. The key benefit is construction of universal, modular HDL design, that can be applied in different measurement system with simple adjustments.

  18. pySPACE—a signal processing and classification environment in Python

    PubMed Central

    Krell, Mario M.; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H.; Kirchner, Elsa A.; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries. PMID:24399965

  19. pySPACE-a signal processing and classification environment in Python.

    PubMed

    Krell, Mario M; Straube, Sirko; Seeland, Anett; Wöhrle, Hendrik; Teiwes, Johannes; Metzen, Jan H; Kirchner, Elsa A; Kirchner, Frank

    2013-01-01

    In neuroscience large amounts of data are recorded to provide insights into cerebral information processing and function. The successful extraction of the relevant signals becomes more and more challenging due to increasing complexities in acquisition techniques and questions addressed. Here, automated signal processing and machine learning tools can help to process the data, e.g., to separate signal and noise. With the presented software pySPACE (http://pyspace.github.io/pyspace), signal processing algorithms can be compared and applied automatically on time series data, either with the aim of finding a suitable preprocessing, or of training supervised algorithms to classify the data. pySPACE originally has been built to process multi-sensor windowed time series data, like event-related potentials from the electroencephalogram (EEG). The software provides automated data handling, distributed processing, modular build-up of signal processing chains and tools for visualization and performance evaluation. Included in the software are various algorithms like temporal and spatial filters, feature generation and selection, classification algorithms, and evaluation schemes. Further, interfaces to other signal processing tools are provided and, since pySPACE is a modular framework, it can be extended with new algorithms according to individual needs. In the presented work, the structural hierarchies are described. It is illustrated how users and developers can interface the software and execute offline and online modes. Configuration of pySPACE is realized with the YAML format, so that programming skills are not mandatory for usage. The concept of pySPACE is to have one comprehensive tool that can be used to perform complete signal processing and classification tasks. It further allows to define own algorithms, or to integrate and use already existing libraries.

  20. Automation of a Wave-Optics Simulation and Image Post-Processing Package on Riptide

    NASA Astrophysics Data System (ADS)

    Werth, M.; Lucas, J.; Thompson, D.; Abercrombie, M.; Holmes, R.; Roggemann, M.

    Detailed wave-optics simulations and image post-processing algorithms are computationally expensive and benefit from the massively parallel hardware available at supercomputing facilities. We created an automated system that interfaces with the Maui High Performance Computing Center (MHPCC) Distributed MATLAB® Portal interface to submit massively parallel waveoptics simulations to the IBM iDataPlex (Riptide) supercomputer. This system subsequently postprocesses the output images with an improved version of physically constrained iterative deconvolution (PCID) and analyzes the results using a series of modular algorithms written in Python. With this architecture, a single person can simulate thousands of unique scenarios and produce analyzed, archived, and briefing-compatible output products with very little effort. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA). The views, opinions, and/or findings expressed are those of the author(s) and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.

  1. Advanced interdisciplinary undergraduate program: light engineering

    NASA Astrophysics Data System (ADS)

    Bakholdin, Alexey; Bougrov, Vladislav; Voznesenskaya, Anna; Ezhova, Kseniia

    2016-09-01

    The undergraduate educational program "Light Engineering" of an advanced level of studies is focused on development of scientific learning outcomes and training of professionals, whose activities are in the interdisciplinary fields of Optical engineering and Technical physics. The program gives practical experience in transmission, reception, storage, processing and displaying information using opto-electronic devices, automation of optical systems design, computer image modeling, automated quality control and characterization of optical devices. The program is implemented in accordance with Educational standards of the ITMO University. The specific features of the Program is practice- and problem-based learning implemented by engaging students to perform research and projects, internships at the enterprises and in leading Russian and international research educational centers. The modular structure of the Program and a significant proportion of variable disciplines provide the concept of individual learning for each student. Learning outcomes of the program's graduates include theoretical knowledge and skills in natural science and core professional disciplines, deep knowledge of modern computer technologies, research expertise, design skills, optical and optoelectronic systems and devices.

  2. Evolving cell models for systems and synthetic biology.

    PubMed

    Cao, Hongqing; Romero-Campero, Francisco J; Heeb, Stephan; Cámara, Miguel; Krasnogor, Natalio

    2010-03-01

    This paper proposes a new methodology for the automated design of cell models for systems and synthetic biology. Our modelling framework is based on P systems, a discrete, stochastic and modular formal modelling language. The automated design of biological models comprising the optimization of the model structure and its stochastic kinetic constants is performed using an evolutionary algorithm. The evolutionary algorithm evolves model structures by combining different modules taken from a predefined module library and then it fine-tunes the associated stochastic kinetic constants. We investigate four alternative objective functions for the fitness calculation within the evolutionary algorithm: (1) equally weighted sum method, (2) normalization method, (3) randomly weighted sum method, and (4) equally weighted product method. The effectiveness of the methodology is tested on four case studies of increasing complexity including negative and positive autoregulation as well as two gene networks implementing a pulse generator and a bandwidth detector. We provide a systematic analysis of the evolutionary algorithm's results as well as of the resulting evolved cell models.

  3. Demonstration of automated proximity and docking technologies

    NASA Astrophysics Data System (ADS)

    Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.

    An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.

  4. Automated culture system experiments hardware: developing test results and design solutions.

    PubMed

    Freddi, M; Covini, M; Tenconi, C; Ricci, C; Caprioli, M; Cotronei, V

    2002-07-01

    The experiment proposed by Prof. Ricci University of Milan is funded by ASI with Laben as industrial Prime Contractor. ACS-EH (Automated Culture System-Experiment Hardware) will support the multigenerational experiment on weightlessness with rotifers and nematodes within four Experiment Containers (ECs) located inside the European Modular Cultivation System (EMCS) facility..Actually the Phase B is in progress and a concept design solution has been defined. The most challenging aspects for the design of such hardware are, from biological point of view the provision of an environment which permits animal's survival and to maintain desiccated generations separated and from the technical point of view, the miniaturisation of the hardware itself due to the reduce EC provided volume (160mmx60mmx60mm). The miniaturisation will allow a better use of the available EMCS Facility resources (e.g. volume. power etc.) and to fulfil the experiment requirements. ACS-EH, will be ready to fly in the year 2005 on boar the ISS.

  5. A versatile and modular quasi optics-based 200 GHz dual dynamic nuclear polarization and electron paramagnetic resonance instrument

    PubMed Central

    Siaw, Ting Ann; Leavesley, Alisa; Lund, Alicia; Kaminker, Ilia; Han, Songi

    2016-01-01

    Solid-state dynamic nuclear polarization (DNP) at higher magnetic fields (>3 T) and cryogenic temperatures (~2–90 K) has gained enormous interest and seen major technological advances as an NMR signal enhancing technique. Still, the current state of the art DNP operation is not at a state at which sample and freezing conditions can be rationally chosen and the DNP performance predicted a priori, but relies on purely empirical approaches. An important step towards rational optimization of DNP conditions is to have access to DNP instrumental capabilities to diagnose DNP performance and elucidate DNP mechanisms. The desired diagnoses include the measurement of the “DNP power curve”, i.e. the microwave (MW) power dependence of DNP enhancement, the “DNP spectrum”, i.e. the MW frequency dependence of DNP enhancement, the electron paramagnetic resonance (EPR) spectrum and the saturation and spectral diffusion properties of the EPR spectrum upon prolonged MW irradiation typical of continuous wave (CW) DNP, as well as various electron and nuclear spin relaxation parameters. Even basic measurements of these DNP parameters require versatile instrumentation at high magnetic fields not commercially available to date. In this article, we describe the detailed design of such a DNP instrument, powered by a solid-state MW source that is tunable between 193 – 201 GHz and outputs up to 140 mW of MW power. The quality and pathway of the transmitted and reflected MWs is controlled by a quasi-optics (QO) bridge and a corrugated waveguide, where the latter couples the MW from an open-space QO bridge to the sample located inside the superconducting magnet and vice versa. Crucially, the versatility of the solid-state MW source enables the automated acquisition of frequency swept DNP spectra, DNP power curves, the diagnosis of MW power and transmission, and frequency swept continuous wave (CW) and pulsed EPR experiments. The flexibility of the DNP instrument centered around the QO MW bridge will provide an efficient means to collect DNP data that is crucial for understanding the relationship between experimental and sample conditions, and the DNP performance. The modularity of this instrumental platform is suitable for future upgrades and extensions to include new experimental capabilities to meet contemporary DNP needs, including the simultaneous operation of two or more MW sources, time domain DNP, electron double resonance measurements, pulsed EPR operation, or simply the implementation of higher power MW amplifiers. PMID:26920839

  6. A versatile and modular quasi optics-based 200 GHz dual dynamic nuclear polarization and electron paramagnetic resonance instrument

    NASA Astrophysics Data System (ADS)

    Siaw, Ting Ann; Leavesley, Alisa; Lund, Alicia; Kaminker, Ilia; Han, Songi

    2016-03-01

    Solid-state dynamic nuclear polarization (DNP) at higher magnetic fields (>3 T) and cryogenic temperatures (∼2-90 K) has gained enormous interest and seen major technological advances as an NMR signal enhancing technique. Still, the current state of the art DNP operation is not at a state at which sample and freezing conditions can be rationally chosen and the DNP performance predicted a priori, but relies on purely empirical approaches. An important step towards rational optimization of DNP conditions is to have access to DNP instrumental capabilities to diagnose DNP performance and elucidate DNP mechanisms. The desired diagnoses include the measurement of the "DNP power curve", i.e. the microwave (MW) power dependence of DNP enhancement, the "DNP spectrum", i.e. the MW frequency dependence of DNP enhancement, the electron paramagnetic resonance (EPR) spectrum, and the saturation and spectral diffusion properties of the EPR spectrum upon prolonged MW irradiation typical of continuous wave (CW) DNP, as well as various electron and nuclear spin relaxation parameters. Even basic measurements of these DNP parameters require versatile instrumentation at high magnetic fields not commercially available to date. In this article, we describe the detailed design of such a DNP instrument, powered by a solid-state MW source that is tunable between 193 and 201 GHz and outputs up to 140 mW of MW power. The quality and pathway of the transmitted and reflected MWs is controlled by a quasi-optics (QO) bridge and a corrugated waveguide, where the latter couples the MW from an open-space QO bridge to the sample located inside the superconducting magnet and vice versa. Crucially, the versatility of the solid-state MW source enables the automated acquisition of frequency swept DNP spectra, DNP power curves, the diagnosis of MW power and transmission, and frequency swept continuous wave (CW) and pulsed EPR experiments. The flexibility of the DNP instrument centered around the QO MW bridge will provide an efficient means to collect DNP data that is crucial for understanding the relationship between experimental and sample conditions, and the DNP performance. The modularity of this instrumental platform is suitable for future upgrades and extensions to include new experimental capabilities to meet contemporary DNP needs, including the simultaneous operation of two or more MW sources, time domain DNP, electron double resonance measurements, pulsed EPR operation, or simply the implementation of higher power MW amplifiers.

  7. A versatile and modular quasi optics-based 200GHz dual dynamic nuclear polarization and electron paramagnetic resonance instrument.

    PubMed

    Siaw, Ting Ann; Leavesley, Alisa; Lund, Alicia; Kaminker, Ilia; Han, Songi

    2016-03-01

    Solid-state dynamic nuclear polarization (DNP) at higher magnetic fields (>3T) and cryogenic temperatures (∼ 2-90K) has gained enormous interest and seen major technological advances as an NMR signal enhancing technique. Still, the current state of the art DNP operation is not at a state at which sample and freezing conditions can be rationally chosen and the DNP performance predicted a priori, but relies on purely empirical approaches. An important step towards rational optimization of DNP conditions is to have access to DNP instrumental capabilities to diagnose DNP performance and elucidate DNP mechanisms. The desired diagnoses include the measurement of the "DNP power curve", i.e. the microwave (MW) power dependence of DNP enhancement, the "DNP spectrum", i.e. the MW frequency dependence of DNP enhancement, the electron paramagnetic resonance (EPR) spectrum, and the saturation and spectral diffusion properties of the EPR spectrum upon prolonged MW irradiation typical of continuous wave (CW) DNP, as well as various electron and nuclear spin relaxation parameters. Even basic measurements of these DNP parameters require versatile instrumentation at high magnetic fields not commercially available to date. In this article, we describe the detailed design of such a DNP instrument, powered by a solid-state MW source that is tunable between 193 and 201 GHz and outputs up to 140 mW of MW power. The quality and pathway of the transmitted and reflected MWs is controlled by a quasi-optics (QO) bridge and a corrugated waveguide, where the latter couples the MW from an open-space QO bridge to the sample located inside the superconducting magnet and vice versa. Crucially, the versatility of the solid-state MW source enables the automated acquisition of frequency swept DNP spectra, DNP power curves, the diagnosis of MW power and transmission, and frequency swept continuous wave (CW) and pulsed EPR experiments. The flexibility of the DNP instrument centered around the QO MW bridge will provide an efficient means to collect DNP data that is crucial for understanding the relationship between experimental and sample conditions, and the DNP performance. The modularity of this instrumental platform is suitable for future upgrades and extensions to include new experimental capabilities to meet contemporary DNP needs, including the simultaneous operation of two or more MW sources, time domain DNP, electron double resonance measurements, pulsed EPR operation, or simply the implementation of higher power MW amplifiers. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Automated System Marketplace 1987: Maturity and Competition.

    ERIC Educational Resources Information Center

    Walton, Robert A.; Bridge, Frank R.

    1988-01-01

    This annual review of the library automation marketplace presents profiles of 15 major library automation firms and looks at emerging trends. Seventeen charts and tables provide data on market shares, number and size of installations, hardware availability, operating systems, and interfaces. A directory of 49 automation sources is included. (MES)

  9. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  10. Challenges in atmospheric monitoring of areal emission sources - an Open-path Fourier transform infrared (OP-FTIR) spectroscopic experience report

    NASA Astrophysics Data System (ADS)

    Schuetze, C.; Sauer, U.; Dietrich, P.

    2015-12-01

    Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.

  11. Demonstration of a Small Modular BioPower System Using Poultry Litter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John P. Reardon; Art Lilley; Jim Wimberly

    2002-05-22

    The purpose of this project was to assess poultry grower residue, or litter (manure plus absorbent biomass), as a fuel source for Community Power Corporation's small modular biopower system (SMB). A second objective was to assess the poultry industry to identify potential ''on-site'' applications of the SMB system using poultry litter residue as a fuel source, and to adapt CPC's existing SMB to generate electricity and heat from the poultry litter biomass fuel. Bench-scale testing and pilot testing were used to gain design information for the SMB retrofit. System design approach for the Phase II application of the SMB wasmore » the goal of Phase I testing. Cost estimates for an onsite poultry litter SMB were prepared. Finally, a market estimate was prepared for implementation of the on-farm SMB using poultry litter.« less

  12. Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference

    PubMed Central

    Ehrenfeld, Stephan; Herbort, Oliver; Butz, Martin V.

    2013-01-01

    This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control. PMID:24191151

  13. SCHeMA open and modular in situ sensing solution

    NASA Astrophysics Data System (ADS)

    Tercier-Waeber, Marie Louise; Novellino, Antonio

    2017-04-01

    Marine environments are highly vulnerable and influenced by a wide diversity of anthropogenic and natural substances and organisms that may have adverse effects on the ecosystem equilibrium, on living resources and, ultimately, on human health. Identification of relevant types of hazards at the appropriate temporal and spatial scale is crucial to detect their sources and origin, to understand the processes governing their magnitude and distribution, and to ultimately evaluate and manage their risks and consequences preventing economic losses. This can be addressed only by the development of innovative, compact, rugged, automated, sensor networks allowing long-term monitoring. Development of such tools is a challenging task as it requires many analytical and technical innovations. The FP7-OCEAN 2013-SCHeMA project aims to contribute to meet this challenge by providing an open and modular sensing solution for autonomous in situ high resolution mapping of a range of anthropogenic and natural chemical compounds (trace metals, nutrients, anthropogenic organic compounds, toxic algae species and toxins, species relevant to the carbon cycle). To achieve this, SCHeMA activities focus on the development of : 1) an array of miniature sensor probes taking advantage of various innovative solutions, namely: (polymer-based) gel-integrated sensors; solid state ion-selective membrane sensors coupled to an on-line desalination module; mid-infrared optical sensors; optochemical multichannel devices; enOcean technology; 2) dedicated hardware, firmware and software components allowing their plug-and-play integration, localization as well as wireless bidirectional communication via advanced OGC-SWE wired/wireless dedicated interfaces; 3) a web-based front-end system compatible with EU standard requirements and principles (INSPIRE, GEO/GEOSS) and configured to insure easy interoperability with national, regional and local marine observation systems. This lecture will present examples of innovative approaches and devices successfully developed and currently explored. Potentiality of the SCHeMA individual probes and integrated system to provide new type of high-resolution environmental data will be illustrated by examples of field application in selected coastal areas. www.schema-ocean.eu

  14. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    PubMed

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  15. High-accuracy microassembly by intelligent vision systems and smart sensor integration

    NASA Astrophysics Data System (ADS)

    Schilp, Johannes; Harfensteller, Mark; Jacob, Dirk; Schilp, Michael

    2003-10-01

    Innovative production processes and strategies from batch production to high volume scale are playing a decisive role in generating microsystems economically. In particular assembly processes are crucial operations during the production of microsystems. Due to large batch sizes many microsystems can be produced economically by conventional assembly techniques using specialized and highly automated assembly systems. At laboratory stage microsystems are mostly assembled by hand. Between these extremes there is a wide field of small and middle sized batch production wherefore common automated solutions rarely are profitable. For assembly processes at these batch sizes a flexible automated assembly system has been developed at the iwb. It is based on a modular design. Actuators like grippers, dispensers or other process tools can easily be attached due to a special tool changing system. Therefore new joining techniques can easily be implemented. A force-sensor and a vision system are integrated into the tool head. The automated assembly processes are based on different optical sensors and smart actuators like high-accuracy robots or linear-motors. A fiber optic sensor is integrated in the dispensing module to measure contactless the clearance between the dispense needle and the substrate. Robot vision systems using the strategy of optical pattern recognition are also implemented as modules. In combination with relative positioning strategies, an assembly accuracy of the assembly system of less than 3 μm can be realized. A laser system is used for manufacturing processes like soldering.

  16. Mining the modular structure of protein interaction networks.

    PubMed

    Berenstein, Ariel José; Piñero, Janet; Furlong, Laura Inés; Chernomoretz, Ariel

    2015-01-01

    Cluster-based descriptions of biological networks have received much attention in recent years fostered by accumulated evidence of the existence of meaningful correlations between topological network clusters and biological functional modules. Several well-performing clustering algorithms exist to infer topological network partitions. However, due to respective technical idiosyncrasies they might produce dissimilar modular decompositions of a given network. In this contribution, we aimed to analyze how alternative modular descriptions could condition the outcome of follow-up network biology analysis. We considered a human protein interaction network and two paradigmatic cluster recognition algorithms, namely: the Clauset-Newman-Moore and the infomap procedures. We analyzed to what extent both methodologies yielded different results in terms of granularity and biological congruency. In addition, taking into account Guimera's cartographic role characterization of network nodes, we explored how the adoption of a given clustering methodology impinged on the ability to highlight relevant network meso-scale connectivity patterns. As a case study we considered a set of aging related proteins and showed that only the high-resolution modular description provided by infomap, could unveil statistically significant associations between them and inter/intra modular cartographic features. Besides reporting novel biological insights that could be gained from the discovered associations, our contribution warns against possible technical concerns that might affect the tools used to mine for interaction patterns in network biology studies. In particular our results suggested that sub-optimal partitions from the strict point of view of their modularity levels might still be worth being analyzed when meso-scale features were to be explored in connection with external source of biological knowledge.

  17. QPatch: the past, present and future of automated patch clamp.

    PubMed

    Mathes, Chris

    2006-04-01

    The QPatch 16 significantly increases throughput for gigaseal patch clamp experiments, making direct measurements in ion channel drug discovery and safety testing feasible. Released to the market in the Autumn of 2004 by Sophion Bioscience, the QPatch originated from work done at NeuroSearch (Denmark) in the early days of automated patch clamp. Today, the QPatch provides many unique features. For example, only the QPatch includes an automated cell preparation station making several hours of unattended operation possible. The 16-channel electrode array, called the QPlate, includes glass-coated microfluidic channels for less compound absorption and, hence, more accurate IC(50) values. The microfluidic pathways also allow for very small amounts of compound used for each experiment ( approximately 5 microl per addition). Only the QPatch has four independent pipetting heads for more efficient liquid handling (especially for ligand-gated ion channel experiments). Patch clamp recordings with the QPatch match the high quality of conventional patch clamp and in some cases the results are even better. For example, only the QPatch includes 100% series resistance compensation for the elimination of false positives due to voltage errors. Finally, the modular QPatch 16 was designed with more channels in mind. The upgrade pathway to 48-channels (the QPatch HT) will be discussed.

  18. Automated cassette-based production of high specific activity [203/212Pb]peptide-based theranostic radiopharmaceuticals for image-guided radionuclide therapy for cancer.

    PubMed

    Li, Mengshi; Zhang, Xiuli; Quinn, Thomas P; Lee, Dongyoul; Liu, Dijie; Kunkel, Falk; Zimmerman, Brian E; McAlister, Daniel; Olewein, Keith; Menda, Yusuf; Mirzadeh, Saed; Copping, Roy; Johnson, Frances L; Schultz, Michael K

    2017-09-01

    A method for preparation of Pb-212 and Pb-203 labeled chelator-modified peptide-based radiopharmaceuticals for cancer imaging and radionuclide therapy has been developed and adapted for automated clinical production. Pre-concentration and isolation of radioactive Pb2+ from interfering metals in dilute hydrochloric acid was optimized using a commercially-available Pb-specific chromatography resin packed in disposable plastic columns. The pre-concentrated radioactive Pb2+ is eluted in NaOAc buffer directly to the reaction vessel containing chelator-modified peptides. Radiolabeling was found to proceed efficiently at 85°C (45min; pH 5.5). The specific activity of radiolabeled conjugates was optimized by separation of radiolabeled conjugates from unlabeled peptide via HPLC. Preservation of bioactivity was confirmed by in vivo biodistribution of Pb-203 and Pb-212 labeled peptides in melanoma-tumor-bearing mice. The approach has been found to be robustly adaptable to automation and a cassette-based fluid-handling system (Modular Lab Pharm Tracer) has been customized for clinical radiopharmaceutical production. Our findings demonstrate that the Pb-203/Pb-212 combination is a promising elementally-matched radionuclide pair for image-guided radionuclide therapy for melanoma, neuroendocrine tumors, and potentially other cancers. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Modular chemiresistive sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alam, Maksudul M.; Sampathkumaran, Uma

    The present invention relates to a modular chemiresistive sensor. In particular, a modular chemiresistive sensor for hypergolic fuel and oxidizer leak detection, carbon dioxide monitoring and detection of disease biomarkers. The sensor preferably has two gold or platinum electrodes mounted on a silicon substrate where the electrodes are connected to a power source and are separated by a gap of 0.5 to 4.0 .mu.M. A polymer nanowire or carbon nanotube spans the gap between the electrodes and connects the electrodes electrically. The electrodes are further connected to a circuit board having a processor and data storage, where the processor canmore » measure current and voltage values between the electrodes and compare the current and voltage values with current and voltage values stored in the data storage and assigned to particular concentrations of a pre-determined substance such as those listed above or a variety of other substances.« less

  20. Modular Open System Architecture for Reducing Contamination Risk in the Space and Missile Defense Supply Chain

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine

    2015-01-01

    To combat contamination of physical assets and provide reliable data to decision makers in the space and missile defense community, a modular open system architecture for creation of contamination models and standards is proposed. Predictive tools for quantifying the effects of contamination can be calibrated from NASA data of long-term orbiting assets. This data can then be extrapolated to missile defense predictive models. By utilizing a modular open system architecture, sensitive data can be de-coupled and protected while benefitting from open source data of calibrated models. This system architecture will include modules that will allow the designer to trade the effects of baseline performance against the lifecycle degradation due to contamination while modeling the lifecycle costs of alternative designs. In this way, each member of the supply chain becomes an informed and active participant in managing contamination risk early in the system lifecycle.

  1. Space Station crew workload - Station operations and customer accommodations

    NASA Technical Reports Server (NTRS)

    Shinkle, G. L.

    1985-01-01

    The features of the Space Station which permit crew members to utilize work time for payload operations are discussed. The user orientation, modular design, nonstressful flight regime, in space construction, on board control, automation and robotics, and maintenance and servicing of the Space Station are examined. The proposed crew size, skills, and functions as station operator and mission specialists are described. Mission objectives and crew functions, which include performing material processing, life science and astronomy experiments, satellite and payload equipment servicing, systems monitoring and control, maintenance and repair, Orbital Maneuvering Vehicle and Mobile Remote Manipulator System operations, on board planning, housekeeping, and health maintenance and recreation, are studied.

  2. Automatic design of IMA systems

    NASA Astrophysics Data System (ADS)

    Salomon, U.; Reichel, R.

    During the last years, the integrated modular avionics (IMA) design philosophy became widely established at aircraft manufacturers, giving rise to a series of new design challenges, most notably the allocation of avionics functions to the various IMA components and the placement of this equipment in the aircraft. This paper presents a modelling approach for avionics that allows automation of some steps of the design process by applying an optimisation algorithm which searches for system configurations that fulfil the safety requirements and have low costs. The algorithm was implemented as a quite sophisticated software prototype, therefore we will also present detailed results of its application to actual avionics systems.

  3. Discovering perturbation of modular structure in HIV progression by integrating multiple data sources through non-negative matrix factorization.

    PubMed

    Ray, Sumanta; Maulik, Ujjwal

    2016-12-20

    Detecting perturbation in modular structure during HIV-1 disease progression is an important step to understand stage specific infection pattern of HIV-1 virus in human cell. In this article, we proposed a novel methodology on integration of multiple biological information to identify such disruption in human gene module during different stages of HIV-1 infection. We integrate three different biological information: gene expression information, protein-protein interaction information and gene ontology information in single gene meta-module, through non negative matrix factorization (NMF). As the identified metamodules inherit those information so, detecting perturbation of these, reflects the changes in expression pattern, in PPI structure and in functional similarity of genes during the infection progression. To integrate modules of different data sources into strong meta-modules, NMF based clustering is utilized here. Perturbation in meta-modular structure is identified by investigating the topological and intramodular properties and putting rank to those meta-modules using a rank aggregation algorithm. We have also analyzed the preservation structure of significant GO terms in which the human proteins of the meta-modules participate. Moreover, we have performed an analysis to show the change of coregulation pattern of identified transcription factors (TFs) over the HIV progression stages.

  4. Towards the automated reduction and calibration of SCUBA data from the James Clerk Maxwell Telescope

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N. E.; Robson, E. I.

    2002-10-01

    The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used to investigate instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data, with particular emphasis on `jiggle-map' observations of compact sources. We demonstrate the validity of our automated approach at both 850 and 450 μm, and apply it to several of the JCMT secondary flux calibrators. We determine light curves for the variable sources IRC +10216 and OH 231.8. This automation is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on the United Kingdom Infrared Telescope (UKIRT) and the JCMT.

  5. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  6. Open-Source 3-D Platform for Low-Cost Scientific Instrument Ecosystem.

    PubMed

    Zhang, C; Wijnen, B; Pearce, J M

    2016-08-01

    The combination of open-source software and hardware provides technically feasible methods to create low-cost, highly customized scientific research equipment. Open-source 3-D printers have proven useful for fabricating scientific tools. Here the capabilities of an open-source 3-D printer are expanded to become a highly flexible scientific platform. An automated low-cost 3-D motion control platform is presented that has the capacity to perform scientific applications, including (1) 3-D printing of scientific hardware; (2) laboratory auto-stirring, measuring, and probing; (3) automated fluid handling; and (4) shaking and mixing. The open-source 3-D platform not only facilities routine research while radically reducing the cost, but also inspires the creation of a diverse array of custom instruments that can be shared and replicated digitally throughout the world to drive down the cost of research and education further. © 2016 Society for Laboratory Automation and Screening.

  7. Designing Decentralized Water and Electricity Supply System for Small Recreational Facilities in the South of Russia

    NASA Astrophysics Data System (ADS)

    Kasharin, D. V.

    2017-11-01

    The article tackles the issues of designing seasonal water and power supply systems for small recreational facilities in the south of Russia based on intelligent decision support systems. The paper proposes modular prefabricated shell water and power supply works (MPSW&PW) along with energy-efficient standalone water-treatment plants as the principal facilities compliant with the environmental and infrastructural requirements applied to specially protected areas and ensuring the least possible damage to the environment due to a maximum possible use of local construction materials characterized by impressive safety margins in highly seismic environments. The task of designing water and power supply systems requires the consideration of issues pertaining to the development of an intelligent GIS-based system for the selection of water intake sites that facilitate automation of data-processing systems using a priori scanning methods with a variable step and random directions. The paper duly addresses such issues and develops parameterized optimization algorithms for MPSW&PW shell facilities. It equally provides the substantiation of water-treatment plants intelligent design based on energy recovery reverse osmosis and nanofiltration plants that enhance the energy efficiency of such plants serving as the optimum solution for the decentralized water supply of small recreational facilities from renewable energy sources.

  8. GéoSAS: A modular and interoperable Open Source Spatial Data Infrastructure for research

    NASA Astrophysics Data System (ADS)

    Bera, R.; Squividant, H.; Le Henaff, G.; Pichelin, P.; Ruiz, L.; Launay, J.; Vanhouteghem, J.; Aurousseau, P.; Cudennec, C.

    2015-05-01

    To-date, the commonest way to deal with geographical information and processes still appears to consume local resources, i.e. locally stored data processed on a local desktop or server. The maturity and subsequent growing use of OGC standards to exchange data on the World Wide Web, enhanced in Europe by the INSPIRE Directive, is bound to change the way people (and among them research scientists, especially in environmental sciences) make use of, and manage, spatial data. A clever use of OGC standards can help scientists to better store, share and use data, in particular for modelling. We propose a framework for online processing by making an intensive use of OGC standards. We illustrate it using the Spatial Data Infrastructure (SDI) GéoSAS which is the SDI set up for researchers' needs in our department. It is based on the existing open source, modular and interoperable Spatial Data Architecture geOrchestra.

  9. The Dichotomy in Degree Correlation of Biological Networks

    PubMed Central

    Hao, Dapeng; Li, Chuanxing

    2011-01-01

    Most complex networks from different areas such as biology, sociology or technology, show a correlation on node degree where the possibility of a link between two nodes depends on their connectivity. It is widely believed that complex networks are either disassortative (links between hubs are systematically suppressed) or assortative (links between hubs are enhanced). In this paper, we analyze a variety of biological networks and find that they generally show a dichotomous degree correlation. We find that many properties of biological networks can be explained by this dichotomy in degree correlation, including the neighborhood connectivity, the sickle-shaped clustering coefficient distribution and the modularity structure. This dichotomy distinguishes biological networks from real disassortative networks or assortative networks such as the Internet and social networks. We suggest that the modular structure of networks accounts for the dichotomy in degree correlation and vice versa, shedding light on the source of modularity in biological networks. We further show that a robust and well connected network necessitates the dichotomy of degree correlation, suggestive of an evolutionary motivation for its existence. Finally, we suggest that a dichotomous degree correlation favors a centrally connected modular network, by which the integrity of network and specificity of modules might be reconciled. PMID:22164269

  10. Pressure Measurement Systems

    NASA Astrophysics Data System (ADS)

    1990-01-01

    System 8400 is an advanced system for measurement of gas and liquid pressure, along with a variety of other parameters, including voltage, frequency and digital inputs. System 8400 offers exceptionally high speed data acquisition through parallel processing, and its modular design allows expansion from a relatively inexpensive entry level system by the addition of modular Input Units that can be installed or removed in minutes. Douglas Juanarena was on the team of engineers that developed a new technology known as ESP (electronically scanned pressure). The Langley ESP measurement system was based on miniature integrated circuit pressure-sensing transducers that communicated pressure information to a minicomputer. In 1977, Juanarena formed PSI to exploit the NASA technology. In 1978 he left Langley, obtained a NASA license for the technology, introduced the first commercial product, the 780B pressure measurement system. PSI developed a pressure scanner for automation of industrial processes. Now in its second design generation, the DPT-6400 is capable of making 2,000 measurements a second and has 64 channels by addition of slave units. New system 8400 represents PSI's bid to further exploit the 600 million U.S. industrial pressure measurement market. It is geared to provide a turnkey solution to physical measurement.

  11. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record.

    PubMed

    Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2012-06-01

    Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.

  12. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record

    PubMed Central

    Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2011-01-01

    Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871

  13. Establishment of a fully automated microtiter plate-based system for suspension cell culture and its application for enhanced process optimization.

    PubMed

    Markert, Sven; Joeris, Klaus

    2017-01-01

    We developed an automated microtiter plate (MTP)-based system for suspension cell culture to meet the increased demands for miniaturized high throughput applications in biopharmaceutical process development. The generic system is based on off-the-shelf commercial laboratory automation equipment and is able to utilize MTPs of different configurations (6-24 wells per plate) in orbital shaken mode. The shaking conditions were optimized by Computational Fluid Dynamics simulations. The fully automated system handles plate transport, seeding and feeding of cells, daily sampling, and preparation of analytical assays. The integration of all required analytical instrumentation into the system enables a hands-off operation which prevents bottlenecks in sample processing. The modular set-up makes the system flexible and adaptable for a continuous extension of analytical parameters and add-on components. The system proved suitable as screening tool for process development by verifying the comparability of results for the MTP-based system and bioreactors regarding profiles of viable cell density, lactate, and product concentration of CHO cell lines. These studies confirmed that 6 well MTPs as well as 24 deepwell MTPs were predictive for a scale up to a 1000 L stirred tank reactor (scale factor 1:200,000). Applying the established cell culture system for automated media blend screening in late stage development, a 22% increase in product yield was achieved in comparison to the reference process. The predicted product increase was subsequently confirmed in 2 L bioreactors. Thus, we demonstrated the feasibility of the automated MTP-based cell culture system for enhanced screening and optimization applications in process development and identified further application areas such as process robustness. The system offers a great potential to accelerate time-to-market for new biopharmaceuticals. Biotechnol. Bioeng. 2017;114: 113-121. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Behavior-based vs. system-based training and displays for automated vertical guidance

    DOT National Transportation Integrated Search

    1997-04-01

    Aircraft automation, particularly the automation surrounding vertical navigation has been cited as an area of training difficulty and a source of confusion during operation. A number of incidents and accidents have been attributed to a lack of crew u...

  15. Configuration of electro-optic fire source detection system

    NASA Astrophysics Data System (ADS)

    Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir

    2007-04-01

    The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.

  16. Modular Low-Heater-Power Cathode/Electron Gun Assembly for Microwave and Millimeter Wave Traveling Wave Tubes

    NASA Technical Reports Server (NTRS)

    Wintucky, Edwin G.

    2000-01-01

    A low-cost, low-mass, electrically efficient, modular cathode/electron gun assembly has been developed by FDE Inc. of Beaverton, Oregon, under a Small Business Innovation Research (SBIR) contract with the NASA Glenn Research Center at Lewis Field. This new assembly offers significant improvements in the design and manufacture of microwave and millimeter wave traveling-wave tubes (TWT's) used for radar and communications. It incorporates a novel, low-heater-power, reduced size and mass, high-performance barium dispenser type thermionic cathode and provides for easy integration of the cathode into a large variety of conventional TWT circuits. Among the applications are TWT's for Earth-orbiting communication satellites and for deep space communications, where future missions will require smaller spacecraft, higher data transfer rates (higher frequencies and radiofrequency output power), and greater electrical efficiency. A particularly important TWT application is in the microwave power module (a hybrid microwave/millimeter wave amplifier consisting of a low-noise solid-state driver, a small TWT, and an electronic power conditioner integrated into a single compact package), where electrical efficiency and thermal loading are critical factors and lower cost is needed for successful commercialization. The design and fabrication are based on practices used in producing cathode ray tubes (CRT's), which is one of the most competitive and efficient manufacturing operations in the world today. The approach used in the design and manufacture of thermionic cathodes and electron guns for CRT's has been optimized for fully automated production, standardization of parts, and minimization of costs. It is applicable to the production of similar components for microwave tubes, with the additional benefits of low mass and significantly lower cathode heater power (less than half that of dispenser cathodes presently used in TWT s). Modular cathode/electron gun assembly. The modular cathode/electron gun assembly consists of four subassemblies the cathode, the focus electrode, the header (including the electrical feedthroughs), and the gun envelope (including the anode) a diagram of which is shown. The modular construction offers a number of significant advantages, including flexibility of design, interchangeability of parts, and a drop-in final assembly procedure for quick and accurate alignment. The gun can accommodate cathodes ranging in size from 0.050 to 0.250-in. in diameter and is applicable to TWT's over a broad range of sizes and operating parameters, requiring the substitution of only a few parts: that is, the cathode, focus electrode, and anode. The die-pressed cathode pellets can be made with either flat or concave (Pierce gun design) emitting surfaces. The gun can be either gridded (pulse operation) or ungridded (continuous operation). Important factors contributing to low cost are the greater use of CRT materials and parts, the standardization of processes (welding and mechanical capture), and tooling amenable to automated production. Examples are the use of simple shapes, drawn or stamped metal parts, and parts joined by welding or mechanical capture. Feasibility was successfully demonstrated in the retrofit and testing of a commercial Kaband (22-GHz) TWT. The modular cathode/electron gun assembly was computer modeled to replicate the performance of the original electron gun and fabricated largely from existing CRT parts. Significant test results included demonstration of low heater power (1.5-W, 1010 C brightness temperature for a 0.085-in.-diameter cathode), mechanical ruggedness (100g shock and vibration tests in accordance with military specifications (MIL specs)), and a very fast warmup. The results of these tests indicate that the low-cost CRT manufacturing approach can be used without sacrificing performance and reliability.

  17. Preliminary Framework for Human-Automation Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna Helene; Le Blanc, Katya Lee; Spielman, Zachary Alexander

    The Department of Energy’s Advanced Reactor Technologies Program sponsors research, development and deployment activities through its Next Generation Nuclear Plant, Advanced Reactor Concepts, and Advanced Small Modular Reactor (aSMR) Programs to promote safety, technical, economical, and environmental advancements of innovative Generation IV nuclear energy technologies. The Human Automation Collaboration (HAC) Research Project is located under the aSMR Program, which identifies developing advanced instrumentation and controls and human-machine interfaces as one of four key research areas. It is expected that the new nuclear power plant designs will employ technology significantly more advanced than the analog systems in the existing reactor fleetmore » as well as utilizing automation to a greater extent. Moving towards more advanced technology and more automation does not necessary imply more efficient and safer operation of the plant. Instead, a number of concerns about how these technologies will affect human performance and the overall safety of the plant need to be addressed. More specifically, it is important to investigate how the operator and the automation work as a team to ensure effective and safe plant operation, also known as the human-automation collaboration (HAC). The focus of the HAC research is to understand how various characteristics of automation (such as its reliability, processes, and modes) effect an operator’s use and awareness of plant conditions. In other words, the research team investigates how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. This report addresses the Department of Energy milestone M4AT-15IN2302054, Complete Preliminary Framework for Human-Automation Collaboration, by discussing the two phased development of a preliminary HAC framework. The framework developed in the first phase was used as the basis for selecting topics to be investigated in more detail. The results and insights gained from the in-depth studies conducted during the second phase were used to revise the framework. This report describes the basis for the framework developed in phase 1, the changes made to the framework in phase 2, and the basis for the changes. Additional research needs are identified and presented in the last section of the report.« less

  18. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    PubMed

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  19. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy

    NASA Astrophysics Data System (ADS)

    Barabas, Federico M.; Masullo, Luciano A.; Stefani, Fernando D.

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  20. Multisensor-integrated organs-on-chips platform for automated and continual in situ monitoring of organoid behaviors

    PubMed Central

    Zhang, Yu Shrike; Aleman, Julio; Shin, Su Ryon; Kim, Duckjin; Mousavi Shaegh, Seyed Ali; Massa, Solange; Riahi, Reza; Chae, Sukyoung; Hu, Ning; Avci, Huseyin; Zhang, Weijia; Silvestri, Antonia; Sanati Nezhad, Amir; Manbohi, Ahmad; De Ferrari, Fabio; Polini, Alessandro; Calzone, Giovanni; Shaikh, Noor; Alerasool, Parissa; Budina, Erica; Kang, Jian; Bhise, Nupura; Pourmand, Adel; Skardal, Aleksander; Shupe, Thomas; Bishop, Colin E.; Dokmeci, Mehmet Remzi; Atala, Anthony; Khademhosseini, Ali

    2017-01-01

    Organ-on-a-chip systems are miniaturized microfluidic 3D human tissue and organ models designed to recapitulate the important biological and physiological parameters of their in vivo counterparts. They have recently emerged as a viable platform for personalized medicine and drug screening. These in vitro models, featuring biomimetic compositions, architectures, and functions, are expected to replace the conventional planar, static cell cultures and bridge the gap between the currently used preclinical animal models and the human body. Multiple organoid models may be further connected together through the microfluidics in a similar manner in which they are arranged in vivo, providing the capability to analyze multiorgan interactions. Although a wide variety of human organ-on-a-chip models have been created, there are limited efforts on the integration of multisensor systems. However, in situ continual measuring is critical in precise assessment of the microenvironment parameters and the dynamic responses of the organs to pharmaceutical compounds over extended periods of time. In addition, automated and noninvasive capability is strongly desired for long-term monitoring. Here, we report a fully integrated modular physical, biochemical, and optical sensing platform through a fluidics-routing breadboard, which operates organ-on-a-chip units in a continual, dynamic, and automated manner. We believe that this platform technology has paved a potential avenue to promote the performance of current organ-on-a-chip models in drug screening by integrating a multitude of real-time sensors to achieve automated in situ monitoring of biophysical and biochemical parameters. PMID:28265064

  1. Nomenclature in laboratory robotics and automation (IUPAC Recommendation 1994)

    PubMed Central

    (Skip) Kingston, H. M.; Kingstonz, M. L.

    1994-01-01

    These recommended terms have been prepared to help provide a uniform approach to terminology and notation in laboratory automation and robotics. Since the terminology used in laboratory automation and robotics has been derived from diverse backgrounds, it is often vague, imprecise, and in some cases, in conflict with classical automation and robotic nomenclature. These dejinitions have been assembled from standards, monographs, dictionaries, journal articles, and documents of international organizations emphasizing laboratory and industrial automation and robotics. When appropriate, definitions have been taken directly from the original source and identified with that source. However, in some cases no acceptable definition could be found and a new definition was prepared to define the object, term, or action. Attention has been given to defining specific robot types, coordinate systems, parameters, attributes, communication protocols and associated workstations and hardware. Diagrams are included to illustrate specific concepts that can best be understood by visualization. PMID:18924684

  2. Open-Source Tools for Enhancing Full-Text Searching of OPACs: Use of Koha, Greenstone and Fedora

    ERIC Educational Resources Information Center

    Anuradha, K. T.; Sivakaminathan, R.; Kumar, P. Arun

    2011-01-01

    Purpose: There are many library automation packages available as open-source software, comprising two modules: staff-client module and online public access catalogue (OPAC). Although the OPAC of these library automation packages provides advanced features of searching and retrieval of bibliographic records, none of them facilitate full-text…

  3. Navigation and Elctro-Optic Sensor Integration Technology for Fusion of Imagery and Digital Mapping Products

    DTIC Science & Technology

    1999-08-01

    Electro - Optic Sensor Integration Technology (NEOSIT) software application. The design is highly modular and based on COTS tools to facilitate integration with sensors, navigation and digital data sources already installed on different host

  4. An automated dose tracking system for adaptive radiation therapy.

    PubMed

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient dose tracking system for ART in the clinical setting is presented. The software and automated processes were rigorously evaluated and validated using patient image datasets. Automation of the various procedures has improved efficiency significantly, allowing for the routine clinical application of ART for improving radiation therapy effectiveness. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Shape optimization of three-dimensional stamped and solid automotive components

    NASA Technical Reports Server (NTRS)

    Botkin, M. E.; Yang, R.-J.; Bennett, J. A.

    1987-01-01

    The shape optimization of realistic, 3-D automotive components is discussed. The integration of the major parts of the total process: modeling, mesh generation, finite element and sensitivity analysis, and optimization are stressed. Stamped components and solid components are treated separately. For stamped parts a highly automated capability was developed. The problem description is based upon a parameterized boundary design element concept for the definition of the geometry. Automatic triangulation and adaptive mesh refinement are used to provide an automated analysis capability which requires only boundary data and takes into account sensitivity of the solution accuracy to boundary shape. For solid components a general extension of the 2-D boundary design element concept has not been achieved. In this case, the parameterized surface shape is provided using a generic modeling concept based upon isoparametric mapping patches which also serves as the mesh generator. Emphasis is placed upon the coupling of optimization with a commercially available finite element program. To do this it is necessary to modularize the program architecture and obtain shape design sensitivities using the material derivative approach so that only boundary solution data is needed.

  6. A Modular Hierarchical Approach to 3D Electron Microscopy Image Segmentation

    PubMed Central

    Liu, Ting; Jones, Cory; Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2014-01-01

    The study of neural circuit reconstruction, i.e., connectomics, is a challenging problem in neuroscience. Automated and semi-automated electron microscopy (EM) image analysis can be tremendously helpful for connectomics research. In this paper, we propose a fully automatic approach for intra-section segmentation and inter-section reconstruction of neurons using EM images. A hierarchical merge tree structure is built to represent multiple region hypotheses and supervised classification techniques are used to evaluate their potentials, based on which we resolve the merge tree with consistency constraints to acquire final intra-section segmentation. Then, we use a supervised learning based linking procedure for the inter-section neuron reconstruction. Also, we develop a semi-automatic method that utilizes the intermediate outputs of our automatic algorithm and achieves intra-segmentation with minimal user intervention. The experimental results show that our automatic method can achieve close-to-human intra-segmentation accuracy and state-of-the-art inter-section reconstruction accuracy. We also show that our semi-automatic method can further improve the intra-segmentation accuracy. PMID:24491638

  7. Tunable, Quantitative Fenton-RAFT Polymerization via Metered Reagent Addition.

    PubMed

    Nothling, Mitchell D; McKenzie, Thomas G; Reyhani, Amin; Qiao, Greg G

    2018-05-10

    A continuous supply of radical species is a key requirement for activating chain growth and accessing quantitative monomer conversions in reversible addition-fragmentation chain transfer (RAFT) polymerization. In Fenton-RAFT, activation is provided by hydroxyl radicals, whose indiscriminate reactivity and short-lived nature poses a challenge to accessing extended polymerization times and quantitative monomer conversions. Here, an alternative Fenton-RAFT procedure is presented, whereby radical generation can be finely controlled via metered dosing of a component of the Fenton redox reaction (H 2 O 2 ) using an external pumping system. By limiting the instantaneous flux of radicals and ensuring sustained radical generation over tunable time periods, metered reagent addition reduces unwanted radical "wasting" reactions and provides access to consistent quantitative monomer conversions with high chain-end fidelity. Fine tuning of radical concentration during polymerization is achieved simply via adjustment of reagent dose rate, offering significant potential for automation. This modular strategy holds promise for extending traditional RAFT initiation toward more tightly regulated radical concentration profiles and affords excellent prospects for the automation of Fenton-RAFT polymerization. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A modular microfluidic platform for the synthesis of biopolymeric nanoparticles entrapping organic actives

    NASA Astrophysics Data System (ADS)

    Chronopoulou, Laura; Sparago, Carolina; Palocci, Cleofe

    2014-11-01

    Using a novel and versatile capillary microfluidic flow-focusing device we fabricated monodisperse drug-loaded nanoparticles from biodegradable polymers. A model amphiphilic drug (dexamethasone) was incorporated within the biodegradable matrix of the particles. The influence of flow rate ratio, polymer concentration, and microreactor-focusing channel dimensions on nanoparticles' size and drug loading has been investigated. The microfluidic approach resulted in the production of colloidal polymeric nanoparticles with a narrow size distribution (diameters ranging between 35 and 350 nm) and useful morphological characteristics. This technique allows the fast, low cost, easy, and automated synthesis of polymeric nanoparticles, therefore it may become a useful approach in the progression from laboratory scale to pilot-line scale processes.

  9. Software Design for Interactive Graphic Radiation Treatment Simulation Systems*

    PubMed Central

    Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan

    1990-01-01

    We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.

  10. Software engineering aspects of real-time programming concepts

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1986-08-01

    Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.

  11. The Flight Telerobotic Servicer (FTS) - A focus for automation and robotics on the Space Station

    NASA Technical Reports Server (NTRS)

    Hinkal, Sanford W.; Andary, James F.; Watzin, James G.; Provost, David E.

    1987-01-01

    The concept, fundamental design principles, and capabilities of the FTS, a multipurpose telerobotic system for use on the Space Station and Space Shuttle, are discussed. The FTS is intended to assist the crew in the performance of extravehicular tasks; the telerobot will also be used on the Orbital Maneuvering Vehicle to service free-flyer spacecraft. The FTS will be capable of both teleoperation and autonomous operation; eventually it may also utilize ground control. By careful selection of the functional architecture and a modular approach to the hardware and software design, the FTS can accept developments in artificial intelligence and newer, more advanced sensors, such as machine vision and collision avoidance.

  12. Methods and tools for profiling and control of distributed systems

    NASA Astrophysics Data System (ADS)

    Sukharev, R.; Lukyanchikov, O.; Nikulchev, E.; Biryukov, D.; Ryadchikov, I.

    2018-02-01

    This article is devoted to the topic of profiling and control of distributed systems. Distributed systems have a complex architecture, applications are distributed among various computing nodes, and many network operations are performed. Therefore, today it is important to develop methods and tools for profiling distributed systems. The article analyzes and standardizes methods for profiling distributed systems that focus on simulation to conduct experiments and build a graph model of the system. The theory of queueing networks is used for simulation modeling of distributed systems, receiving and processing user requests. To automate the above method of profiling distributed systems the software application was developed with a modular structure and similar to a SCADA-system.

  13. Comparative analysis of chemical similarity methods for modular natural products with a hypothetical structure enumeration algorithm.

    PubMed

    Skinnider, Michael A; Dejong, Chris A; Franczak, Brian C; McNicholas, Paul D; Magarvey, Nathan A

    2017-08-16

    Natural products represent a prominent source of pharmaceutically and industrially important agents. Calculating the chemical similarity of two molecules is a central task in cheminformatics, with applications at multiple stages of the drug discovery pipeline. Quantifying the similarity of natural products is a particularly important problem, as the biological activities of these molecules have been extensively optimized by natural selection. The large and structurally complex scaffolds of natural products distinguish their physical and chemical properties from those of synthetic compounds. However, no analysis of the performance of existing methods for molecular similarity calculation specific to natural products has been reported to date. Here, we present LEMONS, an algorithm for the enumeration of hypothetical modular natural product structures. We leverage this algorithm to conduct a comparative analysis of molecular similarity methods within the unique chemical space occupied by modular natural products using controlled synthetic data, and comprehensively investigate the impact of diverse biosynthetic parameters on similarity search. We additionally investigate a recently described algorithm for natural product retrobiosynthesis and alignment, and find that when rule-based retrobiosynthesis can be applied, this approach outperforms conventional two-dimensional fingerprints, suggesting it may represent a valuable approach for the targeted exploration of natural product chemical space and microbial genome mining. Our open-source algorithm is an extensible method of enumerating hypothetical natural product structures with diverse potential applications in bioinformatics.

  14. A modular Human Exposure Model (HEM) framework to ...

    EPA Pesticide Factsheets

    Life Cycle Impact Analysis (LCIA) has proven to be a valuable tool for systematically comparing processes and products, and has been proposed for use in Chemical Alternatives Analysis (CAA). The exposure assessment portion of the human health impact scores of LCIA has historically focused on far-field sources (environmentally mediated exposures) while research has shown that use related exposures, (near-field exposures) typically dominate population exposure. Characterizing the human health impacts of chemicals in consumer products over the life cycle of these products requires an evaluation of both near-field as well far-field sources. Assessing the impacts of the near-field exposures requires bridging the scientific and technical gaps that currently prevent the harmonious use of the best available methods and tools from the fields of LCIA and human health exposure and risk assessment. The U.S. EPA’s Chemical Safety and Sustainability LC-HEM project is developing the Human Exposure Model (HEM) to assess near-field exposures to chemicals that occur to various populations over the life cycle of a commercial product. The HEM will be a publically available, web-based, modular system which will allow for the evaluation of chemical/product impacts in a LCIA framework to support CAA. We present here an overview of the framework for the modular HEM system. The framework includes a data flow diagram of in-progress and future planned modules, the definition of each mod

  15. Advances in Automation Prompt Concern over Increased U.S. Unemployment.

    ERIC Educational Resources Information Center

    General Accounting Office, Washington, DC.

    The General Accounting Office recently studied automation, especially the advent of microelectronics, and its impact on unemployment. The study included identifying available information sources and obtaining opinions on the impact of automation on employment, federal efforts to predict its impact, the dissemination of information about the job…

  16. The Use of Office Automation by Managers: A Survey.

    ERIC Educational Resources Information Center

    Fleischer, Mitchell; Morell, Jonathan A.

    1988-01-01

    Describes a survey that examined office automation use by managers and the impact on managerial roles. The factors discussed include the impact on decision making, changes in work activities, sources of support, training, and different uses between managerial ranks. Recommendations are offered for improving use of office automation. (13…

  17. Automated microfluidic platform for systematic studies of colloidal perovskite nanocrystals: towards continuous nano-manufacturing.

    PubMed

    Epps, Robert W; Felton, Kobi C; Coley, Connor W; Abolhasani, Milad

    2017-11-21

    Colloidal organic/inorganic metal-halide perovskite nanocrystals have recently emerged as a potential low-cost replacement for the semiconductor materials in commercial photovoltaics and light emitting diodes. However, unlike III-V and IV-VI semiconductor nanocrystals, studies of colloidal perovskite nanocrystals have yet to develop a fundamental and comprehensive understanding of nucleation and growth kinetics. Here, we introduce a modular and automated microfluidic platform for the systematic studies of room-temperature synthesized cesium-lead halide perovskite nanocrystals. With abundant data collection across the entirety of four orders of magnitude reaction time span, we comprehensively characterize nanocrystal growth within a modular microfluidic reactor. The developed high-throughput screening platform features a custom-designed three-port flow cell with translational capability for in situ spectral characterization of the in-flow synthesized perovskite nanocrystals along a tubular microreactor with an adjustable length, ranging from 3 cm to 196 cm. The translational flow cell allows for sampling of twenty unique residence times at a single equilibrated flow rate. The developed technique requires an average total liquid consumption of 20 μL per spectra and as little as 2 μL at the time of sampling. It may continuously sample up to 30 000 unique spectra per day in both single and multi-phase flow formats. Using the developed plug-and-play microfluidic platform, we study the growth of cesium lead trihalide perovskite nanocrystals through in situ monitoring of their absorption and emission band-gaps at residence times ranging from 100 ms to 17 min. The automated microfluidic platform enables a systematic study of the effect of mixing enhancement on the quality of the synthesized nanocrystals through a direct comparison between single- and multi-phase flow systems at similar reaction time scales. The improved mixing characteristics of the multi-phase flow format results in high-quality perovskite nanocrystals with kinetically tunable emission wavelength, ranging as much as 25 nm at equivalent residence times. Further application of this unique platform would allow rapid parameter optimization in the colloidal synthesis of a wide range of nanomaterials (e.g., metal or semiconductor), that is directly transferable to continuous manufacturing in a numbered-up platform with a similar characteristic length scale.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Yong-Hoon, E-mail: chaotics@snu.ac.kr; Park, Sangrok; Kim, Byong Sup

    Since the first nuclear power was engaged in Korean electricity grid in 1978, intensive research and development has been focused on localization and standardization of large pressurized water reactors (PWRs) aiming at providing Korean peninsula and beyond with economical and safe power source. With increased priority placed on the safety since Chernobyl accident, Korean nuclear power R and D activity has been diversified into advanced PWR, small modular PWR and generation IV reactors. After the outbreak of Fukushima accident, inherently safe small modular reactor (SMR) receives growing interest in Korea and Europe. In this paper, we will describe recent statusmore » of evolving designs of SMR, their advantages and challenges. In particular, the conceptual design of lead-bismuth cooled SMR in Korea, URANUS with 40∼70 MWe is examined in detail. This paper will cover a framework of the program and a strategy for the successful deployment of small modular reactor how the goals would entail and the approach to collaboration with other entities.« less

  19. Modular 3D-Printed Soil Gas Probes

    NASA Astrophysics Data System (ADS)

    Good, S. P.; Selker, J. S.; Al-Qqaili, F.; Lopez, M.; Kahel, L.

    2016-12-01

    ABSTRACT: Extraction of soil gas is required for a variety of applications in earth sciences and environmental engineering. However, commercially available probes can be costly and are typically limited to a single depth. Here, we present the open-source design and lab testing of a soil gas probe with modular capabilities that allow for the vertical stacking of gas extraction points at different depths in the soil column. The probe modules consist of a 3D printed spacer unit and hydrophobic gas permeable membrane made of high density Polyethylene with pore sizes 20-40 microns. Each of the modular spacer units contain both a gas extraction line and gas input line for the dilution of soil gases if needed. These 2-inch diameter probes can be installed in the field quickly with a hand auger and returned to at any frequency to extract soil gas from desired soil depths. The probes are tested through extraction of soil pore water vapors with distinct stable isotope ratios.

  20. Generic HPLC platform for automated enzyme reaction monitoring: Advancing the assay toolbox for transaminases and other PLP-dependent enzymes.

    PubMed

    Börner, Tim; Grey, Carl; Adlercreutz, Patrick

    2016-08-01

    Methods for rapid and direct quantification of enzyme kinetics independent of the substrate stand in high demand for both fundamental research and bioprocess development. This study addresses the need for a generic method by developing an automated, standardizable HPLC platform monitoring reaction progress in near real-time. The method was applied to amine transaminase (ATA) catalyzed reactions intensifying process development for chiral amine synthesis. Autosampler-assisted pipetting facilitates integrated mixing and sampling under controlled temperature. Crude enzyme formulations in high and low substrate concentrations can be employed. Sequential, small (1 µL) sample injections and immediate detection after separation permits fast reaction monitoring with excellent sensitivity, accuracy and reproducibility. Due to its modular design, different chromatographic techniques, e.g. reverse phase and size exclusion chromatography (SEC) can be employed. A novel assay for pyridoxal 5'-phosphate-dependent enzymes is presented using SEC for direct monitoring of enzyme-bound and free reaction intermediates. Time-resolved changes of the different cofactor states, e.g. pyridoxal 5'-phosphate, pyridoxamine 5'-phosphate and the internal aldimine were traced in both half reactions. The combination of the automated HPLC platform with SEC offers a method for substrate-independent screening, which renders a missing piece in the assay and screening toolbox for ATAs and other PLP-dependent enzymes. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Automated Tracking of Motion and Body Weight for Objective Monitoring of Rats in Colony Housing

    PubMed Central

    Brenneis, Christian; Westhof, Andreas; Holschbach, Jeannine; Michaelis, Martin; Guehring, Hans; Kleinschmidt-Doerr, Kerstin

    2017-01-01

    Living together in large social communities within an enriched environment stimulates self-motivated activity in rats. We developed a modular housing system in which a single unit can accommodate as many as 48 rats and contains multiple functional areas. This rat colony cage further allowed us to remotely measure body weight and to continuously measure movement, including jumping and stair walking between areas. Compared with pair-housed, age-, strain-, and weight-matched rats in conventional cages, the colony-housed rats exhibited higher body mass indices, had more exploratory behavior, and were more cooperative during handling. Continuous activity tracking revealed that the amount of spontaneous locomotion, such as jumping between levels and running through the staircase, fell after surgery, blood sampling, injections, and behavioral tests to a similar extent regardless of the specific intervention. Data from the automated system allowed us to identify individual rats with significant differences (>2 SD) from other cohoused rats; these rats showed potential health problems, as verified using conventional health scoring. Thus, our rat colony cage permits social interaction and provides a variety of functional areas, thereby perhaps improving animal wellbeing. Furthermore, automated online tracking enabled continuous quantification of spontaneous motion, potentially providing objective measures of animal behavior in various disease models and reducing the need for experimental manipulation. Finally, health monitoring of individual rats was facilitated in an objective manner. PMID:28905711

  2. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    PubMed

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  3. Ontology Alignment Repair through Modularization and Confidence-Based Heuristics

    PubMed Central

    Santos, Emanuel; Faria, Daniel; Pesquita, Catia; Couto, Francisco M.

    2015-01-01

    Ontology Matching aims at identifying a set of semantic correspondences, called an alignment, between related ontologies. In recent years, there has been a growing interest in efficient and effective matching methods for large ontologies. However, alignments produced for large ontologies are often logically incoherent. It was only recently that the use of repair techniques to improve the coherence of ontology alignments began to be explored. This paper presents a novel modularization technique for ontology alignment repair which extracts fragments of the input ontologies that only contain the necessary classes and relations to resolve all detectable incoherences. The paper presents also an alignment repair algorithm that uses a global repair strategy to minimize both the degree of incoherence and the number of mappings removed from the alignment, while overcoming the scalability problem by employing the proposed modularization technique. Our evaluation shows that our modularization technique produces significantly small fragments of the ontologies and that our repair algorithm produces more complete alignments than other current alignment repair systems, while obtaining an equivalent degree of incoherence. Additionally, we also present a variant of our repair algorithm that makes use of the confidence values of the mappings to improve alignment repair. Our repair algorithm was implemented as part of AgreementMakerLight, a free and open-source ontology matching system. PMID:26710335

  4. Ontology Alignment Repair through Modularization and Confidence-Based Heuristics.

    PubMed

    Santos, Emanuel; Faria, Daniel; Pesquita, Catia; Couto, Francisco M

    2015-01-01

    Ontology Matching aims at identifying a set of semantic correspondences, called an alignment, between related ontologies. In recent years, there has been a growing interest in efficient and effective matching methods for large ontologies. However, alignments produced for large ontologies are often logically incoherent. It was only recently that the use of repair techniques to improve the coherence of ontology alignments began to be explored. This paper presents a novel modularization technique for ontology alignment repair which extracts fragments of the input ontologies that only contain the necessary classes and relations to resolve all detectable incoherences. The paper presents also an alignment repair algorithm that uses a global repair strategy to minimize both the degree of incoherence and the number of mappings removed from the alignment, while overcoming the scalability problem by employing the proposed modularization technique. Our evaluation shows that our modularization technique produces significantly small fragments of the ontologies and that our repair algorithm produces more complete alignments than other current alignment repair systems, while obtaining an equivalent degree of incoherence. Additionally, we also present a variant of our repair algorithm that makes use of the confidence values of the mappings to improve alignment repair. Our repair algorithm was implemented as part of AgreementMakerLight, a free and open-source ontology matching system.

  5. Automated source classification of new transient sources

    NASA Astrophysics Data System (ADS)

    Oertel, M.; Kreikenbohm, A.; Wilms, J.; DeLuca, A.

    2017-10-01

    The EXTraS project harvests the hitherto unexplored temporal domain information buried in the serendipitous data collected by the European Photon Imaging Camera (EPIC) onboard the ESA XMM-Newton mission since its launch. This includes a search for fast transients, missed by standard image analysis, and a search and characterization of variability in hundreds of thousands of sources. We present an automated classification scheme for new transient sources in the EXTraS project. The method is as follows: source classification features of a training sample are used to train machine learning algorithms (performed in R; randomForest (Breiman, 2001) in supervised mode) which are then tested on a sample of known source classes and used for classification.

  6. SOFIA: a flexible source finder for 3D spectral line data

    NASA Astrophysics Data System (ADS)

    Serra, Paolo; Westmeier, Tobias; Giese, Nadine; Jurek, Russell; Flöer, Lars; Popping, Attila; Winkel, Benjamin; van der Hulst, Thijs; Meyer, Martin; Koribalski, Bärbel S.; Staveley-Smith, Lister; Courtois, Hélène

    2015-04-01

    We introduce SOFIA, a flexible software application for the detection and parametrization of sources in 3D spectral line data sets. SOFIA combines for the first time in a single piece of software a set of new source-finding and parametrization algorithms developed on the way to future H I surveys with ASKAP (WALLABY, DINGO) and APERTIF. It is designed to enable the general use of these new algorithms by the community on a broad range of data sets. The key advantages of SOFIA are the ability to: search for line emission on multiple scales to detect 3D sources in a complete and reliable way, taking into account noise level variations and the presence of artefacts in a data cube; estimate the reliability of individual detections; look for signal in arbitrarily large data cubes using a catalogue of 3D coordinates as a prior; provide a wide range of source parameters and output products which facilitate further analysis by the user. We highlight the modularity of SOFIA, which makes it a flexible package allowing users to select and apply only the algorithms useful for their data and science questions. This modularity makes it also possible to easily expand SOFIA in order to include additional methods as they become available. The full SOFIA distribution, including a dedicated graphical user interface, is publicly available for download.

  7. Unclassified Information Sharing and Coordination in Security, Stabilization, Transition and Reconstruction Efforts

    DTIC Science & Technology

    2008-03-01

    is implemented using the Drupal (2007) content management system (CMS) and many of the baseline information sharing and collaboration tools have...been contributed through the Dru- pal open source community. Drupal is a very modular open source software written in PHP hypertext processor...needed to suit the particular problem domain. While other frameworks have the potential to provide similar advantages (“Ruby,” 2007), Drupal was

  8. BioBlend: automating pipeline analyses within Galaxy and CloudMan.

    PubMed

    Sloggett, Clare; Goonasekera, Nuwan; Afgan, Enis

    2013-07-01

    We present BioBlend, a unified API in a high-level language (python) that wraps the functionality of Galaxy and CloudMan APIs. BioBlend makes it easy for bioinformaticians to automate end-to-end large data analysis, from scratch, in a way that is highly accessible to collaborators, by allowing them to both provide the required infrastructure and automate complex analyses over large datasets within the familiar Galaxy environment. http://bioblend.readthedocs.org/. Automated installation of BioBlend is available via PyPI (e.g. pip install bioblend). Alternatively, the source code is available from the GitHub repository (https://github.com/afgane/bioblend) under the MIT open source license. The library has been tested and is working on Linux, Macintosh and Windows-based systems.

  9. Automated Modular Magnetic Resonance Imaging Clinical Decision Support System (MIROR): An Application in Pediatric Cancer Diagnosis

    PubMed Central

    Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine

    2018-01-01

    Background Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. Objective The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. Methods The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Results Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. Conclusions MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians’ skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. PMID:29720361

  10. Automated Selection of Hotspots (ASH): enhanced automated segmentation and adaptive step finding for Ki67 hotspot detection in adrenal cortical cancer.

    PubMed

    Lu, Hao; Papathomas, Thomas G; van Zessen, David; Palli, Ivo; de Krijger, Ronald R; van der Spek, Peter J; Dinjens, Winand N M; Stubbs, Andrew P

    2014-11-25

    In prognosis and therapeutics of adrenal cortical carcinoma (ACC), the selection of the most active areas in proliferative rate (hotspots) within a slide and objective quantification of immunohistochemical Ki67 Labelling Index (LI) are of critical importance. In addition to intratumoral heterogeneity in proliferative rate i.e. levels of Ki67 expression within a given ACC, lack of uniformity and reproducibility in the method of quantification of Ki67 LI may confound an accurate assessment of Ki67 LI. We have implemented an open source toolset, Automated Selection of Hotspots (ASH), for automated hotspot detection and quantification of Ki67 LI. ASH utilizes NanoZoomer Digital Pathology Image (NDPI) splitter to convert the specific NDPI format digital slide scanned from the Hamamatsu instrument into a conventional tiff or jpeg format image for automated segmentation and adaptive step finding hotspots detection algorithm. Quantitative hotspot ranking is provided by the functionality from the open source application ImmunoRatio as part of the ASH protocol. The output is a ranked set of hotspots with concomitant quantitative values based on whole slide ranking. We have implemented an open source automated detection quantitative ranking of hotspots to support histopathologists in selecting the 'hottest' hotspot areas in adrenocortical carcinoma. To provide wider community easy access to ASH we implemented a Galaxy virtual machine (VM) of ASH which is available from http://bioinformatics.erasmusmc.nl/wiki/Automated_Selection_of_Hotspots . The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/13000_2014_216.

  11. A cost-effective intelligent robotic system with dual-arm dexterous coordination and real-time vision

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Chen, Alexander Y. K.

    1991-01-01

    Dexterous coordination of manipulators based on the use of redundant degrees of freedom, multiple sensors, and built-in robot intelligence represents a critical breakthrough in development of advanced manufacturing technology. A cost-effective approach for achieving this new generation of robotics has been made possible by the unprecedented growth of the latest microcomputer and network systems. The resulting flexible automation offers the opportunity to improve the product quality, increase the reliability of the manufacturing process, and augment the production procedures for optimizing the utilization of the robotic system. Moreover, the Advanced Robotic System (ARS) is modular in design and can be upgraded by closely following technological advancements as they occur in various fields. This approach to manufacturing automation enhances the financial justification and ensures the long-term profitability and most efficient implementation of robotic technology. The new system also addresses a broad spectrum of manufacturing demand and has the potential to address both complex jobs as well as highly labor-intensive tasks. The ARS prototype employs the decomposed optimization technique in spatial planning. This technique is implemented to the framework of the sensor-actuator network to establish the general-purpose geometric reasoning system. The development computer system is a multiple microcomputer network system, which provides the architecture for executing the modular network computing algorithms. The knowledge-based approach used in both the robot vision subsystem and the manipulation control subsystems results in the real-time image processing vision-based capability. The vision-based task environment analysis capability and the responsive motion capability are under the command of the local intelligence centers. An array of ultrasonic, proximity, and optoelectronic sensors is used for path planning. The ARS currently has 18 degrees of freedom made up by two articulated arms, one movable robot head, and two charged coupled device (CCD) cameras for producing the stereoscopic views, and articulated cylindrical-type lower body, and an optional mobile base. A functional prototype is demonstrated.

  12. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieselquist, William A.; Thompson, Adam B.; Bowman, Stephen M.

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process datamore » to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.« less

  13. Minicourses in Astrophysics, Modular Approach, Vol. II.

    ERIC Educational Resources Information Center

    Illinois Univ., Chicago.

    This is the second of a two-volume minicourse in astrophysics. It contains chapters on the following topics: stellar nuclear energy sources and nucleosynthesis; stellar evolution; stellar structure and its determination; and pulsars. Each chapter gives much technical discussion, mathematical treatment, diagrams, and examples. References are…

  14. Localizer: fast, accurate, open-source, and modular software package for superresolution microscopy

    PubMed Central

    Duwé, Sam; Neely, Robert K.; Zhang, Jin

    2012-01-01

    Abstract. We present Localizer, a freely available and open source software package that implements the computational data processing inherent to several types of superresolution fluorescence imaging, such as localization (PALM/STORM/GSDIM) and fluctuation imaging (SOFI/pcSOFI). Localizer delivers high accuracy and performance and comes with a fully featured and easy-to-use graphical user interface but is also designed to be integrated in higher-level analysis environments. Due to its modular design, Localizer can be readily extended with new algorithms as they become available, while maintaining the same interface and performance. We provide front-ends for running Localizer from Igor Pro, Matlab, or as a stand-alone program. We show that Localizer performs favorably when compared with two existing superresolution packages, and to our knowledge is the only freely available implementation of SOFI/pcSOFI microscopy. By dramatically improving the analysis performance and ensuring the easy addition of current and future enhancements, Localizer strongly improves the usability of superresolution imaging in a variety of biomedical studies. PMID:23208219

  15. The Ozone Widget Framework: towards modularity of C2 human interfaces

    NASA Astrophysics Data System (ADS)

    Hellar, David Benjamin; Vega, Laurian C.

    2012-05-01

    The Ozone Widget Framework (OWF) is a common webtop environment for distribution across the enterprise. A key mission driver for OWF is to enable rapid capability delivery by lowering time-to-market with lightweight components. OWF has been released as Government Open Source Software and has been deployed in a variety of C2 net-centric contexts ranging from real-time analytics, cyber-situational awareness, to strategic and operational planning. This paper discusses the current and future evolution of OWF including the availability of the OZONE Marketplace (OMP), useractivity driven metrics, and architecture enhancements for accessibility. Together, OWF is moving towards the rapid delivery of modular human interfaces supporting modern and future command and control contexts.

  16. Space Construction Automated Fabrication Experiment Definition Study (SCAFEDS). Volume 3: Requirements

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The performance, design, and verification requirements for the space construction automated fabrication experiment (SCAFE) are defined and the source of each imposed or derived requirement is identified.

  17. Automated Report Generation for Research Data Repositories: From i2b2 to PDF.

    PubMed

    Thiemann, Volker S; Xu, Tingyan; Röhrig, Rainer; Majeed, Raphael W

    2017-01-01

    We developed an automated toolchain to generate reports of i2b2 data. It is based on free open source software and runs on a Java Application Server. It is sucessfully used in an ED registry project. The solution is highly configurable and portable to other projects based on i2b2 or compatible factual data sources.

  18. Internet-based modular program BADI for adjustment disorder: protocol of a randomized controlled trial.

    PubMed

    Skruibis, Paulius; Eimontas, Jonas; Dovydaitiene, Migle; Mazulyte, Egle; Zelviene, Paulina; Kazlauskas, Evaldas

    2016-07-26

    Adjustment disorder is one of the most common mental health diagnoses. Still it receives relatively little attention from researchers trying to establish best interventions to treat it. With high prevalence of stressful life events, which might be leading to adjustment disorder, and limited resources of mental health service providers, online interventions could be a very practical way of helping people who have these disorders or are in the risk to develop them. The proposed study protocol is aimed to describe a randomized controlled trial of an internet-based modular intervention for adjustment disorder as it is defined in a proposal for the ICD-11. This study is a two-armed Randomized Controlled Trial (RCT) to examine the effectiveness of a web-based intervention BADI (Brief Adjustment Disorder Intervention) for adjustment disorder symptoms. BADI has four modules: Relaxation, Time management, Mindfulness and Strengthening relationships. It is based on stress and coping research and integrates evidence-based treatment approaches such as Cognitive Behavioural therapy (CBT), mindfulness and body-mind practices, as well as exercises for enhancing social support. Primary outcome of the study are symptoms of adjustment disorder and well-being. Engagement into the program and motivation for change is a secondary outcome. All participants after completing the baseline assessment are randomly assigned to one of the two groups: either to the one in which participant will instantly gain access to the BADI intervention or a group in which participants will be given access to the BADI program after waiting one month. Participants of BADI can choose exercises of the program flexibly. There is no particular order in which the exercises should be completed. Study will provide new insights of modular internet-based interventions efficacy for adjustment disorders. The study will also provide information about the role of motivation and expectancies on engagement in modular internet-based interventions. In case this RCT supports effectiveness of fully automated version of BADI, it could be used very broadly. It could become a cost-effective and accessible intervention for adjustment disorder. The study was retrospectively registered with the Australian and New Zealand Clinical Trials Registry with the registration number ACTRN12616000883415 . Registered 5 July, 2016.

  19. A Design of a Modular GPHS-Stirling Power System for a Lunar Habitation Module

    NASA Technical Reports Server (NTRS)

    Schmitz, Paul C.; Penswick, L. Barry; Shaltens, Richard K.

    2005-01-01

    Lunar habitation modules need electricity and potentially heat to operate. Because of the low amounts of radiation emitted by General Purpose Heat Source (GPHS) modules, power plants incorporating these as heat sources could be placed in close proximity to habitation modules. A design concept is discussed for a high efficiency power plant based on a GPHS assembly integrated with a Stirling convertor. This system could provide both electrical power and heat, if required, for a lunar habitation module. The conceptual GPHS/Stirling system is modular in nature and made up of a basic 5.5 KWe Stirling convertor/GPHS module assembly, convertor controller/PMAD electronics, waste heat radiators, and associated thermal insulation. For the specific lunar application under investigation eight modules are employed to deliver 40 KWe to the habitation module. This design looks at three levels of Stirling convertor technology and addresses the issues of integrating the Stirling convertors with the GPHS heat sources assembly using proven technology whenever possible. In addition, issues related to the high-temperature heat transport system, power management, convertor control, vibration isolation, and potential system packaging configurations to ensure safe operation during all phases of deployment will be discussed.

  20. Long-read sequencing data analysis for yeasts.

    PubMed

    Yue, Jia-Xing; Liti, Gianni

    2018-06-01

    Long-read sequencing technologies have become increasingly popular due to their strengths in resolving complex genomic regions. As a leading model organism with small genome size and great biotechnological importance, the budding yeast Saccharomyces cerevisiae has many isolates currently being sequenced with long reads. However, analyzing long-read sequencing data to produce high-quality genome assembly and annotation remains challenging. Here, we present a modular computational framework named long-read sequencing data analysis for yeasts (LRSDAY), the first one-stop solution that streamlines this process. Starting from the raw sequencing reads, LRSDAY can produce chromosome-level genome assembly and comprehensive genome annotation in a highly automated manner with minimal manual intervention, which is not possible using any alternative tool available to date. The annotated genomic features include centromeres, protein-coding genes, tRNAs, transposable elements (TEs), and telomere-associated elements. Although tailored for S. cerevisiae, we designed LRSDAY to be highly modular and customizable, making it adaptable to virtually any eukaryotic organism. When applying LRSDAY to an S. cerevisiae strain, it takes ∼41 h to generate a complete and well-annotated genome from ∼100× Pacific Biosciences (PacBio) running the basic workflow with four threads. Basic experience working within the Linux command-line environment is recommended for carrying out the analysis using LRSDAY.

  1. Waterway wide area tactical coverage and homing (WaterWATCH) program overview

    NASA Astrophysics Data System (ADS)

    Driggers, Gerald; Cleveland, Tammy; Araujo, Lisa; Spohr, Robert; Umansky, Mark

    2008-04-01

    The Congressional and Army sponsored WaterWATCH TM Program has developed and demonstrated a fully integrated shallow water port and facility monitoring system. It provides fully automated monitoring of domains above and below the surface of the water using primarily off-the-shelf sensors and software. The system is modular, open architecture and IP based, and elements can be mixed and matched to adapt to specific applications. The sensors integrated into the WaterWATCH TM system include cameras, radar, passive and active sonar, and various motion detectors. The sensors were chosen based on extensive requirements analyses and tradeoffs. Descriptions of the system and individual sensors are provided, along with data from modular and system level testing. Camera test results address capabilities and limitations associated with using "smart" image analysis software with stressing environmental issues such as bugs, darkness, rain and snow. Radar issues addressed include achieving range and resolution requirements. The passive sonar capability to provide near 100% true positives with zero false positives is demonstrated. Testing results are also presented to show that inexpensive active sonar can be effective against divers with or without SCUBA gear and that false alarms due to fish can be minimized. A simple operator interface has also been demonstrated.

  2. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    PubMed

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  3. Integrated Cryogenic Electronics Testbed (ICE-T) for Evaluation of Superconductor and Cryo-Semiconductor Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Dotsenko, V. V.; Sahu, A.; Chonigman, B.; Tang, J.; Lehmann, A. E.; Gupta, V.; Talalevskii, A.; Ruotolo, S.; Sarwana, S.; Webber, R. J.; Gupta, D.

    2017-02-01

    Research and development of cryogenic application-specific integrated circuits (ASICs), such as high-frequency (tens of GHz) semiconductor and superconductor mixed-signal circuits and large-scale (>10,000 Josephson Junctions) superconductor digital circuits, have long been hindered by the absence of specialized cryogenic test apparatus. During their iterative development phase, most ASICs require many additional input-output lines for applying independent bias controls, injecting test signals, and monitoring outputs of different sub-circuits. We are developing a full suite of modular test apparatus based on cryocoolers that do not consume liquid helium, and support extensive electrical interfaces to standard and custom test equipment. Our design separates the cryogenics from electrical connections, allowing even inexperienced users to conduct testing by simply mounting their ASIC on a removable electrical insert. Thermal connections between the cold stages and the inserts are made with robust thermal links. ICE-T accommodates two independent electrical inserts at the same time. We have designed various inserts, such as universal ones with all 40 or 80 coaxial cables and those with customized wiring and temperature-controlled stages. ICE-T features fast thermal cycling for rapid testing, enables detailed testing over long periods (days to months, if necessary), and even supports automated testing of digital ICs with modular additions.

  4. The new production theory for health care through clinical reengineering: a study of clinical guidelines--Part I.

    PubMed

    Sharp, J R

    1994-12-01

    Drucker writes that the emerging theory of manufacturing includes four principles and practices: statistical quality control, manufacturing accounting, modular organization, and systems approach. SQC is a rigorous, scientific method of identifying variation in the quality and productivity of a given production process, with an emphasis on improvement. The new manufacturing economics intends to integrate the production strategy with the business strategy in order to account for the biggest portions of costs that the old methods did not assess: time and automation. Production operations that are both standardized and flexible will allow the organization to keep up with changes in design, technology, and the market. The return on innovation in this environment is predicated on a modular arrangement of flexible steps in the process. Finally, the systems approach sees the entire process as being integrated in converting goods or services into economic satisfaction. There is now a major restructuring of the U.S. health care industry, and the incorporation of these four theories into health care reform would appear to be essential. This two-part article will address two problems: Will Drucker's theories relate to health care (Part I)? Will the "new manufacturing" in health care (practice guidelines) demonstrate cost, quality, and access changes that reform demands (Part II)?

  5. Improving Information Exchange and Coordination amongst Homeland Security Organizations (Briefing Charts)

    DTIC Science & Technology

    2005-06-01

    need for user-defined dashboard • automated monitoring of web data sources • task driven data aggregation and display Working toward automated processing of task, resource, and intelligence updates

  6. Automated Acquisition Systems: Keynote Address.

    ERIC Educational Resources Information Center

    Boss, Richard D.

    1980-01-01

    The 1980s offer libraries numerous automated acquisitions alternatives, including turnkey systems from circulation system vendors and the acquisition subsystems of the bibliographic utilities. Integration of systems from several sources poses the principal problem. (Author/RAA)

  7. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    PubMed

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in other species.

  8. Cislunar space infrastructure: Lunar technologies

    NASA Technical Reports Server (NTRS)

    Faller, W.; Hoehn, A.; Johnson, S.; Moos, P.; Wiltberger, N.

    1989-01-01

    Continuing its emphasis on the creation of a cisluar infrastructure as an appropriate and cost-effective method of space exploration and development, the University of Colorado explores the technologies necessary for the creation of such an infrastructure, namely (1) automation and robotics; (2) life support systems; (3) fluid management; (4) propulsion; and (5) rotating technologes. The technological focal point is on the development of automated and robotic systems for the implementation of a Lunar Oasis produced by automation and robotics (LOARS). Under direction from the NASA Office of Exploration, automation and robotics have been extensively utilized as an initiating stage in the return to the Moon. A pair of autonomous rovers, modular in design and built from interchangeable and specialized components, is proposed. Utilizing a 'buddy system', these rovers will be able to support each other and to enhance their individual capabilities. One rover primarily explores and maps while the second rover tests the feasibility of various materials-processing techniques. The automated missions emphasize availability and potential uses of lunar resources and the deployment and operations of the LOAR program. An experimental bio-volume is put into place as the precursor to a Lunar Environmentally Controlled Life Support System. The bio-volume will determine the reproduction, growth and production characteristics of various life forms housed on the lunar surface. Physiochemical regenerative technologies and stored resources will be used to buffer biological disturbances of the bio-volume environment. The in situ lunar resources will be both tested and used within this bio-volume. Second phase development on the lunar surface calls for manned operations. Repairs and reconfiguration of the initial framework will ensue. An autonomously initiated, manned Lunar Oasis can become an essential component of the United States space program. The Lunar Oasis will provide support to science, technology, and commerce. It will enable more cost-effective space exploration to the planets and beyond.

  9. Integrated workflows for spiking neuronal network simulations

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID:24368902

  10. Integrated workflows for spiking neuronal network simulations.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages.

  11. EARLINET Single Calculus Chain - technical - Part 2: Calculation of optical products

    NASA Astrophysics Data System (ADS)

    Mattis, Ina; D'Amico, Giuseppe; Baars, Holger; Amodeo, Aldo; Madonna, Fabio; Iarlori, Marco

    2016-07-01

    In this paper we present the automated software tool ELDA (EARLINET Lidar Data Analyzer) for the retrieval of profiles of optical particle properties from lidar signals. This tool is one of the calculus modules of the EARLINET Single Calculus Chain (SCC) which allows for the analysis of the data of many different lidar systems of EARLINET in an automated, unsupervised way. ELDA delivers profiles of particle extinction coefficients from Raman signals as well as profiles of particle backscatter coefficients from combinations of Raman and elastic signals or from elastic signals only. Those analyses start from pre-processed signals which have already been corrected for background, range dependency and hardware specific effects. An expert group reviewed all algorithms and solutions for critical calculus subsystems which are used within EARLINET with respect to their applicability for automated retrievals. Those methods have been implemented in ELDA. Since the software was designed in a modular way, it is possible to add new or alternative methods in future. Most of the implemented algorithms are well known and well documented, but some methods have especially been developed for ELDA, e.g., automated vertical smoothing and temporal averaging or the handling of effective vertical resolution in the case of lidar ratio retrievals, or the merging of near-range and far-range products. The accuracy of the retrieved profiles was tested following the procedure of the EARLINET-ASOS algorithm inter-comparison exercise which is based on the analysis of synthetic signals. Mean deviations, mean relative deviations, and normalized root-mean-square deviations were calculated for all possible products and three height layers. In all cases, the deviations were clearly below the maximum allowed values according to the EARLINET quality requirements.

  12. Modul.LES: a multi-compartment, multi-organism aquatic life support system as experimental platform for research in ∆g

    NASA Astrophysics Data System (ADS)

    Hilbig, Reinhard; Anken, Ralf; Grimm, Dennis

    In view of space exploration and long-term satellite missions, a new generation of multi-modular, multi-organism bioregenerative life support system with different experimental units (Modul.LES) is planned, and subunits are under construction. Modul.LES will be managed via telemetry and remote control and therefore is a fully automated experimental platform for different kinds of investigations. After several forerunner projects like AquaCells (2005), C.E.B.A.S. (1998, 2003) or Aquahab (OHB-System AG the Oreochromis Mossambicus Eu-glena Gracilis Aquatic Habitat (OmegaHab) was successfully flown in 2007 in course of the FOTON-M3 Mission. It was a 3 chamber controlled life support system (CLSS), compris-ing a bioreactor with the green algae Euglena gracilis, a fish chamber with larval cichlid fish Oreochromis mossambicus and a filter chamber with biodegrading bacteria. The sensory super-vision of housekeeping management was registered and controlled by telemetry. Additionally, all scientific data and videos of the organisms aboard were stored and sequentially transmitted to relay stations. Based on the effective performance of OmegaHab, this system was chosen for a reflight on Bion-M1 in 2012. As Bion-M1 is a long term mission (appr. 4 weeks), this CLSS (OmegaHab-XP) has to be redesigned and refurbished with enhanced performance. The number of chambers has been increased from 3 to 4: an algae bioreactor, a fish tank for adult and larval fish (hatchery inserted), a nutrition chamber with higher plants and crustaceans and a filter chamber. The OmegaHab-XP is a full automated system with an extended satellite downlink for video monitoring and housekeeping data acquisition, but no uplink for remote control. OmegaHab-XP provides numerous physical and chemical parameters which will be monitored regarding the state of the biological processes and thus enables the automated con-trol aboard. Besides the two basic parameters oxygen content and temperature, products of the nitrogene-cycle (concentration of ammonium, nitrite and nitrate) as well as conductivity will be measured. For this long term mission an external food supply as has been used with OmegaHab is not sufficient and, therefore, in OmegaHab-XP a nutrition compartment has been added. OmegaHab-XP is a multi-trophic system, designed as a basic concept and test-bed for future multi-modular platform Modul.LES. OmegaHab-XP comprises four different trophic lev-els. The algae experimental container is used as CO2 / O2 exchanger and serves as oxygen source for all heterotrophic organisms. The fish compartment is divided into two areas -namely a hatchery (larval cichlid fish Oreochromis mossambicus) and a fish tank (subadult cichlids). Once the yolk sack is resorbed (stage 19) the juvenile fish are capable to leave the hatchery via escapements into the fish compartment. In order to enable the development of fish from larval yolk sack stages to subadult fish a nutrition compartment is enclosed: In this nutrition compartment the crustacean Hyalella azteca will reproduce and build up a stable population by feeding on the Rigid Hornwort (Ceratophyllum demersum). Younger crustaceans can cross the barrier to the fish tank and can serve as nutrition for fully developed subadult fish. Waste products of all organisms will be assimilated by the water snail Biomphalaria glabrata. The scientific concept of Modul.LES is to establish a multidisciplinary framework of scientists and areas of scientific research (biophysics, molecular-organismic biology, biochemistry etc.) to analyze impacts of g on plants and animals.

  13. BIO::Phylo-phyloinformatic analysis using perl.

    PubMed

    Vos, Rutger A; Caravas, Jason; Hartmann, Klaas; Jensen, Mark A; Miller, Chase

    2011-02-27

    Phyloinformatic analyses involve large amounts of data and metadata of complex structure. Collecting, processing, analyzing, visualizing and summarizing these data and metadata should be done in steps that can be automated and reproduced. This requires flexible, modular toolkits that can represent, manipulate and persist phylogenetic data and metadata as objects with programmable interfaces. This paper presents Bio::Phylo, a Perl5 toolkit for phyloinformatic analysis. It implements classes and methods that are compatible with the well-known BioPerl toolkit, but is independent from it (making it easy to install) and features a richer API and a data model that is better able to manage the complex relationships between different fundamental data and metadata objects in phylogenetics. It supports commonly used file formats for phylogenetic data including the novel NeXML standard, which allows rich annotations of phylogenetic data to be stored and shared. Bio::Phylo can interact with BioPerl, thereby giving access to the file formats that BioPerl supports. Many methods for data simulation, transformation and manipulation, the analysis of tree shape, and tree visualization are provided. Bio::Phylo is composed of 59 richly documented Perl5 modules. It has been deployed successfully on a variety of computer architectures (including various Linux distributions, Mac OS X versions, Windows, Cygwin and UNIX-like systems). It is available as open source (GPL) software from http://search.cpan.org/dist/Bio-Phylo.

  14. Spatial Modeling for Resources Framework (SMRF): A modular framework for developing spatial forcing data in mountainous terrain

    NASA Astrophysics Data System (ADS)

    Havens, S.; Marks, D. G.; Kormos, P.; Hedrick, A. R.; Johnson, M.; Robertson, M.; Sandusky, M.

    2017-12-01

    In the Western US, operational water supply managers rely on statistical techniques to forecast the volume of water left to enter the reservoirs. As the climate changes and the demand increases for stored water utilized for irrigation, flood control, power generation, and ecosystem services, water managers have begun to move from statistical techniques towards using physically based models. To assist with the transition, a new open source framework was developed, the Spatial Modeling for Resources Framework (SMRF), to automate and simplify the most common forcing data distribution methods. SMRF is computationally efficient and can be implemented for both research and operational applications. Currently, SMRF is able to generate all of the forcing data required to run physically based snow or hydrologic models at 50-100 m resolution over regions of 500-10,000 km2, and has been successfully applied in real time and historical applications for the Boise River Basin in Idaho, USA, the Tuolumne River Basin and San Joaquin in California, USA, and Reynolds Creek Experimental Watershed in Idaho, USA. These applications use meteorological station measurements and numerical weather prediction model outputs as input data. SMRF has significantly streamlined the modeling workflow, decreased model set up time from weeks to days, and made near real-time application of physics-based snow and hydrologic models possible.

  15. An open-source, FireWire camera-based, Labview-controlled image acquisition system for automated, dynamic pupillometry and blink detection.

    PubMed

    de Souza, John Kennedy Schettino; Pinto, Marcos Antonio da Silva; Vieira, Pedro Gabrielle; Baron, Jerome; Tierra-Criollo, Carlos Julio

    2013-12-01

    The dynamic, accurate measurement of pupil size is extremely valuable for studying a large number of neuronal functions and dysfunctions. Despite tremendous and well-documented progress in image processing techniques for estimating pupil parameters, comparatively little work has been reported on practical hardware issues involved in designing image acquisition systems for pupil analysis. Here, we describe and validate the basic features of such a system which is based on a relatively compact, off-the-shelf, low-cost FireWire digital camera. We successfully implemented two configurable modes of video record: a continuous mode and an event-triggered mode. The interoperability of the whole system is guaranteed by a set of modular software components hosted on a personal computer and written in Labview. An offline analysis suite of image processing algorithms for automatically estimating pupillary and eyelid parameters were assessed using data obtained in human subjects. Our benchmark results show that such measurements can be done in a temporally precise way at a sampling frequency of up to 120 Hz and with an estimated maximum spatial resolution of 0.03 mm. Our software is made available free of charge to the scientific community, allowing end users to either use the software as is or modify it to suit their own needs. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. BIO::Phylo-phyloinformatic analysis using perl

    PubMed Central

    2011-01-01

    Background Phyloinformatic analyses involve large amounts of data and metadata of complex structure. Collecting, processing, analyzing, visualizing and summarizing these data and metadata should be done in steps that can be automated and reproduced. This requires flexible, modular toolkits that can represent, manipulate and persist phylogenetic data and metadata as objects with programmable interfaces. Results This paper presents Bio::Phylo, a Perl5 toolkit for phyloinformatic analysis. It implements classes and methods that are compatible with the well-known BioPerl toolkit, but is independent from it (making it easy to install) and features a richer API and a data model that is better able to manage the complex relationships between different fundamental data and metadata objects in phylogenetics. It supports commonly used file formats for phylogenetic data including the novel NeXML standard, which allows rich annotations of phylogenetic data to be stored and shared. Bio::Phylo can interact with BioPerl, thereby giving access to the file formats that BioPerl supports. Many methods for data simulation, transformation and manipulation, the analysis of tree shape, and tree visualization are provided. Conclusions Bio::Phylo is composed of 59 richly documented Perl5 modules. It has been deployed successfully on a variety of computer architectures (including various Linux distributions, Mac OS X versions, Windows, Cygwin and UNIX-like systems). It is available as open source (GPL) software from http://search.cpan.org/dist/Bio-Phylo PMID:21352572

  17. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT.

    PubMed

    Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-05-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT

    PubMed Central

    Schenk, Andreas D.; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-01-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library & Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. PMID:23500887

  19. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  20. adLIMS: a customized open source software that allows bridging clinical and basic molecular research studies.

    PubMed

    Calabria, Andrea; Spinozzi, Giulio; Benedicenti, Fabrizio; Tenderini, Erika; Montini, Eugenio

    2015-01-01

    Many biological laboratories that deal with genomic samples are facing the problem of sample tracking, both for pure laboratory management and for efficiency. Our laboratory exploits PCR techniques and Next Generation Sequencing (NGS) methods to perform high-throughput integration site monitoring in different clinical trials and scientific projects. Because of the huge amount of samples that we process every year, which result in hundreds of millions of sequencing reads, we need to standardize data management and tracking systems, building up a scalable and flexible structure with web-based interfaces, which are usually called Laboratory Information Management System (LIMS). We started collecting end-users' requirements, composed of desired functionalities of the system and Graphical User Interfaces (GUI), and then we evaluated available tools that could address our requirements, spanning from pure LIMS to Content Management Systems (CMS) up to enterprise information systems. Our analysis identified ADempiere ERP, an open source Enterprise Resource Planning written in Java J2EE, as the best software that also natively implements some highly desirable technological advances, such as the high usability and modularity that grants high use-case flexibility and software scalability for custom solutions. We extended and customized ADempiere ERP to fulfil LIMS requirements and we developed adLIMS. It has been validated by our end-users verifying functionalities and GUIs through test cases for PCRs samples and pre-sequencing data and it is currently in use in our laboratories. adLIMS implements authorization and authentication policies, allowing multiple users management and roles definition that enables specific permissions, operations and data views to each user. For example, adLIMS allows creating sample sheets from stored data using available exporting operations. This simplicity and process standardization may avoid manual errors and information backtracking, features that are not granted using track recording on files or spreadsheets. adLIMS aims to combine sample tracking and data reporting features with higher accessibility and usability of GUIs, thus allowing time to be saved on doing repetitive laboratory tasks, and reducing errors with respect to manual data collection methods. Moreover, adLIMS implements automated data entry, exploiting sample data multiplexing and parallel/transactional processing. adLIMS is natively extensible to cope with laboratory automation through platform-dependent API interfaces, and could be extended to genomic facilities due to the ERP functionalities.

  1. Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy.

    PubMed

    Görlitz, Frederik; Kelly, Douglas J; Warren, Sean C; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J; Stuhmeier, Frank; Neil, Mark A A; Tate, Edward W; Dunsby, Christopher; French, Paul M W

    2017-01-18

    We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set.

  2. Open-Source Automated Mapping Four-Point Probe.

    PubMed

    Chandra, Handy; Allen, Spencer W; Oberloier, Shane W; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M

    2017-01-26

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas.

  3. Open Source High Content Analysis Utilizing Automated Fluorescence Lifetime Imaging Microscopy

    PubMed Central

    Warren, Sean C.; Alibhai, Dominic; West, Lucien; Kumar, Sunil; Alexandrov, Yuriy; Munro, Ian; Garcia, Edwin; McGinty, James; Talbot, Clifford; Serwa, Remigiusz A.; Thinon, Emmanuelle; da Paola, Vincenzo; Murray, Edward J.; Stuhmeier, Frank; Neil, Mark A. A.; Tate, Edward W.; Dunsby, Christopher; French, Paul M. W.

    2017-01-01

    We present an open source high content analysis instrument utilizing automated fluorescence lifetime imaging (FLIM) for assaying protein interactions using Förster resonance energy transfer (FRET) based readouts of fixed or live cells in multiwell plates. This provides a means to screen for cell signaling processes read out using intramolecular FRET biosensors or intermolecular FRET of protein interactions such as oligomerization or heterodimerization, which can be used to identify binding partners. We describe here the functionality of this automated multiwell plate FLIM instrumentation and present exemplar data from our studies of HIV Gag protein oligomerization and a time course of a FRET biosensor in live cells. A detailed description of the practical implementation is then provided with reference to a list of hardware components and a description of the open source data acquisition software written in µManager. The application of FLIMfit, an open source MATLAB-based client for the OMERO platform, to analyze arrays of multiwell plate FLIM data is also presented. The protocols for imaging fixed and live cells are outlined and a demonstration of an automated multiwell plate FLIM experiment using cells expressing fluorescent protein-based FRET constructs is presented. This is complemented by a walk-through of the data analysis for this specific FLIM FRET data set. PMID:28190060

  4. Simulating reservoir leakage in ground-water models

    USGS Publications Warehouse

    Fenske, J.P.; Leake, S.A.; Prudic, David E.

    1997-01-01

    Leakage to ground water resulting from the expansion and contraction of reservoirs cannot be easily simulated by most ground-water flow models. An algorithm, entitled the Reservoir Package, was developed for the United States Geological Survey (USGS) three-dimensional finite-difference modular ground-water flow model MODFLOW. The Reservoir Package automates the process of specifying head-dependent boundary cells, eliminating the need to divide a simulation into many stress periods while improving accuracy in simulating changes in ground-water levels resulting from transient reservoir stage. Leakage between the reservoir and the underlying aquifer is simulated for each model cell corrresponding to the inundated area by multiplying the head difference between the reservoir and the aquifer with the hydraulic conductance of the reservoir-bed sediments.

  5. System for Better Spacing of Airplanes En Route

    NASA Technical Reports Server (NTRS)

    Green, Steven; Erzberger, Heinz

    2004-01-01

    An improved method of computing the spacing of airplanes en route, and software to implement the method, have been invented. The purpose of the invention is to help air-traffic controllers minimize those deviations of the airplanes from the trajectories preferred by their pilots that are needed to make the airplanes comply with miles-in-trail spacing requirements. The software is meant to be a modular component of the Center TRACON Automation System (CTAS) (TRACON signifies "terminal radar approach control"). The invention reduces controllers workloads and reduces fuel consumption by reducing the number of corrective clearances needed to achieve conformance with specified flow rates, without causing conflicts, while providing for more efficient distribution of spacing workload upstream and across air-traffic-control sectors.

  6. Fighting Testing ACAT/FRRP: Automatic Collision Avoidance Technology/Fighter Risk Reduction Project

    NASA Technical Reports Server (NTRS)

    Skoog, Mark A.

    2009-01-01

    This slide presentation reviews the work of the Flight testing Automatic Collision Avoidance Technology/Fighter Risk Reduction Project (ACAT/FRRP). The goal of this project is to develop common modular architecture for all aircraft, and to enable the transition of technology from research to production as soon as possible to begin to reduce the rate of mishaps. The automated Ground Collision Avoidance System (GCAS) system is designed to prevent collision with the ground, by avionics that project the future trajectory over digital terrain, and request an evasion maneuver at the last instance. The flight controls are capable of automatically performing a recovery. The collision avoidance is described in the presentation. Also included in the presentation is a description of the flight test.

  7. eddy4R 0.2.0: a DevOps model for community-extensible processing and analysis of eddy-covariance data based on R, Git, Docker, and HDF5

    NASA Astrophysics Data System (ADS)

    Metzger, Stefan; Durden, David; Sturtevant, Cove; Luo, Hongyan; Pingintha-Durden, Natchaya; Sachs, Torsten; Serafimovich, Andrei; Hartmann, Jörg; Li, Jiahong; Xu, Ke; Desai, Ankur R.

    2017-08-01

    Large differences in instrumentation, site setup, data format, and operating system stymie the adoption of a universal computational environment for processing and analyzing eddy-covariance (EC) data. This results in limited software applicability and extensibility in addition to often substantial inconsistencies in flux estimates. Addressing these concerns, this paper presents the systematic development of portable, reproducible, and extensible EC software achieved by adopting a development and systems operation (DevOps) approach. This software development model is used for the creation of the eddy4R family of EC code packages in the open-source R language for statistical computing. These packages are community developed, iterated via the Git distributed version control system, and wrapped into a portable and reproducible Docker filesystem that is independent of the underlying host operating system. The HDF5 hierarchical data format then provides a streamlined mechanism for highly compressed and fully self-documented data ingest and output. The usefulness of the DevOps approach was evaluated for three test applications. First, the resultant EC processing software was used to analyze standard flux tower data from the first EC instruments installed at a National Ecological Observatory (NEON) field site. Second, through an aircraft test application, we demonstrate the modular extensibility of eddy4R to analyze EC data from other platforms. Third, an intercomparison with commercial-grade software showed excellent agreement (R2 = 1.0 for CO2 flux). In conjunction with this study, a Docker image containing the first two eddy4R packages and an executable example workflow, as well as first NEON EC data products are released publicly. We conclude by describing the work remaining to arrive at the automated generation of science-grade EC fluxes and benefits to the science community at large. This software development model is applicable beyond EC and more generally builds the capacity to deploy complex algorithms developed by scientists in an efficient and scalable manner. In addition, modularity permits meeting project milestones while retaining extensibility with time.

  8. Light Microscopy Module Imaging Tested and Demonstrated

    NASA Technical Reports Server (NTRS)

    Gati, Frank

    2004-01-01

    The Fluids Integrated Rack (FIR), a facility-class payload, and the Light Microscopy Module (LMM), a subrack payload, are integrated research facilities that will fly in the U.S. Laboratory module, Destiny, aboard the International Space Station. Both facilities are being engineered, designed, and developed at the NASA Glenn Research Center by Northrop Grumman Information Technology. The FIR is a modular, multiuser scientific research facility that is one of two racks that make up the Fluids and Combustion Facility (the other being the Combustion Integrated Rack). The FIR has a large volume dedicated for experimental hardware; easily reconfigurable diagnostics, power, and data systems that allow for unique experiment configurations; and customizable software. The FIR will also provide imagers, light sources, power management and control, command and data handling for facility and experiment hardware, and data processing and storage. The first payload in the FIR will be the LMM. The LMM integrated with the FIR is a remotely controllable, automated, on-orbit microscope subrack facility, with key diagnostic capabilities for meeting science requirements--including video microscopy to observe microscopic phenonema and dynamic interactions, interferometry to make thin-film measurements with nanometer resolution, laser tweezers to manipulate micrometer-sized particles, confocal microscopy to provide enhanced three-dimensional visualization of structures, and spectrophotometry to measure the photonic properties of materials. Vibration disturbances were identified early in the LMM development phase as a high risk for contaminating the science microgravity environment. An integrated FIR-LMM test was conducted in Glenn's Acoustics Test Laboratory to assess mechanical sources of vibration and their impact to microscopic imaging. The primary purpose of the test was to characterize the LMM response at the sample location, the x-y stage within the microscope, to vibration emissions from the FIR and LMM support structures.

  9. ClinicalTrials.gov as a data source for semi-automated point-of-care trial eligibility screening.

    PubMed

    Pfiffner, Pascal B; Oh, JiWon; Miller, Timothy A; Mandl, Kenneth D

    2014-01-01

    Implementing semi-automated processes to efficiently match patients to clinical trials at the point of care requires both detailed patient data and authoritative information about open studies. To evaluate the utility of the ClinicalTrials.gov registry as a data source for semi-automated trial eligibility screening. Eligibility criteria and metadata for 437 trials open for recruitment in four different clinical domains were identified in ClinicalTrials.gov. Trials were evaluated for up to date recruitment status and eligibility criteria were evaluated for obstacles to automated interpretation. Finally, phone or email outreach to coordinators at a subset of the trials was made to assess the accuracy of contact details and recruitment status. 24% (104 of 437) of trials declaring on open recruitment status list a study completion date in the past, indicating out of date records. Substantial barriers to automated eligibility interpretation in free form text are present in 81% to up to 94% of all trials. We were unable to contact coordinators at 31% (45 of 146) of the trials in the subset, either by phone or by email. Only 53% (74 of 146) would confirm that they were still recruiting patients. Because ClinicalTrials.gov has entries on most US and many international trials, the registry could be repurposed as a comprehensive trial matching data source. Semi-automated point of care recruitment would be facilitated by matching the registry's eligibility criteria against clinical data from electronic health records. But the current entries fall short. Ultimately, improved techniques in natural language processing will facilitate semi-automated complex matching. As immediate next steps, we recommend augmenting ClinicalTrials.gov data entry forms to capture key eligibility criteria in a simple, structured format.

  10. Normalization and standardization of electronic health records for high-throughput phenotyping: the SHARPn consortium

    PubMed Central

    Pathak, Jyotishman; Bailey, Kent R; Beebe, Calvin E; Bethard, Steven; Carrell, David S; Chen, Pei J; Dligach, Dmitriy; Endle, Cory M; Hart, Lacey A; Haug, Peter J; Huff, Stanley M; Kaggal, Vinod C; Li, Dingcheng; Liu, Hongfang; Marchant, Kyle; Masanz, James; Miller, Timothy; Oniki, Thomas A; Palmer, Martha; Peterson, Kevin J; Rea, Susan; Savova, Guergana K; Stancl, Craig R; Sohn, Sunghwan; Solbrig, Harold R; Suesse, Dale B; Tao, Cui; Taylor, David P; Westberg, Les; Wu, Stephen; Zhuo, Ning; Chute, Christopher G

    2013-01-01

    Research objective To develop scalable informatics infrastructure for normalization of both structured and unstructured electronic health record (EHR) data into a unified, concept-based model for high-throughput phenotype extraction. Materials and methods Software tools and applications were developed to extract information from EHRs. Representative and convenience samples of both structured and unstructured data from two EHR systems—Mayo Clinic and Intermountain Healthcare—were used for development and validation. Extracted information was standardized and normalized to meaningful use (MU) conformant terminology and value set standards using Clinical Element Models (CEMs). These resources were used to demonstrate semi-automatic execution of MU clinical-quality measures modeled using the Quality Data Model (QDM) and an open-source rules engine. Results Using CEMs and open-source natural language processing and terminology services engines—namely, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) and Common Terminology Services (CTS2)—we developed a data-normalization platform that ensures data security, end-to-end connectivity, and reliable data flow within and across institutions. We demonstrated the applicability of this platform by executing a QDM-based MU quality measure that determines the percentage of patients between 18 and 75 years with diabetes whose most recent low-density lipoprotein cholesterol test result during the measurement year was <100 mg/dL on a randomly selected cohort of 273 Mayo Clinic patients. The platform identified 21 and 18 patients for the denominator and numerator of the quality measure, respectively. Validation results indicate that all identified patients meet the QDM-based criteria. Conclusions End-to-end automated systems for extracting clinical information from diverse EHR systems require extensive use of standardized vocabularies and terminologies, as well as robust information models for storing, discovering, and processing that information. This study demonstrates the application of modular and open-source resources for enabling secondary use of EHR data through normalization into standards-based, comparable, and consistent format for high-throughput phenotyping to identify patient cohorts. PMID:24190931

  11. The Effects of Race Conditions when Implementing Single-Source Redundant Clock Trees in Triple Modular Redundant Synchronous Architectures

    NASA Technical Reports Server (NTRS)

    Berg, Melanie D.; Label, Kenneth A.; Pellish, Jonathan

    2016-01-01

    We present the challenges that arise when using redundant clock domains due to their clock-skew. Heavy-ion radiation data show that a singular clock domain (DTMR) provides an improved TMR methodology for SRAM-based FPGAs over redundant clocks.

  12. Power supply

    DOEpatents

    Yakymyshyn, Christopher Paul; Hamilton, Pamela Jane; Brubaker, Michael Allen

    2007-12-04

    A modular, low weight impedance dropping power supply with battery backup is disclosed that can be connected to a high voltage AC source and provide electrical power at a lower voltage. The design can be scaled over a wide range of input voltages and over a wide range of output voltages and delivered power.

  13. High and Low Visualization Skills and Pedagogical Decision of Preservice Secondary Mathematics Teachers

    ERIC Educational Resources Information Center

    Unal, Hasan

    2011-01-01

    The purpose of this study was to investigate the preservice secondary mathematics teachers' development of pedagogical understanding in the teaching of modular arithmetic problems. Data sources included, written assignments, interview transcripts and filed notes. Using case study and action research approaches cases of three preservice teachers…

  14. EnviroDIY ModularSensors: A Library to give Environmental Sensors a Common Interface of Functions for use with Arduino-Compatible Dataloggers

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Damiano, S. G.; Hicks, S.; Horsburgh, J. S.

    2017-12-01

    EnviroDIY is a community for do-it-yourself environmental science and monitoring (https://envirodiy.org), largely focused on sharing ideas for developing Arduino-compatible open-source sensor stations, similar to the EnviroDIY Mayfly datalogger (http://envirodiy.org/mayfly/). Here we present the ModularSensors Arduino code library (https://github.com/EnviroDIY/ModularSensors), deisigned to give all sensors and variables a common interface of functions and returns and to make it very easy to iterate through and log data from many sensors and variables. This library was written primarily for the EnviroDIY Mayfly, but we have begun to test it on other Arduino based boards. We will show the large number of developed sensor interfaces, and examples of using this library code to stream near real time data to the new EnviroDIY Water Quality Data Portal (http://data.envirodiy.org/), a data and software system based on the Observations Data Model v2 (http://www.odm2.org).

  15. SemantEco: a semantically powered modular architecture for integrating distributed environmental and ecological data

    USGS Publications Warehouse

    Patton, Evan W.; Seyed, Patrice; Wang, Ping; Fu, Linyun; Dein, F. Joshua; Bristol, R. Sky; McGuinness, Deborah L.

    2014-01-01

    We aim to inform the development of decision support tools for resource managers who need to examine large complex ecosystems and make recommendations in the face of many tradeoffs and conflicting drivers. We take a semantic technology approach, leveraging background ontologies and the growing body of linked open data. In previous work, we designed and implemented a semantically enabled environmental monitoring framework called SemantEco and used it to build a water quality portal named SemantAqua. Our previous system included foundational ontologies to support environmental regulation violations and relevant human health effects. In this work, we discuss SemantEco’s new architecture that supports modular extensions and makes it easier to support additional domains. Our enhanced framework includes foundational ontologies to support modeling of wildlife observation and wildlife health impacts, thereby enabling deeper and broader support for more holistically examining the effects of environmental pollution on ecosystems. We conclude with a discussion of how, through the application of semantic technologies, modular designs will make it easier for resource managers to bring in new sources of data to support more complex use cases.

  16. Craniux: A LabVIEW-Based Modular Software Framework for Brain-Machine Interface Research

    DTIC Science & Technology

    2011-01-01

    open-source BMI software solu- tions are currently available, we feel that the Craniux software package fills a specific need in the realm of BMI...data, such as cortical source imaging using EEG or MEG recordings. It is with these characteristics in mind that we feel the Craniux software package...S. Adee, “Dean Kamen’s ‘luke arm’ prosthesis readies for clinical trials,” IEEE Spectrum, February 2008, http://spectrum .ieee.org/biomedical

  17. Friendly Extensible Transfer Tool Beta Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William P.; Gutierrez, Kenneth M.; McRee, Susan R.

    2016-04-15

    Often data transfer software is designed to meet specific requirements or apply to specific environments. Frequently, this requires source code integration for added functionality. An extensible data transfer framework is needed to more easily incorporate new capabilities, in modular fashion. Using FrETT framework, functionality may be incorporated (in many cases without need of source code) to handle new platform capabilities: I/O methods (e.g., platform specific data access), network transport methods, data processing (e.g., data compression.).

  18. A beam optics study of a modular multi-source X-ray tube for novel computed tomography applications

    NASA Astrophysics Data System (ADS)

    Walker, Brandon J.; Radtke, Jeff; Chen, Guang-Hong; Eliceiri, Kevin W.; Mackie, Thomas R.

    2017-10-01

    A modular implementation of a scanning multi-source X-ray tube is designed for the increasing number of multi-source imaging applications in computed tomography (CT). An electron beam array coupled with an oscillating magnetic deflector is proposed as a means for producing an X-ray focal spot at any position along a line. The preliminary multi-source model includes three thermionic electron guns that are deflected in tandem by a slowly varying magnetic field and pulsed according to a scanning sequence that is dependent on the intended imaging application. Particle tracking simulations with particle dynamics analysis software demonstrate that three 100 keV electron beams are laterally swept a combined distance of 15 cm over a stationary target with an oscillating magnetic field of 102 G perpendicular to the beam axis. Beam modulation is accomplished using 25 μs pulse widths to a grid electrode with a reverse gate bias of -500 V and an extraction voltage of +1000 V. Projected focal spot diameters are approximately 1 mm for 138 mA electron beams and the stationary target stays within thermal limits for the 14 kW module. This concept could be used as a research platform for investigating high-speed stationary CT scanners, for lowering dose with virtual fan beam formation, for reducing scatter radiation in cone-beam CT, or for other industrial applications.

  19. Features of Modularly Assembled Compounds That Impart Bioactivity Against an RNA Target

    PubMed Central

    Rzuczek, Suzanne G.; Gao, Yu; Tang, Zhen-Zhi; Thornton, Charles A.; Kodadek, Thomas; Disney, Matthew D.

    2013-01-01

    Transcriptomes provide a myriad of potential RNAs that could be the targets of therapeutics or chemical genetic probes of function. Cell permeable small molecules, however, generally do not exploit these targets, owing to the difficulty in the design of high affinity, specific small molecules targeting RNA. As part of a general program to study RNA function using small molecules, we designed bioactive, modularly assembled small molecules that target the non-coding expanded RNA repeat that causes myotonic dystrophy type 1 (DM1), r(CUG)exp. Herein, we present a rigorous study to elucidate features in modularly assembled compounds that afford bioactivity. Different modular assembly scaffolds were investigated including polyamines, α-peptides, β-peptides, and peptide tertiary amides (PTAs). Based on activity as assessed by improvement of DM1-associated defects, stability against proteases, cellular permeability, and toxicity, we discovered that constrained backbones, namely PTAs, are optimal. Notably, we determined that r(CUG)exp is the target of the optimal PTA in cellular models and that the optimal PTA improves DM1-associated defects in a mouse model. Biophysical analyses were employed to investigate potential sources of bioactivity. These investigations show that modularly assembled compounds have increased residence times on their targets and faster on rates than the RNA-binding modules from which they were derived and faster on rates than the protein that binds r(CUG)exp, the inactivation of which gives rise to DM1-associated defects. These studies provide information about features of small molecules that are programmable for targeting RNA, allowing for the facile optimization of therapeutics or chemical probes against other cellular RNA targets. PMID:24032410

  20. Features of modularly assembled compounds that impart bioactivity against an RNA target.

    PubMed

    Rzuczek, Suzanne G; Gao, Yu; Tang, Zhen-Zhi; Thornton, Charles A; Kodadek, Thomas; Disney, Matthew D

    2013-10-18

    Transcriptomes provide a myriad of potential RNAs that could be the targets of therapeutics or chemical genetic probes of function. Cell-permeable small molecules, however, generally do not exploit these targets, owing to the difficulty in the design of high affinity, specific small molecules targeting RNA. As part of a general program to study RNA function using small molecules, we designed bioactive, modularly assembled small molecules that target the noncoding expanded RNA repeat that causes myotonic dystrophy type 1 (DM1), r(CUG)(exp). Herein, we present a rigorous study to elucidate features in modularly assembled compounds that afford bioactivity. Different modular assembly scaffolds were investigated, including polyamines, α-peptides, β-peptides, and peptide tertiary amides (PTAs). On the basis of activity as assessed by improvement of DM1-associated defects, stability against proteases, cellular permeability, and toxicity, we discovered that constrained backbones, namely, PTAs, are optimal. Notably, we determined that r(CUG)(exp) is the target of the optimal PTA in cellular models and that the optimal PTA improves DM1-associated defects in a mouse model. Biophysical analyses were employed to investigate potential sources of bioactivity. These investigations show that modularly assembled compounds have increased residence times on their targets and faster on rates than the RNA-binding modules from which they were derived. Moreover, they have faster on rates than the protein that binds r(CUG)(exp), the inactivation of which gives rise to DM1-associated defects. These studies provide information about features of small molecules that are programmable for targeting RNA, allowing for the facile optimization of therapeutics or chemical probes against other cellular RNA targets.

  1. Reference Tools for Data Processing, Office Automation, and Data Communications: An Introductory Guide.

    ERIC Educational Resources Information Center

    Cupoli, Patricia Dymkar

    1981-01-01

    Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)

  2. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  3. Automating PACS quality control with the Vanderbilt image processing enterprise resource

    NASA Astrophysics Data System (ADS)

    Esparza, Michael L.; Welch, E. Brian; Landman, Bennett A.

    2012-02-01

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption; for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and widevariety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource (VIPER) to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fong, Erika J.; Huang, Chao; Hamilton, Julie

    Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less

  5. Diabetes: Models, Signals, and Control

    PubMed Central

    Cobelli, Claudio; Man, Chiara Dalla; Sparacino, Giovanni; Magni, Lalo; De Nicolao, Giuseppe; Kovatchev, Boris P.

    2010-01-01

    The control of diabetes is an interdisciplinary endeavor, which includes a significant biomedical engineering component, with traditions of success beginning in the early 1960s. It began with modeling of the insulin-glucose system, and progressed to large-scale in silico experiments, and automated closed-loop control (artificial pancreas). Here, we follow these engineering efforts through the last, almost 50 years. We begin with the now classic minimal modeling approach and discuss a number of subsequent models, which have recently resulted in the first in silico simulation model accepted as substitute to animal trials in the quest for optimal diabetes control. We then review metabolic monitoring, with a particular emphasis on the new continuous glucose sensors, on the analyses of their time-series signals, and on the opportunities that they present for automation of diabetes control. Finally, we review control strategies that have been successfully employed in vivo or in silico, presenting a promise for the development of a future artificial pancreas and, in particular, discuss a modular architecture for building closed-loop control systems, including insulin delivery and patient safety supervision layers. We conclude with a brief discussion of the unique interactions between human physiology, behavioral events, engineering modeling and control relevant to diabetes. PMID:20936056

  6. SNOMED CT module-driven clinical archetype management.

    PubMed

    Allones, J L; Taboada, M; Martinez, D; Lozano, R; Sobrido, M J

    2013-06-01

    To explore semantic search to improve management and user navigation in clinical archetype repositories. In order to support semantic searches across archetypes, an automated method based on SNOMED CT modularization is implemented to transform clinical archetypes into SNOMED CT extracts. Concurrently, query terms are converted into SNOMED CT concepts using the search engine Lucene. Retrieval is then carried out by matching query concepts with the corresponding SNOMED CT segments. A test collection of the 16 clinical archetypes, including over 250 terms, and a subset of 55 clinical terms from two medical dictionaries, MediLexicon and MedlinePlus, were used to test our method. The keyword-based service supported by the OpenEHR repository offered us a benchmark to evaluate the enhancement of performance. In total, our approach reached 97.4% precision and 69.1% recall, providing a substantial improvement of recall (more than 70%) compared to the benchmark. Exploiting medical domain knowledge from ontologies such as SNOMED CT may overcome some limitations of the keyword-based systems and thus improve the search experience of repository users. An automated approach based on ontology segmentation is an efficient and feasible way for supporting modeling, management and user navigation in clinical archetype repositories. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Active Site Mapping of Xylan-Deconstructing Enzymes with Arabinoxylan Oligosaccharides Produced by Automated Glycan Assembly.

    PubMed

    Senf, Deborah; Ruprecht, Colin; de Kruijff, Goswinus H M; Simonetti, Sebastian O; Schuhmacher, Frank; Seeberger, Peter H; Pfrengle, Fabian

    2017-03-02

    Xylan-degrading enzymes are crucial for the deconstruction of hemicellulosic biomass, making the hydrolysis products available for various industrial applications such as the production of biofuel. To determine the substrate specificities of these enzymes, we prepared a collection of complex xylan oligosaccharides by automated glycan assembly. Seven differentially protected building blocks provided the basis for the modular assembly of 2-substituted, 3-substituted, and 2-/3-substituted arabino- and glucuronoxylan oligosaccharides. Elongation of the xylan backbone relied on iterative additions of C4-fluorenylmethoxylcarbonyl (Fmoc) protected xylose building blocks to a linker-functionalized resin. Arabinofuranose and glucuronic acid residues have been selectively attached to the backbone using fully orthogonal 2-(methyl)naphthyl (Nap) and 2-(azidomethyl)benzoyl (Azmb) protecting groups at the C2 and C3 hydroxyls of the xylose building blocks. The arabinoxylan oligosaccharides are excellent tools to map the active site of glycosyl hydrolases involved in xylan deconstruction. The substrate specificities of several xylanases and arabinofuranosidases were determined by analyzing the digestion products after incubation of the oligosaccharides with glycosyl hydrolases. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Space station automation of common module power management and distribution

    NASA Technical Reports Server (NTRS)

    Miller, W.; Jones, E.; Ashworth, B.; Riedesel, J.; Myers, C.; Freeman, K.; Steele, D.; Palmer, R.; Walsh, R.; Gohring, J.

    1989-01-01

    The purpose is to automate a breadboard level Power Management and Distribution (PMAD) system which possesses many functional characteristics of a specified Space Station power system. The automation system was built upon 20 kHz ac source with redundancy of the power buses. There are two power distribution control units which furnish power to six load centers which in turn enable load circuits based upon a system generated schedule. The progress in building this specified autonomous system is described. Automation of Space Station Module PMAD was accomplished by segmenting the complete task in the following four independent tasks: (1) develop a detailed approach for PMAD automation; (2) define the software and hardware elements of automation; (3) develop the automation system for the PMAD breadboard; and (4) select an appropriate host processing environment.

  9. An Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision

    DTIC Science & Technology

    2018-01-01

    ARL-TR-8269 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Kurtosis-Histogram Excision...needed. Do not return it to the originator. ARL-TR-8269 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  10. SU-G-201-03: Automation of High Dose Rate Brachytherapy Quality Assurance: Development of a Radioluminescent Detection System for Simultaneous Detection of Activity, Timing, and Positioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, C; Xing, L; Fahimian, B

    Purpose: Accuracy of positioning, timing and activity is of critical importance for High Dose Rate (HDR) brachytherapy delivery. Respective measurements via film autoradiography, stop-watches and well chambers can be cumbersome, crude or lack dynamic source evaluation capabilities. To address such limitations, a single device radioluminescent detection system enabling automated real-time quantification of activity, position and timing accuracy is presented and experimentally evaluated. Methods: A radioluminescent sheet was fabricated by mixing Gd?O?S:Tb with PDMS and incorporated into a 3D printed device where it was fixated below a CMOS digital camera. An Ir-192 HDR source (VS2000, VariSource iX) with an effective activemore » length of 5 mm was introduced using a 17-gauge stainless steel needle below the sheet. Pixel intensity values for determining activity were taken from an ROI centered on the source location. A calibration curve relating intensity values to activity was generated and used to evaluate automated activity determination with data gathered over 6 weeks. Positioning measurements were performed by integrating images for an entire delivery and fitting peaks to the resulting profile. Timing measurements were performed by evaluating source location and timestamps from individual images. Results: Average predicted activity error over 6 weeks was .35 ± .5%. The distance between four dwell positions was determined by the automated system to be 1.99 ± .02 cm. The result from autoradiography was 2.00 ± .03 cm. The system achieved a time resolution of 10 msec and determined the dwell time to be 1.01 sec ± .02 sec. Conclusion: The system was able to successfully perform automated detection of activity, positioning and timing concurrently under a single setup. Relative to radiochromic and radiographic film-based autoradiography, which can only provide a static evaluation positioning, optical detection of temporary radiation induced luminescence enables dynamic detection of position enabling automated quantification of timing with millisecond accuracy.« less

  11. Part-task training in the context of automation: current and future directions.

    PubMed

    Gutzwiller, Robert S; Clegg, Benjamin A; Blitch, John G

    2013-01-01

    Automation often elicits a divide-and-conquer outlook. By definition, automation has been suggested to assume control over a part or whole task that was previously performed by a human (Parasuraman & Riley, 1997). When such notions of automation are taken as grounds for training, they readily invoke a part-task training (PTT) approach. This article outlines broad functions of automation as a source of PTT and reviews the PTT literature, focusing on the potential benefits and costs related to using automation as a mechanism for PTT. The article reviews some past work in this area and suggests a path to move beyond the type of work captured by the "automation as PTT" framework. An illustrative experiment shows how automation in training and PTT are actually separable issues. PTT with automation has some utility but ultimately remains an unsatisfactory framework for the future broad potential of automation during training, and we suggest that a new conceptualization is needed.

  12. Theory for the Emergence of Modularity in Complex Systems

    NASA Astrophysics Data System (ADS)

    Deem, Michael; Park, Jeong-Man

    2013-03-01

    Biological systems are modular, and this modularity evolves over time and in different environments. A number of observations have been made of increased modularity in biological systems under increased environmental pressure. We here develop a theory for the dynamics of modularity in these systems. We find a principle of least action for the evolved modularity at long times. In addition, we find a fluctuation dissipation relation for the rate of change of modularity at short times. We discuss a number of biological and social systems that can be understood with this framework. The modularity of the protein-protein interaction network increases when yeast are exposed to heat shock, and the modularity of the protein-protein networks in both yeast and E. coli appears to have increased over evolutionary time. Food webs in low-energy, stressful environments are more modular than those in plentiful environments, arid ecologies are more modular during droughts, and foraging of sea otters is more modular when food is limiting. The modularity of social networks changes over time: stock brokers instant messaging networks are more modular under stressful market conditions, criminal networks are more modular under increased police pressure, and world trade network modularity has decreased

  13. MVO Automation Platform: Addressing Unmet Needs in Clinical Laboratories with Microcontrollers, 3D Printing, and Open-Source Hardware/Software.

    PubMed

    Iglehart, Brian

    2018-05-01

    Laboratory automation improves test reproducibility, which is vital to patient care in clinical laboratories. Many small and specialty laboratories are excluded from the benefits of automation due to low sample number, cost, space, and/or lack of automation expertise. The Minimum Viable Option (MVO) automation platform was developed to address these hurdles and fulfill an unmet need. Consumer 3D printing enabled rapid iterative prototyping to allow for a variety of instrumentation and assay setups and procedures. Three MVO versions have been produced. MVOv1.1 successfully performed part of a clinical assay, and results were comparable to those of commercial automation. Raspberry Pi 3 Model B (RPI3) single-board computers with Sense Hardware Attached on Top (HAT) and Raspberry Pi Camera Module V2 hardware were remotely accessed and evaluated for their suitability to qualify the latest MVOv1.2 platform. Sense HAT temperature, barometric pressure, and relative humidity sensors were stable in climate-controlled environments and are useful in identifying appropriate laboratory spaces for automation placement. The RPI3 with camera plus digital dial indicator logged axis travel experiments. RPI3 with camera and Sense HAT as a light source showed promise when used for photometric dispensing tests. Individual well standard curves were necessary for well-to-well light and path length compensations.

  14. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.

  15. An online open-source tool for automated quantification of liver and myocardial iron concentrations by T2* magnetic resonance imaging.

    PubMed

    Git, K-A; Fioravante, L A B; Fernandes, J L

    2015-09-01

    To assess whether an online open-source tool would provide accurate calculations of T2(*) values for iron concentrations in the liver and heart compared with a standard reference software. An online open-source tool, written in pure HTML5/Javascript, was tested in 50 patients (age 26.0 ± 18.9 years, 46% males) who underwent T2(*) MRI of the liver and heart for iron overload assessment as part of their routine workup. Automated truncation correction was the default with optional manual adjustment provided if needed. The results were compared against a standard reference measurement using commercial software with manual truncation (CVI(42)(®) v. 5.1; Circle Cardiovascular Imaging; Calgary, AB). The mean liver T2(*) values calculated with the automated tool was 4.3 ms [95% confidence interval (CI) 3.1 to 5.5 ms] vs 4.26 ms using the reference software (95% CI 3.1 to 5.4 ms) without any significant differences (p = 0.71). In the liver, the mean difference was 0.036 ms (95% CI -0.1609 to 0.2329 ms) with a regression correlation coefficient of 0.97. For the heart, the automated T2(*) value was 26.0 ms (95% CI 22.9 to 29.0 ms) vs 25.3 ms (95% CI 22.3 to 28.3 ms), p = 0.28. The mean difference was 0.72 ms (95% CI 0.08191 to 1.3621 ms) with a correlation coefficient of 0.96. The automated online tool provides similar T2(*) values for the liver and myocardial iron concentrations as compared with a standard reference software. The online program provides an open-source tool for the calculation of T2(*) values, incorporating an automated correction algorithm in a simple and easy-to-use interface.

  16. OLED area illumination source

    DOEpatents

    Foust, Donald Franklin [Scotia, NY; Duggal, Anil Raj [Niskayuna, NY; Shiang, Joseph John [Niskayuna, NY; Nealon, William Francis [Gloversville, NY; Bortscheller, Jacob Charles [Clifton Park, NY

    2008-03-25

    The present invention relates to an area illumination light source comprising a plurality of individual OLED panels. The individual OLED panels are configured in a physically modular fashion. Each OLED panel comprising a plurality of OLED devices. Each OLED panel comprises a first electrode and a second electrode such that the power being supplied to each individual OLED panel may be varied independently. A power supply unit capable of delivering varying levels of voltage simultaneously to the first and second electrodes of each of the individual OLED panels is also provided. The area illumination light source also comprises a mount within which the OLED panels are arrayed.

  17. Development of a versatile multiaperture negative ion source.

    PubMed

    Cavenago, M; Kulevoy, T; Petrenko, S; Serianni, G; Antoni, V; Bigi, M; Fellin, F; Recchia, M; Veltri, P

    2012-02-01

    A 60 kV ion source (9 beamlets of 15 mA each of H(-)) and plasma generators are being developed at Consorzio RFX and INFN-LNL, for their versatility in experimental campaigns and for training. Unlike most experimental sources, the design aimed at continuous operation. Magnetic configuration can achieve a minimum ∣B∣ trap, smoothly merged with the extraction filter. Modular design allows for quick substitution and upgrading of parts such as the extraction and postacceleration grids or the electrodes in contact with plasma. Experiments with a radio frequency plasma generator and Faraday cage inside the plasma are also described.

  18. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    PubMed

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-10-17

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  19. Open-Source Automated Mapping Four-Point Probe

    PubMed Central

    Chandra, Handy; Allen, Spencer W.; Oberloier, Shane W.; Bihari, Nupur; Gwamuri, Jephias; Pearce, Joshua M.

    2017-01-01

    Scientists have begun using self-replicating rapid prototyper (RepRap) 3-D printers to manufacture open source digital designs of scientific equipment. This approach is refined here to develop a novel instrument capable of performing automated large-area four-point probe measurements. The designs for conversion of a RepRap 3-D printer to a 2-D open source four-point probe (OS4PP) measurement device are detailed for the mechanical and electrical systems. Free and open source software and firmware are developed to operate the tool. The OS4PP was validated against a wide range of discrete resistors and indium tin oxide (ITO) samples of different thicknesses both pre- and post-annealing. The OS4PP was then compared to two commercial proprietary systems. Results of resistors from 10 to 1 MΩ show errors of less than 1% for the OS4PP. The 3-D mapping of sheet resistance of ITO samples successfully demonstrated the automated capability to measure non-uniformities in large-area samples. The results indicate that all measured values are within the same order of magnitude when compared to two proprietary measurement systems. In conclusion, the OS4PP system, which costs less than 70% of manual proprietary systems, is comparable electrically while offering automated 100 micron positional accuracy for measuring sheet resistance over larger areas. PMID:28772471

  20. Modular Aquatic Simulation System 1D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-19

    MASS1 simulates open channel hydrodynamics and transport in branched channel networks, using cross-section averaged forms of the continuity, momentum, and convection diffusion equations. Thermal energy transport (temperature), including meteorological influences is supported. The thermodynamics of total dissolved gas (TDG) can be directly simulated. MASS1 has been developed over the last 20 years. It is currently being used on DOE projects that require MASS1 to beopen source. Hence, the authors would like to distribute MASS1 in source form.

  1. Hire Education: Mastery, Modularization, and the Workforce Revolution

    ERIC Educational Resources Information Center

    Weise, Michelle R.; Christensen, Clayton M.

    2014-01-01

    The economic urgency around higher education is undeniable: the price of tuition has soared; student loan debt now exceeds $1 trillion and is greater than credit card debt; the dollars available from government sources for colleges are expected to shrink in the years to come; and the costs for traditional institutions to stay competitive continue…

  2. Technology survey of computer software as applicable to the MIUS project

    NASA Technical Reports Server (NTRS)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  3. A Modular Approach to a Library of Semi-Synthetic Fucosylated Chondroitin Sulfate Polysaccharides with Different Sulfation and Fucosylation Patterns.

    PubMed

    Laezza, Antonio; Iadonisi, Alfonso; Pirozzi, Anna V A; Diana, Paola; De Rosa, Mario; Schiraldi, Chiara; Parrilli, Michelangelo; Bedini, Emiliano

    2016-12-12

    Fucosylated chondroitin sulfate (fCS)-a glycosaminoglycan (GAG) found in sea cucumbers-has recently attracted much attention owing to its biological properties. In particular, a low molecular mass fCS polysaccharide has very recently been suggested as a strong candidate for the development of an antithrombotic drug that would be safer and more effective than heparin. To avoid the use of animal sourced drugs, here we present the chemical transformation of a microbial sourced unsulfated chondroitin polysaccharide into a small library of fucosylated (and sulfated) derivatives thereof. To this aim, a modular approach based on the different combination of only five reactions was employed, with an almost unprecedented polysaccharide branching by O-glycosylation as the key step. The library was differentiated for sulfation patterns and/or positions of the fucose branches, as confirmed by detailed 2D NMR spectroscopic analysis. These semi-synthetic polysaccharides will allow a wider and more accurate structure-activity relationship study with respect to those reported in literature to date. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Initial Experience of the Application of Automated Tube Potential Selection Technique in High-pitch Dual-source CT Angiography of Whole Aorta Using Third-generation Dual-source CT Scanner.

    PubMed

    Kong, Lingyan; Liang, Jixiang; Xue, Huadan; Wang, Yining; Wang, Yun; Jin, Zhengyu; Zhang, Daming; Chen, Jin

    2017-02-20

    Objective To evaluate the application of automated tube potential selection technique in high-pitch dual-source CT aortic angiography on a third-generation dual-source CT scanner. Methods Whole aorta angiography were indiated in 59 patients,who were divided into 2 groups using a simple random method:in group 1 there were 31 patients who underwent the examination with automated tube potential selection using a vascular setting with a preferred image quality of 288 mA/100 kV;in group 2 there were 28 patients who underwent the examination with a tube voltage of 100 kV and automated tube current modulation using a reference tube current of 288 mA. Both groups were scanned on a third generation dual-source CT device operated in dual-source high-pitch ECG-gating mode with a pitch of 3.0,collimation of 2×192×0.6 mm,and a rotation time of 0.25 s. Iterative reconstruction algorithm was used. For group 1,the volume and flow of contrast medium and chasing saline were adapted to the tube voltage. For group 2,a contrast material bolus of 45 ml with a flow of 4.5 ml/s followed by a 50 ml saline chaser at 5 ml/s was used. CTA scan was automatically started using a bolus tracking technique at the level of the original part of aorta after a trigger threshold of 100 HU was reached. The start delay was set to 6 s in both groups. Effective dose (ED),signal to noise ratio (SNR),contrast to noise ratio (CNR),and subjective diagnostic quality of both groups were evaluated. Results The mean ED were 21.3% lower (t=-3.099,P=0.000) in group 1 [(2.48±0.80) mSv] than in group 2 [(3.15±0.86) mSv]. Two groups showed no significant difference in attenuation,SD,SNR,or CNR at all evaluational parts of aorta (ascending aorta,aortic arch,diaphragmatic aorta,or iliac bifurcation)(all P>0.05). There was no significant difference in subjective diagnostic quality values of two groups [(1.41±0.50) scores vs. (1.39±0.50) scores;W=828.5,P=0.837]. Conclusion Compared with automated tube current modulation,the automated tube potential selection technique in aorta CT angiography on a third-generation dual-source CT can dramatically reduce radiation dose without affecting image quality.

  5. Certification of tactics and strategies in aviation

    NASA Technical Reports Server (NTRS)

    Koelman, Hartmut

    1994-01-01

    The paper suggests that the 'tactics and strategies' notion is a highly suitable paradigm to describe the cognitive involvement of human operators in advanced aviation systems (far more suitable than classical functional analysis), and that the workload and situational awareness of operators are intimately associated with the planning and execution of their tactics and strategies. If system designers have muddled views about the collective tactics and strategies to be used during operation, they will produce sub-optimum designs. If operators use unproven and/or inappropriate tactics and strategies, the system may fail. The author wants to make a point that, beyond certification of people or system designs, there may be a need to go into more detail and examine (certify?) the set of tactics and strategies (i.e., the Operational Concept) which makes the people and systems perform as expected. The collective tactics and strategies determine the information flows and situational awareness which exists in organizations and composite human-machine systems. The available infrastructure and equipment (automation) enable these information flows and situational awareness, but are at the same time the constraining factor. Frequently, the tactics and strategies are driven by technology, whereas we would rather like to see a system designed to support an optimized Operational Concept, i.e., to support a sufficiently coherent, cooperative and modular set of anticipation and planning mechanisms. Again, in line with the view of MacLeod and Taylor (1993), this technology driven situation may be caused by the system designer's and operator job designer's over-emphasis on functional analysis (a mechanistic engineering concept), at the expense of a subject which does not seem to be well understood today: the role of the (human cognitive and/or automated) tactics and strategies which are embedded in composite human-machine systems. Research would be needed to arrive at a generally accepted 'planning theory' which can elevate the analysis, description and design of tactics and strategies from today's cottage industry methods to an engineering discipline. The available infrastructure and equipment (automation) enable these information flows and situational awareness, but are at the same time the constraining factor. Frequently, the tactics and strategies are driven by technology, whereas we would rather like to see a system designed to support an optimized Operational Concept, i.e., to support a sufficiently coherent, cooperative and modular set of anticipation and planning mechanisms. Again, in line with the view of MacLeod and Taylor (1993), this technology driven situation may be caused by the system designer's and operator job designer's over-emphasis on functional analysis (a mechanistic engineering concept), at the expense of a subject which does not seem to be well understood today: the role of the (human cognitive and/or automated) tactics and strategies which are embedded in composite human-machine systems. Research would be needed to arrive at a generally accepted 'planning theory' which can evaluate the analysis, description and design of tactics and strategies from today's cottage industry methods to an engineering discipline.

  6. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  7. Automated Sneak Circuit Analysis Technique

    DTIC Science & Technology

    1990-06-01

    the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air

  8. A Graphics Processing Unit Accelerated Motion Correction Algorithm and Modular System for Real-time fMRI

    PubMed Central

    Scheinost, Dustin; Hampson, Michelle; Qiu, Maolin; Bhawnani, Jitendra; Constable, R. Todd; Papademetris, Xenophon

    2013-01-01

    Real-time functional magnetic resonance imaging (rt-fMRI) has recently gained interest as a possible means to facilitate the learning of certain behaviors. However, rt-fMRI is limited by processing speed and available software, and continued development is needed for rt-fMRI to progress further and become feasible for clinical use. In this work, we present an open-source rt-fMRI system for biofeedback powered by a novel Graphics Processing Unit (GPU) accelerated motion correction strategy as part of the BioImage Suite project (www.bioimagesuite.org). Our system contributes to the development of rt-fMRI by presenting a motion correction algorithm that provides an estimate of motion with essentially no processing delay as well as a modular rt-fMRI system design. Using empirical data from rt-fMRI scans, we assessed the quality of motion correction in this new system. The present algorithm performed comparably to standard (non real-time) offline methods and outperformed other real-time methods based on zero order interpolation of motion parameters. The modular approach to the rt-fMRI system allows the system to be flexible to the experiment and feedback design, a valuable feature for many applications. We illustrate the flexibility of the system by describing several of our ongoing studies. Our hope is that continuing development of open-source rt-fMRI algorithms and software will make this new technology more accessible and adaptable, and will thereby accelerate its application in the clinical and cognitive neurosciences. PMID:23319241

  9. A graphics processing unit accelerated motion correction algorithm and modular system for real-time fMRI.

    PubMed

    Scheinost, Dustin; Hampson, Michelle; Qiu, Maolin; Bhawnani, Jitendra; Constable, R Todd; Papademetris, Xenophon

    2013-07-01

    Real-time functional magnetic resonance imaging (rt-fMRI) has recently gained interest as a possible means to facilitate the learning of certain behaviors. However, rt-fMRI is limited by processing speed and available software, and continued development is needed for rt-fMRI to progress further and become feasible for clinical use. In this work, we present an open-source rt-fMRI system for biofeedback powered by a novel Graphics Processing Unit (GPU) accelerated motion correction strategy as part of the BioImage Suite project ( www.bioimagesuite.org ). Our system contributes to the development of rt-fMRI by presenting a motion correction algorithm that provides an estimate of motion with essentially no processing delay as well as a modular rt-fMRI system design. Using empirical data from rt-fMRI scans, we assessed the quality of motion correction in this new system. The present algorithm performed comparably to standard (non real-time) offline methods and outperformed other real-time methods based on zero order interpolation of motion parameters. The modular approach to the rt-fMRI system allows the system to be flexible to the experiment and feedback design, a valuable feature for many applications. We illustrate the flexibility of the system by describing several of our ongoing studies. Our hope is that continuing development of open-source rt-fMRI algorithms and software will make this new technology more accessible and adaptable, and will thereby accelerate its application in the clinical and cognitive neurosciences.

  10. AutoLens: Automated Modeling of a Strong Lens's Light, Mass and Source

    NASA Astrophysics Data System (ADS)

    Nightingale, J. W.; Dye, S.; Massey, Richard J.

    2018-05-01

    This work presents AutoLens, the first entirely automated modeling suite for the analysis of galaxy-scale strong gravitational lenses. AutoLens simultaneously models the lens galaxy's light and mass whilst reconstructing the extended source galaxy on an adaptive pixel-grid. The method's approach to source-plane discretization is amorphous, adapting its clustering and regularization to the intrinsic properties of the lensed source. The lens's light is fitted using a superposition of Sersic functions, allowing AutoLens to cleanly deblend its light from the source. Single component mass models representing the lens's total mass density profile are demonstrated, which in conjunction with light modeling can detect central images using a centrally cored profile. Decomposed mass modeling is also shown, which can fully decouple a lens's light and dark matter and determine whether the two component are geometrically aligned. The complexity of the light and mass models are automatically chosen via Bayesian model comparison. These steps form AutoLens's automated analysis pipeline, such that all results in this work are generated without any user-intervention. This is rigorously tested on a large suite of simulated images, assessing its performance on a broad range of lens profiles, source morphologies and lensing geometries. The method's performance is excellent, with accurate light, mass and source profiles inferred for data sets representative of both existing Hubble imaging and future Euclid wide-field observations.

  11. Concept and set-up of an IR-gas sensor construction kit

    NASA Astrophysics Data System (ADS)

    Sieber, Ingo; Perner, Gernot; Gengenbach, Ulrich

    2015-10-01

    The paper presents an approach to a cost-efficient modularly built non-dispersive optical IR-gas sensor (NDIR) based on a construction kit. The modularity of the approach offers several advantages: First of all it allows for an adaptation of the performance of the gas sensor to individual specifications by choosing the suitable modular components. The sensitivity of the sensor e.g. can be altered by selecting a source which emits a favorable wavelength spectrum with respect to the absorption spectrum of the gas to be measured or by tuning the measuring distance (ray path inside the medium to be measured). Furthermore the developed approach is very well suited to be used in teaching. Together with students a construction kit on basis of an optical free space system was developed and partly implemented to be further used as a teaching and training aid for bachelor and master students at our institute. The components of the construction kit are interchangeable and freely fixable on a base plate. The components are classified into five groups: sources, reflectors, detectors, gas feed, and analysis cell. Source, detector, and the positions of the components are fundamental to experiment and test different configurations and beam paths. The reflectors are implemented by an aluminum coated adhesive foil, mounted onto a support structure fabricated by additive manufacturing. This approach allows derivation of the reflecting surface geometry from the optical design tool and generating the 3D-printing files by applying related design rules. The rapid fabrication process and the adjustment of the modules on the base plate allow rapid, almost LEGO®-like, experimental assessment of design ideas. Subject of this paper is modeling, design, and optimization of the reflective optical components, as well as of the optical subsystem. The realization of a sample set-up used as a teaching aid and the optical measurement of the beam path in comparison to the simulation results are shown as well.

  12. Improving linear transport infrastructure efficiency by automated learning and optimised predictive maintenance techniques (INFRALERT)

    NASA Astrophysics Data System (ADS)

    Jiménez-Redondo, Noemi; Calle-Cordón, Alvaro; Kandler, Ute; Simroth, Axel; Morales, Francisco J.; Reyes, Antonio; Odelius, Johan; Thaduri, Aditya; Morgado, Joao; Duarte, Emmanuele

    2017-09-01

    The on-going H2020 project INFRALERT aims to increase rail and road infrastructure capacity in the current framework of increased transportation demand by developing and deploying solutions to optimise maintenance interventions planning. It includes two real pilots for road and railways infrastructure. INFRALERT develops an ICT platform (the expert-based Infrastructure Management System, eIMS) which follows a modular approach including several expert-based toolkits. This paper presents the methodologies and preliminary results of the toolkits for i) nowcasting and forecasting of asset condition, ii) alert generation, iii) RAMS & LCC analysis and iv) decision support. The results of these toolkits in a meshed road network in Portugal under the jurisdiction of Infraestruturas de Portugal (IP) are presented showing the capabilities of the approaches.

  13. Space Biology Initiative. Trade Studies, volume 2

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The six studies which are the subjects of this report are entitled: Design Modularity and Commonality; Modification of Existing Hardware (COTS) vs. New Hardware Build Cost Analysis; Automation Cost vs. Crew Utilization; Hardware Miniaturization versus Cost; Space Station Freedom/Spacelab Modules Compatibility vs. Cost; and Prototype Utilization in the Development of Space Hardware. The product of these six studies was intended to provide a knowledge base and methodology that enables equipment produced for the Space Biology Initiative program to meet specific design and functional requirements in the most efficient and cost effective form consistent with overall mission integration parameters. Each study promulgates rules of thumb, formulas, and matrices that serves as a handbook for the use and guidance of designers and engineers in design, development, and procurement of Space Biology Initiative (SBI) hardware and software.

  14. CD-based image archival and management on a hybrid radiology intranet.

    PubMed

    Cox, R D; Henri, C J; Bret, P M

    1997-08-01

    This article describes the design and implementation of a low-cost image archival and management solution on a radiology network consisting of UNIX, IBM personal computer-compatible (IBM, Purchase, NY) and Macintosh (Apple Computer, Cupertino, CA) workstations. The picture archiving and communications system (PACS) is modular, scaleable and conforms to the Digital Imaging and Communications in Medicine (DICOM) 3.0 standard for image transfer, storage and retrieval. Image data is made available on soft-copy reporting workstations by a work-flow management scheme and on desktop computers through a World Wide Web (WWW) interface. Data archival is based on recordable compact disc (CD) technology and is automated. The project has allowed the radiology department to eliminate the use of film in magnetic resonance (MR) imaging, computed tomography (CT) and ultrasonography.

  15. A laboratory for life sciences research in space

    NASA Technical Reports Server (NTRS)

    Williams, B. A.; Klein, H. P.

    1982-01-01

    Biological studies hardware for Spacelab flights are described. The research animal holding facility has modular construction and is installed on a single ESA rack. A biotelemetry system will provide body temperature and EKG/heart rate data from a radio transmitter surgically implanted in the animals' stomachs. A plant growth unit (PGU) will be used to study micro-g plant lignin growth. The PGU is automated and can carry as many as 96 plants. A general purpose work station (GPWS) biohazard cabinet will be flown on Spacelab 4 to control liquid and chemical vapors released during experimentation. Spacelab 4 will be the premier flight of actual animal studies comprising measurements of hematology, muscle biochemistry, blood circulation, fluids and electrolytes, vestibular adaptation, etc., using rats and squirrel monkeys as subjects.

  16. A system for intelligent teleoperation research

    NASA Technical Reports Server (NTRS)

    Orlando, N. E.

    1983-01-01

    The Automation Technology Branch of NASA Langley Research Center is developing a research capability in the field of artificial intelligence, particularly as applicable in teleoperator/robotics development for remote space operations. As a testbed for experimentation in these areas, a system concept has been developed and is being implemented. This system termed DAISIE (Distributed Artificially Intelligent System for Interacting with the Environment), interfaces the key processes of perception, reasoning, and manipulation by linking hardware sensors and manipulators to a modular artificial intelligence (AI) software system in a hierarchical control structure. Verification experiments have been performed: one experiment used a blocksworld database and planner embedded in the DAISIE system to intelligently manipulate a simple physical environment; the other experiment implemented a joint-space collision avoidance algorithm. Continued system development is planned.

  17. Space Biology Initiative. Trade Studies, volume 1

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The six studies which are addressed are entitled: Design Modularity and Commonality; Modification of Existing Hardware (COTS) vs. New Hardware Build Cost Analysis; Automation Cost vs. Crew Utilization; Hardware Miniaturization versus Cost; Space Station Freedom/Spacelab Modules Compatibility vs. Cost; and Prototype Utilization in the Development of Space Hardware. The product of these six studies was intended to provide a knowledge base and methodology that enables equipment produced for the Space Biology Initiative program to meet specific design and functional requirements in the most efficient and cost effective form consistent with overall mission integration parameters. Each study promulgates rules of thumb, formulas, and matrices that serves has a handbook for the use and guidance of designers and engineers in design, development, and procurement of Space Biology Initiative (SBI) hardware and software.

  18. Testimony of Fred R. Mynatt before the Energy Research and Development Subcommittee of the Committee on Science, Space, and Technology, US House of Representatives. [Advanced fuel technology, gas-cooled reactor technology, and liquid metal-cooled reactor technology programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mynatt, F.R.

    1987-03-18

    This report provides a description of the statements submitted for the record to the committee on Science, Space, and Technology of the United States House of Representatives. These statements describe three principal areas of activity of the Advanced Reactor Technology Program of the Department of Energy (DOE). These areas are advanced fuel cycle technology, modular high-temperature gas-cooled reactor technology, and liquid metal-cooled reactor. The areas of automated reactor control systems, robotics, materials and structural design shielding and international cooperation were included in these statements describing the Oak Ridge National Laboratory's efforts in these areas. (FI)

  19. An adaptive, object oriented strategy for base calling in DNA sequence analysis.

    PubMed Central

    Giddings, M C; Brumley, R L; Haker, M; Smith, L M

    1993-01-01

    An algorithm has been developed for the determination of nucleotide sequence from data produced in fluorescence-based automated DNA sequencing instruments employing the four-color strategy. This algorithm takes advantage of object oriented programming techniques for modularity and extensibility. The algorithm is adaptive in that data sets from a wide variety of instruments and sequencing conditions can be used with good results. Confidence values are provided on the base calls as an estimate of accuracy. The algorithm iteratively employs confidence determinations from several different modules, each of which examines a different feature of the data for accurate peak identification. Modules within this system can be added or removed for increased performance or for application to a different task. In comparisons with commercial software, the algorithm performed well. Images PMID:8233787

  20. Community structure in networks

    NASA Astrophysics Data System (ADS)

    Newman, Mark

    2004-03-01

    Many networked systems, including physical, biological, social, and technological networks, appear to contain ``communities'' -- groups of nodes within which connections are dense, but between which they are sparser. The ability to find such communities in an automated fashion could be of considerable use. Communities in a web graph for instance might correspond to sets of web sites dealing with related topics, while communities in a biochemical network or an electronic circuit might correspond to functional units of some kind. We present a number of new methods for community discovery, including methods based on ``betweenness'' measures and methods based on modularity optimization. We also give examples of applications of these methods to both computer-generated and real-world network data, and show how our techniques can be used to shed light on the sometimes dauntingly complex structure of networked systems.

  1. A cost and utility analysis of NIM/CAMAC standards and equipment for shuttle payload data acquisition and control systems. Volume 3: Tasks 3 and 4

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The modifications for the Nuclear Instrumentation Modular (NIM) and Computer Automated Measurement Control (CAMAC) equipment, designed for ground based laboratory use, that would be required to permit its use in the Spacelab environments were determined. The cost of these modifications were estimated and the most cost effective approach to implementing them were identified. A shared equipment implementation in which the various Spacelab users draw their required complement of standard NIM and CAMAC equipment for a given flight from a common equipment pool was considered. The alternative approach studied was a dedicated equipment implementation in which each of the users is responsible for procuring either their own NIM/CAMAC equipment or its custom built equivalent.

  2. Reconfigurable, Cognitive Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  3. Using Voice Coils to Actuate Modular Soft Robots: Wormbot, an Example.

    PubMed

    Nemitz, Markus P; Mihaylov, Pavel; Barraclough, Thomas W; Ross, Dylan; Stokes, Adam A

    2016-12-01

    In this study, we present a modular worm-like robot, which utilizes voice coils as a new paradigm in soft robot actuation. Drive electronics are incorporated into the actuators, providing a significant improvement in self-sufficiency when compared with existing soft robot actuation modes such as pneumatics or hydraulics. The body plan of this robot is inspired by the phylum Annelida and consists of three-dimensional printed voice coil actuators, which are connected by flexible silicone membranes. Each electromagnetic actuator engages with its neighbor to compress or extend the membrane of each segment, and the sequence in which they are actuated results in an earthworm-inspired peristaltic motion. We find that a minimum of three segments is required for locomotion, but due to our modular design, robots of any length can be quickly and easily assembled. In addition to actuation, voice coils provide audio input and output capabilities. We demonstrate transmission of data between segments by high-frequency carrier waves and, using a similar mechanism, we note that the passing of power between coupled coils in neighboring modules-or from an external power source-is also possible. Voice coils are a convenient multifunctional alternative to existing soft robot actuators. Their self-contained nature and ability to communicate with each other are ideal for modular robotics, and the additional functionality of sound input/output and power transfer will become increasingly useful as soft robots begin the transition from early proof-of-concept systems toward fully functional and highly integrated robotic systems.

  4. Modular electron transfer circuits for synthetic biology

    PubMed Central

    Agapakis, Christina M

    2010-01-01

    Electron transfer is central to a wide range of essential metabolic pathways, from photosynthesis to fermentation. The evolutionary diversity and conservation of proteins that transfer electrons makes these pathways a valuable platform for engineered metabolic circuits in synthetic biology. Rational engineering of electron transfer pathways containing hydrogenases has the potential to lead to industrial scale production of hydrogen as an alternative source of clean fuel and experimental assays for understanding the complex interactions of multiple electron transfer proteins in vivo. We designed and implemented a synthetic hydrogen metabolism circuit in Escherichia coli that creates an electron transfer pathway both orthogonal to and integrated within existing metabolism. The design of such modular electron transfer circuits allows for facile characterization of in vivo system parameters with applications toward further engineering for alternative energy production. PMID:21468209

  5. Development of a reactor with carbon catalysts for modular-scale, low-cost electrochemical generation of H 2O 2

    DOE PAGES

    Chen, Zhihua; Chen, Shucheng; Siahrostami, Samira; ...

    2017-03-01

    The development of small-scale, decentralized reactors for H 2O 2 production that can couple to renewable energy sources would be of great benefit, particularly for water purification in the developing world. Herein, we describe our efforts to develop electrochemical reactors for H 2O 2 generation with high Faradaic efficiencies of >90%, requiring cell voltages of only ~1.6 V. The reactor employs a carbon-based catalyst that demonstrates excellent performance for H 2O 2 production under alkaline conditions, as demonstrated by fundamental studies involving rotating-ring disk electrode methods. Finally, the low-cost, membrane-free reactor design represents a step towards a continuous, modular-scale, de-centralizedmore » production of H 2O 2.« less

  6. Contamination concerns in the modular containerless processing facility

    NASA Technical Reports Server (NTRS)

    Seshan, P. K.; Trinh, E. H.

    1989-01-01

    This paper describes the problems of the control and management of contamination in the Modular Containerless Processing Facility (MCPF), that is being currently developed at the JPL for the Space Station, and in the MCPF's precursor version, called the Drop Physics Module (DPM), which will be carried aboard one or more Space Shuttle missions. Attention is given to the identification of contamination sources, their mode of transport to the sample positioned within the chamber, and the protection of the sample, as well as to the mathematical simulatiom of the contaminant transport. It is emphasized that, in order to choose and implement the most appropriate contamination control strategy for each investigator, a number of simplified mathematical simulations will have to be developed, and ground-based contamination experiments will have to be carried out with identical materials.

  7. Identification of Modules in Protein-Protein Interaction Networks

    NASA Astrophysics Data System (ADS)

    Erten, Sinan; Koyutürk, Mehmet

    In biological systems, most processes are carried out through orchestration of multiple interacting molecules. These interactions are often abstracted using network models. A key feature of cellular networks is their modularity, which contributes significantly to the robustness, as well as adaptability of biological systems. Therefore, modularization of cellular networks is likely to be useful in obtaining insights into the working principles of cellular systems, as well as building tractable models of cellular organization and dynamics. A common, high-throughput source of data on molecular interactions is in the form of physical interactions between proteins, which are organized into protein-protein interaction (PPI) networks. This chapter provides an overview on identification and analysis of functional modules in PPI networks, which has been an active area of research in the last decade.

  8. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  9. Modular design of H - synchrotrons for radiation therapy

    NASA Astrophysics Data System (ADS)

    Martin, R. L.

    1989-04-01

    A modular synchrotron for accelerating H - ions and a proton beam delivery system are being developed for radiation therapy with protons under SBIR grants from the National Cancer Institute. The advantage proposed for accelerating H - ions and utilizing charge exchange as a slow extraction mechanism lies in enhanced control of the extracted beam current, important for beam delivery with raster scanning for 3D dose contouring of a tumor site. Under these grants prototype magnets and vacuum systems are being constructed, appropriate H - sources are being developed and beam experiments will be carried out to demonstrate some of the key issues of this concept. The status of this program is described along with a discussion of a relatively inexpensive beam delivery system and a proposed program for its development.

  10. ms_lims, a simple yet powerful open source laboratory information management system for MS-driven proteomics.

    PubMed

    Helsens, Kenny; Colaert, Niklaas; Barsnes, Harald; Muth, Thilo; Flikka, Kristian; Staes, An; Timmerman, Evy; Wortelkamp, Steffi; Sickmann, Albert; Vandekerckhove, Joël; Gevaert, Kris; Martens, Lennart

    2010-03-01

    MS-based proteomics produces large amounts of mass spectra that require processing, identification and possibly quantification before interpretation can be undertaken. High-throughput studies require automation of these various steps, and management of the data in association with the results obtained. We here present ms_lims (http://genesis.UGent.be/ms_lims), a freely available, open-source system based on a central database to automate data management and processing in MS-driven proteomics analyses.

  11. Automated Magnitude Measures, Earthquake Source Modeling, VFM Discriminant Testing and Summary of Current Research.

    DTIC Science & Technology

    1979-02-01

    jm.. W 112.11111 * I 120 11 11111.258 MICROCOPY RESOLUTION TEST CHART NATIONAL BUREAU OF STANOARDS-19b3-A 0 - SYSTEMS, SCIENCE AND SOFTWARE * SSS-R-79...3933 0AUTOMATED MAGNITUDE MEASURES, EARTHQUAKE SOURCE MODELING, VFM DISCRIMINANT TESTING AND SUMMARY OF CURRENT RESEARCH T. C. BACHE S. M. DAY J. M...VFM DISCRIMINANT . PERFORMING ORG. REPORT NUMBER TESTING AND SUMMARY OF CURRENT RESEARCH SSS-R-79-3933 7. AUTmOR(s) 8. CONTRACT OR GRANT NUMBERtSi T

  12. Survey of Modular Military Vehicles: Benefits and Burdens

    DTIC Science & Technology

    2016-01-01

    Survey of Modular Military Vehicles: BENEFITS and BURDENS Jean M. Dasch and David J. Gorsich Modularity in military vehicle design is generally...considered a positive attribute that promotes adaptability, resilience, and cost savings. The benefits and burdens of modularity are considered by...Engineering Center, vehicles were considered based on horizontal modularity , vertical modularity , and distributed modularity . Examples were given for each

  13. Rapid ISS Power Availability Simulator

    NASA Technical Reports Server (NTRS)

    Downing, Nicholas

    2011-01-01

    The ISS (International Space Station) Power Resource Officers (PROs) needed a tool to automate the calculation of thousands of ISS power availability simulations used to generate power constraint matrices. Each matrix contains 864 cells, and each cell represents a single power simulation that must be run. The tools available to the flight controllers were very operator intensive and not conducive to rapidly running the thousands of simulations necessary to generate the power constraint data. SOLAR is a Java-based tool that leverages commercial-off-the-shelf software (Satellite Toolkit) and an existing in-house ISS EPS model (SPEED) to rapidly perform thousands of power availability simulations. SOLAR has a very modular architecture and consists of a series of plug-ins that are loosely coupled. The modular architecture of the software allows for the easy replacement of the ISS power system model simulator, re-use of the Satellite Toolkit integration code, and separation of the user interface from the core logic. Satellite Toolkit (STK) is used to generate ISS eclipse and insulation times, solar beta angle, position of the solar arrays over time, and the amount of shadowing on the solar arrays, which is then provided to SPEED to calculate power generation forecasts. The power planning turn-around time is reduced from three months to two weeks (83-percent decrease) using SOLAR, and the amount of PRO power planning support effort is reduced by an estimated 30 percent.

  14. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    PubMed Central

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  15. Manufacturing concepts and development trends in the industrial production of microelectromechanical systems

    NASA Astrophysics Data System (ADS)

    Schuenemann, Matthias; Grimme, Ralf; Kaufmann, Thomas; Schwaab, Gerhard; Baeder, Uwe; Schaefer, Wolfgang; Dorner, Johann

    1998-01-01

    During the past few years, remarkable affords have been made for the realization of microscale sensors, actuators and microelectromechanical system. Due to advances in solid state and micromachining technologies, significant advances in designing, fabricating and testing of microminiaturized devices have been achieved at laboratory level. However, the technical and economical realization of microelectromechanical systems is considerably impeded by the lack of satisfying device technology for their industrial production. A production concept for the industrial production of hybrid microelectromechanical systems was developed and investigated. The concept is based on the resources and requirements of medium-sized enterprises and is characterized by its flexibility. Microsystem fabrication is separated into microfabrication steps performed in-house and technological steps performed by external technology providers. The modularity of the concept allows for a gradual increase in the degree of automation and the in-house production depth, depending on market capacity and financial resources. To demonstrate the feasibility of this approach, the design and realization of a microfabrication process center, which includes tasks like transport and handling, processing, cleaning, testing and storing are discussed. Special attention is given to the supply and feeding of microparts, to the necessary magazines, trays and transport systems, to the implementation of homogeneous mechanical, environmental and information interfaces, to the employment of advanced control, scheduling, and lot tracking concepts, and to the application of highly modular and cost-efficient clean production concepts.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Na; Wu, Yu-Ping; Min, Hao

    A radio-frequency (RF) source designed for cold atom experiments is presented. The source uses AD9858, a direct digital synthesizer, to generate the sine wave directly, up to 400 MHz, with sub-Hz resolution. An amplitude control circuit consisting of wideband variable gain amplifier and high speed digital to analog converter is integrated into the source, capable of 70 dB off isolation and 4 ns on-off keying. A field programmable gate array is used to implement a versatile frequency and amplitude co-sweep logic. Owing to modular design, the RF sources have been used on many cold atom experiments to generate various complicatedmore » RF sequences, enriching the operation schemes of cold atoms, which cannot be done by standard RF source instruments.« less

  17. ISHN Ion Source Control System. First Steps Toward an EPICS Based ESS-Bilbao Accelerator Control System

    NASA Astrophysics Data System (ADS)

    Eguiraun, M.; Jugo, J.; Arredondo, I.; del Campo, M.; Feuchtwanger, J.; Etxebarria, V.; Bermejo, F. J.

    2013-04-01

    ISHN (Ion Source Hydrogen Negative) consists of a Penning type ion source in operation at ESS-Bilbao facilities. From the control point of view, this source is representative of the first steps and decisions taken towards the general control architecture of the whole accelerator to be built. The ISHN main control system is based on a PXI architecture, under a real-time controller which is programmed using LabVIEW. This system, with additional elements, is connected to the general control system. The whole system is based on EPICS for the control network, and the modularization of the communication layers of the accelerator plays an important role in the proposed control architecture.

  18. Robotics for Nuclear Material Handling at LANL:Capabilities and Needs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harden, Troy A; Lloyd, Jane A; Turner, Cameron J

    Nuclear material processing operations present numerous challenges for effective automation. Confined spaces, hazardous materials and processes, particulate contamination, radiation sources, and corrosive chemical operations are but a few of the significant hazards. However, automated systems represent a significant safety advance when deployed in place of manual tasks performed by human workers. The replacement of manual operations with automated systems has been desirable for nearly 40 years, yet only recently are automated systems becoming increasingly common for nuclear materials handling applications. This paper reviews several automation systems which are deployed or about to be deployed at Los Alamos National Laboratory formore » nuclear material handling operations. Highlighted are the current social and technological challenges faced in deploying automated systems into hazardous material handling environments and the opportunities for future innovations.« less

  19. Collecting and Animating Online Satellite Images.

    ERIC Educational Resources Information Center

    Irons, Ralph

    1995-01-01

    Describes how to generate automated classroom resources from the Internet. Topics covered include viewing animated satellite weather images using file transfer protocol (FTP); sources of images on the Internet; shareware available for viewing images; software for automating image retrieval; procedures for animating satellite images; and storing…

  20. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  1. Building a drug ontology based on RxNorm and other sources

    PubMed Central

    2013-01-01

    Background We built the Drug Ontology (DrOn) because we required correct and consistent drug information in a format for use in semantic web applications, and no existing resource met this requirement or could be altered to meet it. One of the obstacles we faced when creating DrOn was the difficulty in reusing drug information from existing sources. The primary external source we have used at this stage in DrOn’s development is RxNorm, a standard drug terminology curated by the National Library of Medicine (NLM). To build DrOn, we (1) mined data from historical releases of RxNorm and (2) mapped many RxNorm entities to Chemical Entities of Biological Interest (ChEBI) classes, pulling relevant information from ChEBI while doing so. Results We built DrOn in a modular fashion to facilitate simpler extension and development of the ontology and to allow reasoning and construction to scale. Classes derived from each source are serialized in separate modules. For example, the classes in DrOn that are programmatically derived from RxNorm are stored in a separate module and subsumed by classes in a manually-curated, realist, upper-level module of DrOn with terms such as 'clinical drug role’, 'tablet’, 'capsule’, etc. Conclusions DrOn is a modular, extensible ontology of drug products, their ingredients, and their biological activity that avoids many of the fundamental flaws found in other, similar artifacts and meets the requirements of our comparative-effectiveness research use-case. PMID:24345026

  2. Building a drug ontology based on RxNorm and other sources.

    PubMed

    Hanna, Josh; Joseph, Eric; Brochhausen, Mathias; Hogan, William R

    2013-12-18

    We built the Drug Ontology (DrOn) because we required correct and consistent drug information in a format for use in semantic web applications, and no existing resource met this requirement or could be altered to meet it. One of the obstacles we faced when creating DrOn was the difficulty in reusing drug information from existing sources. The primary external source we have used at this stage in DrOn's development is RxNorm, a standard drug terminology curated by the National Library of Medicine (NLM). To build DrOn, we (1) mined data from historical releases of RxNorm and (2) mapped many RxNorm entities to Chemical Entities of Biological Interest (ChEBI) classes, pulling relevant information from ChEBI while doing so. We built DrOn in a modular fashion to facilitate simpler extension and development of the ontology and to allow reasoning and construction to scale. Classes derived from each source are serialized in separate modules. For example, the classes in DrOn that are programmatically derived from RxNorm are stored in a separate module and subsumed by classes in a manually-curated, realist, upper-level module of DrOn with terms such as 'clinical drug role', 'tablet', 'capsule', etc. DrOn is a modular, extensible ontology of drug products, their ingredients, and their biological activity that avoids many of the fundamental flaws found in other, similar artifacts and meets the requirements of our comparative-effectiveness research use-case.

  3. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    PubMed

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  4. Fast and Accurate Resonance Assignment of Small-to-Large Proteins by Combining Automated and Manual Approaches

    PubMed Central

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A.; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available. PMID:25569628

  5. Use of Annotations for Component and Framework Interoperability

    NASA Astrophysics Data System (ADS)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  6. The Developing Child Workbook 1995/1996.

    ERIC Educational Resources Information Center

    Olenick, Rhoda; And Others

    An integral part of The Developing Child video modules from the same producer, this workbook provides a very useful clearly formatted modular presentation, 30 modules in all, of information on all areas of child development. The workbook can be used with the videos, without them as a stand alone tutorial or review source, or as the outline for a…

  7. Trusting ICT in Today’s Global Supply Chain - Understanding and Implementing Government and Industry Best Practices

    DTIC Science & Technology

    2010-05-17

    System), or American company with factory in Malaysia (Smart Modular) Technology Is A Focal Point Of Attacks Who is behind data breaches ? 74% resulted...military style community of hackers learning from each other. 8 8 * Source – 2009 Verizon Data Breach Investigations Report 38% 32% There are also 100,000

  8. Automated Modular Magnetic Resonance Imaging Clinical Decision Support System (MIROR): An Application in Pediatric Cancer Diagnosis.

    PubMed

    Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine; Peet, Andrew

    2018-05-02

    Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians' skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. ©Niloufar Zarinabad, Emma M Meeus, Karen Manias, Katharine Foster, Andrew Peet. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 02.05.2018.

  9. Mission Control Technologies: A New Way of Designing and Evolving Mission Systems

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Walton, Joan; Saddler, Harry

    2006-01-01

    Current mission operations systems are built as a collection of monolithic software applications. Each application serves the needs of a specific user base associated with a discipline or functional role. Built to accomplish specific tasks, each application embodies specialized functional knowledge and has its own data storage, data models, programmatic interfaces, user interfaces, and customized business logic. In effect, each application creates its own walled-off environment. While individual applications are sometimes reused across multiple missions, it is expensive and time consuming to maintain these systems, and both costly and risky to upgrade them in the light of new requirements or modify them for new purposes. It is even more expensive to achieve new integrated activities across a set of monolithic applications. These problems impact the lifecycle cost (especially design, development, testing, training, maintenance, and integration) of each new mission operations system. They also inhibit system innovation and evolution. This in turn hinders NASA's ability to adopt new operations paradigms, including increasingly automated space systems, such as autonomous rovers, autonomous onboard crew systems, and integrated control of human and robotic missions. Hence, in order to achieve NASA's vision affordably and reliably, we need to consider and mature new ways to build mission control systems that overcome the problems inherent in systems of monolithic applications. The keys to the solution are modularity and interoperability. Modularity will increase extensibility (evolution), reusability, and maintainability. Interoperability will enable composition of larger systems out of smaller parts, and enable the construction of new integrated activities that tie together, at a deep level, the capabilities of many of the components. Modularity and interoperability together contribute to flexibility. The Mission Control Technologies (MCT) Project, a collaboration of multiple NASA Centers, led by NASA Ames Research Center, is building a framework to enable software to be assembled from flexible collections of components and services.

  10. The relative efficiency of modular and non-modular networks of different size

    PubMed Central

    Tosh, Colin R.; McNally, Luke

    2015-01-01

    Most biological networks are modular but previous work with small model networks has indicated that modularity does not necessarily lead to increased functional efficiency. Most biological networks are large, however, and here we examine the relative functional efficiency of modular and non-modular neural networks at a range of sizes. We conduct a detailed analysis of efficiency in networks of two size classes: ‘small’ and ‘large’, and a less detailed analysis across a range of network sizes. The former analysis reveals that while the modular network is less efficient than one of the two non-modular networks considered when networks are small, it is usually equally or more efficient than both non-modular networks when networks are large. The latter analysis shows that in networks of small to intermediate size, modular networks are much more efficient that non-modular networks of the same (low) connective density. If connective density must be kept low to reduce energy needs for example, this could promote modularity. We have shown how relative functionality/performance scales with network size, but the precise nature of evolutionary relationship between network size and prevalence of modularity will depend on the costs of connectivity. PMID:25631996

  11. ASASSN1: Bright Comet Discovered by the All Sky Automated Survey for SuperNovae

    NASA Astrophysics Data System (ADS)

    Prieto, JJ. L.; Shappee, B. J.; Brimacombe, J.; Stanek, K. Z.; Chen, Ping; Dong, Subo; Holoien, T. W.-S.; Kochanek, C. S.; Brown, J. S.; Shields, J. V.; Thompson, T. A.

    2017-07-01

    During the ongoing All Sky Automated Survey for SuperNovae (ASAS-SN, Shappee et al. 2014), using data from the quadruple 14-cm "Cassius" telescope on Cerro Tololo, Chile, we discovered a new moving transient source, now confirmed as a comet.

  12. Automated classification of RNA 3D motifs and the RNA 3D Motif Atlas

    PubMed Central

    Petrov, Anton I.; Zirbel, Craig L.; Leontis, Neocles B.

    2013-01-01

    The analysis of atomic-resolution RNA three-dimensional (3D) structures reveals that many internal and hairpin loops are modular, recurrent, and structured by conserved non-Watson–Crick base pairs. Structurally similar loops define RNA 3D motifs that are conserved in homologous RNA molecules, but can also occur at nonhomologous sites in diverse RNAs, and which often vary in sequence. To further our understanding of RNA motif structure and sequence variability and to provide a useful resource for structure modeling and prediction, we present a new method for automated classification of internal and hairpin loop RNA 3D motifs and a new online database called the RNA 3D Motif Atlas. To classify the motif instances, a representative set of internal and hairpin loops is automatically extracted from a nonredundant list of RNA-containing PDB files. Their structures are compared geometrically, all-against-all, using the FR3D program suite. The loops are clustered into motif groups, taking into account geometric similarity and structural annotations and making allowance for a variable number of bulged bases. The automated procedure that we have implemented identifies all hairpin and internal loop motifs previously described in the literature. All motif instances and motif groups are assigned unique and stable identifiers and are made available in the RNA 3D Motif Atlas (http://rna.bgsu.edu/motifs), which is automatically updated every four weeks. The RNA 3D Motif Atlas provides an interactive user interface for exploring motif diversity and tools for programmatic data access. PMID:23970545

  13. DEVELOPMENT OF OPERATIONAL CONCEPTS FOR ADVANCED SMRs: THE ROLE OF COGNITIVE SYSTEMS ENGINEERING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo; David Gertman

    Advanced small modular reactors (AdvSMRs) will use advanced digital instrumentation and control systems, and make greater use of automation. These advances not only pose technical and operational challenges, but will inevitably have an effect on the operating and maintenance (O&M) cost of new plants. However, there is much uncertainty about the impact of AdvSMR designs on operational and human factors considerations, such as workload, situation awareness, human reliability, staffing levels, and the appropriate allocation of functions between the crew and various automated plant systems. Existing human factors and systems engineering design standards and methodologies are not current in terms ofmore » human interaction requirements for dynamic automated systems and are no longer suitable for the analysis of evolving operational concepts. New models and guidance for operational concepts for complex socio-technical systems need to adopt a state-of-the-art approach such as Cognitive Systems Engineering (CSE) that gives due consideration to the role of personnel. This approach we report on helps to identify and evaluate human challenges related to non-traditional concepts of operations. A framework - defining operational strategies was developed based on the operational analysis of Argonne National Laboratory’s Experimental Breeder Reactor-II (EBR-II), a small (20MWe) sodium-cooled reactor that was successfully operated for thirty years. Insights from the application of the systematic application of the methodology and its utility are reviewed and arguments for the formal adoption of CSE as a value-added part of the Systems Engineering process are presented.« less

  14. Construction of an automated fiber pigtailing machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strand, O.T.

    1996-01-01

    At present, the high cost of optoelectronic (OE) devices is caused in part by the labor-intensive processes involved with packaging. Automating the packaging processes should result in a significant cost reduction. One of the most labor-intensive steps is aligning and attaching the fiber to the OE device, the so-called pigtailing process. Therefore, the goal of this 2-year ARPA-funded project is to design and build 3 low-cost machines to perform sub-micron alignments and attachments of single-mode fibers to different OE devices. These Automated Fiber Pigtailing Machines (AFPMS) are intended to be compatible with a manufacturing environment and have a modular designmore » for standardization of parts and machine vision for maximum flexibility. This work is a collaboration among Uniphase Telecommunications Products (formerly United Technologies Photonics, UTP), Ortel, Newport/Klinger, the Massachusetts Institute of Technology Manufacturing Institute (MIT), and Lawrence Livermore National Laboratory (LLNL). UTP and Ortel are the industrial partners for whom two of the AFPMs are being built. MIT and LLNL make up the design and assembly team of the project, while Newport/Klinger is a potential manufacturer of the AFPM and provides guidance to ensure that the design of the AFPM is marketable and compatible with a manufacturing environment. The AFPM for UTP will pigtail LiNbO{sub 3} waveguide devices and the AFPM for Ortel will pigtail photodiodes. Both of these machines will contain proprietary information, so the third AFPM, to reside at LLNL, will pigtail a non-proprietary waveguide device for demonstrations to US industry.« less

  15. Work Domain Analysis of a Predecessor Sodium-cooled Reactor as Baseline for AdvSMR Operational Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald Farris; David Gertman; Jacques Hugo

    This report presents the results of the Work Domain Analysis for the Experimental Breeder Reactor (EBR-II). This is part of the phase of the research designed to incorporate Cognitive Work Analysis in the development of a framework for the formalization of an Operational Concept (OpsCon) for Advanced Small Modular Reactors (AdvSMRs). For a new AdvSMR design, information obtained through Cognitive Work Analysis, combined with human performance criteria, can and should be used in during the operational phase of a plant to assess the crew performance aspects associated with identified AdvSMR operational concepts. The main objective of this phase was tomore » develop an analytical and descriptive framework that will help systems and human factors engineers to understand the design and operational requirements of the emerging generation of small, advanced, multi-modular reactors. Using EBR-II as a predecessor to emerging sodium-cooled reactor designs required the application of a method suitable to the structured and systematic analysis of the plant to assist in identifying key features of the work associated with it and to clarify the operational and other constraints. The analysis included the identification and description of operating scenarios that were considered characteristic of this type of nuclear power plant. This is an invaluable aspect of Operational Concept development since it typically reveals aspects of future plant configurations that will have an impact on operations. These include, for example, the effect of core design, different coolants, reactor-to-power conversion unit ratios, modular plant layout, modular versus central control rooms, plant siting, and many more. Multi-modular plants in particular are expected to have a significant impact on overall OpsCon in general, and human performance in particular. To support unconventional modes of operation, the modern control room of a multi-module plant would typically require advanced HSIs that would provide sophisticated operational information visualization, coupled with adaptive automation schemes and operator support systems to reduce complexity. These all have to be mapped at some point to human performance requirements. The EBR-II results will be used as a baseline that will be extrapolated in the extended Cognitive Work Analysis phase to the analysis of a selected advanced sodium-cooled SMR design as a way to establish non-conventional operational concepts. The Work Domain Analysis results achieved during this phase have not only established an organizing and analytical framework for describing existing sociotechnical systems, but have also indicated that the method is particularly suited to the analysis of prospective and immature designs. The results of the EBR-II Work Domain Analysis have indicated that the methodology is scientifically sound and generalizable to any operating environment.« less

  16. PconsFold: improved contact predictions improve protein models.

    PubMed

    Michel, Mirco; Hayat, Sikander; Skwark, Marcin J; Sander, Chris; Marks, Debora S; Elofsson, Arne

    2014-09-01

    Recently it has been shown that the quality of protein contact prediction from evolutionary information can be improved significantly if direct and indirect information is separated. Given sufficiently large protein families, the contact predictions contain sufficient information to predict the structure of many protein families. However, since the first studies contact prediction methods have improved. Here, we ask how much the final models are improved if improved contact predictions are used. In a small benchmark of 15 proteins, we show that the TM-scores of top-ranked models are improved by on average 33% using PconsFold compared with the original version of EVfold. In a larger benchmark, we find that the quality is improved with 15-30% when using PconsC in comparison with earlier contact prediction methods. Further, using Rosetta instead of CNS does not significantly improve global model accuracy, but the chemistry of models generated with Rosetta is improved. PconsFold is a fully automated pipeline for ab initio protein structure prediction based on evolutionary information. PconsFold is based on PconsC contact prediction and uses the Rosetta folding protocol. Due to its modularity, the contact prediction tool can be easily exchanged. The source code of PconsFold is available on GitHub at https://www.github.com/ElofssonLab/pcons-fold under the MIT license. PconsC is available from http://c.pcons.net/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  17. EFTofPNG: a package for high precision computation with the effective field theory of post-Newtonian gravity

    NASA Astrophysics Data System (ADS)

    Levi, Michele; Steinhoff, Jan

    2017-12-01

    We present a novel public package ‘EFTofPNG’ for high precision computation in the effective field theory of post-Newtonian (PN) gravity, including spins. We created this package in view of the timely need to publicly share automated computation tools, which integrate the various types of physics manifested in the expected increasing influx of gravitational wave (GW) data. Hence, we created a free and open source package, which is self-contained, modular, all-inclusive, and accessible to the classical gravity community. The ‘EFTofPNG’ Mathematica package also uses the power of the ‘xTensor’ package, suited for complicated tensor computation, where our coding also strategically approaches the generic generation of Feynman contractions, which is universal to all perturbation theories in physics, by efficiently treating n-point functions as tensors of rank n. The package currently contains four independent units, which serve as subsidiaries to the main one. Its final unit serves as a pipeline chain for the obtainment of the final GW templates, and provides the full computation of derivatives and physical observables of interest. The upcoming ‘EFTofPNG’ package version 1.0 should cover the point mass sector, and all the spin sectors, up to the fourth PN order, and the two-loop level. We expect and strongly encourage public development of the package to improve its efficiency, and to extend it to further PN sectors, and observables useful for the waveform modelling.

  18. NASA Tech Briefs, July 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Topics covered include: Wirelessly Interrogated Wear or Temperature Sensors; Processing Nanostructured Sensors Using Microfabrication Techniques; Optical Pointing Sensor; Radio-Frequency Tank Eigenmode Sensor for Propellant Quantity Gauging; High-Temperature Optical Sensor; Integral Battery Power Limiting Circuit for Intrinsically Safe Applications; Configurable Multi-Purpose Processor; Squeezing Alters Frequency Tuning of WGM Optical Resonator; Automated Computer Access Request System; Range Safety for an Autonomous Flight Safety System; Fast and Easy Searching of Files in Unisys 2200 Computers; Parachute Drag Model; Evolutionary Scheduler for the Deep Space Network; Modular Habitats Comprising Rigid and Inflatable Modules; More About N2O-Based Propulsion and Breathable-Gas Systems; Ultrasonic/Sonic Rotary-Hammer Drills; Miniature Piezoelectric Shaker for Distribution of Unconsolidated Samples to Instrument Cells; Lunar Soil Particle Separator; Advanced Aerobots for Scientific Exploration; Miniature Bioreactor System for Long-Term Cell Culture; Electrochemical Detection of Multiple Bioprocess Analytes; Fabrication and Modification of Nanoporous Silicon Particles; High-Altitude Hydration System; Photon Counting Using Edge-Detection Algorithm; Holographic Vortex Coronagraph; Optical Structural Health Monitoring Device; Fuel-Cell Power Source Based on Onboard Rocket Propellants; Polar Lunar Regions: Exploiting Natural and Augmented Thermal Environments; Simultaneous Spectral Temporal Adaptive Raman Spectrometer - SSTARS; Improved Speed and Functionality of a 580-GHz Imaging Radar; Bolometric Device Based on Fluxoid Quantization; Algorithms for Learning Preferences for Sets of Objects; Model for Simulating a Spiral Software-Development Process; Algorithm That Synthesizes Other Algorithms for Hashing; Algorithms for High-Speed Noninvasive Eye-Tracking System; and Adapting ASPEN for Orbital Express.

  19. Real-time Electrophysiology: Using Closed-loop Protocols to Probe Neuronal Dynamics and Beyond

    PubMed Central

    Linaro, Daniele; Couto, João; Giugliano, Michele

    2015-01-01

    Experimental neuroscience is witnessing an increased interest in the development and application of novel and often complex, closed-loop protocols, where the stimulus applied depends in real-time on the response of the system. Recent applications range from the implementation of virtual reality systems for studying motor responses both in mice1 and in zebrafish2, to control of seizures following cortical stroke using optogenetics3. A key advantage of closed-loop techniques resides in the capability of probing higher dimensional properties that are not directly accessible or that depend on multiple variables, such as neuronal excitability4 and reliability, while at the same time maximizing the experimental throughput. In this contribution and in the context of cellular electrophysiology, we describe how to apply a variety of closed-loop protocols to the study of the response properties of pyramidal cortical neurons, recorded intracellularly with the patch clamp technique in acute brain slices from the somatosensory cortex of juvenile rats. As no commercially available or open source software provides all the features required for efficiently performing the experiments described here, a new software toolbox called LCG5 was developed, whose modular structure maximizes reuse of computer code and facilitates the implementation of novel experimental paradigms. Stimulation waveforms are specified using a compact meta-description and full experimental protocols are described in text-based configuration files. Additionally, LCG has a command-line interface that is suited for repetition of trials and automation of experimental protocols. PMID:26132434

  20. Baking a mass-spectrometry data PIE with McMC and simulated annealing: predicting protein post-translational modifications from integrated top-down and bottom-up data.

    PubMed

    Jefferys, Stuart R; Giddings, Morgan C

    2011-03-15

    Post-translational modifications are vital to the function of proteins, but are hard to study, especially since several modified isoforms of a protein may be present simultaneously. Mass spectrometers are a great tool for investigating modified proteins, but the data they provide is often incomplete, ambiguous and difficult to interpret. Combining data from multiple experimental techniques-especially bottom-up and top-down mass spectrometry-provides complementary information. When integrated with background knowledge this allows a human expert to interpret what modifications are present and where on a protein they are located. However, the process is arduous and for high-throughput applications needs to be automated. This article explores a data integration methodology based on Markov chain Monte Carlo and simulated annealing. Our software, the Protein Inference Engine (the PIE) applies these algorithms using a modular approach, allowing multiple types of data to be considered simultaneously and for new data types to be added as needed. Even for complicated data representing multiple modifications and several isoforms, the PIE generates accurate modification predictions, including location. When applied to experimental data collected on the L7/L12 ribosomal protein the PIE was able to make predictions consistent with manual interpretation for several different L7/L12 isoforms using a combination of bottom-up data with experimentally identified intact masses. Software, demo projects and source can be downloaded from http://pie.giddingslab.org/

  1. Polar2Grid 2.0: Reprojecting Satellite Data Made Easy

    NASA Astrophysics Data System (ADS)

    Hoese, D.; Strabala, K.

    2015-12-01

    Polar-orbiting multi-band meteorological sensors such as those on the Suomi National Polar-orbiting Partnership (SNPP) satellite pose substantial challenges for taking imagery the last mile to forecast offices, scientific analysis environments, and the general public. To do this quickly and easily, the Cooperative Institute for Meteorological Satellite Studies (CIMSS) at the University of Wisconsin has created an open-source, modular application system, Polar2Grid. This bundled solution automates tools for converting various satellite products like those from VIIRS and MODIS into a variety of output formats, including GeoTIFFs, AWIPS compatible NetCDF files, and NinJo forecasting workstation compatible TIFF images. In addition to traditional visible and infrared imagery, Polar2Grid includes three perceptual enhancements for the VIIRS Day-Night Band (DNB), as well as providing the capability to create sharpened true color, sharpened false color, and user-defined RGB images. Polar2Grid performs conversions and projections in seconds on large swaths of data. Polar2Grid is currently providing VIIRS imagery over the Continental United States, as well as Alaska and Hawaii, from various Direct-Broadcast antennas to operational forecasters at the NOAA National Weather Service (NWS) offices in their AWIPS terminals, within minutes of an overpass of the Suomi NPP satellite. Three years after Polar2Grid development started, the Polar2Grid team is now releasing version 2.0 of the software; supporting more sensors, generating more products, and providing all of its features in an easy to use command line interface.

  2. Adaptive multi-resolution Modularity for detecting communities in networks

    NASA Astrophysics Data System (ADS)

    Chen, Shi; Wang, Zhi-Zhong; Bao, Mei-Hua; Tang, Liang; Zhou, Ji; Xiang, Ju; Li, Jian-Ming; Yi, Chen-He

    2018-02-01

    Community structure is a common topological property of complex networks, which attracted much attention from various fields. Optimizing quality functions for community structures is a kind of popular strategy for community detection, such as Modularity optimization. Here, we introduce a general definition of Modularity, by which several classical (multi-resolution) Modularity can be derived, and then propose a kind of adaptive (multi-resolution) Modularity that can combine the advantages of different Modularity. By applying the Modularity to various synthetic and real-world networks, we study the behaviors of the methods, showing the validity and advantages of the multi-resolution Modularity in community detection. The adaptive Modularity, as a kind of multi-resolution method, can naturally solve the first-type limit of Modularity and detect communities at different scales; it can quicken the disconnecting of communities and delay the breakup of communities in heterogeneous networks; and thus it is expected to generate the stable community structures in networks more effectively and have stronger tolerance against the second-type limit of Modularity.

  3. Product modular design incorporating preventive maintenance issues

    NASA Astrophysics Data System (ADS)

    Gao, Yicong; Feng, Yixiong; Tan, Jianrong

    2016-03-01

    Traditional modular design methods lead to product maintenance problems, because the module form of a system is created according to either the function requirements or the manufacturing considerations. For solving these problems, a new modular design method is proposed with the considerations of not only the traditional function related attributes, but also the maintenance related ones. First, modularity parameters and modularity scenarios for product modularity are defined. Then the reliability and economic assessment models of product modularity strategies are formulated with the introduction of the effective working age of modules. A mathematical model used to evaluate the difference among the modules of the product so that the optimal module of the product can be established. After that, a multi-objective optimization problem based on metrics for preventive maintenance interval different degrees and preventive maintenance economics is formulated for modular optimization. Multi-objective GA is utilized to rapidly approximate the Pareto set of optimal modularity strategy trade-offs between preventive maintenance cost and preventive maintenance interval difference degree. Finally, a coordinate CNC boring machine is adopted to depict the process of product modularity. In addition, two factorial design experiments based on the modularity parameters are constructed and analyzed. These experiments investigate the impacts of these parameters on the optimal modularity strategies and the structure of module. The research proposes a new modular design method, which may help to improve the maintainability of product in modular design.

  4. Scientific and technical services in the development of planetary quarantine measures for automated spacecraft

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Primary goals of the Planetary Quarantine Program are defined and used to provide a basis for planning and source allocation toward the development of planetary quarantine measures for the following automated spacecrafts: Viking 1975, Pioneer F and G, and Mariner Venus-Mercury 1973.

  5. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  6. Data for Environmental Modeling (D4EM): Background and Applications of Data Automation

    EPA Science Inventory

    The Data for Environmental Modeling (D4EM) project demonstrates the development of a comprehensive set of open source software tools that overcome obstacles to accessing data needed by automating the process of populating model input data sets with environmental data available fr...

  7. Optimized and Automated design of Plasma Diagnostics for Additive Manufacture

    NASA Astrophysics Data System (ADS)

    Stuber, James; Quinley, Morgan; Melnik, Paul; Sieck, Paul; Smith, Trevor; Chun, Katherine; Woodruff, Simon

    2016-10-01

    Despite having mature designs, diagnostics are usually custom designed for each experiment. Most of the design can be now be automated to reduce costs (engineering labor, and capital cost). We present results from scripted physics modeling and parametric engineering design for common optical and mechanical components found in many plasma diagnostics and outline the process for automated design optimization that employs scripts to communicate data from online forms through proprietary and open-source CAD and FE codes to provide a design that can be sent directly to a printer. As a demonstration of design automation, an optical beam dump, baffle and optical components are designed via an automated process and printed. Supported by DOE SBIR Grant DE-SC0011858.

  8. Crew emergency return vehicle - Electrical power system design study

    NASA Technical Reports Server (NTRS)

    Darcy, E. C.; Barrera, T. P.

    1989-01-01

    A crew emergency return vehicle (CERV) is proposed to perform the lifeboat function for the manned Space Station Freedom. This escape module will be permanently docked to Freedom and, on demand, will be capable of safely returning the crew to earth. The unique requirements that the CERV imposes on its power source are presented, power source options are examined, and a baseline system is selected. It consists of an active Li-BCX DD-cell modular battery system and was chosen for the maturity of its man-rated design and its low development costs.

  9. Small modular reactor modeling using modelica for nuclear-renewable hybrid energy systems applications

    DOE PAGES

    Mikkelson, Daniel; Chang, Chih -Wei; Cetiner, Sacit M.; ...

    2015-10-01

    Here, the U.S. Department of Energy (DOE) supports research and development (R&D) that could lead to more efficient utilization of clean energy generation sources, including renewable and nuclear options, to meet grid demand and industrial thermal energy needs [1]. One hybridization approach being investigated by the DOE Offices of Nuclear Energy (NE) and the DOE Energy Efficiency and Renewable Energy (EERE) is tighter coupling of nuclear and renewable energy sources to better manage overall energy use for the combined electricity, industrial manufacturing, and transportation sectors.

  10. Functional Brain Network Modularity Captures Inter- and Intra-Individual Variation in Working Memory Capacity

    PubMed Central

    Stevens, Alexander A.; Tappon, Sarah C.; Garg, Arun; Fair, Damien A.

    2012-01-01

    Background Cognitive abilities, such as working memory, differ among people; however, individuals also vary in their own day-to-day cognitive performance. One potential source of cognitive variability may be fluctuations in the functional organization of neural systems. The degree to which the organization of these functional networks is optimized may relate to the effective cognitive functioning of the individual. Here we specifically examine how changes in the organization of large-scale networks measured via resting state functional connectivity MRI and graph theory track changes in working memory capacity. Methodology/Principal Findings Twenty-two participants performed a test of working memory capacity and then underwent resting-state fMRI. Seventeen subjects repeated the protocol three weeks later. We applied graph theoretic techniques to measure network organization on 34 brain regions of interest (ROI). Network modularity, which measures the level of integration and segregation across sub-networks, and small-worldness, which measures global network connection efficiency, both predicted individual differences in memory capacity; however, only modularity predicted intra-individual variation across the two sessions. Partial correlations controlling for the component of working memory that was stable across sessions revealed that modularity was almost entirely associated with the variability of working memory at each session. Analyses of specific sub-networks and individual circuits were unable to consistently account for working memory capacity variability. Conclusions/Significance The results suggest that the intrinsic functional organization of an a priori defined cognitive control network measured at rest provides substantial information about actual cognitive performance. The association of network modularity to the variability in an individual's working memory capacity suggests that the organization of this network into high connectivity within modules and sparse connections between modules may reflect effective signaling across brain regions, perhaps through the modulation of signal or the suppression of the propagation of noise. PMID:22276205

  11. Development of modularity in the neural activity of childrenʼs brains

    NASA Astrophysics Data System (ADS)

    Chen, Man; Deem, Michael W.

    2015-02-01

    We study how modularity of the human brain changes as children develop into adults. Theory suggests that modularity can enhance the response function of a networked system subject to changing external stimuli. Thus, greater cognitive performance might be achieved for more modular neural activity, and modularity might likely increase as children develop. The value of modularity calculated from functional magnetic resonance imaging (fMRI) data is observed to increase during childhood development and peak in young adulthood. Head motion is deconvolved from the fMRI data, and it is shown that the dependence of modularity on age is independent of the magnitude of head motion. A model is presented to illustrate how modularity can provide greater cognitive performance at short times, i.e. task switching. A fitness function is extracted from the model. Quasispecies theory is used to predict how the average modularity evolves with age, illustrating the increase of modularity during development from children to adults that arises from selection for rapid cognitive function in young adults. Experiments exploring the effect of modularity on cognitive performance are suggested. Modularity may be a potential biomarker for injury, rehabilitation, or disease.

  12. Corrosion on the acetabular liner taper from retrieved modular metal-on-metal total hip replacements.

    PubMed

    Gascoyne, Trevor C; Dyrkacz, Richard M; Turgeon, Thomas R; Burnell, Colin D; Wyss, Urs P; Brandt, Jan-M

    2014-10-01

    Eight retrieved metal-on-metal total hip replacements displayed corrosion damage along the cobalt-chromium alloy liner taper junction with the Ti alloy acetabular shell. Scanning electron microscopy indicated the primary mechanism of corrosion to be grain boundary and associated crevice corrosion, which was likely accelerated through mechanical micromotion and galvanic corrosion resulting from dissimilar alloys. Coordinate measurements revealed up to 4.3mm(3) of the cobalt-chromium alloy taper surface was removed due to corrosion, which is comparable to previous reports of corrosion damage on head-neck tapers. The acetabular liner-shell taper appears to be an additional source of metal corrosion products in modular total hip replacements. Patients with these prostheses should be closely monitored for signs of adverse reaction towards corrosion by-products. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Modular Chemical Descriptor Language (MCDL): Stereochemical modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gakh, Andrei A; Burnett, Michael N; Trepalin, Sergei V.

    2011-01-01

    In our previous papers we introduced the Modular Chemical Descriptor Language (MCDL) for providing a linear representation of chemical information. A subsequent development was the MCDL Java Chemical Structure Editor which is capable of drawing chemical structures from linear representations and generating MCDL descriptors from structures. In this paper we present MCDL modules and accompanying software that incorporate unique representation of molecular stereochemistry based on Cahn-Ingold-Prelog and Fischer ideas in constructing stereoisomer descriptors. The paper also contains additional discussions regarding canonical representation of stereochemical isomers, and brief algorithm descriptions of the open source LINDES, Java applet, and Open Babel MCDLmore » processing module software packages. Testing of the upgraded MCDL Java Chemical Structure Editor on compounds taken from several large and diverse chemical databases demonstrated satisfactory performance for storage and processing of stereochemical information in MCDL format.« less

  14. Scalable Light Module for Low-Cost, High-Efficiency Light- Emitting Diode Luminaires

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tarsa, Eric

    2015-08-31

    During this two-year program Cree developed a scalable, modular optical architecture for low-cost, high-efficacy light emitting diode (LED) luminaires. Stated simply, the goal of this architecture was to efficiently and cost-effectively convey light from LEDs (point sources) to broad luminaire surfaces (area sources). By simultaneously developing warm-white LED components and low-cost, scalable optical elements, a high system optical efficiency resulted. To meet program goals, Cree evaluated novel approaches to improve LED component efficacy at high color quality while not sacrificing LED optical efficiency relative to conventional packages. Meanwhile, efficiently coupling light from LEDs into modular optical elements, followed by optimallymore » distributing and extracting this light, were challenges that were addressed via novel optical design coupled with frequent experimental evaluations. Minimizing luminaire bill of materials and assembly costs were two guiding principles for all design work, in the effort to achieve luminaires with significantly lower normalized cost ($/klm) than existing LED fixtures. Chief project accomplishments included the achievement of >150 lm/W warm-white LEDs having primary optics compatible with low-cost modular optical elements. In addition, a prototype Light Module optical efficiency of over 90% was measured, demonstrating the potential of this scalable architecture for ultra-high-efficacy LED luminaires. Since the project ended, Cree has continued to evaluate optical element fabrication and assembly methods in an effort to rapidly transfer this scalable, cost-effective technology to Cree production development groups. The Light Module concept is likely to make a strong contribution to the development of new cost-effective, high-efficacy luminaries, thereby accelerating widespread adoption of energy-saving SSL in the U.S.« less

  15. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    PubMed

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision the broad utility of the framework for diverse problems across different length scales and imaging methods.

  16. Photochemical numerics for global-scale modeling: Fidelity and GCM testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, S.; Jim Kao, Chih-Yue; Zhao, X.

    1995-03-01

    Atmospheric photochemistry lies at the heart of global-scale pollution problems, but it is a nonlinear system embedded in nonlinear transport and so must be modeled in three dimensions. Total earth grids are massive and kinetics require dozens of interacting tracers, taxing supercomputers to their limits in global calculations. A matrix-free and noniterative family scheme is described that permits chemical step sizes an order of magnitude or more larger than time constants for molecular groupings, in the 1-h range used for transport. Families are partitioned through linearized implicit integrations that produce stabilizing species concentrations for a mass-conserving forward solver. The kineticsmore » are also parallelized by moving geographic loops innermost and changes in the continuity equations are automated through list reading. The combination of speed, parallelization and automation renders the programs naturally modular. Accuracy lies within 1% for all species in week-long fidelity tests. A 50-species, 150-reaction stratospheric module tested in a spectral GCM benchmarks at 10 min CPU time per day and agrees with lower-dimensionality simulations. Tropospheric nonmethane hydrocarbon chemistry will soon be added, and inherently three-dimensional phenomena will be investigated both decoupled from dynamics and in a complete chemical GCM. 225 refs., 11 figs., 2 tabs.« less

  17. Control of intelligent robots in space

    NASA Technical Reports Server (NTRS)

    Freund, E.; Buehler, CH.

    1989-01-01

    In view of space activities like International Space Station, Man-Tended-Free-Flyer (MTFF) and free flying platforms, the development of intelligent robotic systems is gaining increasing importance. The range of applications that have to be performed by robotic systems in space includes e.g., the execution of experiments in space laboratories, the service and maintenance of satellites and flying platforms, the support of automatic production processes or the assembly of large network structures. Some of these tasks will require the development of bi-armed or of multiple robotic systems including functional redundancy. For the development of robotic systems which are able to perform this variety of tasks a hierarchically structured modular concept of automation is required. This concept is characterized by high flexibility as well as by automatic specialization to the particular sequence of tasks that have to be performed. On the other hand it has to be designed such that the human operator can influence or guide the system on different levels of control supervision, and decision. This leads to requirements for the hardware and software concept which permit a range of application of the robotic systems from telemanipulation to autonomous operation. The realization of this goal requires strong efforts in the development of new methods, software and hardware concepts, and the integration into an automation concept.

  18. A new web-based system to improve the monitoring of snow avalanche hazard in France

    NASA Astrophysics Data System (ADS)

    Bourova, Ekaterina; Maldonado, Eric; Leroy, Jean-Baptiste; Alouani, Rachid; Eckert, Nicolas; Bonnefoy-Demongeot, Mylene; Deschatres, Michael

    2016-05-01

    Snow avalanche data in the French Alps and Pyrenees have been recorded for more than 100 years in several databases. The increasing amount of observed data required a more integrative and automated service. Here we report the comprehensive web-based Snow Avalanche Information System newly developed to this end for three important data sets: an avalanche chronicle (Enquête Permanente sur les Avalanches, EPA), an avalanche map (Carte de Localisation des Phénomènes d'Avalanche, CLPA) and a compilation of hazard and vulnerability data recorded on selected paths endangering human settlements (Sites Habités Sensibles aux Avalanches, SSA). These data sets are now integrated into a common database, enabling full interoperability between all different types of snow avalanche records: digitized geographic data, avalanche descriptive parameters, eyewitness reports, photographs, hazard and risk levels, etc. The new information system is implemented through modular components using Java-based web technologies with Spring and Hibernate frameworks. It automates the manual data entry and improves the process of information collection and sharing, enhancing user experience and data quality, and offering new outlooks to explore and exploit the huge amount of snow avalanche data available for fundamental research and more applied risk assessment.

  19. Task-based image quality assessment in radiation therapy: initial characterization and demonstration with CT simulation images

    NASA Astrophysics Data System (ADS)

    Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua

    2017-03-01

    In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.

  20. Advanced Nuclear Technology. Using Technology for Small Modular Reactor Staff Optimization, Improved Effectiveness, and Cost Containment, 3002007071

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loflin, Leonard

    Through this grant, the U.S. Department of Energy (DOE) will review several functional areas within a nuclear power plant, including fire protection, operations and operations support, refueling, training, procurement, maintenance, site engineering, and others. Several functional areas need to be examined since there appears to be no single staffing area or approach that alone has the potential for significant staff optimization at new nuclear power plants. Several of the functional areas will require a review of technology options such as automation, remote monitoring, fleet wide monitoring, new and specialized instrumentation, human factors engineering, risk informed analysis and PRAs, component andmore » system condition monitoring and reporting, just in time training, electronic and automated procedures, electronic tools for configuration management and license and design basis information, etc., that may be applied to support optimization. Additionally, the project will require a review key regulatory issues that affect staffing and could be optimized with additional technology input. Opportunities to further optimize staffing levels and staffing functions by selection of design attributes of physical systems and structures need also be identified. A goal of this project is to develop a prioritized assessment of the functional areas, and R&D actions needed for those functional areas, to provide the best optimization« less

  1. Dynamics of modularity of neural activity in the brain during development

    NASA Astrophysics Data System (ADS)

    Deem, Michael; Chen, Man

    2014-03-01

    Theory suggests that more modular systems can have better response functions at short times. This theory suggests that greater cognitive performance may be achieved for more modular neural activity, and that modularity of neural activity may, therefore, likely increase with development in children. We study the relationship between age and modularity of brain neural activity in developing children. The value of modularity calculated from fMRI data is observed to increase during childhood development and peak in young adulthood. We interpret these results as evidence of selection for plasticity in the cognitive function of the human brain. We present a model to illustrate how modularity can provide greater cognitive performance at short times and enhance fast, low-level, automatic cognitive processes. Conversely, high-level, effortful, conscious cognitive processes may not benefit from modularity. We use quasispecies theory to predict how the average modularity evolves with age, given a fitness function extracted from the model. We suggest further experiments exploring the effect of modularity on cognitive performance and suggest that modularity may be a potential biomarker for injury, rehabilitation, or disease.

  2. MOSAIC--A Modular Approach to Data Management in Epidemiological Studies.

    PubMed

    Bialke, M; Bahls, T; Havemann, C; Piegsa, J; Weitmann, K; Wegner, T; Hoffmann, W

    2015-01-01

    In the context of an increasing number of multi-centric studies providing data from different sites and sources the necessity for central data management (CDM) becomes undeniable. This is exacerbated by a multiplicity of featured data types, formats and interfaces. In relation to methodological medical research the definition of central data management needs to be broadened beyond the simple storage and archiving of research data. This paper highlights typical requirements of CDM for cohort studies and registries and illustrates how orientation for CDM can be provided by addressing selected data management challenges. Therefore in the first part of this paper a short review summarises technical, organisational and legal challenges for CDM in cohort studies and registries. A deduced set of typical requirements of CDM in epidemiological research follows. In the second part the MOSAIC project is introduced (a modular systematic approach to implement CDM). The modular nature of MOSAIC contributes to manage both technical and organisational challenges efficiently by providing practical tools. A short presentation of a first set of tools, aiming for selected CDM requirements in cohort studies and registries, comprises a template for comprehensive documentation of data protection measures, an interactive reference portal for gaining insights and sharing experiences, supplemented by modular software tools for generation and management of generic pseudonyms, for participant management and for sophisticated consent management. Altogether, work within MOSAIC addresses existing challenges in epidemiological research in the context of CDM and facilitates the standardized collection of data with pre-programmed modules and provided document templates. The necessary effort for in-house programming is reduced, which accelerates the start of data collection.

  3. Development of a Modular Research Platform to Create Medical Observational Studies for Mobile Devices.

    PubMed

    Zens, Martin; Grotejohann, Birgit; Tassoni, Adrian; Duttenhoefer, Fabian; Südkamp, Norbert P; Niemeyer, Philipp

    2017-05-23

    Observational studies have proven to be a valuable resource in medical research, especially when performed on a large scale. Recently, mobile device-based observational studies have been discovered by an increasing number of researchers as a promising new source of information. However, the development and deployment of app-based studies is not trivial and requires profound programming skills. The aim of this project was to develop a modular online research platform that allows researchers to create medical studies for mobile devices without extensive programming skills. The platform approach for a modular research platform consists of three major components. A Web-based platform forms the researchers' main workplace. This platform communicates via a shared database with a platform independent mobile app. Furthermore, a separate Web-based login platform for physicians and other health care professionals is outlined and completes the concept. A prototype of the research platform has been developed and is currently in beta testing. Simple questionnaire studies can be created within minutes and published for testing purposes. Screenshots of an example study are provided, and the general working principle is displayed. In this project, we have created a basis for a novel research platform. The necessity and implications of a modular approach were displayed and an outline for future development given. International researchers are invited and encouraged to participate in this ongoing project. ©Martin Zens, Birgit Grotejohann, Adrian Tassoni, Fabian Duttenhoefer, Norbert P Südkamp, Philipp Niemeyer. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 23.05.2017.

  4. Apparatus and method for automated monitoring of airborne bacterial spores

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian (Inventor)

    2009-01-01

    An apparatus and method for automated monitoring of airborne bacterial spores. The apparatus is provided with an air sampler, a surface for capturing airborne spores, a thermal lysis unit to release DPA from bacterial spores, a source of lanthanide ions, and a spectrometer for excitation and detection of the characteristic fluorescence of the aromatic molecules in bacterial spores complexed with lanthanide ions. In accordance with the method: computer-programmed steps allow for automation of the apparatus for the monitoring of airborne bacterial spores.

  5. Aircraft noise prediction program validation

    NASA Technical Reports Server (NTRS)

    Shivashankara, B. N.

    1980-01-01

    A modular computer program (ANOPP) for predicting aircraft flyover and sideline noise was developed. A high quality flyover noise data base for aircraft that are representative of the U.S. commercial fleet was assembled. The accuracy of ANOPP with respect to the data base was determined. The data for source and propagation effects were analyzed and suggestions for improvements to the prediction methodology are given.

  6. Near-Infrared Neuroimaging with NinPy

    PubMed Central

    Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas

    2009-01-01

    There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449

  7. A summary of the results from the DOE advanced gas reactor (AGR) fuel development and qualification program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petti, David Andrew

    2017-04-01

    Modular high temperature gas-cooled reactor (HTGR) designs were developed to provide natural safety, which prevents core damage under all licensing basis events. The principle that guides their design concepts is to passively maintain core temperatures below fission product release thresholds under all accident scenarios. The required level of fuel performance and fission product retention reduces the radioactive source term by many orders of magnitude relative to source terms for other reactor types and allows a graded approach to emergency planning and the potential elimination of the need for evacuation and sheltering beyond a small exclusion area. Achieving this level, however,more » is predicated on exceptionally high coated-particle fuel fabrication quality and excellent performance under normal operation and accident conditions. The design goal of modular HTGRs is to meet the Environmental Protection Agency (EPA) Protective Action Guides (PAGs) for offsite dose at the Exclusion Area Boundary (EAB). To achieve this, the reactor design concepts require a level of fuel integrity that is far better than that achieved for all prior U.S.-manufactured tristructural isotropic (TRISO) coated particle fuel.« less

  8. Low energy stage study. Volume 1: Executive summary. [propulsion system configurations for orbital launching of space shuttle payloads

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Cost effective approaches for placing automated payloads into circular and elliptical orbits using energy requirements significantly lower than that provided by the smallest, currently planned shuttle upper stage, SSUS-D, were investigated. Launch costs were derived using both NASA existing/planned launch approaches as well as new propulsion concepts meeting low-energy regime requirements. Candidate new propulsion approaches considered were solid (tandem, cluster, and controlled), solid/liquid combinations and all-liquid stages. Results show that the most economical way to deliver the 129 low energy payloads is basically with a new modular, short liquid bipropellant stage system for the large majority of the payloads. For the remainder of the payloads, use the shuttle with integral OMS and the Scout form for a few specialized payloads until the Shuttle becomes operational.

  9. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  10. ELSA: An integrated, semi-automated nebular abundance package

    NASA Astrophysics Data System (ADS)

    Johnson, Matthew D.; Levitt, Jesse S.; Henry, Richard B. C.; Kwitter, Karen B.

    We present ELSA, a new modular software package, written in C, to analyze and manage spectroscopic data from emission-line objects. In addition to calculating plasma diagnostics and abundances from nebular emission lines, the software provides a number of convenient features including the ability to ingest logs produced by IRAF's splot task, to semi-automatically merge spectra in different wavelength ranges, and to automatically generate various data tables in machine-readable or LaTeX format. ELSA features a highly sophisticated interstellar reddening correction scheme that takes into account temperature and density effects as well as He II contamination of the hydrogen Balmer lines. Abundance calculations are performed using a 5-level atom approximation with recent atomic data, based on R. Henry's ABUN program. Downloading and detailed documentation for all aspects of ELSA are available at the following URL:

  11. EMASS (tm): An expandable solution for NASA space data storage needs

    NASA Technical Reports Server (NTRS)

    Peterson, Anthony L.; Cardwell, P. Larry

    1992-01-01

    The data acquisition, distribution, processing, and archiving requirements of NASA and other U.S. Government data centers present significant data management challenges that must be met in the 1990's. The Earth Observing System (EOS) project alone is expected to generate daily data volumes greater than 2 Terabytes (2(10)(exp 12) Bytes). As the scientific community makes use of this data their work product will result in larger, increasingly complex data sets to be further exploited and managed. The challenge for data storage systems is to satisfy the initial data management requirements with cost effective solutions that provide for planned growth. This paper describes the expandable architecture of the E-Systems Modular Automated Storage System (EMASS (TM)), a mass storage system which is designed to support NASA's data capture, storage, distribution, and management requirements into the 21st century.

  12. EMASS (trademark): An expandable solution for NASA space data storage needs

    NASA Technical Reports Server (NTRS)

    Peterson, Anthony L.; Cardwell, P. Larry

    1991-01-01

    The data acquisition, distribution, processing, and archiving requirements of NASA and other U.S. Government data centers present significant data management challenges that must be met in the 1990's. The Earth Observing System (EOS) project alone is expected to generate daily data volumes greater than 2 Terabytes (2 x 10(exp 12) Bytes). As the scientific community makes use of this data, their work will result in larger, increasingly complex data sets to be further exploited and managed. The challenge for data storage systems is to satisfy the initial data management requirements with cost effective solutions that provide for planned growth. The expendable architecture of the E-Systems Modular Automated Storage System (EMASS(TM)), a mass storage system which is designed to support NASA's data capture, storage, distribution, and management requirements into the 21st century is described.

  13. Mission Level Autonomy for USSV

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Stirb, Robert C.; Brizzolara, Robert

    2011-01-01

    On-water demonstration of a wide range of mission-proven, advanced technologies at TRL 5+ that provide a total integrated, modular approach to effectively address the majority of the key needs for full mission-level autonomous, cross-platform control of USV s. Wide baseline stereo system mounted on the ONR USSV was shown to be an effective sensing modality for tracking of dynamic contacts as a first step to automated retrieval operations. CASPER onboard planner/replanner successfully demonstrated realtime, on-water resource-based analysis for mission-level goal achievement and on-the-fly opportunistic replanning. Full mixed mode autonomy was demonstrated on-water with a seamless transition between operator over-ride and return to current mission plan. Autonomous cooperative operations for fixed asset protection and High Value Unit escort using 2 USVs (AMN1 & 14m RHIB) were demonstrated during Trident Warrior 2010 in JUN 2010

  14. Open-source software for collision detection in external beam radiation therapy

    NASA Astrophysics Data System (ADS)

    Suriyakumar, Vinith M.; Xu, Renee; Pinter, Csaba; Fichtinger, Gabor

    2017-03-01

    PURPOSE: Collision detection for external beam radiation therapy (RT) is important for eliminating the need for dryruns that aim to ensure patient safety. Commercial treatment planning systems (TPS) offer this feature but they are expensive and proprietary. Cobalt-60 RT machines are a viable solution to RT practice in low-budget scenarios. However, such clinics are hesitant to invest in these machines due to a lack of affordable treatment planning software. We propose the creation of an open-source room's eye view visualization module with automated collision detection as part of the development of an open-source TPS. METHODS: An openly accessible linac 3D geometry model is sliced into the different components of the treatment machine. The model's movements are based on the International Electrotechnical Commission standard. Automated collision detection is implemented between the treatment machine's components. RESULTS: The room's eye view module was built in C++ as part of SlicerRT, an RT research toolkit built on 3D Slicer. The module was tested using head and neck and prostate RT plans. These tests verified that the module accurately modeled the movements of the treatment machine and radiation beam. Automated collision detection was verified using tests where geometric parameters of the machine's components were changed, demonstrating accurate collision detection. CONCLUSION: Room's eye view visualization and automated collision detection are essential in a Cobalt-60 treatment planning system. Development of these features will advance the creation of an open-source TPS that will potentially help increase the feasibility of adopting Cobalt-60 RT.

  15. Modular Courses in British Higher Education: A Critical Assessment

    ERIC Educational Resources Information Center

    Church, Clive

    1975-01-01

    The trends towards modular course structures is examined. British conceptions of modularization are compared with American interpretations of modular instruction, the former shown to be concerned almost exclusively with content, the latter attempting more radical changes in students' learning behavior. Rationales for British modular schemes are…

  16. Automated structural classification of lipids by machine learning.

    PubMed

    Taylor, Ryan; Miller, Ryan H; Miller, Ryan D; Porter, Michael; Dalgleish, James; Prince, John T

    2015-03-01

    Modern lipidomics is largely dependent upon structural ontologies because of the great diversity exhibited in the lipidome, but no automated lipid classification exists to facilitate this partitioning. The size of the putative lipidome far exceeds the number currently classified, despite a decade of work. Automated classification would benefit ongoing classification efforts by decreasing the time needed and increasing the accuracy of classification while providing classifications for mass spectral identification algorithms. We introduce a tool that automates classification into the LIPID MAPS ontology of known lipids with >95% accuracy and novel lipids with 63% accuracy. The classification is based upon simple chemical characteristics and modern machine learning algorithms. The decision trees produced are intelligible and can be used to clarify implicit assumptions about the current LIPID MAPS classification scheme. These characteristics and decision trees are made available to facilitate alternative implementations. We also discovered many hundreds of lipids that are currently misclassified in the LIPID MAPS database, strongly underscoring the need for automated classification. Source code and chemical characteristic lists as SMARTS search strings are available under an open-source license at https://www.github.com/princelab/lipid_classifier. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  17. Automation of Vapor-Diffusion Growth of Protein Crystals

    NASA Technical Reports Server (NTRS)

    Hamrick, David T.; Bray, Terry L.

    2005-01-01

    Some improvements have been made in a system of laboratory equipment developed previously for studying the crystallization of proteins from solution by use of dynamically controlled flows of dry gas. The improvements involve mainly (1) automation of dispensing of liquids for starting experiments, (2) automatic control of drying of protein solutions during the experiments, and (3) provision for automated acquisition of video images for monitoring experiments in progress and for post-experiment analysis. The automation of dispensing of liquids was effected by adding an automated liquid-handling robot that can aspirate source solutions and dispense them in either a hanging-drop or a sitting-drop configuration, whichever is specified, in each of 48 experiment chambers. A video camera of approximately the size and shape of a lipstick dispenser was added to a mobile stage that is part of the robot, in order to enable automated acquisition of images in each experiment chamber. The experiment chambers were redesigned to enable the use of sitting drops, enable backlighting of each specimen, and facilitate automation.

  18. TECHNICAL NOTE: Effect of bait delivery interval in an automated head-chamber system on respiration gas estimates when cattle are grazing rangeland

    USDA-ARS?s Scientific Manuscript database

    Agricultural methane emissions account for approximately 43% of all anthropogenic methane emissions and the majority of agricultural CH4 emissions are attributed to enteric fermentation within ruminant livestock, therefor interest is heightened in quantifying and mitigating this source. An automate...

  19. Comparative growth of trichoderma strains in different nutritional sources, using bioscreen c automated system

    PubMed Central

    Rossi-Rodrigues, Bianca Caroline; Brochetto-Braga, Márcia Regina; Tauk-Tornisielo, Sâmia Maria; Carmona, Eleonora Cano; Arruda, Valeska Marques; Chaud Netto, José

    2009-01-01

    Trichoderma is one of the fungi genera that produce important metabolites for industry. The growth of these organisms is a consequence of the nutritional sources used as also of the physical conditions employed to cultivate them. In this work, the automated Bioscreen C system was used to evaluate the influence of different nutritional sources on the growth of Trichoderma strains (T. hamatum, T. harzianum, T. viride, and T. longibrachiatum) isolated from the soil in the Juréia-Itatins Ecological Station (JIES), São Paulo State - Brazil. The cultures were grown in liquid culture media containing different carbon- (2%; w/v) and nitrogen (1%; w/v) sources at 28ºC, pH 6.5, and agitated at 150 rpm for 72 h. The results showed, as expected, that glucose is superior to sucrose as a growth-stimulating carbon source in the Trichoderma strains studied, while yeast extract and tryptone were good growth-stimulating nitrogen sources in the cultivation of T. hamatum and T. harzianum. PMID:24031380

  20. BATSE imaging survey of the Galactic plane

    NASA Technical Reports Server (NTRS)

    Grindlay, J. E.; Barret, D.; Bloser, P. F.; Zhang, S. N.; Robinson, C.; Harmon, B. A.

    1997-01-01

    The burst and transient source experiment (BATSE) onboard the Compton Gamma Ray Observatory (CGRO) provides all sky monitoring capability, occultation analysis and occultation imaging which enables new and fainter sources to be searched for in relatively crowded fields. The occultation imaging technique is used in combination with an automated BATSE image scanner, allowing an analysis of large data sets of occultation images for detections of candidate sources and for the construction of source catalogs and data bases. This automated image scanner system is being tested on archival data in order to optimize the search and detection thresholds. The image search system, its calibration results and preliminary survey results on archival data are reported on. The aim of the survey is to identify a complete sample of black hole candidates in the galaxy and constrain the number of black hole systems and neutron star systems.

Top