Sample records for automatic modular abstractions

  1. Automatization of hardware configuration for plasma diagnostic system

    NASA Astrophysics Data System (ADS)

    Wojenski, A.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R. D.; Zabolotny, W.; Linczuk, P.; Chernyshova, M.; Czarski, T.; Malinowski, K.

    2016-09-01

    Soft X-ray plasma measurement systems are mostly multi-channel, high performance systems. In case of the modular construction it is necessary to perform sophisticated system discovery in parallel with automatic system configuration. In the paper the structure of the modular system designed for tokamak plasma soft X-ray measurements is described. The concept of the system discovery and further automatic configuration is also presented. FCS application (FMC/ FPGA Configuration Software) is used for running sophisticated system setup with automatic verification of proper configuration. In order to provide flexibility of further system configurations (e.g. user setup), common communication interface is also described. The approach presented here is related to the automatic system firmware building presented in previous papers. Modular construction and multichannel measurements are key requirement in term of SXR diagnostics with use of GEM detectors.

  2. Contour interpolation: A case study in Modularity of Mind.

    PubMed

    Keane, Brian P

    2018-05-01

    In his monograph Modularity of Mind (1983), philosopher Jerry Fodor argued that mental architecture can be partly decomposed into computational organs termed modules, which were characterized as having nine co-occurring features such as automaticity, domain specificity, and informational encapsulation. Do modules exist? Debates thus far have been framed very generally with few, if any, detailed case studies. The topic is important because it has direct implications on current debates in cognitive science and because it potentially provides a viable framework from which to further understand and make hypotheses about the mind's structure and function. Here, the case is made for the modularity of contour interpolation, which is a perceptual process that represents non-visible edges on the basis of how surrounding visible edges are spatiotemporally configured. There is substantial evidence that interpolation is domain specific, mandatory, fast, and developmentally well-sequenced; that it produces representationally impoverished outputs; that it relies upon a relatively fixed neural architecture that can be selectively impaired; that it is encapsulated from belief and expectation; and that its inner workings cannot be fathomed through conscious introspection. Upon differentiating contour interpolation from a higher-order contour representational ability ("contour abstraction") and upon accommodating seemingly inconsistent experimental results, it is argued that interpolation is modular to the extent that the initiating conditions for interpolation are strong. As interpolated contours become more salient, the modularity features emerge. The empirical data, taken as a whole, show that at least certain parts of the mind are modularly organized. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Faunus: An object oriented framework for molecular simulation

    PubMed Central

    Lund, Mikael; Trulsson, Martin; Persson, Björn

    2008-01-01

    Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331

  4. Design of a modular digital computer system, DRL 4. [for meeting future requirements of spaceborne computers

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.

  5. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    NASA Astrophysics Data System (ADS)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  6. Stored program concept for analog computers

    NASA Technical Reports Server (NTRS)

    Hannauer, G., III; Patmore, J. R.

    1971-01-01

    Optimization of three-stage matrices, modularization, and black boxes design techniques provides for automatically interconnecting computing component inputs and outputs in general purpose analog computer. Design also produces relatively inexpensive and less complex automatic patching system.

  7. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A Central Control Element (CCE) module which controls the Automatically Reconfigurable Modular System (ARMS) and allows both redundant processing and multi-computing in the same computer with real time mode switching, is discussed. The same hardware is used for either reliability enhancement, speed enhancement, or for a combination of both.

  8. New ARCH: Future Generation Internet Architecture

    DTIC Science & Technology

    2004-08-01

    a vocabulary to talk about a system . This provides a framework ( a “reference model ...layered model Modularity and abstraction are central tenets of Computer Science thinking. Modularity breaks a system into parts, normally to permit...this complexity is hidden. Abstraction suggests a structure for the system . A popular and simple structure is a layered model : lower layer

  9. Dynamics of modularity of neural activity in the brain during development

    NASA Astrophysics Data System (ADS)

    Deem, Michael; Chen, Man

    2014-03-01

    Theory suggests that more modular systems can have better response functions at short times. This theory suggests that greater cognitive performance may be achieved for more modular neural activity, and that modularity of neural activity may, therefore, likely increase with development in children. We study the relationship between age and modularity of brain neural activity in developing children. The value of modularity calculated from fMRI data is observed to increase during childhood development and peak in young adulthood. We interpret these results as evidence of selection for plasticity in the cognitive function of the human brain. We present a model to illustrate how modularity can provide greater cognitive performance at short times and enhance fast, low-level, automatic cognitive processes. Conversely, high-level, effortful, conscious cognitive processes may not benefit from modularity. We use quasispecies theory to predict how the average modularity evolves with age, given a fitness function extracted from the model. We suggest further experiments exploring the effect of modularity on cognitive performance and suggest that modularity may be a potential biomarker for injury, rehabilitation, or disease.

  10. A modular approach to detection and identification of defects in rough lumber

    Treesearch

    Sang Mook Lee; A. Lynn Abbott; Daniel L. Schmoldt

    2001-01-01

    This paper describes a prototype scanning system that can automatically identify several important defects on rough hardwood lumber. The scanning system utilizes 3 laser sources and an embedded-processor camera to capture and analyze profile and gray-scale images. The modular approach combines the detection of wane (the curved sides of a board, possibly containing...

  11. Automatic Modeling and Simulation of Modular Robots

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Wei, H.; Zhang, Y.

    2018-03-01

    The ability of reconfiguration makes modular robots have the ability of adaptable, low-cost, self-healing and fault-tolerant. It can also be applied to a variety of mission situations. In this manuscript, a robot platform which relied on the module library was designed, based on the screw theory and module theory. Then, the configuration design method of the modular robot was proposed. And the different configurations of modular robot system have been built, including industrial mechanical arms, the mobile platform, six-legged robot and 3D exoskeleton manipulator. Finally, the simulation and verification of one system among them have been made, using the analyses of screw kinematics and polynomial planning. The results of experiments demonstrate the feasibility and superiority of this modular system.

  12. Programming with models: modularity and abstraction provide powerful capabilities for systems biology

    PubMed Central

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2008-01-01

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously. PMID:18647734

  13. Programming with models: modularity and abstraction provide powerful capabilities for systems biology.

    PubMed

    Mallavarapu, Aneil; Thomson, Matthew; Ullian, Benjamin; Gunawardena, Jeremy

    2009-03-06

    Mathematical models are increasingly used to understand how phenotypes emerge from systems of molecular interactions. However, their current construction as monolithic sets of equations presents a fundamental barrier to progress. Overcoming this requires modularity, enabling sub-systems to be specified independently and combined incrementally, and abstraction, enabling generic properties of biological processes to be specified independently of specific instances. These, in turn, require models to be represented as programs rather than as datatypes. Programmable modularity and abstraction enables libraries of modules to be created, which can be instantiated and reused repeatedly in different contexts with different components. We have developed a computational infrastructure that accomplishes this. We show here why such capabilities are needed, what is required to implement them and what can be accomplished with them that could not be done previously.

  14. Feasibility of Nuclear Power on U.S. Military Installations. 2nd Revision

    DTIC Science & Technology

    2011-03-01

    Small Modular Reactor , Military Installation Energy, Energy Assurance 16. SECURITY CLASSIFICATION OF: a. REPORT I b. ABSTRACT U c. THIS PAGE i; 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF PAGES 98 19a. NAME OF RESPONSIBLE PERSON Knowledge Center/Rhea Stone 19b. TELEPHONE NUMBER (Include area code) 703-824-2110 Standard Form 298 (Rev. 8/98) Prescribed bv ANSI Sid 239.18 Contents Preliminary note: Development and commercial deployment of small modular reactors

  15. Molecular solid-state inverter-converter system

    NASA Technical Reports Server (NTRS)

    Birchenough, A. G.

    1973-01-01

    A modular approach for aerospace electrical systems has been developed, using lightweight high efficiency pulse width modulation techniques. With the modular approach, a required system is obtained by paralleling modules. The modular system includes the inverters and converters, a paralleling system, and an automatic control and fault-sensing protection system with a visual annunciator. The output is 150 V dc, or a low distortion three phase sine wave at 120 V, 400 Hz. Input power is unregulated 56 V dc. Each module is rated 2.5 kW or 3.6 kVA at 0.7 power factor.

  16. Unmanned Aerial Vehicle Non Line of Sight Chemical Detection Final Report

    DTIC Science & Technology

    2016-12-01

    aircraft system that is used to perform point detection of chemical warfare agents and collection of vapor, liquid, and solid samples. A modular payload...Standoff Quadcopter Unmanned aircraft system Modular payload 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF...Manufacturing Division, modular payloads are being developed to perform point detection and CBRNE sampling. The available UAS is a quadcopter capable of

  17. Automatic Text Structuring and Summarization.

    ERIC Educational Resources Information Center

    Salton, Gerard; And Others

    1997-01-01

    Discussion of the use of information retrieval techniques for automatic generation of semantic hypertext links focuses on automatic text summarization. Topics include World Wide Web links, text segmentation, and evaluation of text summarization by comparing automatically generated abstracts with manually prepared abstracts. (Author/LRW)

  18. Automatic Assembly of Combined Checkingfixture for Auto-Body Components Based Onfixture Elements Libraries

    NASA Astrophysics Data System (ADS)

    Jiang, Jingtao; Sui, Rendong; Shi, Yan; Li, Furong; Hu, Caiqi

    In this paper 3-D models of combined fixture elements are designed, classified by their functions, and saved in computer as supporting elements library, jointing elements library, basic elements library, localization elements library, clamping elements library, and adjusting elements library etc. Then automatic assembly of 3-D combined checking fixture for auto-body part is presented based on modularization theory. And in virtual auto-body assembly space, Locating constraint mapping technique and assembly rule-based reasoning technique are used to calculate the position of modular elements according to localization points and clamp points of auto-body part. Auto-body part model is transformed from itself coordinate system space to virtual assembly space by homogeneous transformation matrix. Automatic assembly of different functional fixture elements and auto-body part is implemented with API function based on the second development of UG. It is proven in practice that the method in this paper is feasible and high efficiency.

  19. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    DTIC Science & Technology

    2017-01-01

    004 OFFICE OF NAVAL RESEARCH ATTN JASON STACK MINE WARFARE & OCEAN ENGINEERING PROGRAMS CODE 32, SUITE 1092 875 N RANDOLPH ST ARLINGTON VA 22203 ONR...naval mine countermeasures (MCM) operations by automating a large portion of the data analysis. Successful long-term implementation of ATR requires a...Modular Algorithm Testbed Suite; MATS; Mine Countermeasures Operations U U U SAR 24 Derek R. Kolacinski (850) 230-7218 THIS PAGE INTENTIONALLY LEFT

  20. A truly Lego®-like modular microfluidics platform

    NASA Astrophysics Data System (ADS)

    Vittayarukskul, Kevin; Lee, Abraham Phillip

    2017-03-01

    Ideally, a modular microfluidics platform should be simple to assemble and support 3D configurations for increased versatility. The modular building blocks should also be mass producible like electrical components. These are fundamental features of world-renowned Legos® and why Legos® inspire many existing modular microfluidics platforms. In this paper, a truly Lego®-like microfluidics platform is introduced, and its basic feasibility is demonstrated. Here, PDMS building blocks resembling 2  ×  2 Lego® bricks are cast from 3D-printed master molds. The blocks are pegged and stacked on a traditional Lego® plate to create simple, 3D microfluidic networks, such as a single basket weave. Characteristics of the platform, including reversible sealing and automatic alignment of channels, are also analyzed and discussed in detail.

  1. Automatic Classification of Medical Text: The Influence of Publication Form1

    PubMed Central

    Cole, William G.; Michael, Patricia A.; Stewart, James G.; Blois, Marsden S.

    1988-01-01

    Previous research has shown that within the domain of medical journal abstracts the statistical distribution of words is neither random nor uniform, but is highly characteristic. Many words are used mainly or solely by one medical specialty or when writing about one particular level of description. Due to this regularity of usage, automatic classification within journal abstracts has proved quite successful. The present research asks two further questions. It investigates whether this statistical regularity and automatic classification success can also be achieved in medical textbook chapters. It then goes on to see whether the statistical distribution found in textbooks is sufficiently similar to that found in abstracts to permit accurate classification of abstracts based solely on previous knowledge of textbooks. 14 textbook chapters and 45 MEDLINE abstracts were submitted to an automatic classification program that had been trained only on chapters drawn from a standard textbook series. Statistical analysis of the properties of abstracts vs. chapters revealed important differences in word use. Automatic classification performance was good for chapters, but poor for abstracts.

  2. Design of a modular digital computer system, CDRL no. D001, final design plan

    NASA Technical Reports Server (NTRS)

    Easton, R. A.

    1975-01-01

    The engineering breadboard implementation for the CDRL no. D001 modular digital computer system developed during design of the logic system was documented. This effort followed the architecture study completed and documented previously, and was intended to verify the concepts of a fault tolerant, automatically reconfigurable, modular version of the computer system conceived during the architecture study. The system has a microprogrammed 32 bit word length, general register architecture and an instruction set consisting of a subset of the IBM System 360 instruction set plus additional fault tolerance firmware. The following areas were covered: breadboard packaging, central control element, central processing element, memory, input/output processor, and maintenance/status panel and electronics.

  3. ANNUAL REPORT-AUTOMATIC INDEXING AND ABSTRACTING.

    ERIC Educational Resources Information Center

    Lockheed Missiles and Space Co., Palo Alto, CA. Electronic Sciences Lab.

    THE INVESTIGATION IS CONCERNED WITH THE DEVELOPMENT OF AUTOMATIC INDEXING, ABSTRACTING, AND EXTRACTING SYSTEMS. BASIC INVESTIGATIONS IN ENGLISH MORPHOLOGY, PHONETICS, AND SYNTAX ARE PURSUED AS NECESSARY MEANS TO THIS END. IN THE FIRST SECTION THE THEORY AND DESIGN OF THE "SENTENCE DICTIONARY" EXPERIMENT IN AUTOMATIC EXTRACTION IS OUTLINED. SOME OF…

  4. 28-Bit serial word simulator/monitor

    NASA Technical Reports Server (NTRS)

    Durbin, J. W.

    1979-01-01

    Modular interface unit transfers data at high speeds along four channels. Device expedites variable-word-length communication between computers. Operation eases exchange of bit information by automatically reformatting coded input data and status information to match requirements of output.

  5. Automatic Identification System modular receiver for academic purposes

    NASA Astrophysics Data System (ADS)

    Cabrera, F.; Molina, N.; Tichavska, M.; Araña, V.

    2016-07-01

    The Automatic Identification System (AIS) standard is encompassed within the Global Maritime Distress and Safety System (GMDSS), in force since 1999. The GMDSS is a set of procedures, equipment, and communication protocols designed with the aim of increasing the safety of sea crossings, facilitating navigation, and the rescue of vessels in danger. The use of this system not only is increasingly attractive to security issues but also potentially creates intelligence products throughout the added-value information that this network can transmit from ships on real time (identification, position, course, speed, dimensions, flag, among others). Within the marine electronics market, commercial receivers implement this standard and allow users to access vessel-broadcasted information if in the range of coverage. In addition to satellite services, users may request actionable information from private or public AIS terrestrial networks where real-time feed or historical data can be accessed from its nodes. This paper describes the configuration of an AIS receiver based on a modular design. This modular design facilitates the evaluation of specific modules and also a better understanding of the standard and the possibility of changing hardware modules to improve the performance of the prototype. Thus, the aim of this paper is to describe the system's specifications, its main hardware components, and to present educational didactics on the setup and use of a modular and terrestrial AIS receiver. The latter is for academic purposes and in undergraduate studies such as electrical engineering, telecommunications, and maritime studies.

  6. Measuring, Enabling and Comparing Modularity, Regularity and Hierarchy in Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2005-01-01

    For computer-automated design systems to scale to complex designs they must be able to produce designs that exhibit the characteristics of modularity, regularity and hierarchy - characteristics that are found both in man-made and natural designs. Here we claim that these characteristics are enabled by implementing the attributes of combination, control-flow and abstraction in the representation. To support this claim we use an evolutionary algorithm to evolve solutions to different sizes of a table design problem using five different representations, each with different combinations of modularity, regularity and hierarchy enabled and show that the best performance happens when all three of these attributes are enabled. We also define metrics for modularity, regularity and hierarchy in design encodings and demonstrate that high fitness values are achieved with high values of modularity, regularity and hierarchy and that there is a positive correlation between increases in fitness and increases in modularity. regularity and hierarchy.

  7. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    ERIC Educational Resources Information Center

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  8. Program document for Energy Systems Optimization Program 2 (ESOP2). Volume 1: Engineering manual

    NASA Technical Reports Server (NTRS)

    Hamil, R. G.; Ferden, S. L.

    1977-01-01

    The Energy Systems Optimization Program, which is used to provide analyses of Modular Integrated Utility Systems (MIUS), is discussed. Modifications to the input format to allow modular inputs in specified blocks of data are described. An optimization feature which enables the program to search automatically for the minimum value of one parameter while varying the value of other parameters is reported. New program option flags for prime mover analyses and solar energy for space heating and domestic hot water are also covered.

  9. JDFTx: Software for joint density-functional theory

    DOE PAGES

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...

    2017-11-14

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  10. A software defined RTU multi-protocol automatic adaptation data transmission method

    NASA Astrophysics Data System (ADS)

    Jin, Huiying; Xu, Xingwu; Wang, Zhanfeng; Ma, Weijun; Li, Sheng; Su, Yong; Pan, Yunpeng

    2018-02-01

    Remote terminal unit (RTU) is the core device of the monitor system in hydrology and water resources. Different devices often have different communication protocols in the application layer, which results in the difficulty in information analysis and communication networking. Therefore, we introduced the idea of software defined hardware, and abstracted the common feature of mainstream communication protocols of RTU application layer, and proposed a uniformed common protocol model. Then, various communication protocol algorithms of application layer are modularized according to the model. The executable codes of these algorithms are labeled by the virtual functions and stored in the flash chips of embedded CPU to form the protocol stack. According to the configuration commands to initialize the RTU communication systems, it is able to achieve dynamic assembling and loading of various application layer communication protocols of RTU and complete the efficient transport of sensor data from RTU to central station when the data acquisition protocol of sensors and various external communication terminals remain unchanged.

  11. JDFTx: Software for joint density-functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  12. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  13. Automatic image database generation from CAD for 3D object recognition

    NASA Astrophysics Data System (ADS)

    Sardana, Harish K.; Daemi, Mohammad F.; Ibrahim, Mohammad K.

    1993-06-01

    The development and evaluation of Multiple-View 3-D object recognition systems is based on a large set of model images. Due to the various advantages of using CAD, it is becoming more and more practical to use existing CAD data in computer vision systems. Current PC- level CAD systems are capable of providing physical image modelling and rendering involving positional variations in cameras, light sources etc. We have formulated a modular scheme for automatic generation of various aspects (views) of the objects in a model based 3-D object recognition system. These views are generated at desired orientations on the unit Gaussian sphere. With a suitable network file sharing system (NFS), the images can directly be stored on a database located on a file server. This paper presents the image modelling solutions using CAD in relation to multiple-view approach. Our modular scheme for data conversion and automatic image database storage for such a system is discussed. We have used this approach in 3-D polyhedron recognition. An overview of the results, advantages and limitations of using CAD data and conclusions using such as scheme are also presented.

  14. A simplified computational memory model from information processing.

    PubMed

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  15. Design strategies to address the effect of hydrophobic epitope on stability and in vitro assembly of modular virus‐like particle

    PubMed Central

    Tekewe, Alemu; Connors, Natalie K.; Middelberg, Anton P. J.

    2016-01-01

    Abstract Virus‐like particles (VLPs) and capsomere subunits have shown promising potential as safe and effective vaccine candidates. They can serve as platforms for the display of foreign epitopes on their surfaces in a modular architecture. Depending on the physicochemical properties of the antigenic modules, modularization may affect the expression, solubility and stability of capsomeres, and VLP assembly. In this study, three module designs of a rotavirus hydrophobic peptide (RV10) were synthesized using synthetic biology. Among the three synthetic modules, modularization of the murine polyomavirus VP1 with a single copy of RV10 flanked by long linkers and charged residues resulted in the expression of stable modular capsomeres. Further employing the approach of module titration of RV10 modules on each capsomere via Escherichia coli co‐expression of unmodified VP1 and modular VP1‐RV10 successfully translated purified modular capomeres into modular VLPs when assembled in vitro. Our results demonstrate that tailoring the physicochemical properties of modules to enhance modular capsomeres stability is achievable through synthetic biology designs. Combined with module titration strategy to avoid steric hindrance to intercapsomere interactions, this allows bioprocessing of bacterially produced in vitro assembled modular VLPs. PMID:27222486

  16. Teaching of Writing: Abstracts of Doctoral Dissertations Published in "Dissertation Abstracts International," January through June 1979 (Vol. 39 Nos. 7 through 12).

    ERIC Educational Resources Information Center

    ERIC Clearinghouse on Reading and Communication Skills, Urbana, IL.

    This collection of abstracts is part of a continuing series providing information on recent doctoral dissertations. The 19 titles deal with the following topics: the dynamics of creative expression, modular scheduling and student success in freshman composition, growth in writing ability through immersion in a university discipline, massed and…

  17. Master/Programmable-Slave Computer

    NASA Technical Reports Server (NTRS)

    Smaistrla, David; Hall, William A.

    1990-01-01

    Unique modular computer features compactness, low power, mass storage of data, multiprocessing, and choice of various input/output modes. Master processor communicates with user via usual keyboard and video display terminal. Coordinates operations of as many as 24 slave processors, each dedicated to different experiment. Each slave circuit card includes slave microprocessor and assortment of input/output circuits for communication with external equipment, with master processor, and with other slave processors. Adaptable to industrial process control with selectable degrees of automatic control, automatic and/or manual monitoring, and manual intervention.

  18. Command, Control, Communications, Computers and Intelligence Electronic Warfare (C4IEW) Project Book, Fiscal Year 1994. (Non-FOUO Version)

    DTIC Science & Technology

    1994-04-01

    TSW-7A, AIR TRAFFIC CONTROL CENTRAL (ATCC) 32- 8 AN/TTC-41(V), CENTRAL OFFICE, TELEPHONE, AUTOMATIC 32- 9 MISSILE COUNTERMEASURE DEVICE (MCD) .- 0 MK...a Handheld Terminal Unit (HTU), Portable Computer Unit (PCU), Transportable Computer Unit (TCU), and compatible NOI peripheral devices . All but the...CLASSIFICATION: ASARC-III, Jun 80, Standard. I I I AN/TIC-39 IS A MOBILE , AUTOMATIC , MODULAR ELECTRONIC CIRCUIT SWITCH UNDER PROCESSOR CONTROL WITH INTEGRAL

  19. Command, Control, Communications, Computers, Intelligence Electronic Warfare (C4IEW) and Sensors. Project Book. Fiscal Year 1996

    DTIC Science & Technology

    1996-01-01

    INTENSIFICATION (AI2) ATD AERIAL SCOUT SENSORS INTEGRATION (ASSI) BISTATIC RADAR FOR WEAPONS LOCATION (BRWL) ATD CLOSE IN MAN PORTABLE MINE DETECTOR (CIMMD...MS IV PE & LINE #: 1X428010.D107 HI Operations/Support DESCRIPTION: The AN/TTC-39A Circuit Switch is a 744 line mobile , automatic ...SYNOPSIS: AN/TTC-39 IS A MOBILE , AUTOMATIC , MODULAR ELECTRONIC CIRCUIT SWITCH UNDER PROCESSOR CONTROL WITH INTEGRAL COMSEC AND MULTIPLEX EQUIPMENT. AN/TTC

  20. Modular Expression Language for Ordinary Differential Equation Editing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blake, Robert C.

    MELODEEis a system for describing systems of initial value problem ordinary differential equations, and a compiler for the language that produces optimized code to integrate the differential equations. Features include rational polynomial approximation for expensive functions and automatic differentiation for symbolic jacobians

  1. Automatic Abstraction in Planning

    NASA Technical Reports Server (NTRS)

    Christensen, J.

    1991-01-01

    Traditionally, abstraction in planning has been accomplished by either state abstraction or operator abstraction, neither of which has been fully automatic. We present a new method, predicate relaxation, for automatically performing state abstraction. PABLO, a nonlinear hierarchical planner, implements predicate relaxation. Theoretical, as well as empirical results are presented which demonstrate the potential advantages of using predicate relaxation in planning. We also present a new definition of hierarchical operators that allows us to guarantee a limited form of completeness. This new definition is shown to be, in some ways, more flexible than previous definitions of hierarchical operators. Finally, a Classical Truth Criterion is presented that is proven to be sound and complete for a planning formalism that is general enough to include most classical planning formalisms that are based on the STRIPS assumption.

  2. (abstract) An Ada Language Modular Telerobot Task Execution System

    NASA Technical Reports Server (NTRS)

    Backes, Paul; Long, Mark; Steele, Robert

    1993-01-01

    A telerobotic task execution system is described which has been developed for space flight applications. The Modular Telerobot Task Execution System (MOTES) provides the remote site task execution capability in a local-remote telerobotic system. The system provides supervised autonomous control, shared control, and teleoperation for a redundant manipulator. The system is capable of nominal task execution as well as monitoring and reflex motion.

  3. General software design for multisensor data fusion

    NASA Astrophysics Data System (ADS)

    Zhang, Junliang; Zhao, Yuming

    1999-03-01

    In this paper a general method of software design for multisensor data fusion is discussed in detail, which adopts object-oriented technology under UNIX operation system. The software for multisensor data fusion is divided into six functional modules: data collection, database management, GIS, target display and alarming data simulation etc. Furthermore, the primary function, the components and some realization methods of each modular is given. The interfaces among these functional modular relations are discussed. The data exchange among each functional modular is performed by interprocess communication IPC, including message queue, semaphore and shared memory. Thus, each functional modular is executed independently, which reduces the dependence among functional modules and helps software programing and testing. This software for multisensor data fusion is designed as hierarchical structure by the inheritance character of classes. Each functional modular is abstracted and encapsulated through class structure, which avoids software redundancy and enhances readability.

  4. A simplified computational memory model from information processing

    PubMed Central

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  5. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less

  6. Extending Automatic Parallelization to Optimize High-Level Abstractions for Multicore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, C; Quinlan, D J; Willcock, J J

    2008-12-12

    Automatic introduction of OpenMP for sequential applications has attracted significant attention recently because of the proliferation of multicore processors and the simplicity of using OpenMP to express parallelism for shared-memory systems. However, most previous research has only focused on C and Fortran applications operating on primitive data types. C++ applications using high-level abstractions, such as STL containers and complex user-defined types, are largely ignored due to the lack of research compilers that are readily able to recognize high-level object-oriented abstractions and leverage their associated semantics. In this paper, we automatically parallelize C++ applications using ROSE, a multiple-language source-to-source compiler infrastructuremore » which preserves the high-level abstractions and gives us access to their semantics. Several representative parallelization candidate kernels are used to explore semantic-aware parallelization strategies for high-level abstractions, combined with extended compiler analyses. Those kernels include an array-base computation loop, a loop with task-level parallelism, and a domain-specific tree traversal. Our work extends the applicability of automatic parallelization to modern applications using high-level abstractions and exposes more opportunities to take advantage of multicore processors.« less

  7. Single Event Effects mitigation with TMRG tool

    NASA Astrophysics Data System (ADS)

    Kulis, S.

    2017-01-01

    Single Event Effects (SEE) are a major concern for integrated circuits exposed to radiation. There have been several techniques proposed to protect circuits against radiation-induced upsets. Among the others, the Triple Modular Redundancy (TMR) technique is one of the most popular. The purpose of the Triple Modular Redundancy Generator (TMRG) tool is to automatize the process of triplicating digital circuits freeing the designer from introducing the TMR code manually at the implementation stage. It helps to ensure that triplicated logic is maintained through the design process. Finally, the tool streamlines the process of introducing SEE in gate level simulations for final verification.

  8. A Software Architecture for Adaptive Modular Sensing Systems

    PubMed Central

    Lyle, Andrew C.; Naish, Michael D.

    2010-01-01

    By combining a number of simple transducer modules, an arbitrarily complex sensing system may be produced to accommodate a wide range of applications. This work outlines a novel software architecture and knowledge representation scheme that has been developed to support this type of flexible and reconfigurable modular sensing system. Template algorithms are used to embed intelligence within each module. As modules are added or removed, the composite sensor is able to automatically determine its overall geometry and assume an appropriate collective identity. A virtual machine-based middleware layer runs on top of a real-time operating system with a pre-emptive kernel, enabling platform-independent template algorithms to be written once and run on any module, irrespective of its underlying hardware architecture. Applications that may benefit from easily reconfigurable modular sensing systems include flexible inspection, mobile robotics, surveillance, and space exploration. PMID:22163614

  9. A software architecture for adaptive modular sensing systems.

    PubMed

    Lyle, Andrew C; Naish, Michael D

    2010-01-01

    By combining a number of simple transducer modules, an arbitrarily complex sensing system may be produced to accommodate a wide range of applications. This work outlines a novel software architecture and knowledge representation scheme that has been developed to support this type of flexible and reconfigurable modular sensing system. Template algorithms are used to embed intelligence within each module. As modules are added or removed, the composite sensor is able to automatically determine its overall geometry and assume an appropriate collective identity. A virtual machine-based middleware layer runs on top of a real-time operating system with a pre-emptive kernel, enabling platform-independent template algorithms to be written once and run on any module, irrespective of its underlying hardware architecture. Applications that may benefit from easily reconfigurable modular sensing systems include flexible inspection, mobile robotics, surveillance, and space exploration.

  10. The pandemonium system of reflective agents.

    PubMed

    Smieja, F

    1996-01-01

    The Pandemonium system of reflective MINOS agents solves problems by automatic dynamic modularization of the input space. The agents contain feedforward neural networks which adapt using the backpropagation algorithm. We demonstrate the performance of Pandemonium on various categories of problems. These include learning continuous functions with discontinuities, separating two spirals, learning the parity function, and optical character recognition. It is shown how strongly the advantages gained from using a modularization technique depend on the nature of the problem. The superiority of the Pandemonium method over a single net on the first two test categories is contrasted with its limited advantages for the second two categories. In the first case the system converges quicker with modularization and is seen to lead to simpler solutions. For the second case the problem is not significantly simplified through flat decomposition of the input space, although convergence is still quicker.

  11. Minimal-resource computer program for automatic generation of ocean wave ray or crest diagrams in shoaling waters

    NASA Technical Reports Server (NTRS)

    Poole, L. R.; Lecroy, S. R.; Morris, W. D.

    1977-01-01

    A computer program for studying linear ocean wave refraction is described. The program features random-access modular bathymetry data storage. Three bottom topography approximation techniques are available in the program which provide varying degrees of bathymetry data smoothing. Refraction diagrams are generated automatically and can be displayed graphically in three forms: Ray patterns with specified uniform deepwater ray density, ray patterns with controlled nearshore ray density, or crest patterns constructed by using a cubic polynomial to approximate crest segments between adjacent rays.

  12. Experimental Applications of Automatic Test Markup Language (ATML)

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; McCartney, Patrick; Gorringe, Chris

    2012-01-01

    The authors describe challenging use-cases for Automatic Test Markup Language (ATML), and evaluate solutions. The first case uses ATML Test Results to deliver active features to support test procedure development and test flow, and bridging mixed software development environments. The second case examines adding attributes to Systems Modelling Language (SysML) to create a linkage for deriving information from a model to fill in an ATML document set. Both cases are outside the original concept of operations for ATML but are typical when integrating large heterogeneous systems with modular contributions from multiple disciplines.

  13. Solaris: Orbital station: Automatic laboratory for outer space rendezvous and operations

    NASA Technical Reports Server (NTRS)

    Runavot, J. J.

    1981-01-01

    The preliminary design for a modular orbital space station (unmanned) is outlined. The three main components are a support module, an experiment module, and an orbital transport vehicle. The major types of missions (assembly, materials processing, and Earth observation) that could be performed are discussed.

  14. Self-Assembly of a Modular Polypeptide Based on Blocks of Silk-Mimetic and Elastin-Mimetic Sequences

    DTIC Science & Technology

    2002-04-01

    Silk -Mimetic and Elastin-Mimetic Sequences DISTRIBUTION: Approved for public release, distribution unlimited This paper is part of the following...724 © 2002 Materials Research Society N3.8 Self-Assembly of a Modular Polypeptide based on Blocks of Silk -Mimetic and Elastin- Mimetic Sequences...Chrystelle S. Cazalis, and Vincent P. Conticello* Department of Chemistry, Emory University, Atlanta, GA 30322 ABSTRACT Spider dragline silk fiber displays

  15. Properties of Artifact Representations for Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    To achieve evolutionary design systems that scale to the levels achieved by man-made artifacts we can look to their characteristics of modularity, hierarchy and regularity to guide us. For this we focus on design representations, since they strongly determine the ability of evolutionary design systems to evolve artifacts with these characteristics. We identify three properties of design representations - combination, control-flow and abstraction - and discuss how they relate to hierarchy, modularity and regularity.

  16. Standardized strapdown inertial component modularity study, volume 2

    NASA Technical Reports Server (NTRS)

    Feldman, J.

    1974-01-01

    To obtain cost effective strapdown navigation, guidance and stabilization systems to meet anticipated future needs a standardized modularized strapdown system concept is proposed. Three performance classes, high, medium and low, are suggested to meet the range of applications. Candidate inertial instruments are selected and analyzed for interface compatibility. Electronic packaging and processing, materials and thermal considerations applying to the three classes are discussed and recommendations advanced. Opportunities for automatic fault detection and redundancy are presented. The smallest gyro and accelerometer modules are projected as requiring a volume of 26 cubic inches and 23.6 cubic inches, respectively. Corresponding power dissipation is projected as 5 watts, and 2.6 watts respectively.

  17. Modular Toolkit for Data Processing (MDP): A Python Data Processing Framework.

    PubMed

    Zito, Tiziano; Wilbert, Niko; Wiskott, Laurenz; Berkes, Pietro

    2008-01-01

    Modular toolkit for Data Processing (MDP) is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  18. An interactive modular design for computerized photometry in spectrochemical analysis

    NASA Technical Reports Server (NTRS)

    Bair, V. L.

    1980-01-01

    A general functional description of totally automatic photometry of emission spectra is not available for an operating environment in which the sample compositions and analysis procedures are low-volume and non-routine. The advantages of using an interactive approach to computer control in such an operating environment are demonstrated. This approach includes modular subroutines selected at multiple-option, menu-style decision points. This style of programming is used to trace elemental determinations, including the automated reading of spectrographic plates produced by a 3.4 m Ebert mount spectrograph using a dc-arc in an argon atmosphere. The simplified control logic and modular subroutine approach facilitates innovative research and program development, yet is easily adapted to routine tasks. Operator confidence and control are increased by the built-in options including degree of automation, amount of intermediate data printed out, amount of user prompting, and multidirectional decision points.

  19. Fighting Testing ACAT/FRRP: Automatic Collision Avoidance Technology/Fighter Risk Reduction Project

    NASA Technical Reports Server (NTRS)

    Skoog, Mark A.

    2009-01-01

    This slide presentation reviews the work of the Flight testing Automatic Collision Avoidance Technology/Fighter Risk Reduction Project (ACAT/FRRP). The goal of this project is to develop common modular architecture for all aircraft, and to enable the transition of technology from research to production as soon as possible to begin to reduce the rate of mishaps. The automated Ground Collision Avoidance System (GCAS) system is designed to prevent collision with the ground, by avionics that project the future trajectory over digital terrain, and request an evasion maneuver at the last instance. The flight controls are capable of automatically performing a recovery. The collision avoidance is described in the presentation. Also included in the presentation is a description of the flight test.

  20. What does the modularity of morals have to do with ethics? Four moral sprouts plus or minus a few.

    PubMed

    Flanagan, Owen; Williams, Robert Anthony

    2010-07-01

    Flanagan (1991) was the first contemporary philosopher to suggest that a modularity of morals hypothesis (MMH) was worth consideration by cognitive science. There is now a serious empirically informed proposal that moral competence is best explained in terms of moral modules-evolutionarily ancient, fast-acting, automatic reactions to particular sociomoral experiences (Haidt & Joseph, 2007). MMH fleshes out an idea nascent in Aristotle, Mencius, and Darwin. We discuss the evidence for MMH, specifically an ancient version, "Mencian Moral Modularity," which claims four innate modules, and "Social Intuitionist Modularity," which claims five innate modules. We compare these two moral modularity models, discuss whether the postulated modules are best conceived as perceptual/Fodorian or emotional/Darwinian, and consider whether assuming MMH true has any normative ethical consequences whatsoever. The discussion of MMH reconnects cognitive science with normative ethics in a way that involves the reassertion of the "is-ought" problem. We explain in a new way what this problem is and why it would not yield. The reason does not involve the logic of "ought," but rather the plasticity of human nature and the realistic options to "grow" and "do" human nature in multifarious legitimate ways. Copyright © 2010 Cognitive Science Society, Inc.

  1. The Role of Automatic Indexing in Access Control: A Modular View

    ERIC Educational Resources Information Center

    Hartson, H. Rex

    1974-01-01

    A model which relates the access control and indexing functions. The model is based on concept protection which allows a practically unbounded number of levels (subsets) of protection without requiring a fixed hierarchy among the levels. This protection is offered independently for each of the user operations allowed. (Author)

  2. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  3. Assume-Guarantee Abstraction Refinement Meets Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Bogomolov, Sergiy; Frehse, Goran; Greitschus, Marius; Grosu, Radu; Pasareanu, Corina S.; Podelski, Andreas; Strump, Thomas

    2014-01-01

    Compositional verification techniques in the assume- guarantee style have been successfully applied to transition systems to efficiently reduce the search space by leveraging the compositional nature of the systems under consideration. We adapt these techniques to the domain of hybrid systems with affine dynamics. To build assumptions we introduce an abstraction based on location merging. We integrate the assume-guarantee style analysis with automatic abstraction refinement. We have implemented our approach in the symbolic hybrid model checker SpaceEx. The evaluation shows its practical potential. To the best of our knowledge, this is the first work combining assume-guarantee reasoning with automatic abstraction-refinement in the context of hybrid automata.

  4. Automatic safety belt systems owner usage and attitudes in GM Chevettes and VW Rabbits

    DOT National Transportation Integrated Search

    1980-05-01

    Author's abstract: The study was designed to: (1) evaluate the effectiveness of automatic restraint systems in increasing belt usage, and (2) determine owner attitudes toward the system. Information gathered from owners of vehicles with automatic sys...

  5. SA-SOM algorithm for detecting communities in complex networks

    NASA Astrophysics Data System (ADS)

    Chen, Luogeng; Wang, Yanran; Huang, Xiaoming; Hu, Mengyu; Hu, Fang

    2017-10-01

    Currently, community detection is a hot topic. This paper, based on the self-organizing map (SOM) algorithm, introduced the idea of self-adaptation (SA) that the number of communities can be identified automatically, a novel algorithm SA-SOM of detecting communities in complex networks is proposed. Several representative real-world networks and a set of computer-generated networks by LFR-benchmark are utilized to verify the accuracy and the efficiency of this algorithm. The experimental findings demonstrate that this algorithm can identify the communities automatically, accurately and efficiently. Furthermore, this algorithm can also acquire higher values of modularity, NMI and density than the SOM algorithm does.

  6. Electronic control circuits: A compilation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A compilation of technical R and D information on circuits and modular subassemblies is presented as a part of a technology utilization program. Fundamental design principles and applications are given. Electronic control circuits discussed include: anti-noise circuit; ground protection device for bioinstrumentation; temperature compensation for operational amplifiers; hybrid gatling capacitor; automatic signal range control; integrated clock-switching control; and precision voltage tolerance detector.

  7. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  8. Modular Hamiltonians on the null plane and the Markov property of the vacuum state

    NASA Astrophysics Data System (ADS)

    Casini, Horacio; Testé, Eduardo; Torroba, Gonzalo

    2017-09-01

    We compute the modular Hamiltonians of regions having the future horizon lying on a null plane. For a CFT this is equivalent to regions with a boundary of arbitrary shape lying on the null cone. These Hamiltonians have a local expression on the horizon formed by integrals of the stress tensor. We prove this result in two different ways, and show that the modular Hamiltonians of these regions form an infinite dimensional Lie algebra. The corresponding group of unitary transformations moves the fields on the null surface locally along the null generators with arbitrary null line dependent velocities, but act non-locally outside the null plane. We regain this result in greater generality using more abstract tools on the algebraic quantum field theory. Finally, we show that modular Hamiltonians on the null surface satisfy a Markov property that leads to the saturation of the strong sub-additive inequality for the entropies and to the strong super-additivity of the relative entropy.

  9. A Modular Hierarchical Approach to 3D Electron Microscopy Image Segmentation

    PubMed Central

    Liu, Ting; Jones, Cory; Seyedhosseini, Mojtaba; Tasdizen, Tolga

    2014-01-01

    The study of neural circuit reconstruction, i.e., connectomics, is a challenging problem in neuroscience. Automated and semi-automated electron microscopy (EM) image analysis can be tremendously helpful for connectomics research. In this paper, we propose a fully automatic approach for intra-section segmentation and inter-section reconstruction of neurons using EM images. A hierarchical merge tree structure is built to represent multiple region hypotheses and supervised classification techniques are used to evaluate their potentials, based on which we resolve the merge tree with consistency constraints to acquire final intra-section segmentation. Then, we use a supervised learning based linking procedure for the inter-section neuron reconstruction. Also, we develop a semi-automatic method that utilizes the intermediate outputs of our automatic algorithm and achieves intra-segmentation with minimal user intervention. The experimental results show that our automatic method can achieve close-to-human intra-segmentation accuracy and state-of-the-art inter-section reconstruction accuracy. We also show that our semi-automatic method can further improve the intra-segmentation accuracy. PMID:24491638

  10. A fully automatic three-step liver segmentation method on LDA-based probability maps for multiple contrast MR images.

    PubMed

    Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf

    2010-07-01

    Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Business Case Analysis Of Small Modular Reactors (SMR) For DOD Assured Power

    DTIC Science & Technology

    2017-12-01

    Smith Second Reader: Thomas L. Albright THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public...SECURITY CLASSIFICATION OF ABSTRACT Unclassified 20. LIMITATION OF ABSTRACT UU NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89...attack (EMP) represents one of the most devastating forms of attack on the power grid. Because the North American grid is made up of three major

  12. Discovering Multimodal Behavior in Ms. Pac-Man through Evolution of Modular Neural Networks.

    PubMed

    Schrum, Jacob; Miikkulainen, Risto

    2016-03-12

    Ms. Pac-Man is a challenging video game in which multiple modes of behavior are required: Ms. Pac-Man must escape ghosts when they are threats and catch them when they are edible, in addition to eating all pills in each level. Past approaches to learning behavior in Ms. Pac-Man have treated the game as a single task to be learned using monolithic policy representations. In contrast, this paper uses a framework called Modular Multi-objective NEAT (MM-NEAT) to evolve modular neural networks. Each module defines a separate behavior. The modules are used at different times according to a policy that can be human-designed (i.e. Multitask) or discovered automatically by evolution. The appropriate number of modules can be fixed or discovered using a genetic operator called Module Mutation. Several versions of Module Mutation are evaluated in this paper. Both fixed modular networks and Module Mutation networks outperform monolithic networks and Multitask networks. Interestingly, the best networks dedicate modules to critical behaviors (such as escaping when surrounded after luring ghosts near a power pill) that do not follow the customary division of the game into chasing edible and escaping threat ghosts. The results demonstrate that MM-NEAT can discover interesting and effective behavior for agents in challenging games.

  13. Discovering Multimodal Behavior in Ms. Pac-Man through Evolution of Modular Neural Networks

    PubMed Central

    Schrum, Jacob; Miikkulainen, Risto

    2015-01-01

    Ms. Pac-Man is a challenging video game in which multiple modes of behavior are required: Ms. Pac-Man must escape ghosts when they are threats and catch them when they are edible, in addition to eating all pills in each level. Past approaches to learning behavior in Ms. Pac-Man have treated the game as a single task to be learned using monolithic policy representations. In contrast, this paper uses a framework called Modular Multi-objective NEAT (MM-NEAT) to evolve modular neural networks. Each module defines a separate behavior. The modules are used at different times according to a policy that can be human-designed (i.e. Multitask) or discovered automatically by evolution. The appropriate number of modules can be fixed or discovered using a genetic operator called Module Mutation. Several versions of Module Mutation are evaluated in this paper. Both fixed modular networks and Module Mutation networks outperform monolithic networks and Multitask networks. Interestingly, the best networks dedicate modules to critical behaviors (such as escaping when surrounded after luring ghosts near a power pill) that do not follow the customary division of the game into chasing edible and escaping threat ghosts. The results demonstrate that MM-NEAT can discover interesting and effective behavior for agents in challenging games. PMID:27030803

  14. Study of thermal management for space platform applications: Unmanned modular thermal management and radiator technologies

    NASA Technical Reports Server (NTRS)

    Oren, J. A.

    1981-01-01

    Candidate techniques for thermal management of unmanned modules docked to a large 250 kW platform were evaluated. Both automatically deployed and space constructed radiator systems were studied to identify characteristics and potential problems. Radiator coating requirements and current state-of-the-art were identified. An assessment of the technology needs was made and advancements were recommended.

  15. Engineering Supply Management System: The Next Generation

    DTIC Science & Technology

    1991-09-01

    010 Partia! receipts 0018 Automatic inventory update 0 048 Discrepant material 0 004 Order processing requirements Transaction reversal capability 0 012...August 1991. 2-5 sys.em’s modules that support the DEH’s needs are the Sales Order Processing , Register Sales, Purchase Order Processing , Inventory...modular system developed by PIC Business Systems, Incorporated. This system possesses Order Processing , Inventory Management, Purchase Orders, and

  16. Automatic load sharing in inverter modules

    NASA Technical Reports Server (NTRS)

    Nagano, S.

    1979-01-01

    Active feedback loads transistor equally with little power loss. Circuit is suitable for balancing modular inverters in spacecraft, computer power supplies, solar-electric power generators, and electric vehicles. Current-balancing circuit senses differences between collector current for power transistor and average value of load currents for all power transistors. Principle is effective not only in fixed duty-cycle inverters but also in converters operating at variable duty cycles.

  17. Modelling Metamorphism by Abstract Interpretation

    NASA Astrophysics Data System (ADS)

    Dalla Preda, Mila; Giacobazzi, Roberto; Debray, Saumya; Coogan, Kevin; Townsend, Gregg M.

    Metamorphic malware apply semantics-preserving transformations to their own code in order to foil detection systems based on signature matching. In this paper we consider the problem of automatically extract metamorphic signatures from these malware. We introduce a semantics for self-modifying code, later called phase semantics, and prove its correctness by showing that it is an abstract interpretation of the standard trace semantics. Phase semantics precisely models the metamorphic code behavior by providing a set of traces of programs which correspond to the possible evolutions of the metamorphic code during execution. We show that metamorphic signatures can be automatically extracted by abstract interpretation of the phase semantics, and that regular metamorphism can be modelled as finite state automata abstraction of the phase semantics.

  18. UFO - The Universal FEYNRULES Output

    NASA Astrophysics Data System (ADS)

    Degrande, Céline; Duhr, Claude; Fuks, Benjamin; Grellscheid, David; Mattelaer, Olivier; Reiter, Thomas

    2012-06-01

    We present a new model format for automatized matrix-element generators, the so-called Universal FEYNRULES Output (UFO). The format is universal in the sense that it features compatibility with more than one single generator and is designed to be flexible, modular and agnostic of any assumption such as the number of particles or the color and Lorentz structures appearing in the interaction vertices. Unlike other model formats where text files need to be parsed, the information on the model is encoded into a PYTHON module that can easily be linked to other computer codes. We then describe an interface for the MATHEMATICA package FEYNRULES that allows for an automatic output of models in the UFO format.

  19. B-737 Linear Autoland Simulink Model

    NASA Technical Reports Server (NTRS)

    Belcastro, Celeste (Technical Monitor); Hogge, Edward F.

    2004-01-01

    The Linear Autoland Simulink model was created to be a modular test environment for testing of control system components in commercial aircraft. The input variables, physical laws, and referenced frames used are summarized. The state space theory underlying the model is surveyed and the location of the control actuators described. The equations used to realize the Dryden gust model to simulate winds and gusts are derived. A description of the pseudo-random number generation method used in the wind gust model is included. The longitudinal autopilot, lateral autopilot, automatic throttle autopilot, engine model and automatic trim devices are considered as subsystems. The experience in converting the Airlabs FORTRAN aircraft control system simulation to a graphical simulation tool (Matlab/Simulink) is described.

  20. Development and integration of a LabVIEW-based modular architecture for automated execution of electrochemical catalyst testing.

    PubMed

    Topalov, Angel A; Katsounaros, Ioannis; Meier, Josef C; Klemm, Sebastian O; Mayrhofer, Karl J J

    2011-11-01

    This paper describes a system for performing electrochemical catalyst testing where all hardware components are controlled simultaneously using a single LabVIEW-based software application. The software that we developed can be operated in both manual mode for exploratory investigations and automatic mode for routine measurements, by using predefined execution procedures. The latter enables the execution of high-throughput or combinatorial investigations, which decrease substantially the time and cost for catalyst testing. The software was constructed using a modular architecture which simplifies the modification or extension of the system, depending on future needs. The system was tested by performing stability tests of commercial fuel cell electrocatalysts, and the advantages of the developed system are discussed. © 2011 American Institute of Physics

  1. GAMBIT: the global and modular beyond-the-standard-model inference tool

    NASA Astrophysics Data System (ADS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  2. Film grain synthesis and its application to re-graining

    NASA Astrophysics Data System (ADS)

    Schallauer, Peter; Mörzinger, Roland

    2006-01-01

    Digital film restoration and special effects compositing require more and more automatic procedures for movie regraining. Missing or inhomogeneous grain decreases perceived quality. For the purpose of grain synthesis an existing texture synthesis algorithm has been evaluated and optimized. We show that this algorithm can produce synthetic grain which is perceptually similar to a given grain template, which has high spatial and temporal variation and which can be applied to multi-spectral images. Furthermore a re-grain application framework is proposed, which synthesises based on an input grain template artificial grain and composites this together with the original image content. Due to its modular approach this framework supports manual as well as automatic re-graining applications. Two example applications are presented, one for re-graining an entire movie and one for fully automatic re-graining of image regions produced by restoration algorithms. Low computational cost of the proposed algorithms allows application in industrial grade software.

  3. Multilevel and Hybrid Architecture for Device Abstraction and Context Information Management in Smart Home Environments

    NASA Astrophysics Data System (ADS)

    Peláez, Víctor; González, Roberto; San Martín, Luis Ángel; Campos, Antonio; Lobato, Vanesa

    Hardware device management, and context information acquisition and abstraction are key factors to develop the ambient intelligent paradigm in smart homes. This work presents an architecture that addresses these two problems and provides a usable framework to develop applications easily. In contrast to other proposals, this work addresses performance issues specifically. Results show that the execution performance of the developed prototype is suitable for deployment in a real environment. In addition, the modular design of the system allows the user to develop applications using different techniques and different levels of abstraction.

  4. A Modular Approach To Developing A Large Deployable Reflector

    NASA Astrophysics Data System (ADS)

    Pittman, R.; Leidich, C.; Mascy, F.; Swenson, B.

    1984-01-01

    NASA is currently studying the feasibility of developing a Large Deployable Reflector (LDR) astronomical facility to perform astrophysical studies of the infrared and submillimeter portion of the spectrum in the mid 1990's. The LDR concept was recommended by the Astronomy Survey Committee of the National Academy of Sciences as one of two space based projects to be started this decade. The current baseline calls for a 20 m (65.6 ft) aperture telescope diffraction limited at 30 μm and automatically deployed from a single Shuttle launch. The volume, performance, and single launch constraints place great demands on the technology and place LDR beyond the state-of-the-art in certain areas such as lightweight reflector segments. The advent of the Shuttle is opening up many new options and capabilities for producing large space systems. Until now, LDR has always been conceived as an integrated system, deployed autonomously in a single launch. This paper will look at a combination of automatic deployment and on-orbit assembly that may reduce the technological complexity and cost of the LDR system. Many technological tools are now in use or under study that will greatly enhance our capabilities to do assembly in space. Two Shuttle volume budget scenarios will be examined to assess the potential of these tools to reduce the LDR system complexity. Further study will be required to reach the full optimal combination of deployment and assembly, since in most cases the capabilities of these new tools have not been demonstrated. In order to take maximum advantage of these concepts, the design of LDR must be flexible and allow one subsystem to be modified without adversely affecting the entire system. One method of achieving this flexibility is to use a modular design approach in which the major subsystems are physically separated during launch and assembled on orbit. A modular design approach facilitates this flexibility but requires that the subsystems be interfaced in a simple, straightforward, and controlled manner. NASA is currently defining a technology development plan for LDR which will identify the technology advances that are required. The modular approach offers the flexibility to easily incorporate these new advances into the design.

  5. Tailoring Software Inspections for Aspect-Oriented Programming

    ERIC Educational Resources Information Center

    Watkins, Charlette Ward

    2009-01-01

    Aspect-Oriented Software Development (AOSD) is a new approach that addresses limitations inherent in conventional programming, especially the principle of separation of concerns by emphasizing the encapsulation and modularization of crosscutting concerns through a new abstraction, the "aspect." Aspect-oriented programming is an emerging AOSD…

  6. Abstracts Produced Using Computer Assistance.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2000-01-01

    Describes an experiment that evaluated features of TEXNET abstracting software, compared the use of keywords and phrases that were automatically extracted, tested hypotheses about relations between abstractors' backgrounds and their reactions to abstracting assistance software, and obtained ideas for further features to be developed in TEXNET.…

  7. Selforganization of modular activity of grid cells

    PubMed Central

    Urdapilleta, Eugenio; Si, Bailu

    2017-01-01

    Abstract A unique topographical representation of space is found in the concerted activity of grid cells in the rodent medial entorhinal cortex. Many among the principal cells in this region exhibit a hexagonal firing pattern, in which each cell expresses its own set of place fields (spatial phases) at the vertices of a triangular grid, the spacing and orientation of which are typically shared with neighboring cells. Grid spacing, in particular, has been found to increase along the dorso‐ventral axis of the entorhinal cortex but in discrete steps, that is, with a modular structure. In this study, we show that such a modular activity may result from the self‐organization of interacting units, which individually would not show discrete but rather continuously varying grid spacing. Within our “adaptation” network model, the effect of a continuously varying time constant, which determines grid spacing in the isolated cell model, is modulated by recurrent collateral connections, which tend to produce a few subnetworks, akin to magnetic domains, each with its own grid spacing. In agreement with experimental evidence, the modular structure is tightly defined by grid spacing, but also involves grid orientation and distortion, due to interactions across modules. Thus, our study sheds light onto a possible mechanism, other than simply assuming separate networks a priori, underlying the formation of modular grid representations. PMID:28768062

  8. Towards cortex sized artificial neural systems.

    PubMed

    Johansson, Christopher; Lansner, Anders

    2007-01-01

    We propose, implement, and discuss an abstract model of the mammalian neocortex. This model is instantiated with a sparse recurrently connected neural network that has spiking leaky integrator units and continuous Hebbian learning. First we study the structure, modularization, and size of neocortex, and then we describe a generic computational model of the cortical circuitry. A characterizing feature of the model is that it is based on the modularization of neocortex into hypercolumns and minicolumns. Both a floating- and fixed-point arithmetic implementation of the model are presented along with simulation results. We conclude that an implementation on a cluster computer is not communication but computation bounded. A mouse and rat cortex sized version of our model executes in 44% and 23% of real-time respectively. Further, an instance of the model with 1.6 x 10(6) units and 2 x 10(11) connections performed noise reduction and pattern completion. These implementations represent the current frontier of large-scale abstract neural network simulations in terms of network size and running speed.

  9. Automatic processing of spoken dialogue in the home hemodialysis domain.

    PubMed

    Lacson, Ronilda; Barzilay, Regina

    2005-01-01

    Spoken medical dialogue is a valuable source of information, and it forms a foundation for diagnosis, prevention and therapeutic management. However, understanding even a perfect transcript of spoken dialogue is challenging for humans because of the lack of structure and the verbosity of dialogues. This work presents a first step towards automatic analysis of spoken medical dialogue. The backbone of our approach is an abstraction of a dialogue into a sequence of semantic categories. This abstraction uncovers structure in informal, verbose conversation between a caregiver and a patient, thereby facilitating automatic processing of dialogue content. Our method induces this structure based on a range of linguistic and contextual features that are integrated in a supervised machine-learning framework. Our model has a classification accuracy of 73%, compared to 33% achieved by a majority baseline (p<0.01). This work demonstrates the feasibility of automatically processing spoken medical dialogue.

  10. Programming languages for circuit design.

    PubMed

    Pedersen, Michael; Yordanov, Boyan

    2015-01-01

    This chapter provides an overview of a programming language for Genetic Engineering of Cells (GEC). A GEC program specifies a genetic circuit at a high level of abstraction through constraints on otherwise unspecified DNA parts. The GEC compiler then selects parts which satisfy the constraints from a given parts database. GEC further provides more conventional programming language constructs for abstraction, e.g., through modularity. The GEC language and compiler is available through a Web tool which also provides functionality, e.g., for simulation of designed circuits.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, J.E.

    Many robotic operations, e.g., mapping, scanning, feature following, etc., require accurate surface following of arbitrary targets. This paper presents a versatile surface following and mapping system designed to promote hardware, software and application independence, modular development, and upward expandability. These goals are met by: a full, a priori specification of the hardware and software interfaces; a modular system architecture; and a hierarchical surface-data analysis method, permitting application specific tuning at each conceptual level of topological abstraction. This surface following system was fully designed and independently of any specific robotic host, then successfully integrated with and demonstrated on a completely amore » priori unknown, real-time robotic system. 7 refs.« less

  12. A Modular Flow Design for the meta‐Selective C−H Arylation of Anilines

    PubMed Central

    Gemoets, Hannes P. L.; Laudadio, Gabriele; Verstraete, Kirsten; Hessel, Volker

    2017-01-01

    Abstract Described herein is an effective and practical modular flow design for the meta‐selective C−H arylation of anilines. The design consists of four continuous‐flow modules (i.e., diaryliodonium salt synthesis, meta‐selective C−H arylation, inline copper extraction, and aniline deprotection) which can be operated either individually or consecutively to provide direct access to meta‐arylated anilines. With a total residence time of 1 hour, the desired product could be obtained in high yield and excellent purity without the need for column chromatography, and the residual copper content meets the standards for parenterally administered pharmaceutical substances. PMID:28543979

  13. Visual mismatch negativity indicates automatic, task-independent detection of artistic image composition in abstract artworks.

    PubMed

    Menzel, Claudia; Kovács, Gyula; Amado, Catarina; Hayn-Leichsenring, Gregor U; Redies, Christoph

    2018-05-06

    In complex abstract art, image composition (i.e., the artist's deliberate arrangement of pictorial elements) is an important aesthetic feature. We investigated whether the human brain detects image composition in abstract artworks automatically (i.e., independently of the experimental task). To this aim, we studied whether a group of 20 original artworks elicited a visual mismatch negativity when contrasted with a group of 20 images that were composed of the same pictorial elements as the originals, but in shuffled arrangements, which destroy artistic composition. We used a passive oddball paradigm with parallel electroencephalogram recordings to investigate the detection of image type-specific properties. We observed significant deviant-standard differences for the shuffled and original images, respectively. Furthermore, for both types of images, differences in amplitudes correlated with the behavioral ratings of the images. In conclusion, we show that the human brain can detect composition-related image properties in visual artworks in an automatic fashion. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Modularizing Spatial Ontologies for Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Hois, Joana

    Assisted living systems are intended to support daily-life activities in user homes by automatizing and monitoring behavior of the environment while interacting with the user in a non-intrusive way. The knowledge base of such systems therefore has to define thematically different aspects of the environment mostly related to space, such as basic spatial floor plan information, pieces of technical equipment in the environment and their functions and spatial ranges, activities users can perform, entities that occur in the environment, etc. In this paper, we present thematically different ontologies, each of which describing environmental aspects from a particular perspective. The resulting modular structure allows the selection of application-specific ontologies as necessary. This hides information and reduces complexity in terms of the represented spatial knowledge and reasoning practicability. We motivate and present the different spatial ontologies applied to an ambient assisted living application.

  15. Information Leakage Analysis by Abstract Interpretation

    NASA Astrophysics Data System (ADS)

    Zanioli, Matteo; Cortesi, Agostino

    Protecting the confidentiality of information stored in a computer system or transmitted over a public network is a relevant problem in computer security. The approach of information flow analysis involves performing a static analysis of the program with the aim of proving that there will not be leaks of sensitive information. In this paper we propose a new domain that combines variable dependency analysis, based on propositional formulas, and variables' value analysis, based on polyhedra. The resulting analysis is strictly more accurate than the state of the art abstract interpretation based analyses for information leakage detection. Its modular construction allows to deal with the tradeoff between efficiency and accuracy by tuning the granularity of the abstraction and the complexity of the abstract operators.

  16. Modular Neural Networks for Speech Recognition.

    DTIC Science & Technology

    1996-08-01

    automatic speech rccogni- tion, understanding and translation since the early 1950’ s . Although researchers have demonstrated impressive results with...nodes. It serves only as a data source for the following hidden layer( s ). Finally, the networks output is computed by neurons in the output layer. The...following update rule for weights in the hidden layer: w (,,•+I) ("’) E/V S (W W k- = wj, -- 7 - / v It is easy to generalize the backpropagation

  17. A Strategy for Efficiently Verifying Requirements Specifications Using Composition and Invariants

    DTIC Science & Technology

    2003-09-05

    Colle - sur - Loup , France, Oct. 1984. Springer-Verlag. [34] J. Ramish. Empirical studies of compositional abstraction. Technical report, Naval Research...global to modular temporal rea- soning about programs. In K. R. Apt, editor, Proc. NATO Adv. Study Inst. on Logics and Models of Concurrent Systems, La

  18. A Thesaurus for Use in a Computer-Aided Abstracting Tool Kit.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    1993-01-01

    Discusses the use of thesauri in automatic indexing and describes the development of a prototype computerized abstractor's assistant. Topics addressed include TEXNET, a text network management system; the use of TEXNET for abstracting; the structure and use of a thesaurus for abstracting in TEXNET; and weighted terms. (Contains 26 references.)…

  19. Quantification of complex modular architecture in plants.

    PubMed

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  20. Different micromanipulation applications based on common modular control architecture

    NASA Astrophysics Data System (ADS)

    Sipola, Risto; Vallius, Tero; Pudas, Marko; Röning, Juha

    2010-01-01

    This paper validates a previously introduced scalable modular control architecture and shows how it can be used to implement research equipment. The validation is conducted by presenting different kinds of micromanipulation applications that use the architecture. Conditions of the micro-world are very different from those of the macro-world. Adhesive forces are significant compared to gravitational forces when micro-scale objects are manipulated. Manipulation is mainly conducted by automatic control relying on haptic feedback provided by force sensors. The validated architecture is a hierarchical layered hybrid architecture, including a reactive layer and a planner layer. The implementation of the architecture is modular, and the architecture has a lot in common with open architectures. Further, the architecture is extensible, scalable, portable and it enables reuse of modules. These are the qualities that we validate in this paper. To demonstrate the claimed features, we present different applications that require special control in micrometer, millimeter and centimeter scales. These applications include a device that measures cell adhesion, a device that examines properties of thin films, a device that measures adhesion of micro fibers and a device that examines properties of submerged gel produced by bacteria. Finally, we analyze how the architecture is used in these applications.

  1. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  2. Development of spray guns for the application of rigid foam insulation

    NASA Technical Reports Server (NTRS)

    Allen, Peter B.

    1993-01-01

    The paper describes the activities initiated to improve the existing spray gun system used for spraying insulating foam on the External Tank of the Space Shuttle, due to the quality variations of the applied foam noted in the past. Consideration is given to the two tasks of the project: (1) investigations of possible improvements, as an interim measure, to the spray gun currently used to apply the large acreage spray-on-foam insulation and the evaluation of other commercial equipment; and (2) the design and fabrication of a new automatic spray gun. The design and operation of the currently used Binks 43 PA spray gun are described together with several new breadboard spray guns designed and fabricated and the testing procedures developed. These new guns include the Modular Automatic Foam spray gun, the Ball Valve spray gun, and the Tapered Plug Valve (TPV) gun. As a result of tests, the TPV spray gun is recommended to replace the currently used automatic spray gun.

  3. Modular robotic assembly of small devices.

    PubMed

    Frauenfelder, M

    2000-01-01

    The use of robots for the automatic assembly of devices of up to 100 x 100 x 100 mm is relatively uncommon today. Insufficient return on investment and the long lead times that are required have been limiting factors. Innovations in vision technology have led to the development of robotic assembly systems that employ flexible part-feeding. The benefits of these systems are described, which suggest that better ratios of price to productivity and deployment times are now achievable.

  4. Terrain following of arbitrary surfaces using a high intensity LED proximity sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, J.E.

    1992-01-01

    Many robotic operations, e.g., mapping, scanning, feature following, etc., require accurate surface following of arbitrary targets. This paper presents a versatile surface following and mapping system designed to promote hardware, software and application independence, modular development, and upward expandability. These goals are met by: a full, a priori specification of the hardware and software interfaces; a modular system architecture; and a hierarchical surface-data analysis method, permitting application specific tuning at each conceptual level of topological abstraction. This surface following system was fully designed and independently of any specific robotic host, then successfully integrated with and demonstrated on a completely amore » priori unknown, real-time robotic system. 7 refs.« less

  5. An Approach to Automated Fusion System Design and Adaptation

    PubMed Central

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-01-01

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum. PMID:28300762

  6. An Approach to Automated Fusion System Design and Adaptation.

    PubMed

    Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker

    2017-03-16

    Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.

  7. FUSE: a profit maximization approach for functional summarization of biological networks.

    PubMed

    Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes; Yu, Hanry

    2012-03-21

    The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein interaction network (PPI) using graph theoretic analysis. Despite the recent progress, systems level analysis of PPIS remains a daunting task as it is challenging to make sense out of the deluge of high-dimensional interaction data. Specifically, techniques that automatically abstract and summarize PPIS at multiple resolutions to provide high level views of its functional landscape are still lacking. We present a novel data-driven and generic algorithm called FUSE (Functional Summary Generator) that generates functional maps of a PPI at different levels of organization, from broad process-process level interactions to in-depth complex-complex level interactions, through a pro t maximization approach that exploits Minimum Description Length (MDL) principle to maximize information gain of the summary graph while satisfying the level of detail constraint. We evaluate the performance of FUSE on several real-world PPIS. We also compare FUSE to state-of-the-art graph clustering methods with GO term enrichment by constructing the biological process landscape of the PPIS. Using AD network as our case study, we further demonstrate the ability of FUSE to quickly summarize the network and identify many different processes and complexes that regulate it. Finally, we study the higher-order connectivity of the human PPI. By simultaneously evaluating interaction and annotation data, FUSE abstracts higher-order interaction maps by reducing the details of the underlying PPI to form a functional summary graph of interconnected functional clusters. Our results demonstrate its effectiveness and superiority over state-of-the-art graph clustering methods with GO term enrichment.

  8. An image engineering system for the inspection of transparent construction materials

    NASA Astrophysics Data System (ADS)

    Hinz, S.; Stephani, M.; Schiemann, L.; Zeller, K.

    This article presents a modular photogrammetric recording and image analysis system for inspecting the material characteristics of transparent foils, in particular Ethylen-TetraFluorEthylen-Copolymer (ETFE) foils. The foils are put under increasing air pressure and are observed by a stereo camera system. Determining the time-variable 3D shape of transparent material imposes a number of challenges: especially the automatic point transfer between stereo images and, in temporal domain, from one image pair to the next. We developed an automatic approach that accommodates for these particular circumstances and allows reconstruction of the 3D shape for each epoch as well as determining 3D translation vectors between epochs by feature tracking. Examples including numerical results and accuracy measures prove the applicability of the system.

  9. Automatic Publication of a MIS Product to GeoNetwork: Case of the AIS Indexer

    DTIC Science & Technology

    2012-11-01

    installation and configuration The following instructions are for installing and configuring the software packages Java 1.6 and MySQL 5.5 which are...An Automatic Identification System (AIS) reception indexer Java application was developed in the summer of 2011, based on the work of Lapinski and...release; distribution unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT An Automatic Identification System (AIS) reception indexer Java application was

  10. Modular theory of inverse systems

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The relationship between multivariable zeros and inverse systems was explored. A definition of zero module is given in such a way that it is basis independent. The existence of essential right and left inverses were established. The way in which the abstract zero module captured previous definitions of multivariable zeros is explained and examples are presented.

  11. Toward Routine Automatic Pathway Discovery from On-line Scientific Text Abstracts.

    PubMed

    Ng; Wong

    1999-01-01

    We are entering a new era of research where the latest scientific discoveries are often first reported online and are readily accessible by scientists worldwide. This rapid electronic dissemination of research breakthroughs has greatly accelerated the current pace in genomics and proteomics research. The race to the discovery of a gene or a drug has now become increasingly dependent on how quickly a scientist can scan through voluminous amount of information available online to construct the relevant picture (such as protein-protein interaction pathways) as it takes shape amongst the rapidly expanding pool of globally accessible biological data (e.g. GENBANK) and scientific literature (e.g. MEDLINE). We describe a prototype system for automatic pathway discovery from on-line text abstracts, combining technologies that (1) retrieve research abstracts from online sources, (2) extract relevant information from the free texts, and (3) present the extracted information graphically and intuitively. Our work demonstrates that this framework allows us to routinely scan online scientific literature for automatic discovery of knowledge, giving modern scientists the necessary competitive edge in managing the information explosion in this electronic age.

  12. FunBlocks. A modular framework for AmI system development.

    PubMed

    Baquero, Rafael; Rodríguez, José; Mendoza, Sonia; Decouchant, Dominique; Papis, Alfredo Piero Mateos

    2012-01-01

    The last decade has seen explosive growth in the technologies required to implement Ambient Intelligence (AmI) systems. Technologies such as facial and speech recognition, home networks, household cleaning robots, to name a few, have become commonplace. However, due to the multidisciplinary nature of AmI systems and the distinct requirements of different user groups, integrating these developments into full-scale systems is not an easy task. In this paper we propose FunBlocks, a minimalist modular framework for the development of AmI systems based on the function module abstraction used in the IEC 61499 standard for distributed control systems. FunBlocks provides a framework for the development of AmI systems through the integration of modules loosely joined by means of an event-driven middleware and a module and sensor/actuator catalog. The modular design of the FunBlocks framework allows the development of AmI systems which can be customized to a wide variety of usage scenarios.

  13. Modular 3D-Printed Soil Gas Probes

    NASA Astrophysics Data System (ADS)

    Good, S. P.; Selker, J. S.; Al-Qqaili, F.; Lopez, M.; Kahel, L.

    2016-12-01

    ABSTRACT: Extraction of soil gas is required for a variety of applications in earth sciences and environmental engineering. However, commercially available probes can be costly and are typically limited to a single depth. Here, we present the open-source design and lab testing of a soil gas probe with modular capabilities that allow for the vertical stacking of gas extraction points at different depths in the soil column. The probe modules consist of a 3D printed spacer unit and hydrophobic gas permeable membrane made of high density Polyethylene with pore sizes 20-40 microns. Each of the modular spacer units contain both a gas extraction line and gas input line for the dilution of soil gases if needed. These 2-inch diameter probes can be installed in the field quickly with a hand auger and returned to at any frequency to extract soil gas from desired soil depths. The probes are tested through extraction of soil pore water vapors with distinct stable isotope ratios.

  14. FunBlocks. A Modular Framework for AmI System Development

    PubMed Central

    Baquero, Rafael; Rodríguez, José; Mendoza, Sonia; Decouchant, Dominique; Papis, Alfredo Piero Mateos

    2012-01-01

    The last decade has seen explosive growth in the technologies required to implement Ambient Intelligence (AmI) systems. Technologies such as facial and speech recognition, home networks, household cleaning robots, to name a few, have become commonplace. However, due to the multidisciplinary nature of AmI systems and the distinct requirements of different user groups, integrating these developments into full-scale systems is not an easy task. In this paper we propose FunBlocks, a minimalist modular framework for the development of AmI systems based on the function module abstraction used in the IEC 61499 standard for distributed control systems. FunBlocks provides a framework for the development of AmI systems through the integration of modules loosely joined by means of an event-driven middleware and a module and sensor/actuator catalog. The modular design of the FunBlocks framework allows the development of AmI systems which can be customized to a wide variety of usage scenarios. PMID:23112599

  15. A comparison of the comfort and convenience of automatic safety belt systems among selected 1988-1989 model year automobiles

    DOT National Transportation Integrated Search

    1989-06-01

    Author's abstract: A nonrandom sample of 120 disproportionately short, tall, and overweight drivers compared the comfort and convenience of the automatic safety belt systems used in seventeen automobiles. Nine vehicles had motorized shoulder belts wi...

  16. Resource Aware Intelligent Network Services (RAINS) Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, Tom; Yang, Xi

    The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyber infrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum ofmore » compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyber infrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less

  17. Automatic evidence retrieval for systematic reviews.

    PubMed

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  18. Experiments in Multi-Lingual Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, Gerard

    A comparison was made of the performance in an automatic information retrieval environment of user queries and document abstracts available in natural language form in both English and French. The results obtained indicate that the automatic indexing and retrieval techniques actually used appear equally effective in handling the query and document…

  19. Development of a beam builder for automatic fabrication of large composite space structures

    NASA Technical Reports Server (NTRS)

    Bodle, J. G.

    1979-01-01

    The composite material beam builder which will produce triangular beams from pre-consolidated graphite/glass/thermoplastic composite material through automated mechanical processes is presented, side member storage, feed and positioning, ultrasonic welding, and beam cutoff are formed. Each process lends itself to modular subsystem development. Initial development is concentrated on the key processes for roll forming and ultrasonic welding composite thermoplastic materials. The construction and test of an experimental roll forming machine and ultrasonic welding process control techniques are described.

  20. Launch Processing System. [for Space Shuttle

    NASA Technical Reports Server (NTRS)

    Byrne, F.; Doolittle, G. V.; Hockenberger, R. W.

    1976-01-01

    This paper presents a functional description of the Launch Processing System, which provides automatic ground checkout and control of the Space Shuttle launch site and airborne systems, with emphasis placed on the Checkout, Control, and Monitor Subsystem. Hardware and software modular design concepts for the distributed computer system are reviewed relative to performing system tests, launch operations control, and status monitoring during ground operations. The communication network design, which uses a Common Data Buffer interface to all computers to allow computer-to-computer communication, is discussed in detail.

  1. HAL/S - The programming language for Shuttle

    NASA Technical Reports Server (NTRS)

    Martin, F. H.

    1974-01-01

    HAL/S is a higher order language and system, now operational, adopted by NASA for programming Space Shuttle on-board software. Program reliability is enhanced through language clarity and readability, modularity through program structure, and protection of code and data. Salient features of HAL/S include output orientation, automatic checking (with strictly enforced compiler rules), the availability of linear algebra, real-time control, a statement-level simulator, and compiler transferability (for applying HAL/S to additional object and host computers). The compiler is described briefly.

  2. Construct Abstraction for Automatic Information Abstraction from Digital Images

    DTIC Science & Technology

    2006-05-30

    objects and features and the names of objects of objects and features. For example, in Figure 15 the parts of the fish could be named the ‘mouth... fish -1 fish -2 fish -3 tennis shoe tennis racquet...of abstraction and generality. For example, an algorithm might usefully find a polygon ( blob ) in an image and calculate numbers such as the

  3. An object-oriented forest landscape model and its representation of tree species

    Treesearch

    Hong S. He; David J. Mladenoff; Joel Boeder

    1999-01-01

    LANDIS is a forest landscape model that simulates the interaction of large landscape processes and forest successional dynamics at tree species level. We discuss how object-oriented design (OOD) approaches such as modularity, abstraction and encapsulation are integrated into the design of LANDIS. We show that using OOD approaches, model decisions (olden as model...

  4. Delivering Advanced Methods in Mathematical Programming to Students of All Disciplines Using Abstraction, Modularity and Open-Ended Assignments

    ERIC Educational Resources Information Center

    Ezra, Elishai; Nahmias, Yaakov

    2015-01-01

    The advent of integrated multidisciplinary research has given rise to some of the most important breakthroughs of our time, but has also set significant challenges to the current educational paradigm. Current academic education often limits cross-discipline discussion, depends on close-ended problems, and restricts utilization of interdisciplinary…

  5. A study of an arbiter function in the structures of a shared bus

    NASA Astrophysics Data System (ADS)

    Seck, J.-P.

    The results of a comparative study of synchronous and asynchronous arbiters for managing user access to a shared bus is presented. The best available method is determined to be modular arbiter structures attached only to the decision module. Linear and circular arbitration strategies are examined for suitability for automatic decision-making. A multiple strategies arbiter scheme is devised, involving the superposition of various strategies of one sequential machine into another. It is then possible to modify the strategy on-line if the current strategy is ineffective. The utilization of a multiple structure of cascading arbiter devices is noted to be effective if response time is not a critical matter. Finally, attention is given to automatic circuit testing and fault detection. An example is furnished in terms of a management system for a shared memory in a multimicroprocessor structure.

  6. ELSA: An integrated, semi-automated nebular abundance package

    NASA Astrophysics Data System (ADS)

    Johnson, Matthew D.; Levitt, Jesse S.; Henry, Richard B. C.; Kwitter, Karen B.

    We present ELSA, a new modular software package, written in C, to analyze and manage spectroscopic data from emission-line objects. In addition to calculating plasma diagnostics and abundances from nebular emission lines, the software provides a number of convenient features including the ability to ingest logs produced by IRAF's splot task, to semi-automatically merge spectra in different wavelength ranges, and to automatically generate various data tables in machine-readable or LaTeX format. ELSA features a highly sophisticated interstellar reddening correction scheme that takes into account temperature and density effects as well as He II contamination of the hydrogen Balmer lines. Abundance calculations are performed using a 5-level atom approximation with recent atomic data, based on R. Henry's ABUN program. Downloading and detailed documentation for all aspects of ELSA are available at the following URL:

  7. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  8. Modular rate laws for enzymatic reactions: thermodynamics, elasticities and implementation.

    PubMed

    Liebermeister, Wolfram; Uhlendorf, Jannis; Klipp, Edda

    2010-06-15

    Standard rate laws are a key requisite for systematically turning metabolic networks into kinetic models. They should provide simple, general and biochemically plausible formulae for reaction velocities and reaction elasticities. At the same time, they need to respect thermodynamic relations between the kinetic constants and the metabolic fluxes and concentrations. We present a family of reversible rate laws for reactions with arbitrary stoichiometries and various types of regulation, including mass-action, Michaelis-Menten and uni-uni reversible Hill kinetics as special cases. With a thermodynamically safe parameterization of these rate laws, parameter sets obtained by model fitting, sampling or optimization are guaranteed to lead to consistent chemical equilibrium states. A reformulation using saturation values yields simple formulae for rates and elasticities, which can be easily adjusted to the given stationary flux distributions. Furthermore, this formulation highlights the role of chemical potential differences as thermodynamic driving forces. We compare the modular rate laws to the thermodynamic-kinetic modelling formalism and discuss a simplified rate law in which the reaction rate directly depends on the reaction affinity. For automatic handling of modular rate laws, we propose a standard syntax and semantic annotations for the Systems Biology Markup Language. An online tool for inserting the rate laws into SBML models is freely available at www.semanticsbml.org. Supplementary data are available at Bioinformatics online.

  9. AUTOMATIC DETECTION OF VEGETATION CHANGES IN THE SOUTHWESTERN UNITED STATES USING REMOTELY SENSED IMAGES. (R825152)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  10. Automatic Evidence Retrieval for Systematic Reviews

    PubMed Central

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G

    2014-01-01

    Background Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing’s effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Objective Our goal was to evaluate an automatic method for citation snowballing’s capacity to identify and retrieve the full text and/or abstracts of cited articles. Methods Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. Results The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. Conclusions The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews. PMID:25274020

  11. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.

    PubMed

    d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe

    2013-01-01

    The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.

  12. An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications

    PubMed Central

    2013-01-01

    Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970

  13. Service Without Servers

    DTIC Science & Technology

    1993-08-01

    Abstract We propose a new style of operating system architecture appropriate for microkernel -based operating sys- tems: services are implemented as a...retaining all the modularity advantages of microkernel technology. Since services reside in libraries, an application is free to use the library that...U.S. Government. 93-23976,. . I~lUI5E NIIA Keywords: Operating Systems, Microkernel , Network communication, File organization 1. Introduction In the

  14. Wireless Sensor Networks--A Hands-On Modular Experiments Platform for Enhanced Pedagogical Learning

    ERIC Educational Resources Information Center

    Taslidere, E.; Cohen, F. S.; Reisman, F. K.

    2011-01-01

    This paper presents the use of wireless sensor networks (WSNs) in educational research as a platform for enhanced pedagogical learning. The aim here with the use of a WSN platform was to go beyond the implementation stage to the real-life application stage, i.e., linking the implementation to real-life applications, where abstract theory and…

  15. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  16. The OPEnSampler: A Low-Cost, Low-Weight, Customizable and Modular Open Source 24-Unit Automatic Water Sampler

    NASA Astrophysics Data System (ADS)

    Nelke, M.; Selker, J. S.; Udell, C.

    2017-12-01

    Reliable automatic water samplers allow repetitive sampling of various water sources over long periods of time without requiring a researcher on site, reducing human error as well as the monetary and time costs of traveling to the field, particularly when the scale of the sample period is hours or days. The high fixed cost of buying a commercial sampler with little customizability can be a barrier to research requiring repetitive samples, such as the analysis of septic water pre- and post-treatment. DIY automatic samplers proposed in the past sacrifice maximum volume, customizability, or scope of applications, among other features, in exchange for a lower net cost. The purpose of this project was to develop a low-cost, highly customizable, robust water sampler that is capable of sampling many sources of water for various analytes. A lightweight aluminum-extrusion frame was designed and assembled, chosen for its mounting system, strength, and low cost. Water is drawn from two peristaltic pumps through silicone tubing and directed into 24 foil-lined 250mL bags using solenoid valves. A programmable Arduino Uno microcontroller connected to a circuit board communicates with a battery operated real-time clock, initiating sampling stages. Period and volume settings are programmable in-field by the user via serial commands. The OPEnSampler is an open design, allowing the user to decide what components to use and the modular theme of the frame allows fast mounting of new manufactured or 3D printed components. The 24-bag system weighs less than 10kg and the material cost is under $450. Up to 6L of sample water can be drawn at a rate of 100mL/minute in either direction. Faster flowrates are achieved by using more powerful peristaltic pumps. Future design changes could allow a greater maximum volume by filling the unused space with more containers and adding GSM communications to send real time status information.

  17. AMBIENT AIR POLLUTION AND ARRHYTHMIC EVENTS IN PATIENTS WITH AUTOMATIC IMPLANTABLE CARDIOVERTER DEFIBRILLATORS, ATLANTA, 1993-2000 (ARIES/SOPHIA). (R829213)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  18. Automatic identification of abstract online groups

    DOEpatents

    Engel, David W; Gregory, Michelle L; Bell, Eric B; Cowell, Andrew J; Piatt, Andrew W

    2014-04-15

    Online abstract groups, in which members aren't explicitly connected, can be automatically identified by computer-implemented methods. The methods involve harvesting records from social media and extracting content-based and structure-based features from each record. Each record includes a social-media posting and is associated with one or more entities. Each feature is stored on a data storage device and includes a computer-readable representation of an attribute of one or more records. The methods further involve grouping records into record groups according to the features of each record. Further still the methods involve calculating an n-dimensional surface representing each record group and defining an outlier as a record having feature-based distances measured from every n-dimensional surface that exceed a threshold value. Each of the n-dimensional surfaces is described by a footprint that characterizes the respective record group as an online abstract group.

  19. Distinguishing Man from Molecules: The Distinctiveness of Medical Concepts at Different Levels of Description

    PubMed Central

    Cole, William G.; Michael, Patricia; Blois, Marsden S.

    1987-01-01

    A computer program was created to use information about the statistical distribution of words in journal abstracts to make probabilistic judgments about the level of description (e.g. molecular, cell, organ) of medical text. Statistical analysis of 7,409 journal abstracts taken from three medical journals representing distinct levels of description revealed that many medical words seem to be highly specific to one or another level of description. For example, the word adrenoreceptors occurred only in the American Journal of Physiology, never in Journal of Biological Chemistry or in Journal of American Medical Association. Such highly specific words occured so frequently that the automatic classification program was able to classify correctly 45 out of 45 test abstracts, with 100% confidence. These findings are interpreted in terms of both a theory of the structure of medical knowledge and the pragmatics of automatic classification.

  20. Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines

    NASA Astrophysics Data System (ADS)

    Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.

    2017-01-01

    Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.

  1. SMITHERS: An object-oriented modular mapping methodology for MCNP-based neutronic–thermal hydraulic multiphysics

    DOE PAGES

    Richard, Joshua; Galloway, Jack; Fensin, Michael; ...

    2015-04-04

    A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less

  2. Run Environment and Data Management for Earth System Models

    NASA Astrophysics Data System (ADS)

    Widmann, H.; Lautenschlager, M.; Fast, I.; Legutke, S.

    2009-04-01

    The Integrating Model and Data Infrastructure (IMDI) developed and maintained by the Model and Data Group (M&D) comprises the Standard Compile Environment (SCE) and the Standard Run Environment (SRE). The IMDI software has a modular design, which allows to combine and couple a suite of model components and as well to execute the tasks independently and on various platforms. Furthermore the modular structure enables the extension to new model combinations and new platforms. The SRE presented here enables the configuration and performance of earth system model experiments from model integration up to storage and visualization of data. We focus on recently implemented tasks such as synchronous data base filling, graphical monitoring and automatic generation of meta data in XML forms during run time. As well we address the capability to run experiments in heterogeneous IT environments with different computing systems for model integration, data processing and storage. These features are demonstrated for model configurations and on platforms used in current or upcoming projects, e.g. MILLENNIUM or IPCC AR5.

  3. Gapless edges of 2d topological orders and enriched monoidal categories

    NASA Astrophysics Data System (ADS)

    Kong, Liang; Zheng, Hao

    2018-02-01

    In this work, we give a mathematical description of a chiral gapless edge of a 2d topological order (without symmetry). We show that the observables on the 1+1D world sheet of such an edge consist of a family of topological edge excitations, boundary CFT's and walls between boundary CFT's. These observables can be described by a chiral algebra and an enriched monoidal category. This mathematical description automatically includes that of gapped edges as special cases. Therefore, it gives a unified framework to study both gapped and gapless edges. Moreover, the boundary-bulk duality also holds for gapless edges. More precisely, the unitary modular tensor category that describes the 2d bulk phase is exactly the Drinfeld center of the enriched monoidal category that describes the gapless/gapped edge. We propose a classification of all gapped and chiral gapless edges of a given bulk phase. In the end, we explain how modular-invariant bulk rational conformal field theories naturally emerge on certain gapless walls between two trivial phases.

  4. TOPEX electrical power system

    NASA Technical Reports Server (NTRS)

    Chetty, P. R. K.; Roufberg, Lew; Costogue, Ernest

    1991-01-01

    The TOPEX mission requirements which impact the power requirements and analyses are presented. A description of the electrical power system (EPS), including energy management and battery charging methods that were conceived and developed to meet the identified satellite requirements, is included. Analysis of the TOPEX EPS confirms that all of its electrical performance and reliability requirements have been met. The TOPEX EPS employs the flight-proven modular power system (MPS) which is part of the Multimission Modular Spacecraft and provides high reliability, abbreviated development effort and schedule, and low cost. An energy balance equation, unique to TOPEX, has been derived to confirm that the batteries will be completely recharged following each eclipse, under worst-case conditions. TOPEX uses three NASA Standard 50AH Ni-Cd batteries, each with 22 cells in series. The MPS contains battery charge control and protection based on measurements of battery currents, voltages, temperatures, and computed depth-of-discharge. In case of impending battery depletion, the MPS automatically implements load shedding.

  5. Development of Integrated Modular Avionics Application Based on Simulink and XtratuM

    NASA Astrophysics Data System (ADS)

    Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons

    2013-08-01

    This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.

  6. Modular implementation of a digital hardware design automation system

    NASA Astrophysics Data System (ADS)

    Masud, M.

    An automation system based on AHPL (A Hardware Programming Language) was developed. The project may be divided into three distinct phases: (1) Upgrading of AHPL to make it more universally applicable; (2) Implementation of a compiler for the language; and (3) illustration of how the compiler may be used to support several phases of design activities. Several new features were added to AHPL. These include: application-dependent parameters, mutliple clocks, asynchronous results, functional registers and primitive functions. The new language, called Universal AHPL, has been defined rigorously. The compiler design is modular. The parsing is done by an automatic parser generated from the SLR(1)BNF grammar of the language. The compiler produces two data bases from the AHPL description of a circuit. The first one is a tabular representation of the circuit, and the second one is a detailed interconnection linked list. The two data bases provide a means to interface the compiler to application-dependent CAD systems.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marchisio, Mario Andrea, E-mail: marchisio@hit.edu.cn

    Published in 2008, Parts & Pools represents one of the first attempts to conceptualize the modular design of bacterial synthetic gene circuits with Standard Biological Parts (DNA segments) and Pools of molecules referred to as common signal carriers (e.g., RNA polymerases and ribosomes). The original framework for modeling bacterial components and designing prokaryotic circuits evolved over the last years and brought, first, to the development of an algorithm for the automatic design of Boolean gene circuits. This is a remarkable achievement since gene digital circuits have a broad range of applications that goes from biosensors for health and environment caremore » to computational devices. More recently, Parts & Pools was enabled to give a proper formal description of eukaryotic biological circuit components. This was possible by employing a rule-based modeling approach, a technique that permits a faithful calculation of all the species and reactions involved in complex systems such as eukaryotic cells and compartments. In this way, Parts & Pools is currently suitable for the visual and modular design of synthetic gene circuits in yeast and mammalian cells too.« less

  8. Precipitation-runoff modeling system; user's manual

    USGS Publications Warehouse

    Leavesley, G.H.; Lichty, R.W.; Troutman, B.M.; Saindon, L.G.

    1983-01-01

    The concepts, structure, theoretical development, and data requirements of the precipitation-runoff modeling system (PRMS) are described. The precipitation-runoff modeling system is a modular-design, deterministic, distributed-parameter modeling system developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow, sediment yields, and general basin hydrology. Basin response to normal and extreme rainfall and snowmelt can be simulated to evaluate changes in water balance relationships, flow regimes, flood peaks and volumes, soil-water relationships, sediment yields, and groundwater recharge. Parameter-optimization and sensitivity analysis capabilites are provided to fit selected model parameters and evaluate their individual and joint effects on model output. The modular design provides a flexible framework for continued model system enhancement and hydrologic modeling research and development. (Author 's abstract)

  9. Identification of Modules in Protein-Protein Interaction Networks

    NASA Astrophysics Data System (ADS)

    Erten, Sinan; Koyutürk, Mehmet

    In biological systems, most processes are carried out through orchestration of multiple interacting molecules. These interactions are often abstracted using network models. A key feature of cellular networks is their modularity, which contributes significantly to the robustness, as well as adaptability of biological systems. Therefore, modularization of cellular networks is likely to be useful in obtaining insights into the working principles of cellular systems, as well as building tractable models of cellular organization and dynamics. A common, high-throughput source of data on molecular interactions is in the form of physical interactions between proteins, which are organized into protein-protein interaction (PPI) networks. This chapter provides an overview on identification and analysis of functional modules in PPI networks, which has been an active area of research in the last decade.

  10. Argo: an integrative, interactive, text mining-based workbench supporting curation

    PubMed Central

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844

  11. Large space erectable structures - building block structures study

    NASA Technical Reports Server (NTRS)

    Armstrong, W. H.; Skoumal, D. E.; Straayer, J. W.

    1977-01-01

    A modular planar truss structure and a long slender boom concept identified as building block approaches to construction of large spacecraft configurations are described. The concepts are compatible in weight and volume goals with the Space Transportation System, use standard structural units, and represent high on-orbit productivity in terms of structural area or beam length. Results of structural trade studies involving static and dynamic analyses of a single module and rigid body deployment analyses to assess kinetics and kinematics of automatic deployment of the building block modules are presented.

  12. Standardized mappings--a framework to combine different semantic mappers into a standardized web-API.

    PubMed

    Neuhaus, Philipp; Doods, Justin; Dugas, Martin

    2015-01-01

    Automatic coding of medical terms is an important, but highly complicated and laborious task. To compare and evaluate different strategies a framework with a standardized web-interface was created. Two UMLS mapping strategies are compared to demonstrate the interface. The framework is a Java Spring application running on a Tomcat application server. It accepts different parameters and returns results in JSON format. To demonstrate the framework, a list of medical data items was mapped by two different methods: similarity search in a large table of terminology codes versus search in a manually curated repository. These mappings were reviewed by a specialist. The evaluation shows that the framework is flexible (due to standardized interfaces like HTTP and JSON), performant and reliable. Accuracy of automatically assigned codes is limited (up to 40%). Combining different semantic mappers into a standardized Web-API is feasible. This framework can be easily enhanced due to its modular design.

  13. IRIS family of IRCCD thermal imagers integrating long-life cryogenic coolers, sophisticated algorithms for image enhancement, and hot points detection

    NASA Astrophysics Data System (ADS)

    Dupuy, Pascal; Harter, Jean

    1995-09-01

    Iris is a modular infrared thermal image developed by SAGEM since 1988, based on a 288 by 4 IRCCD detector. The first section of the presentation gives a description of the different modules of the IRIS thermal imager and their evolution in recent years. The second section covers the description of the major evolution, namely the integrated detector cooler assembly (IDCA), using a SOFRADIR 288 by 4 detector and a SAGEM microcooler, now integrated in the IRIS thermal imagers. The third section gives the description of two functions integrated in the IRIS thermal imager: (1) image enhancement, using a digital convolution filter, and (2) automatic hot points detection and tracking, offering an assistance to surveillance and automatic detection. The last section presents several programs for navy, air forces, and land applications for which IRIS has already been selected and achieved.

  14. Automatic Building Abstraction from Aerial Photogrammetry

    NASA Astrophysics Data System (ADS)

    Ley, A.; Hänsch, R.; Hellwich, O.

    2017-09-01

    Multi-view stereo has been shown to be a viable tool for the creation of realistic 3D city models. Nevertheless, it still states significant challenges since it results in dense, but noisy and incomplete point clouds when applied to aerial images. 3D city modelling usually requires a different representation of the 3D scene than these point clouds. This paper applies a fully-automatic pipeline to generate a simplified mesh from a given dense point cloud. The mesh provides a certain level of abstraction as it only consists of relatively large planar and textured surfaces. Thus, it is possible to remove noise, outlier, as well as clutter, while maintaining a high level of accuracy.

  15. Multi-terminology indexing for the assignment of MeSH descriptors to medical abstracts in French.

    PubMed

    Pereira, Suzanne; Sakji, Saoussen; Névéol, Aurélie; Kergourlay, Ivan; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J

    2009-11-14

    To facilitate information retrieval in the biomedical domain, a system for the automatic assignment of Medical Subject Headings to documents curated by an online quality-controlled health gateway was implemented. The French Multi-Terminology Indexer (F-MTI) implements a multiterminology approach using nine main medical terminologies in French and the mappings between them. This paper presents recent efforts to assess the added value of (a) integrating four new terminologies (Orphanet, ATC, drug names, MeSH supplementary concepts) into F-MTI's knowledge sources and (b) performing the automatic indexing on the titles and abstracts (vs. title only) of the online health resources. F-MTI was evaluated on a CISMeF corpus comprising 18,161 manually indexed resources. The performance of F-MTI including nine health terminologies on CISMeF resources with Title only was 27.9% precision and 19.7% recall, while the performance on CISMeF resources with Title and Abstract is 14.9 % precision (-13.0%) and 25.9% recall (+6.2%). In a few weeks, CISMeF will launch the indexing of resources based on title and abstract, using nine terminologies.

  16. Automatic real-time control of suspended sediment based upon high frequency in situ measurements of nephelometric turbidity

    Treesearch

    Jack Lewis; Rand Eads

    1998-01-01

    Abstract - For estimating suspended sediment concentration (SSC) in rivers, turbidity is potentially a much better predictor than water discharge. Since about 1990, it has been feasible to automatically collect high frequency turbidity data at remote sites using battery-powered turbidity probes that are properly mounted in the river or stream. With sensors calibrated...

  17. Speech Processing and Recognition (SPaRe)

    DTIC Science & Technology

    2011-01-01

    results in the areas of automatic speech recognition (ASR), speech processing, machine translation (MT), natural language processing ( NLP ), and...Processing ( NLP ), Information Retrieval (IR) 16. SECURITY CLASSIFICATION OF: UNCLASSIFED 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME...Figure 9, the IOC was only expected to provide document submission and search; automatic speech recognition (ASR) for English, Spanish, Arabic , and

  18. IMp: The customizable LEGO® Pinned Insect Manipulator

    PubMed Central

    Dupont, Steen; Price, Benjamin; Blagoderov, Vladimir

    2015-01-01

    Abstract We present a pinned insect manipulator (IMp) constructed of LEGO® building bricks with two axes of movement and two axes of rotation. In addition we present three variants of the IMp to emphasise the modular design, which facilitates resizing to meet the full range of pinned insect specimens, is fully customizable, collapsible, affordable and does not require specialist tools or knowledge to assemble. PMID:25685035

  19. Identification of an Adaptable Computer Program Design for Analyzing a Modular Organizational Assessment Instrument.

    DTIC Science & Technology

    1981-09-01

    ber) Survey-guided development Organizational effectiveness Computer program Organizational diagnosis Management 20. ABSTRACT (Continue an reverse...Army. Doctoral dissertation, Purdue University, December 1977. (DTIC AD-A059-542) Bowers, D. G. Organizational diagnosis : A review and a proposed method...G. E. Compara- tive issues and methods in organizational diagnosis . Ann Arbor MI: Institute for Social Research, University of Michigan, November 1977

  20. Stability Analysis of Distributed Engine Control Systems Under Communication Packet Drop (Postprint)

    DTIC Science & Technology

    2008-07-01

    use, modify, reproduce, release, perform, display, or disclose the work. 14. ABSTRACT Currently, Full Authority Digital Engine Control ( FADEC ...based on a centralized architecture framework is being widely used for gas turbine engine control. However, current FADEC is not able to meet the...system (DEC). FADEC based on Distributed Control Systems (DCS) offers modularity, improved control systems prognostics and fault tolerance along with

  1. Status, Vision, and Challenges of an Intelligent Distributed Engine Control Architecture (Postprint)

    DTIC Science & Technology

    2007-09-18

    TERMS turbine engine control, engine health management, FADEC , Universal FADEC , Distributed Controls, UF, UF Platform, common FADEC , Generic FADEC ...Modular FADEC , Adaptive Control 16. SECURITY CLASSIFICATION OF: 19a. NAME OF RESPONSIBLE PERSON (Monitor) a. REPORT Unclassified b. ABSTRACT...Eventually the Full Authority Digital Electronic Control ( FADEC ) became the norm. Presently, this control system architecture accounts for 15 to 20% of

  2. Software Voting in Asynchronous NMR (N-Modular Redundancy) Computer Structures.

    DTIC Science & Technology

    1983-05-06

    added reliability is exchanged for increased system cost and decreased throughput. Some applications require extremely reliable systems, so the only...not the other way around. Although no systems proidc abstract voting yet. as more applications are written for NMR systems, the programmers are going...throughput goes down, the overhead goes up. Mathematically : Overhead= Non redundant Throughput- Actual Throughput (1) In this section, the actual throughput

  3. Automatic decomposition of kinetic models of signaling networks minimizing the retroactivity among modules.

    PubMed

    Saez-Rodriguez, Julio; Gayer, Stefan; Ginkel, Martin; Gilles, Ernst Dieter

    2008-08-15

    The modularity of biochemical networks in general, and signaling networks in particular, has been extensively studied over the past few years. It has been proposed to be a useful property to analyze signaling networks: by decomposing the network into subsystems, more manageable units are obtained that are easier to analyze. While many powerful algorithms are available to identify modules in protein interaction networks, less attention has been paid to signaling networks de.ned as chemical systems. Such a decomposition would be very useful as most quantitative models are de.ned using the latter, more detailed formalism. Here, we introduce a novel method to decompose biochemical networks into modules so that the bidirectional (retroactive) couplings among the modules are minimized. Our approach adapts a method to detect community structures, and applies it to the so-called retroactivity matrix that characterizes the couplings of the network. Only the structure of the network, e.g. in SBML format, is required. Furthermore, the modularized models can be loaded into ProMoT, a modeling tool which supports modular modeling. This allows visualization of the models, exploiting their modularity and easy generation of models of one or several modules for further analysis. The method is applied to several relevant cases, including an entangled model of the EGF-induced MAPK cascade and a comprehensive model of EGF signaling, demonstrating its ability to uncover meaningful modules. Our approach can thus help to analyze large networks, especially when little a priori knowledge on the structure of the network is available. The decomposition algorithms implemented in MATLAB (Mathworks, Inc.) are freely available upon request. ProMoT is freely available at http://www.mpi-magdeburg.mpg.de/projects/promot. Supplementary data are available at Bioinformatics online.

  4. Modular programming for tuberculosis control, the "AuTuMN" platform.

    PubMed

    Trauer, James McCracken; Ragonnet, Romain; Doan, Tan Nhut; McBryde, Emma Sue

    2017-08-07

    Tuberculosis (TB) is now the world's leading infectious killer and major programmatic advances will be needed if we are to meet the ambitious new End TB Targets. Although mathematical models are powerful tools for TB control, such models must be flexible enough to capture the complexity and heterogeneity of the global TB epidemic. This includes simulating a disease that affects age groups and other risk groups differently, has varying levels of infectiousness depending upon the organ involved and varying outcomes from treatment depending on the drug resistance pattern of the infecting strain. We adopted sound basic principles of software engineering to develop a modular software platform for simulation of TB control interventions ("AuTuMN"). These included object-oriented programming, logical linkage between modules and consistency of code syntax and variable naming. The underlying transmission dynamic model incorporates optional stratification by age, risk group, strain and organ involvement, while our approach to simulating time-variant programmatic parameters better captures the historical progression of the epidemic. An economic model is overlaid upon this epidemiological model which facilitates comparison between new and existing technologies. A "Model runner" module allows for predictions of future disease burden trajectories under alternative scenario situations, as well as uncertainty, automatic calibration, cost-effectiveness and optimisation. The model has now been used to guide TB control strategies across a range of settings and countries, with our modular approach enabling repeated application of the tool without the need for extensive modification for each application. The modular construction of the platform minimises errors, enhances readability and collaboration between multiple programmers and enables rapid adaptation to answer questions in a broad range of contexts without the need for extensive re-programming. Such features are particularly important in simulating an epidemic as complex and diverse as TB.

  5. Reduced modeling of signal transduction – a modular approach

    PubMed Central

    Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter

    2007-01-01

    Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494

  6. A Modular Low-Complexity ECG Delineation Algorithm for Real-Time Embedded Systems.

    PubMed

    Bote, Jose Manuel; Recas, Joaquin; Rincon, Francisco; Atienza, David; Hermida, Roman

    2018-03-01

    This work presents a new modular and low-complexity algorithm for the delineation of the different ECG waves (QRS, P and T peaks, onsets, and end). Involving a reduced number of operations per second and having a small memory footprint, this algorithm is intended to perform real-time delineation on resource-constrained embedded systems. The modular design allows the algorithm to automatically adjust the delineation quality in runtime to a wide range of modes and sampling rates, from a ultralow-power mode when no arrhythmia is detected, in which the ECG is sampled at low frequency, to a complete high-accuracy delineation mode, in which the ECG is sampled at high frequency and all the ECG fiducial points are detected, in the case of arrhythmia. The delineation algorithm has been adjusted using the QT database, providing very high sensitivity and positive predictivity, and validated with the MIT database. The errors in the delineation of all the fiducial points are below the tolerances given by the Common Standards for Electrocardiography Committee in the high-accuracy mode, except for the P wave onset, for which the algorithm is above the agreed tolerances by only a fraction of the sample duration. The computational load for the ultralow-power 8-MHz TI MSP430 series microcontroller ranges from 0.2% to 8.5% according to the mode used.

  7. Direct social perception and dual process theories of mindreading.

    PubMed

    Herschbach, Mitchell

    2015-11-01

    The direct social perception (DSP) thesis claims that we can directly perceive some mental states of other people. The direct perception of mental states has been formulated phenomenologically and psychologically, and typically restricted to the mental state types of intentions and emotions. I will compare DSP to another account of mindreading: dual process accounts that posit a fast, automatic "Type 1" form of mindreading and a slow, effortful "Type 2" form. I will here analyze whether dual process accounts' Type 1 mindreading serves as a rival to DSP or whether some Type 1 mindreading can be perceptual. I will focus on Apperly and Butterfill's dual process account of mindreading epistemic states such as perception, knowledge, and belief. This account posits a minimal form of Type 1 mindreading of belief-like states called registrations. I will argue that general dual process theories fit well with a modular view of perception that is considered a kind of Type 1 process. I will show that this modular view of perception challenges and has significant advantages over DSP's phenomenological and psychological theses. Finally, I will argue that if such a modular view of perception is accepted, there is significant reason for thinking Type 1 mindreading of belief-like states is perceptual in nature. This would mean extending the scope of DSP to at least one type of epistemic state. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Facilitating the Information Exchange Using a Modular Electronic Discharge Summary.

    PubMed

    Denecke, Kerstin; Dittli, Pascal A; Kanagarasa, Niveadha; Nüssli, Stephan

    2018-01-01

    Discharge summaries are a standard communication tool delivering important clinical information from inpatient to ambulatory care. To ensure a high quality, correctness and completeness, the generation process is time consuming. It requires also contributions of multiple persons. This is problematic since the primary care provider needs the information from the discharge summary for continuing the intended treatment. To address this challenge, we developed a concept for exchanging a modular electronic discharge summary. Through a literature review and interviews with multiple stakeholders, we analysed existing processes and derived requirements for an improved communication of the discharge summary. In this paper, we suggest a concept of a modular electronic discharge summary that is exchanged through the electronic patient dossier in CDA CH level 2 documents. Until 2020, all Swiss hospitals are obliged to connect to the electronic patient dossier. Our concept allows to access already completed modules of the discharge summary from the primary care side, before the entire report is entirely finalised. The data is automatically merged with the local patient record on the physician side and prepared for data integration into the practice information system. Our concept offers the opportunity not only to improve the information exchange between hospital and primary care, but it also provides a potential use case and demonstrates a benefit of the electronic patient dossier for primary care providers who are so far not obliged to connect to the patient dossier in Switzerland.

  9. Exploiting the systematic review protocol for classification of medical abstracts.

    PubMed

    Frunza, Oana; Inkpen, Diana; Matwin, Stan; Klement, William; O'Blenis, Peter

    2011-01-01

    To determine whether the automatic classification of documents can be useful in systematic reviews on medical topics, and specifically if the performance of the automatic classification can be enhanced by using the particular protocol of questions employed by the human reviewers to create multiple classifiers. The test collection is the data used in large-scale systematic review on the topic of the dissemination strategy of health care services for elderly people. From a group of 47,274 abstracts marked by human reviewers to be included in or excluded from further screening, we randomly selected 20,000 as a training set, with the remaining 27,274 becoming a separate test set. As a machine learning algorithm we used complement naïve Bayes. We tested both a global classification method, where a single classifier is trained on instances of abstracts and their classification (i.e., included or excluded), and a novel per-question classification method that trains multiple classifiers for each abstract, exploiting the specific protocol (questions) of the systematic review. For the per-question method we tested four ways of combining the results of the classifiers trained for the individual questions. As evaluation measures, we calculated precision and recall for several settings of the two methods. It is most important not to exclude any relevant documents (i.e., to attain high recall for the class of interest) but also desirable to exclude most of the non-relevant documents (i.e., to attain high precision on the class of interest) in order to reduce human workload. For the global method, the highest recall was 67.8% and the highest precision was 37.9%. For the per-question method, the highest recall was 99.2%, and the highest precision was 63%. The human-machine workflow proposed in this paper achieved a recall value of 99.6%, and a precision value of 17.8%. The per-question method that combines classifiers following the specific protocol of the review leads to better results than the global method in terms of recall. Because neither method is efficient enough to classify abstracts reliably by itself, the technology should be applied in a semi-automatic way, with a human expert still involved. When the workflow includes one human expert and the trained automatic classifier, recall improves to an acceptable level, showing that automatic classification techniques can reduce the human workload in the process of building a systematic review. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. FOAM: the modular adaptive optics framework

    NASA Astrophysics Data System (ADS)

    van Werkhoven, T. I. M.; Homs, L.; Sliepen, G.; Rodenhuis, M.; Keller, C. U.

    2012-07-01

    Control software for adaptive optics systems is mostly custom built and very specific in nature. We have developed FOAM, a modular adaptive optics framework for controlling and simulating adaptive optics systems in various environments. Portability is provided both for different control hardware and adaptive optics setups. To achieve this, FOAM is written in C++ and runs on standard CPUs. Furthermore we use standard Unix libraries and compilation procedures and implemented a hardware abstraction layer in FOAM. We have successfully implemented FOAM on the adaptive optics system of ExPo - a high-contrast imaging polarimeter developed at our institute - in the lab and will test it on-sky late June 2012. We also plan to implement FOAM on adaptive optics systems for microscopy and solar adaptive optics. FOAM is available* under the GNU GPL license and is free to be used by anyone.

  11. Quality Control of True Height Profiles Obtained Automatically from Digital Ionograms.

    DTIC Science & Technology

    1982-05-01

    nece.,ssary and Identify by block number) Ionosphere Digisonde Electron Density Profile Ionogram Autoscaling ARTIST 2 , ABSTRACT (Continue on reverae...analysis technique currently used with the ionogram traces scaled automatically by the ARTIST software [Reinisch and Huang, 1983; Reinisch et al...19841, and the generalized polynomial analysis technique POLAN [Titheridge, 1985], using the same ARTIST -identified ionogram traces. 2. To determine how

  12. Remotely piloted vehicles. Citations from the International Aerospace abstracts data base

    NASA Technical Reports Server (NTRS)

    Mauk, S. C.

    1980-01-01

    These citations from the international literature cover various aspects of remotely piloted vehicles. Included are articles concerning aircraft design, flight tests, aircraft control, cost effectiveness, automatic flight control, automatic pilots, and data links. Civil aviation applications are included, although military uses of remotely piloted vehicles are stressed. This updated bibliography contains 224 citations, 43 of which are new additions to the previous edition.

  13. Unsupervised method for automatic construction of a disease dictionary from a large free text collection.

    PubMed

    Xu, Rong; Supekar, Kaustubh; Morgan, Alex; Das, Amar; Garber, Alan

    2008-11-06

    Concept specific lexicons (e.g. diseases, drugs, anatomy) are a critical source of background knowledge for many medical language-processing systems. However, the rapid pace of biomedical research and the lack of constraints on usage ensure that such dictionaries are incomplete. Focusing on disease terminology, we have developed an automated, unsupervised, iterative pattern learning approach for constructing a comprehensive medical dictionary of disease terms from randomized clinical trial (RCT) abstracts, and we compared different ranking methods for automatically extracting con-textual patterns and concept terms. When used to identify disease concepts from 100 randomly chosen, manually annotated clinical abstracts, our disease dictionary shows significant performance improvement (F1 increased by 35-88%) over available, manually created disease terminologies.

  14. Unsupervised Method for Automatic Construction of a Disease Dictionary from a Large Free Text Collection

    PubMed Central

    Xu, Rong; Supekar, Kaustubh; Morgan, Alex; Das, Amar; Garber, Alan

    2008-01-01

    Concept specific lexicons (e.g. diseases, drugs, anatomy) are a critical source of background knowledge for many medical language-processing systems. However, the rapid pace of biomedical research and the lack of constraints on usage ensure that such dictionaries are incomplete. Focusing on disease terminology, we have developed an automated, unsupervised, iterative pattern learning approach for constructing a comprehensive medical dictionary of disease terms from randomized clinical trial (RCT) abstracts, and we compared different ranking methods for automatically extracting contextual patterns and concept terms. When used to identify disease concepts from 100 randomly chosen, manually annotated clinical abstracts, our disease dictionary shows significant performance improvement (F1 increased by 35–88%) over available, manually created disease terminologies. PMID:18999169

  15. An autonomous sensor module based on a legacy CCTV camera

    NASA Astrophysics Data System (ADS)

    Kent, P. J.; Faulkner, D. A. A.; Marshall, G. F.

    2016-10-01

    A UK MoD funded programme into autonomous sensors arrays (SAPIENT) has been developing new, highly capable sensor modules together with a scalable modular architecture for control and communication. As part of this system there is a desire to also utilise existing legacy sensors. The paper reports upon the development of a SAPIENT-compliant sensor module using a legacy Close-Circuit Television (CCTV) pan-tilt-zoom (PTZ) camera. The PTZ camera sensor provides three modes of operation. In the first mode, the camera is automatically slewed to acquire imagery of a specified scene area, e.g. to provide "eyes-on" confirmation for a human operator or for forensic purposes. In the second mode, the camera is directed to monitor an area of interest, with zoom level automatically optimized for human detection at the appropriate range. Open source algorithms (using OpenCV) are used to automatically detect pedestrians; their real world positions are estimated and communicated back to the SAPIENT central fusion system. In the third mode of operation a "follow" mode is implemented where the camera maintains the detected person within the camera field-of-view without requiring an end-user to directly control the camera with a joystick.

  16. A parallel and modular deformable cell Car-Parrinello code

    NASA Astrophysics Data System (ADS)

    Cavazzoni, Carlo; Chiarotti, Guido L.

    1999-12-01

    We have developed a modular parallel code implementing the Car-Parrinello [Phys. Rev. Lett. 55 (1985) 2471] algorithm including the variable cell dynamics [Europhys. Lett. 36 (1994) 345; J. Phys. Chem. Solids 56 (1995) 510]. Our code is written in Fortran 90, and makes use of some new programming concepts like encapsulation, data abstraction and data hiding. The code has a multi-layer hierarchical structure with tree like dependences among modules. The modules include not only the variables but also the methods acting on them, in an object oriented fashion. The modular structure allows easier code maintenance, develop and debugging procedures, and is suitable for a developer team. The layer structure permits high portability. The code displays an almost linear speed-up in a wide range of number of processors independently of the architecture. Super-linear speed up is obtained with a "smart" Fast Fourier Transform (FFT) that uses the available memory on the single node (increasing for a fixed problem with the number of processing elements) as temporary buffer to store wave function transforms. This code has been used to simulate water and ammonia at giant planet conditions for systems as large as 64 molecules for ˜50 ps.

  17. Using VCL as an Aspect-Oriented Approach to Requirements Modelling

    NASA Astrophysics Data System (ADS)

    Amálio, Nuno; Kelsen, Pierre; Ma, Qin; Glodt, Christian

    Software systems are becoming larger and more complex. By tackling the modularisation of crosscutting concerns, aspect orientation draws attention to modularity as a means to address the problems of scalability, complexity and evolution in software systems development. Aspect-oriented modelling (AOM) applies aspect-orientation to the construction of models. Most existing AOM approaches are designed without a formal semantics, and use multi-view partial descriptions of behaviour. This paper presents an AOM approach based on the Visual Contract Language (VCL): a visual language for abstract and precise modelling, designed with a formal semantics, and comprising a novel approach to visual behavioural modelling based on design by contract where behavioural descriptions are total. By applying VCL to a large case study of a car-crash crisis management system, the paper demonstrates how modularity of VCL's constructs, at different levels of granularity, help to tackle complexity. In particular, it shows how VCL's package construct and its associated composition mechanisms are key in supporting separation of concerns, coarse-grained problem decomposition and aspect-orientation. The case study's modelling solution has a clear and well-defined modular structure; the backbone of this structure is a collection of packages encapsulating local solutions to concerns.

  18. A study of the use of abstract types for the representation of engineering units in integration and test applications

    NASA Technical Reports Server (NTRS)

    Johnson, Charles S.

    1986-01-01

    Physical quantities using various units of measurement can be well represented in Ada by the use of abstract types. Computation involving these quantities (electric potential, mass, volume) can also automatically invoke the computation and checking of some of the implicitly associable attributes of measurements. Quantities can be held internally in SI units, transparently to the user, with automatic conversion. Through dimensional analysis, the type of the derived quantity resulting from a computation is known, thereby allowing dynamic checks of the equations used. The impact of the possible implementation of these techniques in integration and test applications is discussed. The overhead of computing and transporting measurement attributes is weighed against the advantages gained by their use. The construction of a run time interpreter using physical quantities in equations can be aided by the dynamic equation checks provided by dimensional analysis. The effects of high levels of abstraction on the generation and maintenance of software used in integration and test applications are also discussed.

  19. Automatic Keyword Extraction from Individual Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Stuart J.; Engel, David W.; Cramer, Nicholas O.

    2010-05-03

    This paper introduces a novel and domain-independent method for automatically extracting keywords, as sequences of one or more words, from individual documents. We describe the method’s configuration parameters and algorithm, and present an evaluation on a benchmark corpus of technical abstracts. We also present a method for generating lists of stop words for specific corpora and domains, and evaluate its ability to improve keyword extraction on the benchmark corpus. Finally, we apply our method of automatic keyword extraction to a corpus of news articles and define metrics for characterizing the exclusivity, essentiality, and generality of extracted keywords within a corpus.

  20. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  1. Control and protection system for paralleled modular static inverter-converter systems

    NASA Technical Reports Server (NTRS)

    Birchenough, A. G.; Gourash, F.

    1973-01-01

    A control and protection system was developed for use with a paralleled 2.5-kWe-per-module static inverter-converter system. The control and protection system senses internal and external fault parameters such as voltage, frequency, current, and paralleling current unbalance. A logic system controls contactors to isolate defective power conditioners or loads. The system sequences contactor operation to automatically control parallel operation, startup, and fault isolation. Transient overload protection and fault checking sequences are included. The operation and performance of a control and protection system, with detailed circuit descriptions, are presented.

  2. Combatting Inherent Vulnerabilities of CFAR Algorithms and a New Robust CFAR Design

    DTIC Science & Technology

    1993-09-01

    elements of any automatic radar system. Unfortunately, CFAR systems are inherently vulnerable to degradation caused by large clutter edges, multiple ...edges, multiple targets, and electronic countermeasures (ECM) environments. 20 Distribution, Availability of Abstract 21 Abstract Security...inherently vulnerable to degradation caused by large clutter edges, multiple targets and jamming environments. This thesis presents eight popular and studied

  3. Multi-terminology indexing for the assignment of MeSH descriptors to medical abstracts in French

    PubMed Central

    Pereira, Suzanne; Sakji, Saoussen; Névéol, Aurélie; Kergourlay, Ivan; Kerdelhué, Gaétan; Serrot, Elisabeth; Joubert, Michel; Darmoni, Stéfan J.

    2009-01-01

    Background: To facilitate information retrieval in the biomedical domain, a system for the automatic assignment of Medical Subject Headings to documents curated by an online quality-controlled health gateway was implemented. The French Multi-Terminology Indexer (F-MTI) implements a multiterminology approach using nine main medical terminologies in French and the mappings between them. Objective: This paper presents recent efforts to assess the added value of (a) integrating four new terminologies (Orphanet, ATC, drug names, MeSH supplementary concepts) into F-MTI’s knowledge sources and (b) performing the automatic indexing on the titles and abstracts (vs. title only) of the online health resources. Methods: F-MTI was evaluated on a CISMeF corpus comprising 18,161 manually indexed resources. Results: The performance of F-MTI including nine health terminologies on CISMeF resources with Title only was 27.9% precision and 19.7% recall, while the performance on CISMeF resources with Title and Abstract is 14.9 % precision (−13.0%) and 25.9% recall (+6.2%). Conclusion: In a few weeks, CISMeF will launch the indexing of resources based on title and abstract, using nine terminologies. PMID:20351910

  4. Auspice: Automatic Service Planning in Cloud/Grid Environments

    NASA Astrophysics Data System (ADS)

    Chiu, David; Agrawal, Gagan

    Recent scientific advances have fostered a mounting number of services and data sets available for utilization. These resources, though scattered across disparate locations, are often loosely coupled both semantically and operationally. This loosely coupled relationship implies the possibility of linking together operations and data sets to answer queries. This task, generally known as automatic service composition, therefore abstracts the process of complex scientific workflow planning from the user. We have been exploring a metadata-driven approach toward automatic service workflow composition, among other enabling mechanisms, in our system, Auspice: Automatic Service Planning in Cloud/Grid Environments. In this paper, we present a complete overview of our system's unique features and outlooks for future deployment as the Cloud computing paradigm becomes increasingly eminent in enabling scientific computing.

  5. Orbital spacecraft consumables resupply

    NASA Technical Reports Server (NTRS)

    Dominick, Sam M.; Eberhardt, Ralph N.; Tracey, Thomas R.

    1988-01-01

    The capability to replenish spacecraft, satellites, and laboratories on-orbit with consumable fluids provides significant increases in their cost and operational effectiveness. Tanker systems to perform on-orbit fluid resupply must be flexible enough to operate from the Space Transportation System (STS), Space Station, or the Orbital Maneuvering Vehicle (OMV), and to accommodate launch from both the Shuttle and Expendable Launch Vehicles (ELV's). Resupply systems for storable monopropellant hydrazine and bipropellants, and water have been developed. These studies have concluded that designing tankers capable of launch on both the Shuttle and ELV's was feasible and desirable. Design modifications and interfaces for an ELV launch of the tanker systems were identified. Additionally, it was determined that modularization of the tanker subsystems was necessary to provide the most versatile tanker and most efficient approach for use at the Space Station. The need to develop an automatic umbilical mating mechanism, capable of performing both docking and coupler mating functions was identified. Preliminary requirements for such a mechanism were defined. The study resulted in a modular tanker capable of resupplying monopropellants, bipropellants, and water with a single design.

  6. Toward a Practical Type Theory for Recursive Modules

    DTIC Science & Technology

    2001-03-01

    Carnegie Mellon University Pittsburgh, PA 15213 Abstract Module systems for languages with complex type systems, such as Standard ML, often lack the...Project: Advanced Languages for Systems Software”, ARPA Order No. C533, issued by ESC/ENS under Contract No. F19628-95-C-0050. The views and conclusions...power of a module system lies in the flexibility of its facility for expressing dependencies between modular components. Some languages (such as Java

  7. Interface Specifications for the A-7E Shared Services Module.

    DTIC Science & Technology

    1982-09-08

    To illustrate the principles, the onboard software for the Navy’s A-7E aircraft will be redesigned and rewritten. The Shared Services module provides...purpose of the Shared Services module is to allow the remainder of the software to remain unchanged when the requirements-based rules for these values and...services change. This report describes the modular structure of the Shared Services module, and contains the abstract interface specifications for all

  8. Model Checking Abstract PLEXIL Programs with SMART

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.

    2007-01-01

    We describe a method to automatically generate discrete-state models of abstract Plan Execution Interchange Language (PLEXIL) programs that can be analyzed using model checking tools. Starting from a high-level description of a PLEXIL program or a family of programs with common characteristics, the generator lays the framework that models the principles of program execution. The concrete parts of the program are not automatically generated, but require the modeler to introduce them by hand. As a case study, we generate models to verify properties of the PLEXIL macro constructs that are introduced as shorthand notation. After an exhaustive analysis, we conclude that the macro definitions obey the intended semantics and behave as expected, but contingently on a few specific requirements on the timing semantics of micro-steps in the concrete executive implementation.

  9. Automatic Review of Abstract State Machines by Meta Property Verification

    NASA Technical Reports Server (NTRS)

    Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia

    2010-01-01

    A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.

  10. iBodies: Modular Synthetic Antibody Mimetics Based on Hydrophilic Polymers Decorated with Functional Moieties

    PubMed Central

    Šácha, Pavel; Knedlík, Tomáš; Schimer, Jiří; Tykvart, Jan; Parolek, Jan; Navrátil, Václav; Dvořáková, Petra; Sedlák, František; Ulbrich, Karel; Strohalm, Jiří; Majer, Pavel

    2016-01-01

    Abstract Antibodies are indispensable tools for biomedicine and anticancer therapy. Nevertheless, their use is compromised by high production costs, limited stability, and difficulty of chemical modification. The design and preparation of synthetic polymer conjugates capable of replacing antibodies in biomedical applications such as ELISA, flow cytometry, immunocytochemistry, and immunoprecipitation is reported. The conjugates, named “iBodies”, consist of an HPMA copolymer decorated with low‐molecular‐weight compounds that function as targeting ligands, affinity anchors, and imaging probes. We prepared specific conjugates targeting several proteins with known ligands and used these iBodies for enzyme inhibition, protein isolation, immobilization, quantification, and live‐cell imaging. Our data indicate that this highly modular and versatile polymer system can be used to produce inexpensive and stable antibody substitutes directed toward virtually any protein of interest with a known ligand. PMID:26749427

  11. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. The CHEMDNER corpus of chemicals and drugs and its annotation principles.

    PubMed

    Krallinger, Martin; Rabal, Obdulia; Leitner, Florian; Vazquez, Miguel; Salgado, David; Lu, Zhiyong; Leaman, Robert; Lu, Yanan; Ji, Donghong; Lowe, Daniel M; Sayle, Roger A; Batista-Navarro, Riza Theresa; Rak, Rafal; Huber, Torsten; Rocktäschel, Tim; Matos, Sérgio; Campos, David; Tang, Buzhou; Xu, Hua; Munkhdalai, Tsendsuren; Ryu, Keun Ho; Ramanan, S V; Nathan, Senthil; Žitnik, Slavko; Bajec, Marko; Weber, Lutz; Irmer, Matthias; Akhondi, Saber A; Kors, Jan A; Xu, Shuo; An, Xin; Sikdar, Utpal Kumar; Ekbal, Asif; Yoshioka, Masaharu; Dieb, Thaer M; Choi, Miji; Verspoor, Karin; Khabsa, Madian; Giles, C Lee; Liu, Hongfang; Ravikumar, Komandur Elayavilli; Lamurias, Andre; Couto, Francisco M; Dai, Hong-Jie; Tsai, Richard Tzong-Han; Ata, Caglar; Can, Tolga; Usié, Anabel; Alves, Rui; Segura-Bedmar, Isabel; Martínez, Paloma; Oyarzabal, Julen; Valencia, Alfonso

    2015-01-01

    The automatic extraction of chemical information from text requires the recognition of chemical entity mentions as one of its key steps. When developing supervised named entity recognition (NER) systems, the availability of a large, manually annotated text corpus is desirable. Furthermore, large corpora permit the robust evaluation and comparison of different approaches that detect chemicals in documents. We present the CHEMDNER corpus, a collection of 10,000 PubMed abstracts that contain a total of 84,355 chemical entity mentions labeled manually by expert chemistry literature curators, following annotation guidelines specifically defined for this task. The abstracts of the CHEMDNER corpus were selected to be representative for all major chemical disciplines. Each of the chemical entity mentions was manually labeled according to its structure-associated chemical entity mention (SACEM) class: abbreviation, family, formula, identifier, multiple, systematic and trivial. The difficulty and consistency of tagging chemicals in text was measured using an agreement study between annotators, obtaining a percentage agreement of 91. For a subset of the CHEMDNER corpus (the test set of 3,000 abstracts) we provide not only the Gold Standard manual annotations, but also mentions automatically detected by the 26 teams that participated in the BioCreative IV CHEMDNER chemical mention recognition task. In addition, we release the CHEMDNER silver standard corpus of automatically extracted mentions from 17,000 randomly selected PubMed abstracts. A version of the CHEMDNER corpus in the BioC format has been generated as well. We propose a standard for required minimum information about entity annotations for the construction of domain specific corpora on chemical and drug entities. The CHEMDNER corpus and annotation guidelines are available at: http://www.biocreative.org/resources/biocreative-iv/chemdner-corpus/.

  13. The CHEMDNER corpus of chemicals and drugs and its annotation principles

    PubMed Central

    2015-01-01

    The automatic extraction of chemical information from text requires the recognition of chemical entity mentions as one of its key steps. When developing supervised named entity recognition (NER) systems, the availability of a large, manually annotated text corpus is desirable. Furthermore, large corpora permit the robust evaluation and comparison of different approaches that detect chemicals in documents. We present the CHEMDNER corpus, a collection of 10,000 PubMed abstracts that contain a total of 84,355 chemical entity mentions labeled manually by expert chemistry literature curators, following annotation guidelines specifically defined for this task. The abstracts of the CHEMDNER corpus were selected to be representative for all major chemical disciplines. Each of the chemical entity mentions was manually labeled according to its structure-associated chemical entity mention (SACEM) class: abbreviation, family, formula, identifier, multiple, systematic and trivial. The difficulty and consistency of tagging chemicals in text was measured using an agreement study between annotators, obtaining a percentage agreement of 91. For a subset of the CHEMDNER corpus (the test set of 3,000 abstracts) we provide not only the Gold Standard manual annotations, but also mentions automatically detected by the 26 teams that participated in the BioCreative IV CHEMDNER chemical mention recognition task. In addition, we release the CHEMDNER silver standard corpus of automatically extracted mentions from 17,000 randomly selected PubMed abstracts. A version of the CHEMDNER corpus in the BioC format has been generated as well. We propose a standard for required minimum information about entity annotations for the construction of domain specific corpora on chemical and drug entities. The CHEMDNER corpus and annotation guidelines are available at: http://www.biocreative.org/resources/biocreative-iv/chemdner-corpus/ PMID:25810773

  14. Automatic Design of a Maglev Controller in State Space

    DTIC Science & Technology

    1991-12-01

    Design of a Maglev Controller in State Space Feng Zhao Richard Thornton Abstract We describe the automatic synthesis of a global nonlinear controller for...the global switching points of the controller is presented. The synthesized control system can stabilize the maglev vehicle with large initial displace...NUMBERS Automation Desing of a Maglev Controller in State Space N00014-89-J-3202 MIP-9001651 6. AUTHOR(S) Feng Zhao and Richard Thornton 7. PERFORMING

  15. Software for marine ecological environment comprehensive monitoring system based on MCGS

    NASA Astrophysics Data System (ADS)

    Wang, X. H.; Ma, R.; Cao, X.; Cao, L.; Chu, D. Z.; Zhang, L.; Zhang, T. P.

    2017-08-01

    The automatic integrated monitoring software for marine ecological environment based on MCGS configuration software is designed and developed to realize real-time automatic monitoring of many marine ecological parameters. The DTU data transmission terminal performs network communication and transmits the data to the user data center in a timely manner. The software adopts the modular design and has the advantages of stable and flexible data structure, strong portability and scalability, clear interface, simple user operation and convenient maintenance. Continuous site comparison test of 6 months showed that, the relative error of the parameters monitored by the system such as temperature, salinity, turbidity, pH, dissolved oxygen was controlled within 5% with the standard method and the relative error of the nutrient parameters was within 15%. Meanwhile, the system had few maintenance times, low failure rate, stable and efficient continuous monitoring capabilities. The field application shows that the software is stable and the data communication is reliable, and it has a good application prospect in the field of marine ecological environment comprehensive monitoring.

  16. The ALICE-HMPID Detector Control System: Its evolution towards an expert and adaptive system

    NASA Astrophysics Data System (ADS)

    De Cataldo, G.; Franco, A.; Pastore, C.; Sgura, I.; Volpe, G.

    2011-05-01

    The High Momentum Particle IDentification (HMPID) detector is a proximity focusing Ring Imaging Cherenkov (RICH) for charged hadron identification. The HMPID is based on liquid C 6F 14 as the radiator medium and on a 10 m 2 CsI coated, pad segmented photocathode of MWPCs for UV Cherenkov photon detection. To ensure full remote control, the HMPID is equipped with a detector control system (DCS) responding to industrial standards for robustness and reliability. It has been implemented using PVSS as Slow Control And Data Acquisition (SCADA) environment, Programmable Logic Controller as control devices and Finite State Machines for modular and automatic command execution. In the perspective of reducing human presence at the experiment site, this paper focuses on DCS evolution towards an expert and adaptive control system, providing, respectively, automatic error recovery and stable detector performance. HAL9000, the first prototype of the HMPID expert system, is then presented. Finally an analysis of the possible application of the adaptive features is provided.

  17. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    NASA Astrophysics Data System (ADS)

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; West, Richard H.

    2016-06-01

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involving carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.

  18. MxCuBE: a synchrotron beamline control environment customized for macromolecular crystallography experiments

    PubMed Central

    Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren; Bowler, Matthew W.; Brockhauser, Sandor; Flot, David; Gordon, Elspeth J.; Hall, David R.; Lavault, Bernard; McCarthy, Andrew A.; McCarthy, Joanne; Mitchell, Edward; Monaco, Stéphanie; Mueller-Dieckmann, Christoph; Nurizzo, Didier; Ravelli, Raimond B. G.; Thibault, Xavier; Walsh, Martin A.; Leonard, Gordon A.; McSweeney, Sean M.

    2010-01-01

    The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. PMID:20724792

  19. Automatic analysis of medical dialogue in the home hemodialysis domain: structure induction and summarization.

    PubMed

    Lacson, Ronilda C; Barzilay, Regina; Long, William J

    2006-10-01

    Spoken medical dialogue is a valuable source of information for patients and caregivers. This work presents a first step towards automatic analysis and summarization of spoken medical dialogue. We first abstract a dialogue into a sequence of semantic categories using linguistic and contextual features integrated in a supervised machine-learning framework. Our model has a classification accuracy of 73%, compared to 33% achieved by a majority baseline (p<0.01). We then describe and implement a summarizer that utilizes this automatically induced structure. Our evaluation results indicate that automatically generated summaries exhibit high resemblance to summaries written by humans. In addition, task-based evaluation shows that physicians can reasonably answer questions related to patient care by looking at the automatically generated summaries alone, in contrast to the physicians' performance when they were given summaries from a naïve summarizer (p<0.05). This work demonstrates the feasibility of automatically structuring and summarizing spoken medical dialogue.

  20. Graph Theory-Based Brain Connectivity for Automatic Classification of Multiple Sclerosis Clinical Courses.

    PubMed

    Kocevar, Gabriel; Stamile, Claudio; Hannoun, Salem; Cotton, François; Vukusic, Sandra; Durand-Dubief, Françoise; Sappey-Marinier, Dominique

    2016-01-01

    Purpose: In this work, we introduce a method to classify Multiple Sclerosis (MS) patients into four clinical profiles using structural connectivity information. For the first time, we try to solve this question in a fully automated way using a computer-based method. The main goal is to show how the combination of graph-derived metrics with machine learning techniques constitutes a powerful tool for a better characterization and classification of MS clinical profiles. Materials and Methods: Sixty-four MS patients [12 Clinical Isolated Syndrome (CIS), 24 Relapsing Remitting (RR), 24 Secondary Progressive (SP), and 17 Primary Progressive (PP)] along with 26 healthy controls (HC) underwent MR examination. T1 and diffusion tensor imaging (DTI) were used to obtain structural connectivity matrices for each subject. Global graph metrics, such as density and modularity, were estimated and compared between subjects' groups. These metrics were further used to classify patients using tuned Support Vector Machine (SVM) combined with Radial Basic Function (RBF) kernel. Results: When comparing MS patients to HC subjects, a greater assortativity, transitivity, and characteristic path length as well as a lower global efficiency were found. Using all graph metrics, the best F -Measures (91.8, 91.8, 75.6, and 70.6%) were obtained for binary (HC-CIS, CIS-RR, RR-PP) and multi-class (CIS-RR-SP) classification tasks, respectively. When using only one graph metric, the best F -Measures (83.6, 88.9, and 70.7%) were achieved for modularity with previous binary classification tasks. Conclusion: Based on a simple DTI acquisition associated with structural brain connectivity analysis, this automatic method allowed an accurate classification of different MS patients' clinical profiles.

  1. The sixth generation robot in space

    NASA Technical Reports Server (NTRS)

    Butcher, A.; Das, A.; Reddy, Y. V.; Singh, H.

    1990-01-01

    The knowledge based simulator developed in the artificial intelligence laboratory has become a working test bed for experimenting with intelligent reasoning architectures. With this simulator, recently, small experiments have been done with an aim to simulate robot behavior to avoid colliding paths. An automatic extension of such experiments to intelligently planning robots in space demands advanced reasoning architectures. One such architecture for general purpose problem solving is explored. The robot, seen as a knowledge base machine, goes via predesigned abstraction mechanism for problem understanding and response generation. The three phases in one such abstraction scheme are: abstraction for representation, abstraction for evaluation, and abstraction for resolution. Such abstractions require multimodality. This multimodality requires the use of intensional variables to deal with beliefs in the system. Abstraction mechanisms help in synthesizing possible propagating lattices for such beliefs. The machine controller enters into a sixth generation paradigm.

  2. Using Voice Coils to Actuate Modular Soft Robots: Wormbot, an Example

    PubMed Central

    Nemitz, Markus P.; Mihaylov, Pavel; Barraclough, Thomas W.; Ross, Dylan

    2016-01-01

    Abstract In this study, we present a modular worm-like robot, which utilizes voice coils as a new paradigm in soft robot actuation. Drive electronics are incorporated into the actuators, providing a significant improvement in self-sufficiency when compared with existing soft robot actuation modes such as pneumatics or hydraulics. The body plan of this robot is inspired by the phylum Annelida and consists of three-dimensional printed voice coil actuators, which are connected by flexible silicone membranes. Each electromagnetic actuator engages with its neighbor to compress or extend the membrane of each segment, and the sequence in which they are actuated results in an earthworm-inspired peristaltic motion. We find that a minimum of three segments is required for locomotion, but due to our modular design, robots of any length can be quickly and easily assembled. In addition to actuation, voice coils provide audio input and output capabilities. We demonstrate transmission of data between segments by high-frequency carrier waves and, using a similar mechanism, we note that the passing of power between coupled coils in neighboring modules—or from an external power source—is also possible. Voice coils are a convenient multifunctional alternative to existing soft robot actuators. Their self-contained nature and ability to communicate with each other are ideal for modular robotics, and the additional functionality of sound input/output and power transfer will become increasingly useful as soft robots begin the transition from early proof-of-concept systems toward fully functional and highly integrated robotic systems. PMID:28078195

  3. (abstract) Generic Modeling of a Life Support System for Process Technology Comparisons

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.

  4. Large-scale diversification of skull shape in domestic dogs: disparity and modularity.

    PubMed

    Drake, Abby Grace; Klingenberg, Christian Peter

    2010-03-01

    Abstract: The variation among domestic dog breeds offers a unique opportunity to study large-scale diversification by microevolutionary mechanisms. We use geometric morphometrics to quantify the diversity of skull shape in 106 breeds of domestic dog, in three wild canid species, and across the order Carnivora. The amount of shape variation among domestic dogs far exceeds that in wild species, and it is comparable to the disparity throughout the Carnivora. The greatest shape distances between dog breeds clearly surpass the maximum divergence between species in the Carnivora. Moreover, domestic dogs occupy a range of novel shapes outside the domain of wild carnivorans. The disparity among companion dogs substantially exceeds that of other classes of breeds, suggesting that relaxed functional demands facilitated diversification. Much of the diversity of dog skull shapes stems from variation between short and elongate skulls and from modularity of the face versus that of the neurocranium. These patterns of integration and modularity apply to variation among individuals and breeds, but they also apply to fluctuating asymmetry, indicating they have a shared developmental basis. These patterns of variation are also found for the wolf and across the Carnivora, suggesting that they existed before the domestication of dogs and are not a result of selective breeding.

  5. Optimizing Automatic Deployment Using Non-functional Requirement Annotations

    NASA Astrophysics Data System (ADS)

    Kugele, Stefan; Haberl, Wolfgang; Tautschnig, Michael; Wechs, Martin

    Model-driven development has become common practice in design of safety-critical real-time systems. High-level modeling constructs help to reduce the overall system complexity apparent to developers. This abstraction caters for fewer implementation errors in the resulting systems. In order to retain correctness of the model down to the software executed on a concrete platform, human faults during implementation must be avoided. This calls for an automatic, unattended deployment process including allocation, scheduling, and platform configuration.

  6. Predicate Abstraction of ANSI-C Programs using SAT

    DTIC Science & Technology

    2003-09-23

    compositionally and automatically. In Alan J. Hu and Moshe Y. Vardi, editors, Computer-Aided Verification, CAV ’98, volume 1427, pages 319–331, Vancouver...Languages, POPL ’77, pages 238–252, 1977. [14] David W. Currie, Alan J. Hu, Sreeranga Rajan, and Masahira Fujita. Automatic formal verification of dsp...Languages and Systems (TOPLAS), 2(4):564–79, 1980. [19] A. Gupta, Z. Yang, P. Ashar , and A. Gupta. SAT-based image computation with application in

  7. Automatic processing of political preferences in the human brain.

    PubMed

    Tusche, Anita; Kahnt, Thorsten; Wisniewski, David; Haynes, John-Dylan

    2013-05-15

    Individual political preferences as expressed, for instance, in votes or donations are fundamental to democratic societies. However, the relevance of deliberative processing for political preferences has been highly debated, putting automatic processes in the focus of attention. Based on this notion, the present study tested whether brain responses reflect participants' preferences for politicians and their associated political parties in the absence of explicit deliberation and attention. Participants were instructed to perform a demanding visual fixation task while their brain responses were measured using fMRI. Occasionally, task-irrelevant images of German politicians from two major competing parties were presented in the background while the distraction task was continued. Subsequent to scanning, participants' political preferences for these politicians and their affiliated parties were obtained. Brain responses in distinct brain areas predicted automatic political preferences at the different levels of abstraction: activation in the ventral striatum was positively correlated with preference ranks for unattended politicians, whereas participants' preferences for the affiliated political parties were reflected in activity in the insula and the cingulate cortex. Using an additional donation task, we showed that the automatic preference-related processing in the brain extended to real-world behavior that involved actual financial loss to participants. Together, these findings indicate that brain responses triggered by unattended and task-irrelevant political images reflect individual political preferences at different levels of abstraction. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Preparing Electronic Clinical Data for Quality Improvement and Comparative Effectiveness Research: The SCOAP CERTAIN Automation and Validation Project

    PubMed Central

    Devine, Emily Beth; Capurro, Daniel; van Eaton, Erik; Alfonso-Cristancho, Rafael; Devlin, Allison; Yanez, N. David; Yetisgen-Yildiz, Meliha; Flum, David R.; Tarczy-Hornoch, Peter

    2013-01-01

    Background: The field of clinical research informatics includes creation of clinical data repositories (CDRs) used to conduct quality improvement (QI) activities and comparative effectiveness research (CER). Ideally, CDR data are accurately and directly abstracted from disparate electronic health records (EHRs), across diverse health-systems. Objective: Investigators from Washington State’s Surgical Care Outcomes and Assessment Program (SCOAP) Comparative Effectiveness Research Translation Network (CERTAIN) are creating such a CDR. This manuscript describes the automation and validation methods used to create this digital infrastructure. Methods: SCOAP is a QI benchmarking initiative. Data are manually abstracted from EHRs and entered into a data management system. CERTAIN investigators are now deploying Caradigm’s Amalga™ tool to facilitate automated abstraction of data from multiple, disparate EHRs. Concordance is calculated to compare data automatically to manually abstracted. Performance measures are calculated between Amalga and each parent EHR. Validation takes place in repeated loops, with improvements made over time. When automated abstraction reaches the current benchmark for abstraction accuracy - 95% - itwill ‘go-live’ at each site. Progress to Date: A technical analysis was completed at 14 sites. Five sites are contributing; the remaining sites prioritized meeting Meaningful Use criteria. Participating sites are contributing 15–18 unique data feeds, totaling 13 surgical registry use cases. Common feeds are registration, laboratory, transcription/dictation, radiology, and medications. Approximately 50% of 1,320 designated data elements are being automatically abstracted—25% from structured data; 25% from text mining. Conclusion: In semi-automating data abstraction and conducting a rigorous validation, CERTAIN investigators will semi-automate data collection to conduct QI and CER, while advancing the Learning Healthcare System. PMID:25848565

  9. Looking at 3,000,000 References Without Growing Grey Hair

    NASA Astrophysics Data System (ADS)

    Demleitner, M.; Accomazzi, A.; Eichhorn, G.; Grant, C. S.; Kurtz, M. J.; Murray, S. S.

    1999-12-01

    The article service of the Astrophysics Data System (ADS, http://adswww.harvard.edu) currently holds about 500,000 pages scanned from astronomical journals and conference proceedings. This data set not only facilitates an easy and convenient access to the majority of the astronomical literature from anywhere on the Internet but also allows highly automatized extraction of the information contained in the articles. As first steps towards processing and indexing the full texts of the articles, the ADS has been extracting abstracts and references from the bitmap images of the articles since May 1999. In this poster we describe the procedures and strategies to (a) automatically identify the regions within a paper containing the abstract or the references, (b) spot and correct errors in the data base or the identification of the regions, (c) resolve references obtained by optical character recognition (OCR) with its inherent uncertainties to parsed references (i.e., bibcodes) and (d) incorporate the data collected in this way into the ADS abstract service. We also give an overview of the extent of additional bibliographical material from this source. We estimate that by January 2000, these procedures will have yielded about 14,000 abstracts and 1,000,000 citation pairs (out of a total of 3,000,000 references) not previously present in the ADS.

  10. Assembly of optical fibers for the connection of polymer-based waveguide

    NASA Astrophysics Data System (ADS)

    Ansel, Yannick; Grau, Daniel; Holzki, Markus; Kraus, Silvio; Neumann, Frank; Reinhard, Carsten; Schmitz, Felix

    2003-03-01

    This paper describes the realization of polymer-based optical structures and the assembly and packaging strategy to connect optical fiber ribbons to the waveguides. For that a low cost fabrication process using the SU-8TM thick photo-resist is presented. This process consists in the deposition of two photo-structurized resist layers filled up with epoxy glue realising the core waveguide. For the assembly, a new modular vacuum gripper was realised and installed on an automatic pick and place assembly robot to mount precisely and efficiently the optical fibers in the optical structures. First results have shown acceptable optical propagation loss for the complete test structure.

  11. CAD-Based Aerodynamic Design of Complex Configurations using a Cartesian Method

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.; Pulliam, Thomas H.

    2003-01-01

    A modular framework for aerodynamic optimization of complex geometries is developed. By working directly with a parametric CAD system, complex-geometry models are modified nnd tessellated in an automatic fashion. The use of a component-based Cartesian method significantly reduces the demands on the CAD system, and also provides for robust and efficient flowfield analysis. The optimization is controlled using either a genetic or quasi-Newton algorithm. Parallel efficiency of the framework is maintained even when subject to limited CAD resources by dynamically re-allocating the processors of the flow solver. Overall, the resulting framework can explore designs incorporating large shape modifications and changes in topology.

  12. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Weatherbee, J. E.; Taylor, D. S.

    1972-01-01

    A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.

  13. A Solution Adaptive Technique Using Tetrahedral Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    2000-01-01

    An adaptive unstructured grid refinement technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The method is based on a combination of surface mesh subdivision and local remeshing of the volume grid Simple functions of flow quantities are employed to detect dominant features of the flowfield The method is designed for modular coupling with various error/feature analyzers and flow solvers. Several steady-state, inviscid flow test cases are presented to demonstrate the applicability of the method for solving practical three-dimensional problems. In all cases, accurate solutions featuring complex, nonlinear flow phenomena such as shock waves and vortices have been generated automatically and efficiently.

  14. Automatic design of IMA systems

    NASA Astrophysics Data System (ADS)

    Salomon, U.; Reichel, R.

    During the last years, the integrated modular avionics (IMA) design philosophy became widely established at aircraft manufacturers, giving rise to a series of new design challenges, most notably the allocation of avionics functions to the various IMA components and the placement of this equipment in the aircraft. This paper presents a modelling approach for avionics that allows automation of some steps of the design process by applying an optimisation algorithm which searches for system configurations that fulfil the safety requirements and have low costs. The algorithm was implemented as a quite sophisticated software prototype, therefore we will also present detailed results of its application to actual avionics systems.

  15. Memory interface simulator: A computer design aid

    NASA Technical Reports Server (NTRS)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  16. Interpreting Abstract Interpretations in Membership Equational Logic

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Rosu, Grigore

    2001-01-01

    We present a logical framework in which abstract interpretations can be naturally specified and then verified. Our approach is based on membership equational logic which extends equational logics by membership axioms, asserting that a term has a certain sort. We represent an abstract interpretation as a membership equational logic specification, usually as an overloaded order-sorted signature with membership axioms. It turns out that, for any term, its least sort over this specification corresponds to its most concrete abstract value. Maude implements membership equational logic and provides mechanisms to calculate the least sort of a term efficiently. We first show how Maude can be used to get prototyping of abstract interpretations "for free." Building on the meta-logic facilities of Maude, we further develop a tool that automatically checks and abstract interpretation against a set of user-defined properties. This can be used to select an appropriate abstract interpretation, to characterize the specified loss of information during abstraction, and to compare different abstractions with each other.

  17. A method for automatically abstracting visual documents

    NASA Technical Reports Server (NTRS)

    Rorvig, Mark E.

    1994-01-01

    Visual documents--motion sequences on film, videotape, and digital recording--constitute a major source of information for the Space Agency, as well as all other government and private sector entities. This article describes a method for automatically selecting key frames from visual documents. These frames may in turn be used to represent the total image sequence of visual documents in visual libraries, hypermedia systems, and training algorithm reduces 51 minutes of video sequences to 134 frames; a reduction of information in the range of 700:1.

  18. Evaluation of Particle Counter Technology for Detection of Fuel Contamination Detection Utilizing Advanced Aviation Forward Area Refueling System

    DTIC Science & Technology

    2014-01-24

    8, Automatic Particle Counter, cleanliness, free water, Diesel 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT none 18. NUMBER OF...aircraft, or up to 10 mg/L for product used as a diesel product for ground use (1). Free water contamination (droplets) may appear as fine droplets or...published several methods and test procedures for the calibration and use of automatic particle counters. The transition of this technology to the fuel

  19. Threat assessment and sensor management in a modular architecture

    NASA Astrophysics Data System (ADS)

    Page, S. F.; Oldfield, J. P.; Islip, S.; Benfold, B.; Brandon, R.; Thomas, P. A.; Stubbins, D. J.

    2016-10-01

    Many existing asset/area protection systems, for example those deployed to protect critical national infrastructure, are comprised of multiple sensors such as EO/IR, radar, and Perimeter Intrusion Detection Systems (PIDS), loosely integrated with a central Command and Control (C2) system. Whilst some sensors provide automatic event detection and C2 systems commonly provide rudimentary multi-sensor rule based alerting, the performance of such systems is limited by the lack of deep integration and autonomy. As a result, these systems have a high degree of operator burden. To address these challenges, an architectural concept termed "SAPIENT" was conceived. SAPIENT is based on multiple Autonomous Sensor Modules (ASMs) connected to a High-Level Decision Making Module (HLDMM) that provides data fusion, situational awareness, alerting, and sensor management capability. The aim of the SAPIENT concept is to allow for the creation of a surveillance system, in a modular plug-and-play manner, that provides high levels of autonomy, threat detection performance, and reduced operator burden. This paper considers the challenges associated with developing an HLDMM aligned with the SAPIENT concept, through the discussion of the design of a realised HLDMM. Particular focus is drawn to how high levels of system level performance can be achieved whilst retaining modularity and flexibility. A number of key aspects of our HLDMM are presented, including an integrated threat assessment and sensor management framework, threat sequence matching, and ASM trust modelling. The results of real-world testing of the HLDMM, in conjunction with multiple Laser, Radar, and EO/IR sensors, in representative semi-urban environments, are discussed.

  20. Definition of Systematic, Approximately Separable, and Modular Internal Coordinates (SASMIC) for macromolecular simulation.

    PubMed

    Echenique, Pablo; Alonso, J L

    2006-07-30

    A set of rules is defined to systematically number the groups and the atoms of polypeptides in a modular manner. Supported by this numeration, a set of internal coordinates is defined. These coordinates (termed Systematic, Approximately Separable, and Modular Internal Coordinates--SASMIC) are straightforwardly written in Z-matrix form and may be directly implemented in typical Quantum Chemistry packages. A number of Perl scripts that automatically generate the Z-matrix files are provided as supplementary material. The main difference with most Z-matrix-like coordinates normally used in the literature is that normal dihedral angles ("principal dihedrals" in this work) are only used to fix the orientation of whole groups and a different type of dihedrals, termed "phase dihedrals," are used to describe the covalent structure inside the groups. This physical approach allows to approximately separate soft and hard movements of the molecule using only topological information and to directly implement constraints. As an application, we use the coordinates defined and ab initio quantum mechanical calculations to assess the commonly assumed approximation of the free energy, obtained from "integrating out" the side chain degree of freedom chi, by the Potential Energy Surface (PES) in the protected dipeptide HCO-L-Ala-NH2. We also present a subbox of the Hessian matrix in two different sets of coordinates to illustrate the approximate separation of soft and hard movements when the coordinates defined in this work are used. (PACS: 87.14.Ee, 87.15.-v, 87.15.Aa, 87.15.Cc) 2006 Wiley Periodicals, Inc.

  1. The use of automatic programming techniques for fault tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Wild, C.

    1985-01-01

    It is conjectured that the production of software for ultra-reliable computing systems such as required by Space Station, aircraft, nuclear power plants and the like will require a high degree of automation as well as fault tolerance. In this paper, the relationship between automatic programming techniques and fault tolerant computing systems is explored. Initial efforts in the automatic synthesis of code from assertions to be used for error detection as well as the automatic generation of assertions and test cases from abstract data type specifications is outlined. Speculation on the ability to generate truly diverse designs capable of recovery from errors by exploring alternate paths in the program synthesis tree is discussed. Some initial thoughts on the use of knowledge based systems for the global detection of abnormal behavior using expectations and the goal-directed reconfiguration of resources to meet critical mission objectives are given. One of the sources of information for these systems would be the knowledge captured during the automatic programming process.

  2. MAGMA: analysis of two-channel microarrays made easy.

    PubMed

    Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph

    2007-07-01

    The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.

  3. Toward the First Data Acquisition Standard in Synthetic Biology.

    PubMed

    Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I

    2016-08-19

    This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.

  4. Self mobile space manipulator project

    NASA Technical Reports Server (NTRS)

    Brown, H. Ben; Friedman, Mark; Xu, Yangsheng; Kanade, Takeo

    1992-01-01

    A relatively simple, modular, low mass, low cost robot is being developed for space EVA that is large enough to be independently mobile on a space station or platform exterior, yet versatile enough to accomplish many vital tasks. The robot comprises two long flexible links connected by a rotary joint, with 2-DOF 'wrist' joints and grippers at each end. It walks by gripping pre-positioned attachment points, such as trusswork nodes, and alternately shifting its base of support from one foot (gripper) to the other. The robot can perform useful tasks such as visual inspection, material transport, and light assembly by manipulating objects with one gripper, while stabilizing itself with the other. At SOAR '90, we reported development of 1/3 scale robot hardware, modular trusswork to serve as a locomotion substrate, and a gravity compensation system to allow laboratory tests of locomotion strategies on the horizontal face of the trusswork. In this paper, we report on project progress including the development of: (1) adaptive control for automatic adjustment to loads; (2) enhanced manipulation capabilities; (3) machine vision, including the use of neural nets, to guide autonomous locomotion; (4) locomotion between orthogonal trusswork faces; and (5) improved facilities for gravity compensation and telerobotic control.

  5. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  6. A Reconfigurable Omnidirectional Soft Robot Based on Caterpillar Locomotion.

    PubMed

    Zou, Jun; Lin, Yangqiao; Ji, Chen; Yang, Huayong

    2018-04-01

    A pneumatically powered, reconfigurable omnidirectional soft robot based on caterpillar locomotion is described. The robot is composed of nine modules arranged as a three by three matrix and the length of this matrix is 154 mm. The robot propagates a traveling wave inspired by caterpillar locomotion, and it has all three degrees of freedom on a plane (X, Y, and rotation). The speed of the robot is about 18.5 m/h (two body lengths per minute) and it can rotate at a speed of 1.63°/s. The modules have neodymium-iron-boron (NdFeB) magnets embedded and can be easily replaced or combined into other configurations. Two different configurations are presented to demonstrate the possibilities of the modular structure: (1) by removing some modules, the omnidirectional robot can be reassembled into a form that can crawl in a pipe and (2) two omnidirectional robots can crawl close to each other and be assembled automatically into a bigger omnidirectional robot. Omnidirectional motion is important for soft robots to explore unstructured environments. The modular structure gives the soft robot the ability to cope with the challenges of different environments and tasks.

  7. Localizer: fast, accurate, open-source, and modular software package for superresolution microscopy

    PubMed Central

    Duwé, Sam; Neely, Robert K.; Zhang, Jin

    2012-01-01

    Abstract. We present Localizer, a freely available and open source software package that implements the computational data processing inherent to several types of superresolution fluorescence imaging, such as localization (PALM/STORM/GSDIM) and fluctuation imaging (SOFI/pcSOFI). Localizer delivers high accuracy and performance and comes with a fully featured and easy-to-use graphical user interface but is also designed to be integrated in higher-level analysis environments. Due to its modular design, Localizer can be readily extended with new algorithms as they become available, while maintaining the same interface and performance. We provide front-ends for running Localizer from Igor Pro, Matlab, or as a stand-alone program. We show that Localizer performs favorably when compared with two existing superresolution packages, and to our knowledge is the only freely available implementation of SOFI/pcSOFI microscopy. By dramatically improving the analysis performance and ensuring the easy addition of current and future enhancements, Localizer strongly improves the usability of superresolution imaging in a variety of biomedical studies. PMID:23208219

  8. A Formal Model of Partitioning for Integrated Modular Avionics

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    1998-01-01

    The aviation industry is gradually moving toward the use of integrated modular avionics (IMA) for civilian transport aircraft. An important concern for IMA is ensuring that applications are safely partitioned so they cannot interfere with one another. We have investigated the problem of ensuring safe partitioning and logical non-interference among separate applications running on a shared Avionics Computer Resource (ACR). This research was performed in the context of ongoing standardization efforts, in particular, the work of RTCA committee SC-182, and the recently completed ARINC 653 application executive (APEX) interface standard. We have developed a formal model of partitioning suitable for evaluating the design of an ACR. The model draws from the mathematical modeling techniques developed by the computer security community. This report presents a formulation of partitioning requirements expressed first using conventional mathematical notation, then formalized using the language of SRI'S Prototype Verification System (PVS). The approach is demonstrated on three candidate designs, each an abstraction of features found in real systems.

  9. BIRI: a new approach for automatically discovering and indexing available public bioinformatics resources from the literature.

    PubMed

    de la Calle, Guillermo; García-Remesal, Miguel; Chiesa, Stefano; de la Iglesia, Diana; Maojo, Victor

    2009-10-07

    The rapid evolution of Internet technologies and the collaborative approaches that dominate the field have stimulated the development of numerous bioinformatics resources. To address this new framework, several initiatives have tried to organize these services and resources. In this paper, we present the BioInformatics Resource Inventory (BIRI), a new approach for automatically discovering and indexing available public bioinformatics resources using information extracted from the scientific literature. The index generated can be automatically updated by adding additional manuscripts describing new resources. We have developed web services and applications to test and validate our approach. It has not been designed to replace current indexes but to extend their capabilities with richer functionalities. We developed a web service to provide a set of high-level query primitives to access the index. The web service can be used by third-party web services or web-based applications. To test the web service, we created a pilot web application to access a preliminary knowledge base of resources. We tested our tool using an initial set of 400 abstracts. Almost 90% of the resources described in the abstracts were correctly classified. More than 500 descriptions of functionalities were extracted. These experiments suggest the feasibility of our approach for automatically discovering and indexing current and future bioinformatics resources. Given the domain-independent characteristics of this tool, it is currently being applied by the authors in other areas, such as medical nanoinformatics. BIRI is available at http://edelman.dia.fi.upm.es/biri/.

  10. An evolutionary computation based algorithm for calculating solar differential rotation by automatic tracking of coronal bright points

    NASA Astrophysics Data System (ADS)

    Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.

    2016-03-01

    Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.

  11. A preliminary architecture for building communication software from traffic captures

    NASA Astrophysics Data System (ADS)

    Acosta, Jaime C.; Estrada, Pedro

    2017-05-01

    Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.

  12. Automatic Summarization of MEDLINE Citations for Evidence–Based Medical Treatment: A Topic-Oriented Evaluation

    PubMed Central

    Fiszman, Marcelo; Demner-Fushman, Dina; Kilicoglu, Halil; Rindflesch, Thomas C.

    2009-01-01

    As the number of electronic biomedical textual resources increases, it becomes harder for physicians to find useful answers at the point of care. Information retrieval applications provide access to databases; however, little research has been done on using automatic summarization to help navigate the documents returned by these systems. After presenting a semantic abstraction automatic summarization system for MEDLINE citations, we concentrate on evaluating its ability to identify useful drug interventions for fifty-three diseases. The evaluation methodology uses existing sources of evidence-based medicine as surrogates for a physician-annotated reference standard. Mean average precision (MAP) and a clinical usefulness score developed for this study were computed as performance metrics. The automatic summarization system significantly outperformed the baseline in both metrics. The MAP gain was 0.17 (p < 0.01) and the increase in the overall score of clinical usefulness was 0.39 (p < 0.05). PMID:19022398

  13. Dynamic Information and Library Processing.

    ERIC Educational Resources Information Center

    Salton, Gerard

    This book provides an introduction to automated information services: collection, analysis, classification, storage, retrieval, transmission, and dissemination. An introductory chapter is followed by an overview of mechanized processes for acquisitions, cataloging, and circulation. Automatic indexing and abstracting methods are covered, followed…

  14. Reaction Mechanism Generator: Automatic construction of chemical kinetic mechanisms

    DOE PAGES

    Gao, Connie W.; Allen, Joshua W.; Green, William H.; ...

    2016-02-24

    Reaction Mechanism Generator (RMG) constructs kinetic models composed of elementary chemical reaction steps using a general understanding of how molecules react. Species thermochemistry is estimated through Benson group additivity and reaction rate coefficients are estimated using a database of known rate rules and reaction templates. At its core, RMG relies on two fundamental data structures: graphs and trees. Graphs are used to represent chemical structures, and trees are used to represent thermodynamic and kinetic data. Models are generated using a rate-based algorithm which excludes species from the model based on reaction fluxes. RMG can generate reaction mechanisms for species involvingmore » carbon, hydrogen, oxygen, sulfur, and nitrogen. It also has capabilities for estimating transport and solvation properties, and it automatically computes pressure-dependent rate coefficients and identifies chemically-activated reaction paths. RMG is an object-oriented program written in Python, which provides a stable, robust programming architecture for developing an extensible and modular code base with a large suite of unit tests. Computationally intensive functions are cythonized for speed improvements.« less

  15. Integrating personalized medical test contents with XML and XSL-FO.

    PubMed

    Toddenroth, Dennis; Dugas, Martin; Frankewitsch, Thomas

    2011-03-01

    In 2004 the adoption of a modular curriculum at the medical faculty in Muenster led to the introduction of centralized examinations based on multiple-choice questions (MCQs). We report on how organizational challenges of realizing faculty-wide personalized tests were addressed by implementation of a specialized software module to automatically generate test sheets from individual test registrations and MCQ contents. Key steps of the presented method for preparing personalized test sheets are (1) the compilation of relevant item contents and graphical media from a relational database with database queries, (2) the creation of Extensible Markup Language (XML) intermediates, and (3) the transformation into paginated documents. The software module by use of an open source print formatter consistently produced high-quality test sheets, while the blending of vectorized textual contents and pixel graphics resulted in efficient output file sizes. Concomitantly the module permitted an individual randomization of item sequences to prevent illicit collusion. The automatic generation of personalized MCQ test sheets is feasible using freely available open source software libraries, and can be efficiently deployed on a faculty-wide scale.

  16. FALCON: A distributed scheduler for MIMD architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimshaw, A.S.; Vivas, V.E. Jr.

    1991-01-01

    This paper describes FALCON (Fully Automatic Load COordinator for Networks), the scheduler for the Mentat parallel processing system. FALCON has a modular structure and is designed for systems that use a task scheduling mechanism. FALCON is distributed, stable, supports system heterogeneities, and employs a sender-initiated adaptive load sharing policy with static task assignment. FALCON is parameterizable and is implemented in Mentat, a working distributed system. We present the design and implementation of FALCON as well as a brief introduction to those features of the Mentat run-time system that influence FALCON. Performance measures under different scheduler configurations are also presented andmore » analyzed with respect to the system parameters. 36 refs., 8 figs.« less

  17. Programmable Cadence Timer

    NASA Technical Reports Server (NTRS)

    Hall, William A.; Gilbert, John

    1990-01-01

    Electronic metronome paces users through wide range of exercise routines. Conceptual programmable cadence timer provides rhythmic aural and visual cues. Timer automatically changes cadence according to program entered by the user. It also functions as clock, stopwatch, or alarm. Modular pacer operated as single unit or as two units. With audiovisual module moved away from base module, user concentrates on exercise cues without distraction from information appearing on the liquid-crystal display. Variety of uses in rehabilitative medicine, experimental medicine, sports, and gymnastics. Used in intermittent positive-pressure breathing treatment, in which patient must rhythmically inhale and retain medication delivered under positive pressure; and in incentive spirometer treatment, in which patient must inhale maximally at regular intervals.

  18. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  19. Implementation of an optimum profile guidance system on STOLAND

    NASA Technical Reports Server (NTRS)

    Flanagan, P. F.

    1978-01-01

    The implementation on the STOLAND airborne digital computer of an optimum profile guidance system for the augmentor wing jet STOL research aircraft is described. Major tasks were to implement the guidance and control logic to airborne computer software and to integrate the module with the existing STOLAND navigation, display, and autopilot routines. The optimum profile guidance system comprises an algorithm for synthesizing mimimum fuel trajectories for a wide range of starting positions in the terminal area and a control law for flying the aircraft automatically along the trajectory. The avionics software developed is described along with a FORTRAN program that was constructed to reflect the modular nature and algorthms implemented in the avionics software.

  20. Irrelevance in Problem Solving

    NASA Technical Reports Server (NTRS)

    Levy, Alon Y.

    1992-01-01

    The notion of irrelevance underlies many different works in AI, such as detecting redundant facts, creating abstraction hierarchies and reformulation and modeling physical devices. However, in order to design problem solvers that exploit the notion of irrelevance, either by automatically detecting irrelevance or by being given knowledge about irrelevance, a formal treatment of the notion is required. In this paper we present a general framework for analyzing irrelevance. We discuss several properties of irrelevance and show how they vary in a space of definitions outlined by the framework. We show how irrelevance claims can be used to justify the creation of abstractions thereby suggesting a new view on the work on abstraction.

  1. Automated Tumor Registry for Oncology. A VA-DHCP MUMPS application.

    PubMed

    Richie, S

    1992-01-01

    The VA Automated Tumor Registry for Oncology, Version 2, is a multifaceted, completely automated user-friendly cancer database. Easy to use modules include: Automatic Casefinding; Suspense Files; Abstracting and Printing; Follow-up; Annual Reports; Statistical Reports; Utility Functions.

  2. Automatic imitation in a strategic context: players of rock–paper–scissors imitate opponents' gestures†

    PubMed Central

    Cook, Richard; Bird, Geoffrey; Lünser, Gabriele; Huck, Steffen; Heyes, Cecilia

    2012-01-01

    A compelling body of evidence indicates that observing a task-irrelevant action makes the execution of that action more likely. However, it remains unclear whether this ‘automatic imitation’ effect is indeed automatic or whether the imitative action is voluntary. The present study tested the automaticity of automatic imitation by asking whether it occurs in a strategic context where it reduces payoffs. Participants were required to play rock–paper–scissors, with the aim of achieving as many wins as possible, while either one or both players were blindfolded. While the frequency of draws in the blind–blind condition was precisely that expected at chance, the frequency of draws in the blind–sighted condition was significantly elevated. Specifically, the execution of either a rock or scissors gesture by the blind player was predictive of an imitative response by the sighted player. That automatic imitation emerges in a context where imitation reduces payoffs accords with its ‘automatic’ description, and implies that these effects are more akin to involuntary than to voluntary actions. These data represent the first evidence of automatic imitation in a strategic context, and challenge the abstraction from physical aspects of social interaction typical in economic and game theory. PMID:21775334

  3. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  4. BRENDA in 2013: integrated reactions, kinetic data, enzyme function data, improved disease classification: new options and contents in BRENDA.

    PubMed

    Schomburg, Ida; Chang, Antje; Placzek, Sandra; Söhngen, Carola; Rother, Michael; Lang, Maren; Munaretto, Cornelia; Ulas, Susanne; Stelzer, Michael; Grote, Andreas; Scheer, Maurice; Schomburg, Dietmar

    2013-01-01

    The BRENDA (BRaunschweig ENzyme DAtabase) enzyme portal (http://www.brenda-enzymes.org) is the main information system of functional biochemical and molecular enzyme data and provides access to seven interconnected databases. BRENDA contains 2.7 million manually annotated data on enzyme occurrence, function, kinetics and molecular properties. Each entry is connected to a reference and the source organism. Enzyme ligands are stored with their structures and can be accessed via their names, synonyms or via a structure search. FRENDA (Full Reference ENzyme DAta) and AMENDA (Automatic Mining of ENzyme DAta) are based on text mining methods and represent a complete survey of PubMed abstracts with information on enzymes in different organisms, tissues or organelles. The supplemental database DRENDA provides more than 910 000 new EC number-disease relations in more than 510 000 references from automatic search and a classification of enzyme-disease-related information. KENDA (Kinetic ENzyme DAta), a new amendment extracts and displays kinetic values from PubMed abstracts. The integration of the EnzymeDetector offers an automatic comparison, evaluation and prediction of enzyme function annotations for prokaryotic genomes. The biochemical reaction database BKM-react contains non-redundant enzyme-catalysed and spontaneous reactions and was developed to facilitate and accelerate the construction of biochemical models.

  5. Automated Tumor Registry for Oncology. A VA-DHCP MUMPS application.

    PubMed Central

    Richie, S.

    1992-01-01

    The VA Automated Tumor Registry for Oncology, Version 2, is a multifaceted, completely automated user-friendly cancer database. Easy to use modules include: Automatic Casefinding; Suspense Files; Abstracting and Printing; Follow-up; Annual Reports; Statistical Reports; Utility Functions. PMID:1482866

  6. An Ada implementation of the network manager for the advanced information processing system

    NASA Technical Reports Server (NTRS)

    Nagle, Gail A.

    1986-01-01

    From an implementation standpoint, the Ada language provided many features which facilitated the data and procedure abstraction process. The language supported a design which was dynamically flexible (despite strong typing), modular, and self-documenting. Adequate training of programmers requires access to an efficient compiler which supports full Ada. When the performance issues for real time processing are finally addressed by more stringent requirements for tasking features and the development of efficient run-time environments for embedded systems, the full power of the language will be realized.

  7. Machine‐Assisted Organic Synthesis

    PubMed Central

    Fitzpatrick, Daniel E.; Myers, Rebecca M.; Battilocchio, Claudio; Ingham, Richard. J.

    2015-01-01

    Abstract In this Review we describe how the advent of machines is impacting on organic synthesis programs, with particular emphasis on the practical issues associated with the design of chemical reactors. In the rapidly changing, multivariant environment of the research laboratory, equipment needs to be modular to accommodate high and low temperatures and pressures, enzymes, multiphase systems, slurries, gases, and organometallic compounds. Additional technologies have been developed to facilitate more specialized reaction techniques such as electrochemical and photochemical methods. All of these areas create both opportunities and challenges during adoption as enabling technologies. PMID:26193360

  8. Continuous system modeling

    NASA Technical Reports Server (NTRS)

    Cellier, Francois E.

    1991-01-01

    A comprehensive and systematic introduction is presented for the concepts associated with 'modeling', involving the transition from a physical system down to an abstract description of that system in the form of a set of differential and/or difference equations, and basing its treatment of modeling on the mathematics of dynamical systems. Attention is given to the principles of passive electrical circuit modeling, planar mechanical systems modeling, hierarchical modular modeling of continuous systems, and bond-graph modeling. Also discussed are modeling in equilibrium thermodynamics, population dynamics, and system dynamics, inductive reasoning, artificial neural networks, and automated model synthesis.

  9. Automated Modular Magnetic Resonance Imaging Clinical Decision Support System (MIROR): An Application in Pediatric Cancer Diagnosis

    PubMed Central

    Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine

    2018-01-01

    Background Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. Objective The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. Methods The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Results Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. Conclusions MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians’ skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. PMID:29720361

  10. Software Toolbox Development for Rapid Earthquake Source Optimisation Combining InSAR Data and Seismic Waveforms

    NASA Astrophysics Data System (ADS)

    Isken, Marius P.; Sudhaus, Henriette; Heimann, Sebastian; Steinberg, Andreas; Bathke, Hannes M.

    2017-04-01

    We present a modular open-source software framework (pyrocko, kite, grond; http://pyrocko.org) for rapid InSAR data post-processing and modelling of tectonic and volcanic displacement fields derived from satellite data. Our aim is to ease and streamline the joint optimisation of earthquake observations from InSAR and GPS data together with seismological waveforms for an improved estimation of the ruptures' parameters. Through this approach we can provide finite models of earthquake ruptures and therefore contribute to a timely and better understanding of earthquake kinematics. The new kite module enables a fast processing of unwrapped InSAR scenes for source modelling: the spatial sub-sampling and data error/noise estimation for the interferogram is evaluated automatically and interactively. The rupture's near-field surface displacement data are then combined with seismic far-field waveforms and jointly modelled using the pyrocko.gf framwork, which allows for fast forward modelling based on pre-calculated elastodynamic and elastostatic Green's functions. Lastly the grond module supplies a bootstrap-based probabilistic (Monte Carlo) joint optimisation to estimate the parameters and uncertainties of a finite-source earthquake rupture model. We describe the developed and applied methods as an effort to establish a semi-automatic processing and modelling chain. The framework is applied to Sentinel-1 data from the 2016 Central Italy earthquake sequence, where we present the earthquake mechanism and rupture model from which we derive regions of increased coulomb stress. The open source software framework is developed at GFZ Potsdam and at the University of Kiel, Germany, it is written in Python and C programming languages. The toolbox architecture is modular and independent, and can be utilized flexibly for a variety of geophysical problems. This work is conducted within the BridGeS project (http://www.bridges.uni-kiel.de) funded by the German Research Foundation DFG through an Emmy-Noether grant.

  11. Web Based Seismological Monitoring (wbsm)

    NASA Astrophysics Data System (ADS)

    Giudicepietro, F.; Meglio, V.; Romano, S. P.; de Cesare, W.; Ventre, G.; Martini, M.

    Over the last few decades the seismological monitoring systems have dramatically improved tanks to the technological advancements and to the scientific progresses of the seismological studies. The most modern processing systems use the network tech- nologies to realize high quality performances in data transmission and remote controls. Their architecture is designed to favor the real-time signals analysis. This is, usually, realized by adopting a modular structure that allow to easy integrate any new cal- culation algorithm, without affecting the other system functionalities. A further step in the seismic processing systems evolution is the large use of the web based appli- cations. The web technologies can be an useful support for the monitoring activities allowing to automatically publishing the results of signals processing and favoring the remote access to data, software systems and instrumentation. An application of the web technologies to the seismological monitoring has been developed at the "Os- servatorio Vesuviano" monitoring center (INGV) in collaboration with the "Diparti- mento di Informatica e Sistemistica" of the Naples University. A system named Web Based Seismological Monitoring (WBSM) has been developed. Its main objective is to automatically publish the seismic events processing results and to allow displaying, analyzing and downloading seismic data via Internet. WBSM uses the XML tech- nology for hypocentral and picking parameters representation and creates a seismic events data base containing parametric data and wave-forms. In order to give tools for the evaluation of the quality and reliability of the published locations, WBSM also supplies all the quality parameters calculated by the locating program and allow to interactively display the wave-forms and the related parameters. WBSM is a modular system in which the interface function to the data sources is performed by two spe- cific modules so that to make it working in conjunction with a generic data source it is sufficient to modify or substitute the interface modules. WBSM is running at the "Osservatorio Vesuviano" Monitoring Center since the beginning of 2001 and can be visited at http://ov.ingv.it.

  12. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.

    Abstract. Background In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here in this paper, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family (“Candidatus MH11”) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named “Candidatus Paraporphyromonas polyenzymogenes”,more » illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  13. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE PAGES

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.; ...

    2018-03-01

    Abstract. Background In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here in this paper, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family (“Candidatus MH11”) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named “Candidatus Paraporphyromonas polyenzymogenes”,more » illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  14. Presentation of Repeated Phrases in a Computer-Assisted Abstracting Tool Kit.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2001-01-01

    Discusses automatic indexing methods and describes the development of a prototype computerized abstractor's assistant. Highlights include the text network management system, TEXNET; phrase selection that follows indexing; phrase display, including Boolean capabilities; results of preliminary testing; and availability of TEXNET software. (LRW)

  15. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    NASA Astrophysics Data System (ADS)

    D'Amico, Giuseppe; Amodeo, Aldo; Mattis, Ina; Freudenthaler, Volker; Pappalardo, Gelsomina

    2016-02-01

    In this paper we describe an automatic tool for the pre-processing of aerosol lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of ELPP, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of ELPP is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of ELPP. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. ELPP has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  16. Feasibility of Automatic Extraction of Electronic Health Data to Evaluate a Status Epilepticus Clinical Protocol.

    PubMed

    Hafeez, Baria; Paolicchi, Juliann; Pon, Steven; Howell, Joy D; Grinspan, Zachary M

    2016-05-01

    Status epilepticus is a common neurologic emergency in children. Pediatric medical centers often develop protocols to standardize care. Widespread adoption of electronic health records by hospitals affords the opportunity for clinicians to rapidly, and electronically evaluate protocol adherence. We reviewed the clinical data of a small sample of 7 children with status epilepticus, in order to (1) qualitatively determine the feasibility of automated data extraction and (2) demonstrate a timeline-style visualization of each patient's first 24 hours of care. Qualitatively, our observations indicate that most clinical data are well labeled in structured fields within the electronic health record, though some important information, particularly electroencephalography (EEG) data, may require manual abstraction. We conclude that a visualization that clarifies a patient's clinical course can be automatically created using the patient's electronic clinical data, supplemented with some manually abstracted data. Future work could use this timeline to evaluate adherence to status epilepticus clinical protocols. © The Author(s) 2015.

  17. MassCascade: Visual Programming for LC-MS Data Processing in Metabolomics.

    PubMed

    Beisken, Stephan; Earll, Mark; Portwood, David; Seymour, Mark; Steinbeck, Christoph

    2014-04-01

    Liquid chromatography coupled to mass spectrometry (LC-MS) is commonly applied to investigate the small molecule complement of organisms. Several software tools are typically joined in custom pipelines to semi-automatically process and analyse the resulting data. General workflow environments like the Konstanz Information Miner (KNIME) offer the potential of an all-in-one solution to process LC-MS data by allowing easy integration of different tools and scripts. We describe MassCascade and its workflow plug-in for processing LC-MS data. The Java library integrates frequently used algorithms in a modular fashion, thus enabling it to serve as back-end for graphical front-ends. The functions available in MassCascade have been encapsulated in a plug-in for the workflow environment KNIME, allowing combined use with e.g. statistical workflow nodes from other providers and making the tool intuitive to use without knowledge of programming. The design of the software guarantees a high level of modularity where processing functions can be quickly replaced or concatenated. MassCascade is an open-source library for LC-MS data processing in metabolomics. It embraces the concept of visual programming through its KNIME plug-in, simplifying the process of building complex workflows. The library was validated using open data.

  18. The new generation of OpenGL support in ROOT

    NASA Astrophysics Data System (ADS)

    Tadel, M.

    2008-07-01

    OpenGL has been promoted to become the main 3D rendering engine of the ROOT framework. This required a major re-modularization of OpenGL support on all levels, from basic window-system specific interface to medium-level object-representation and top-level scene management. This new architecture allows seamless integration of external scene-graph libraries into the ROOT OpenGL viewer as well as inclusion of ROOT 3D scenes into external GUI and OpenGL-based 3D-rendering frameworks. Scene representation was removed from inside of the viewer, allowing scene-data to be shared among several viewers and providing for a natural implementation of multi-view canvas layouts. The object-graph traversal infrastructure allows free mixing of 3D and 2D-pad graphics and makes implementation of ROOT canvas in pure OpenGL possible. Scene-elements representing ROOT objects trigger automatic instantiation of user-provided rendering-objects based on the dictionary information and class-naming convention. Additionally, a finer, per-object control over scene-updates is available to the user, allowing overhead-free maintenance of dynamic 3D scenes and creation of complex real-time animations. User-input handling was modularized as well, making it easy to support application-specific scene navigation, selection handling and tool management.

  19. Space shuttle heat pipe thermal control systems

    NASA Technical Reports Server (NTRS)

    Alario, J.

    1973-01-01

    Heat pipe (HP) thermal control systems designed for possible space shuttle applications were built and tested under this program. They are: (1) a HP augmented cold rail, (2) a HP/phase change material (PCM) modular heat sink and (3) a HP radiating panel for compartment temperature control. The HP augmented cold rail is similar to a standard two-passage fluid cold rail except that it contains an integral, centrally located HP throughout its length. The central HP core helps to increase the local power density capability by spreading concentrated heat inputs over the entire rail. The HP/PCM modular heat sink system consists of a diode HP connected in series to a standard HP that has a PCM canister attached to its mid-section. It is designed to connect a heat source to a structural heat sink during normal operation, and to automatically decouple from it and sink to the PCM whenever structural temperatures are too high. The HP radiating panel is designed to conductively couple the panel feeder HPs directly to a fluid line that serves as a source of waste heat. It is a simple strap-on type of system that requires no internal or external line modifications to distribute the heat to a large radiating area.

  20. Approximation, abstraction and decomposition in search and optimization

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1992-01-01

    In this paper, I discuss four different areas of my research. One portion of my research has focused on automatic synthesis of search control heuristics for constraint satisfaction problems (CSPs). I have developed techniques for automatically synthesizing two types of heuristics for CSPs: Filtering functions are used to remove portions of a search space from consideration. Another portion of my research is focused on automatic synthesis of hierarchic algorithms for solving constraint satisfaction problems (CSPs). I have developed a technique for constructing hierarchic problem solvers based on numeric interval algebra. Another portion of my research is focused on automatic decomposition of design optimization problems. We are using the design of racing yacht hulls as a testbed domain for this research. Decomposition is especially important in the design of complex physical shapes such as yacht hulls. Another portion of my research is focused on intelligent model selection in design optimization. The model selection problem results from the difficulty of using exact models to analyze the performance of candidate designs.

  1. Ball with hair: modular functionalization of highly stable G-quadruplex DNA nano-scaffolds through N2-guanine modification

    PubMed Central

    Lech, Christopher Jacques

    2017-01-01

    Abstract Functionalized nanoparticles have seen valuable applications, particularly in the delivery of therapeutic and diagnostic agents in biological systems. However, the manufacturing of such nano-scale systems with the consistency required for biological application can be challenging, as variation in size and shape have large influences in nanoparticle behavior in vivo. We report on the development of a versatile nano-scaffold based on the modular functionalization of a DNA G-quadruplex. DNA sequences are functionalized in a modular fashion using well-established phosphoramidite chemical synthesis with nucleotides containing modification of the amino (N2) position of the guanine base. In physiological conditions, these sequences fold into well-defined G-quadruplex structures. The resulting DNA nano-scaffolds are thermally stable, consistent in size, and functionalized in a manner that allows for control over the density and relative orientation of functional chemistries on the nano-scaffold surface. Various chemistries including small modifications (N2-methyl-guanine), bulky aromatic modifications (N2-benzyl-guanine), and long chain-like modifications (N2-6-amino-hexyl-guanine) are tested and are found to be generally compatible with G-quadruplex formation. Furthermore, these modifications stabilize the G-quadruplex scaffold by 2.0–13.3 °C per modification in the melting temperature, with concurrent modifications producing extremely stable nano-scaffolds. We demonstrate the potential of this approach by functionalizing nano-scaffolds for use within the biotin–avidin conjugation approach. PMID:28499037

  2. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, George

    1993-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.

  3. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, Stanislav

    1992-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.

  4. Computer aided fixture design - A case based approach

    NASA Astrophysics Data System (ADS)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  5. A preview of a modular surface light scattering instrument with autotracking optics

    NASA Technical Reports Server (NTRS)

    Meyer, William V.; Tin, Padetha; Mann, J. Adin, Jr.; Cheung, H. Michael; Rogers, Richard B.; Lading, Lars

    1994-01-01

    NASA's Advanced Technology Development (ATD) program is sponsoring the development of a new generation of surface light scattering hardware. This instrument is designed to non-invasively measure the surface response function of liquids over a wide range of operating conditions while automatically compensating for a sloshing surface. The surface response function can be used to compute surface tension, properties of monolayers present, viscosity, surface tension gradient and surface temperature. The instrument uses optical and electronic building blocks developed for the laser light scattering program at NASA Lewis along with several unique surface light scattering components. The emphasis of this paper is the compensation for bulk surface motion (slosh). Some data processing background information is also included.

  6. Real time standoff gas detection and environmental monitoring with LWIR hyperspectral imager

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Lavoie, Hugo; Bouffard, François; Thériault, Jean-Marc; Vallieres, Christian; Roy, Claude; Dubé, Denis

    2012-10-01

    MR-i is a dual band Hyperspectral Imaging Spectro-radiometer. This field instrument generates spectral datacubes in the MWIR and LWIR. MR-i is modular and can be configured in different ways. One of its configurations is optimized for the standoff measurements of gases in differential mode. In this mode, the instrument is equipped with a dual-input telescope to perform optical background subtraction. The resulting signal is the differential between the spectral radiance entering each input port. With that method, the signal from the background is automatically removed from the signal of the target of interest. The spectral range of this configuration extends in the VLWIR (cut-off near 14 μm) to take full advantage of the LW atmospheric window.

  7. Fully automatic adjoints: a robust and efficient mechanism for generating adjoint ocean models

    NASA Astrophysics Data System (ADS)

    Ham, D. A.; Farrell, P. E.; Funke, S. W.; Rognes, M. E.

    2012-04-01

    The problem of generating and maintaining adjoint models is sufficiently difficult that typically only the most advanced and well-resourced community ocean models achieve it. There are two current technologies which each suffer from their own limitations. Algorithmic differentiation, also called automatic differentiation, is employed by models such as the MITGCM [2] and the Alfred Wegener Institute model FESOM [3]. This technique is very difficult to apply to existing code, and requires a major initial investment to prepare the code for automatic adjoint generation. AD tools may also have difficulty with code employing modern software constructs such as derived data types. An alternative is to formulate the adjoint differential equation and to discretise this separately. This approach, known as the continuous adjoint and employed in ROMS [4], has the disadvantage that two different model code bases must be maintained and manually kept synchronised as the model develops. The discretisation of the continuous adjoint is not automatically consistent with that of the forward model, producing an additional source of error. The alternative presented here is to formulate the flow model in the high level language UFL (Unified Form Language) and to automatically generate the model using the software of the FEniCS project. In this approach it is the high level code specification which is differentiated, a task very similar to the formulation of the continuous adjoint [5]. However since the forward and adjoint models are generated automatically, the difficulty of maintaining them vanishes and the software engineering process is therefore robust. The scheduling and execution of the adjoint model, including the application of an appropriate checkpointing strategy is managed by libadjoint [1]. In contrast to the conventional algorithmic differentiation description of a model as a series of primitive mathematical operations, libadjoint employs a new abstraction of the simulation process as a sequence of discrete equations which are assembled and solved. It is the coupling of the respective abstractions employed by libadjoint and the FEniCS project which produces the adjoint model automatically, without further intervention from the model developer. This presentation will demonstrate this new technology through linear and non-linear shallow water test cases. The exceptionally simple model syntax will be highlighted and the correctness of the resulting adjoint simulations will be demonstrated using rigorous convergence tests.

  8. Multiple Metaphors: Teaching Tense and Aspect to English-Speakers.

    ERIC Educational Resources Information Center

    Cody, Karen

    2000-01-01

    This paper proposes a synthesis of instructional methods from both traditional/explicit grammar and learner-centered/constructivist camps that also incorporates many types of metaphors (abstract, visual, and kinesthetic) in order to lead learners from declarative to proceduralized to automatized knowledge. This integrative, synthetic approach…

  9. Knowledge-based public health situation awareness

    NASA Astrophysics Data System (ADS)

    Mirhaji, Parsa; Zhang, Jiajie; Srinivasan, Arunkumar; Richesson, Rachel L.; Smith, Jack W.

    2004-09-01

    There have been numerous efforts to create comprehensive databases from multiple sources to monitor the dynamics of public health and most specifically to detect the potential threats of bioterrorism before widespread dissemination. But there are not many evidences for the assertion that these systems are timely and dependable, or can reliably identify man made from natural incident. One must evaluate the value of so called 'syndromic surveillance systems' along with the costs involved in design, development, implementation and maintenance of such systems and the costs involved in investigation of the inevitable false alarms1. In this article we will introduce a new perspective to the problem domain with a shift in paradigm from 'surveillance' toward 'awareness'. As we conceptualize a rather different approach to tackle the problem, we will introduce a different methodology in application of information science, computer science, cognitive science and human-computer interaction concepts in design and development of so called 'public health situation awareness systems'. We will share some of our design and implementation concepts for the prototype system that is under development in the Center for Biosecurity and Public Health Informatics Research, in the University of Texas Health Science Center at Houston. The system is based on a knowledgebase containing ontologies with different layers of abstraction, from multiple domains, that provide the context for information integration, knowledge discovery, interactive data mining, information visualization, information sharing and communications. The modular design of the knowledgebase and its knowledge representation formalism enables incremental evolution of the system from a partial system to a comprehensive knowledgebase of 'public health situation awareness' as it acquires new knowledge through interactions with domain experts or automatic discovery of new knowledge.

  10. Microgripper construction kit

    NASA Astrophysics Data System (ADS)

    Gengenbach, Ulrich K.; Hofmann, Andreas; Engelhardt, Friedhelm; Scharnowell, Rudolf; Koehler, Bernd

    2001-10-01

    A large number of microgrippers has been developed in industry and academia. Although the importance of hybrid integration techniques and hence the demand for assembly tools grows continuously a large part of these developments has not yet been used in industrial production. The first grippers developed for microassembly were basically vacuum grippers and downscaled tweezers. Due to increasingly complex assembly tasks more and more functionality such as sensing or additional functions such as adhesive dispensing has been integrated into gripper systems over the last years. Most of these gripper systems are incompatible since there exists no standard interface to the assembly machine and no standard for the internal modules and interfaces. Thus these tools are not easily interchangeable between assembly machines and not easily adaptable to assembly tasks. In order to alleviate this situation a construction kit for modular microgrippers is being developed. It is composed of modules with well defined interfaces that can be combined to build task specific grippers. An abstract model of a microgripper is proposed as a tool to structure the development of the construction kit. The modular concept is illustrated with prototypes.

  11. Static Aeroelastic Analysis with an Inviscid Cartesian Method

    NASA Technical Reports Server (NTRS)

    Rodriguez, David L.; Aftosmis, Michael J.; Nemec, Marian; Smith, Stephen C.

    2014-01-01

    An embedded-boundary Cartesian-mesh flow solver is coupled with a three degree-offreedom structural model to perform static, aeroelastic analysis of complex aircraft geometries. The approach solves the complete system of aero-structural equations using a modular, loosely-coupled strategy which allows the lower-fidelity structural model to deform the highfidelity CFD model. The approach uses an open-source, 3-D discrete-geometry engine to deform a triangulated surface geometry according to the shape predicted by the structural model under the computed aerodynamic loads. The deformation scheme is capable of modeling large deflections and is applicable to the design of modern, very-flexible transport wings. The interface is modular so that aerodynamic or structural analysis methods can be easily swapped or enhanced. This extended abstract includes a brief description of the architecture, along with some preliminary validation of underlying assumptions and early results on a generic 3D transport model. The final paper will present more concrete cases and validation of the approach. Preliminary results demonstrate convergence of the complete aero-structural system and investigate the accuracy of the approximations used in the formulation of the structural model.

  12. Ising model with conserved magnetization on the human connectome: Implications on the relation structure-function in wakefulness and anesthesia

    NASA Astrophysics Data System (ADS)

    Stramaglia, S.; Pellicoro, M.; Angelini, L.; Amico, E.; Aerts, H.; Cortés, J. M.; Laureys, S.; Marinazzo, D.

    2017-04-01

    Dynamical models implemented on the large scale architecture of the human brain may shed light on how a function arises from the underlying structure. This is the case notably for simple abstract models, such as the Ising model. We compare the spin correlations of the Ising model and the empirical functional brain correlations, both at the single link level and at the modular level, and show that their match increases at the modular level in anesthesia, in line with recent results and theories. Moreover, we show that at the peak of the specific heat (the critical state), the spin correlations are minimally shaped by the underlying structural network, explaining how the best match between the structure and function is obtained at the onset of criticality, as previously observed. These findings confirm that brain dynamics under anesthesia shows a departure from criticality and could open the way to novel perspectives when the conserved magnetization is interpreted in terms of a homeostatic principle imposed to neural activity.

  13. Paradigms and strategies for scientific computing on distributed memory concurrent computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.T.; Walker, D.W.

    1994-06-01

    In this work we examine recent advances in parallel languages and abstractions that have the potential for improving the programmability and maintainability of large-scale, parallel, scientific applications running on high performance architectures and networks. This paper focuses on Fortran M, a set of extensions to Fortran 77 that supports the modular design of message-passing programs. We describe the Fortran M implementation of a particle-in-cell (PIC) plasma simulation application, and discuss issues in the optimization of the code. The use of two other methodologies for parallelizing the PIC application are considered. The first is based on the shared object abstraction asmore » embodied in the Orca language. The second approach is the Split-C language. In Fortran M, Orca, and Split-C the ability of the programmer to control the granularity of communication is important is designing an efficient implementation.« less

  14. A General Architecture for Intelligent Tutoring of Diagnostic Classification Problem Solving

    PubMed Central

    Crowley, Rebecca S.; Medvedeva, Olga

    2003-01-01

    We report on a general architecture for creating knowledge-based medical training systems to teach diagnostic classification problem solving. The approach is informed by our previous work describing the development of expertise in classification problem solving in Pathology. The architecture envelops the traditional Intelligent Tutoring System design within the Unified Problem-solving Method description Language (UPML) architecture, supporting component modularity and reuse. Based on the domain ontology, domain task ontology and case data, the abstract problem-solving methods of the expert model create a dynamic solution graph. Student interaction with the solution graph is filtered through an instructional layer, which is created by a second set of abstract problem-solving methods and pedagogic ontologies, in response to the current state of the student model. We outline the advantages and limitations of this general approach, and describe it’s implementation in SlideTutor–a developing Intelligent Tutoring System in Dermatopathology. PMID:14728159

  15. Using AI and Semantic Web Technologies to attack Process Complexity in Open Systems

    NASA Astrophysics Data System (ADS)

    Thompson, Simon; Giles, Nick; Li, Yang; Gharib, Hamid; Nguyen, Thuc Duong

    Recently many vendors and groups have advocated using BPEL and WS-BPEL as a workflow language to encapsulate business logic. While encapsulating workflow and process logic in one place is a sensible architectural decision the implementation of complex workflows suffers from the same problems that made managing and maintaining hierarchical procedural programs difficult. BPEL lacks constructs for logical modularity such as the requirements construct from the STL [12] or the ability to adapt constructs like pure abstract classes for the same purpose. We describe a system that uses semantic web and agent concepts to implement an abstraction layer for BPEL based on the notion of Goals and service typing. AI planning was used to enable process engineers to create and validate systems that used services and goals as first class concepts and compiled processes at run time for execution.

  16. Automated Verification of Specifications with Typestates and Access Permissions

    NASA Technical Reports Server (NTRS)

    Siminiceanu, Radu I.; Catano, Nestor

    2011-01-01

    We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).

  17. Sma3s: a three-step modular annotator for large sequence datasets.

    PubMed

    Muñoz-Mérida, Antonio; Viguera, Enrique; Claros, M Gonzalo; Trelles, Oswaldo; Pérez-Pulido, Antonio J

    2014-08-01

    Automatic sequence annotation is an essential component of modern 'omics' studies, which aim to extract information from large collections of sequence data. Most existing tools use sequence homology to establish evolutionary relationships and assign putative functions to sequences. However, it can be difficult to define a similarity threshold that achieves sufficient coverage without sacrificing annotation quality. Defining the correct configuration is critical and can be challenging for non-specialist users. Thus, the development of robust automatic annotation techniques that generate high-quality annotations without needing expert knowledge would be very valuable for the research community. We present Sma3s, a tool for automatically annotating very large collections of biological sequences from any kind of gene library or genome. Sma3s is composed of three modules that progressively annotate query sequences using either: (i) very similar homologues, (ii) orthologous sequences or (iii) terms enriched in groups of homologous sequences. We trained the system using several random sets of known sequences, demonstrating average sensitivity and specificity values of ~85%. In conclusion, Sma3s is a versatile tool for high-throughput annotation of a wide variety of sequence datasets that outperforms the accuracy of other well-established annotation algorithms, and it can enrich existing database annotations and uncover previously hidden features. Importantly, Sma3s has already been used in the functional annotation of two published transcriptomes. © The Author 2014. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  18. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  19. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  20. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  1. I-SCAD® standoff chemical agent detector overview

    NASA Astrophysics Data System (ADS)

    Popa, Mirela O.; Griffin, Matthew T.

    2012-06-01

    This paper presents a system-level description of the I-SCAD® Standoff Chemical Agent Detector, a passive Fourier Transform InfraRed (FTIR) based remote sensing system, for detecting chemical vapor threats. The passive infrared detection system automatically searches the 7 to 14 micron region of the surrounding atmosphere for agent vapor clouds. It is capable of operating while on the move to accomplish reconnaissance, surveillance, and contamination avoidance missions. Additionally, the system is designed to meet the needs for application on air and sea as well as ground mobile and fixed site platforms. The lightweight, passive, and fully automatic detection system scans the surrounding atmosphere for chemical warfare agent vapors. It provides on-the-move, 360-deg coverage from a variety of tactical and reconnaissance platforms at distances up to 5 km. The core of the system is a rugged Michelson interferometer with a flexure spring bearing mechanism and bi-directional data acquisition capability. The modular system design facilitates interfacing to many platforms. A Reduced Field of View (RFOV) variant includes novel modifications to the scanner subcomponent assembly optical design that gives extended performance in detection range and detection probability without sacrificing existing radiometric sensitivity performance. This paper will deliver an overview of system.

  2. Predictive assimilation framework to support contaminated site understanding and remediation

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Bianchi, M.; Hubbard, S. S.

    2014-12-01

    Subsurface system behavior at contaminated sites is driven and controlled by the interplay of physical, chemical, and biological processes occurring at multiple temporal and spatial scales. Effective remediation and monitoring planning requires an understanding of this complexity that is current, predictive (with some level of confidence) and actionable. We present and demonstrate a predictive assimilation framework (PAF). This framework automatically ingests, quality controls and stores near real-time environmental data and processes these data using different inversion and modeling codes to provide information on the current state and evolution of the subsurface system. PAF is implemented as a cloud based software application which has five components: (1) data acquisition, (2) data management, (3) data assimilation and processing, (4) visualization and result deliver and (5) orchestration. Access to and interaction with PAF is done through a standard browser. PAF is designed to be modular so that it can ingest and process different data streams dependent on the site. We will present an implementation of PAF which uses data from a highly instrumented site (the DOE Rifle Subsurface Biogeochemistry Field Observatory in Rifle, Colorado) for which PAF automatically ingests hydrological data and forward models groundwater flow in the saturated zone.

  3. Geophysical phenomena classification by artificial neural networks

    NASA Technical Reports Server (NTRS)

    Gough, M. P.; Bruckner, J. R.

    1995-01-01

    Space science information systems involve accessing vast data bases. There is a need for an automatic process by which properties of the whole data set can be assimilated and presented to the user. Where data are in the form of spectrograms, phenomena can be detected by pattern recognition techniques. Presented are the first results obtained by applying unsupervised Artificial Neural Networks (ANN's) to the classification of magnetospheric wave spectra. The networks used here were a simple unsupervised Hamming network run on a PC and a more sophisticated CALM network run on a Sparc workstation. The ANN's were compared in their geophysical data recognition performance. CALM networks offer such qualities as fast learning, superiority in generalizing, the ability to continuously adapt to changes in the pattern set, and the possibility to modularize the network to allow the inter-relation between phenomena and data sets. This work is the first step toward an information system interface being developed at Sussex, the Whole Information System Expert (WISE). Phenomena in the data are automatically identified and provided to the user in the form of a data occurrence morphology, the Whole Information System Data Occurrence Morphology (WISDOM), along with relationships to other parameters and phenomena.

  4. Report on Information Retrieval and Library Automation Studies.

    ERIC Educational Resources Information Center

    Alberta Univ., Edmonton. Dept. of Computing Science.

    Short abstracts of works in progress or completed in the Department of Computing Science at the University of Alberta are presented under five major headings. The five categories are: Storage and search techniques for document data bases, Automatic classification, Study of indexing and classification languages through computer manipulation of data…

  5. What is automatized during perceptual categorization?

    PubMed Central

    Roeder, Jessica L.; Ashby, F. Gregory

    2016-01-01

    An experiment is described that tested whether stimulus-response associations or an abstract rule are automatized during extensive practice at perceptual categorization. Twenty-seven participants each completed 12,300 trials of perceptual categorization, either on rule-based (RB) categories that could be learned explicitly or information-integration (II) categories that required procedural learning. Each participant practiced predominantly on a primary category structure, but every third session they switched to a secondary structure that used the same stimuli and responses. Half the stimuli retained their same response on the primary and secondary categories (the congruent stimuli) and half switched responses (the incongruent stimuli). Several results stood out. First, performance on the primary categories met the standard criteria of automaticity by the end of training. Second, for the primary categories in the RB condition, accuracy and response time (RT) were identical on congruent and incongruent stimuli. In contrast, for the primary II categories, accuracy was higher and RT was lower for congruent than for incongruent stimuli. These results are consistent with the hypothesis that rules are automatized in RB tasks, whereas stimulus-response associations are automatized in II tasks. A cognitive neuroscience theory is proposed that accounts for these results. PMID:27232521

  6. Plant phenomics: an overview of image acquisition technologies and image data analysis algorithms

    PubMed Central

    Perez-Sanz, Fernando; Navarro, Pedro J

    2017-01-01

    Abstract The study of phenomes or phenomics has been a central part of biology. The field of automatic phenotype acquisition technologies based on images has seen an important advance in the last years. As with other high-throughput technologies, it addresses a common set of problems, including data acquisition and analysis. In this review, we give an overview of the main systems developed to acquire images. We give an in-depth analysis of image processing with its major issues and the algorithms that are being used or emerging as useful to obtain data out of images in an automatic fashion. PMID:29048559

  7. Theory for the Emergence of Modularity in Complex Systems

    NASA Astrophysics Data System (ADS)

    Deem, Michael; Park, Jeong-Man

    2013-03-01

    Biological systems are modular, and this modularity evolves over time and in different environments. A number of observations have been made of increased modularity in biological systems under increased environmental pressure. We here develop a theory for the dynamics of modularity in these systems. We find a principle of least action for the evolved modularity at long times. In addition, we find a fluctuation dissipation relation for the rate of change of modularity at short times. We discuss a number of biological and social systems that can be understood with this framework. The modularity of the protein-protein interaction network increases when yeast are exposed to heat shock, and the modularity of the protein-protein networks in both yeast and E. coli appears to have increased over evolutionary time. Food webs in low-energy, stressful environments are more modular than those in plentiful environments, arid ecologies are more modular during droughts, and foraging of sea otters is more modular when food is limiting. The modularity of social networks changes over time: stock brokers instant messaging networks are more modular under stressful market conditions, criminal networks are more modular under increased police pressure, and world trade network modularity has decreased

  8. Diagnostic support for glaucoma using retinal images: a hybrid image analysis and data mining approach.

    PubMed

    Yu, Jin; Abidi, Syed Sibte Raza; Artes, Paul; McIntyre, Andy; Heywood, Malcolm

    2005-01-01

    The availability of modern imaging techniques such as Confocal Scanning Laser Tomography (CSLT) for capturing high-quality optic nerve images offer the potential for developing automatic and objective methods for diagnosing glaucoma. We present a hybrid approach that features the analysis of CSLT images using moment methods to derive abstract image defining features. The features are then used to train classifers for automatically distinguishing CSLT images of normal and glaucoma patient. As a first, in this paper, we present investigations in feature subset selction methods for reducing the relatively large input space produced by the moment methods. We use neural networks and support vector machines to determine a sub-set of moments that offer high classification accuracy. We demonstratee the efficacy of our methods to discriminate between healthy and glaucomatous optic disks based on shape information automatically derived from optic disk topography and reflectance images.

  9. NASA automatic subject analysis technique for extracting retrievable multi-terms (NASA TERM) system

    NASA Technical Reports Server (NTRS)

    Kirschbaum, J.; Williamson, R. E.

    1978-01-01

    Current methods for information processing and retrieval used at the NASA Scientific and Technical Information Facility are reviewed. A more cost effective computer aided indexing system is proposed which automatically generates print terms (phrases) from the natural text. Satisfactory print terms can be generated in a primarily automatic manner to produce a thesaurus (NASA TERMS) which extends all the mappings presently applied by indexers, specifies the worth of each posting term in the thesaurus, and indicates the areas of use of the thesaurus entry phrase. These print terms enable the computer to determine which of several terms in a hierarchy is desirable and to differentiate ambiguous terms. Steps in the NASA TERMS algorithm are discussed and the processing of surrogate entry phrases is demonstrated using four previously manually indexed STAR abstracts for comparison. The simulation shows phrase isolation, text phrase reduction, NASA terms selection, and RECON display.

  10. Automatic Invocation Linking for Collaborative Web-Based Corpora

    NASA Astrophysics Data System (ADS)

    Gardner, James; Krowne, Aaron; Xiong, Li

    Collaborative online encyclopedias or knowledge bases such as Wikipedia and PlanetMath are becoming increasingly popular because of their open access, comprehensive and interlinked content, rapid and continual updates, and community interactivity. To understand a particular concept in these knowledge bases, a reader needs to learn about related and underlying concepts. In this chapter, we introduce the problem of invocation linking for collaborative encyclopedia or knowledge bases, review the state of the art for invocation linking including the popular linking system of Wikipedia, discuss the problems and challenges of automatic linking, and present the NNexus approach, an abstraction and generalization of the automatic linking system used by PlanetMath.org. The chapter emphasizes both research problems and practical design issues through discussion of real world scenarios and hence is suitable for both researchers in web intelligence and practitioners looking to adopt the techniques. Below is a brief outline of the chapter.

  11. Mighty Metaphors: Behavioral and ERP Evidence that Power Shifts Attention on a Vertical Dimension

    ERIC Educational Resources Information Center

    Zanolie, Kiki; van Dantzig, Saskia; Boot, Inge; Wijnen, Jasper; Schubert, Thomas W.; Giessner, Steffen R.; Pecher, Diane

    2012-01-01

    Thinking about the abstract concept power may automatically activate the spatial up-down image schema ("powerful up"; "powerless down") and consequently direct spatial attention to the image schema-congruent location. Participants indicated whether a word represented a powerful or powerless person (e.g. "king" or "servant"). Following each…

  12. Data-Driven Hint Generation in Vast Solution Spaces: A Self-Improving Python Programming Tutor

    ERIC Educational Resources Information Center

    Rivers, Kelly; Koedinger, Kenneth R.

    2017-01-01

    To provide personalized help to students who are working on code-writing problems, we introduce a data-driven tutoring system, ITAP (Intelligent Teaching Assistant for Programming). ITAP uses state abstraction, path construction, and state reification to automatically generate personalized hints for students, even when given states that have not…

  13. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge-Based Image Analysis.

    DTIC Science & Technology

    1988-01-19

    approach for the analysis of aerial images. In this approach image analysis is performed ast three levels of abstraction, namely iconic or low-level... image analysis , symbolic or medium-level image analysis , and semantic or high-level image analysis . Domain dependent knowledge about prototypical urban

  14. Tasty Non-Words and Neighbours: The Cognitive Roots of Lexical-Gustatory Synaesthesia

    ERIC Educational Resources Information Center

    Simner, Julia; Haywood, Sarah L.

    2009-01-01

    For lexical-gustatory synaesthetes, words trigger automatic, associated food sensations (e.g., for JB, the word "slope" tastes of over-ripe melon). Our study tests two claims about this unusual condition: that synaesthetic tastes are associated with abstract levels of word representation (concepts/lemmas), and that the first tastes to crystallise…

  15. Polisher (conflicting versions 2.0.8 on IM Form, 1.0 on abstract)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-09-18

    Polisher is a software package designed to facilitate the error correction of an assembled genome using Illumia read data. The software addresses substandard regions by automatically correcting consensus errors and/or suggesting primer walking reactions to improve the quality of the bases. This is done by performing the following:...........

  16. Design of a cardiac monitor in terms of parameters of QRS complex.

    PubMed

    Chen, Zhen-cheng; Ni, Li-li; Su, Ke-ping; Wang, Hong-yan; Jiang, Da-zong

    2002-08-01

    Objective. To design a portable cardiac monitor system based on the available ordinary ECG machine and works on the basis of QRS parameters. Method. The 80196 single chip microcomputer was used as the central microprocessor and real time electrocardiac signal was collected and analyzed [correction of analysized] in the system. Result. Apart from the performance of an ordinary monitor, this machine possesses also the following functions: arrhythmia analysis, HRV analysis, alarm, freeze, and record of automatic papering. Convenient in carrying, the system is powered by AC or DC sources. Stability, low power and low cost are emphasized in the hardware design; and modularization method is applied in software design. Conclusion. Popular in usage and low cost made the portable monitor system suitable for use under simple conditions.

  17. [The design of a cardiac monitoring and analysing system with low power consumption].

    PubMed

    Chen, Zhen-cheng; Ni, Li-li; Zhu, Yan-gao; Wang, Hong-yan; Ma, Yan

    2002-07-01

    The paper deals with a portable analyzing monitor system with liquid crystal display (LCD), which is low in power consumption and suitable for China's specific conditions. Apart from the development of the overall scheme of the system, the paper introduces the design of the hardware and the software. The 80196 single chip microcomputer is used as the central microprocessor to process and real-time electrocardiac signal data. The system have the following functions: five types of arrhythmia analysis, alarm, freeze, and record of automatic paperfeeding. The portable system can be operated by alternate-current (AC) or direct-current (DC). Its hardware circuit is simplified and its software structure is optimized. Multiple low power consumption and LCD unit are adopted in its modular designs.

  18. TAMU: A New Space Mission Operations Paradigm

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Ruszkowski, James; Haensly, Jean; Pennington, Granvil A.; Hogle, Charles

    2011-01-01

    The Transferable, Adaptable, Modular and Upgradeable (TAMU) Flight Production Process (FPP) is a model-centric System of System (SoS) framework which cuts across multiple organizations and their associated facilities, that are, in the most general case, in geographically diverse locations, to develop the architecture and associated workflow processes for a broad range of mission operations. Further, TAMU FPP envisions the simulation, automatic execution and re-planning of orchestrated workflow processes as they become operational. This paper provides the vision for the TAMU FPP paradigm. This includes a complete, coherent technique, process and tool set that result in an infrastructure that can be used for full lifecycle design and decision making during any flight production process. A flight production process is the process of developing all products that are necessary for flight.

  19. LAURA Users Manual: 5.3-48528

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Chirstopher O.; Kleb, Bil

    2010-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  20. LAURA Users Manual: 5.5-64987

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, William L.

    2013-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintain ability by eliminating the requirement for problem dependent recompilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  1. LAURA Users Manual: 5.4-54166

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2011-01-01

    This users manual provides in-depth information concerning installation and execution of Laura, version 5. Laura is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 Laura code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multi-physics coupling. As a result, Laura now shares gas-physics modules, MPI modules, and other low-level modules with the Fun3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  2. LAURA Users Manual: 5.2-43231

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2009-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  3. Laura Users Manual: 5.1-41601

    NASA Technical Reports Server (NTRS)

    Mazaheri, Alireza; Gnoffo, Peter A.; Johnston, Christopher O.; Kleb, Bil

    2009-01-01

    This users manual provides in-depth information concerning installation and execution of LAURA, version 5. LAURA is a structured, multi-block, computational aerothermodynamic simulation code. Version 5 represents a major refactoring of the original Fortran 77 LAURA code toward a modular structure afforded by Fortran 95. The refactoring improved usability and maintainability by eliminating the requirement for problem-dependent re-compilations, providing more intuitive distribution of functionality, and simplifying interfaces required for multiphysics coupling. As a result, LAURA now shares gas-physics modules, MPI modules, and other low-level modules with the FUN3D unstructured-grid code. In addition to internal refactoring, several new features and capabilities have been added, e.g., a GNU-standard installation process, parallel load balancing, automatic trajectory point sequencing, free-energy minimization, and coupled ablation and flowfield radiation.

  4. Integrating biofiltration with SVE: A case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesley, M.P.; Rangan, C.R.

    1996-12-01

    A prototype integrated soil vacuum extraction/biofiltration system has been designed and installed at a gasoline contaminated LUST site in southern Delaware. The prototype system remediates contaminated moisture entrained in the air stream, employs automatic water level controls in the filters, and achieves maximum vapor extraction and VOC destruction efficiency with an optimum power input. In addition, the valving and piping layout allows the direction of air flow through the filters to be reversed at a given time interval, which minimizes biofouling, thereby increasing efficiency by minimizing the need for frequent cleaning. This integrated system achieves constant VOC destruction rates ofmore » 40 to 70% while maintaining optimal VOC removal rates from the subsurface. The modular design allows for easy mobilization, setup and demobilization at state-lead LUST sites throughout Delaware.« less

  5. Reconfigurable, Cognitive Software-Defined Radio

    NASA Technical Reports Server (NTRS)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  6. A Study on the Development of a Robot-Assisted Automatic Laser Hair Removal System

    PubMed Central

    Lim, Hyoung-woo; Park, Sungwoo; Noh, Seungwoo; Lee, Dong-Hun; Yoon, Chiyul; Koh, Wooseok; Kim, Youdan; Chung, Jin Ho; Kim, Hee Chan

    2014-01-01

    Abstract Background and Objective: The robot-assisted automatic laser hair removal (LHR) system is developed to automatically detect any arbitrary shape of the desired LHR treatment area and to provide uniform laser irradiation to the designated skin area. Methods: For uniform delivery of laser energy, a unit of a commercial LHR device, a laser distance sensor, and a high-resolution webcam are attached at the six axis industrial robot's end-effector, which can be easily controlled using a graphical user interface (GUI). During the treatment, the system provides real-time treatment progress as well as the total number of “pick and place” automatically. Results: During the test, it was demonstrated that the arbitrary shapes were detected, and that the laser was delivered uniformly. The localization error test and the area-per-spot test produced satisfactory outcome averages of 1.04 mm error and 38.22 mm2/spot, respectively. Conclusions: Results showed that the system successfully demonstrated accuracy and effectiveness. The proposed system is expected to become a promising device in LHR treatment. PMID:25343281

  7. A study on the development of a robot-assisted automatic laser hair removal system.

    PubMed

    Lim, Hyoung-Woo; Park, Sungwoo; Noh, Seungwoo; Lee, Dong-Hun; Yoon, Chiyul; Koh, Wooseok; Kim, Youdan; Chung, Jin Ho; Kim, Hee Chan; Kim, Sungwan

    2014-11-01

    Abstract Background and Objective: The robot-assisted automatic laser hair removal (LHR) system is developed to automatically detect any arbitrary shape of the desired LHR treatment area and to provide uniform laser irradiation to the designated skin area. For uniform delivery of laser energy, a unit of a commercial LHR device, a laser distance sensor, and a high-resolution webcam are attached at the six axis industrial robot's end-effector, which can be easily controlled using a graphical user interface (GUI). During the treatment, the system provides real-time treatment progress as well as the total number of "pick and place" automatically. During the test, it was demonstrated that the arbitrary shapes were detected, and that the laser was delivered uniformly. The localization error test and the area-per-spot test produced satisfactory outcome averages of 1.04 mm error and 38.22 mm(2)/spot, respectively. RESULTS showed that the system successfully demonstrated accuracy and effectiveness. The proposed system is expected to become a promising device in LHR treatment.

  8. Construction and analysis of a modular model of caspase activation in apoptosis

    PubMed Central

    Harrington, Heather A; Ho, Kenneth L; Ghosh, Samik; Tung, KC

    2008-01-01

    Background A key physiological mechanism employed by multicellular organisms is apoptosis, or programmed cell death. Apoptosis is triggered by the activation of caspases in response to both extracellular (extrinsic) and intracellular (intrinsic) signals. The extrinsic and intrinsic pathways are characterized by the formation of the death-inducing signaling complex (DISC) and the apoptosome, respectively; both the DISC and the apoptosome are oligomers with complex formation dynamics. Additionally, the extrinsic and intrinsic pathways are coupled through the mitochondrial apoptosis-induced channel via the Bcl-2 family of proteins. Results A model of caspase activation is constructed and analyzed. The apoptosis signaling network is simplified through modularization methodologies and equilibrium abstractions for three functional modules. The mathematical model is composed of a system of ordinary differential equations which is numerically solved. Multiple linear regression analysis investigates the role of each module and reduced models are constructed to identify key contributions of the extrinsic and intrinsic pathways in triggering apoptosis for different cell lines. Conclusion Through linear regression techniques, we identified the feedbacks, dissociation of complexes, and negative regulators as the key components in apoptosis. The analysis and reduced models for our model formulation reveal that the chosen cell lines predominately exhibit strong extrinsic caspase, typical of type I cell, behavior. Furthermore, under the simplified model framework, the selected cells lines exhibit different modes by which caspase activation may occur. Finally the proposed modularized model of apoptosis may generalize behavior for additional cells and tissues, specifically identifying and predicting components responsible for the transition from type I to type II cell behavior. PMID:19077196

  9. Separability of Abstract-Category and Specific-Exemplar Visual Object Subsystems: Evidence from fMRI Pattern Analysis

    PubMed Central

    McMenamin, Brenton W.; Deason, Rebecca G.; Steele, Vaughn R.; Koutstaal, Wilma; Marsolek, Chad J.

    2014-01-01

    Previous research indicates that dissociable neural subsystems underlie abstract-category (AC) recognition and priming of objects (e.g., cat, piano) and specific-exemplar (SE) recognition and priming of objects (e.g., a calico cat, a different calico cat, a grand piano, etc.). However, the degree of separability between these subsystems is not known, despite the importance of this issue for assessing relevant theories. Visual object representations are widely distributed in visual cortex, thus a multivariate pattern analysis (MVPA) approach to analyzing functional magnetic resonance imaging (fMRI) data may be critical for assessing the separability of different kinds of visual object processing. Here we examined the neural representations of visual object categories and visual object exemplars using multi-voxel pattern analyses of brain activity elicited in visual object processing areas during a repetition-priming task. In the encoding phase, participants viewed visual objects and the printed names of other objects. In the subsequent test phase, participants identified objects that were either same-exemplar primed, different-exemplar primed, word-primed, or unprimed. In visual object processing areas, classifiers were trained to distinguish same-exemplar primed objects from word-primed objects. Then, the abilities of these classifiers to discriminate different-exemplar primed objects and word-primed objects (reflecting AC priming) and to discriminate same-exemplar primed objects and different-exemplar primed objects (reflecting SE priming) was assessed. Results indicated that (a) repetition priming in occipital-temporal regions is organized asymmetrically, such that AC priming is more prevalent in the left hemisphere and SE priming is more prevalent in the right hemisphere, and (b) AC and SE subsystems are weakly modular, not strongly modular or unified. PMID:25528436

  10. Separability of abstract-category and specific-exemplar visual object subsystems: evidence from fMRI pattern analysis.

    PubMed

    McMenamin, Brenton W; Deason, Rebecca G; Steele, Vaughn R; Koutstaal, Wilma; Marsolek, Chad J

    2015-02-01

    Previous research indicates that dissociable neural subsystems underlie abstract-category (AC) recognition and priming of objects (e.g., cat, piano) and specific-exemplar (SE) recognition and priming of objects (e.g., a calico cat, a different calico cat, a grand piano, etc.). However, the degree of separability between these subsystems is not known, despite the importance of this issue for assessing relevant theories. Visual object representations are widely distributed in visual cortex, thus a multivariate pattern analysis (MVPA) approach to analyzing functional magnetic resonance imaging (fMRI) data may be critical for assessing the separability of different kinds of visual object processing. Here we examined the neural representations of visual object categories and visual object exemplars using multi-voxel pattern analyses of brain activity elicited in visual object processing areas during a repetition-priming task. In the encoding phase, participants viewed visual objects and the printed names of other objects. In the subsequent test phase, participants identified objects that were either same-exemplar primed, different-exemplar primed, word-primed, or unprimed. In visual object processing areas, classifiers were trained to distinguish same-exemplar primed objects from word-primed objects. Then, the abilities of these classifiers to discriminate different-exemplar primed objects and word-primed objects (reflecting AC priming) and to discriminate same-exemplar primed objects and different-exemplar primed objects (reflecting SE priming) was assessed. Results indicated that (a) repetition priming in occipital-temporal regions is organized asymmetrically, such that AC priming is more prevalent in the left hemisphere and SE priming is more prevalent in the right hemisphere, and (b) AC and SE subsystems are weakly modular, not strongly modular or unified. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Targeting multiple heterogeneous hardware platforms with OpenCL

    NASA Astrophysics Data System (ADS)

    Fox, Paul A.; Kozacik, Stephen T.; Humphrey, John R.; Paolini, Aaron; Kuller, Aryeh; Kelmelis, Eric J.

    2014-06-01

    The OpenCL API allows for the abstract expression of parallel, heterogeneous computing, but hardware implementations have substantial implementation differences. The abstractions provided by the OpenCL API are often insufficiently high-level to conceal differences in hardware architecture. Additionally, implementations often do not take advantage of potential performance gains from certain features due to hardware limitations and other factors. These factors make it challenging to produce code that is portable in practice, resulting in much OpenCL code being duplicated for each hardware platform being targeted. This duplication of effort offsets the principal advantage of OpenCL: portability. The use of certain coding practices can mitigate this problem, allowing a common code base to be adapted to perform well across a wide range of hardware platforms. To this end, we explore some general practices for producing performant code that are effective across platforms. Additionally, we explore some ways of modularizing code to enable optional optimizations that take advantage of hardware-specific characteristics. The minimum requirement for portability implies avoiding the use of OpenCL features that are optional, not widely implemented, poorly implemented, or missing in major implementations. Exposing multiple levels of parallelism allows hardware to take advantage of the types of parallelism it supports, from the task level down to explicit vector operations. Static optimizations and branch elimination in device code help the platform compiler to effectively optimize programs. Modularization of some code is important to allow operations to be chosen for performance on target hardware. Optional subroutines exploiting explicit memory locality allow for different memory hierarchies to be exploited for maximum performance. The C preprocessor and JIT compilation using the OpenCL runtime can be used to enable some of these techniques, as well as to factor in hardware-specific optimizations as necessary.

  12. A modular, open-source, slide-scanning microscope for diagnostic applications in resource-constrained settings

    PubMed Central

    Lu, Qiang; Liu, Guanghui; Xiao, Chuanli; Hu, Chuanzhen; Zhang, Shiwu; Xu, Ronald X.; Chu, Kaiqin; Xu, Qianming

    2018-01-01

    In this paper we report the development of a cost-effective, modular, open source, and fully automated slide-scanning microscope, composed entirely of easily available off-the-shelf parts, and capable of bright field and fluorescence modes. The automated X-Y stage is composed of two low-cost micrometer stages coupled to stepper motors operated in open-loop mode. The microscope is composed of a low-cost CMOS sensor and low-cost board lenses placed in a 4f configuration. The system has approximately 1 micron resolution, limited by the f/# of available board lenses. The microscope is compact, measuring just 25×25×30 cm, and has an absolute positioning accuracy of ±1 μm in the X and Y directions. A Z-stage enables autofocusing and imaging over large fields of view even on non-planar samples, and custom software enables automatic determination of sample boundaries and image mosaicking. We demonstrate the utility of our device through imaging of fluorescent- and transmission-dye stained blood and fecal smears containing human and animal parasites, as well as several prepared tissue samples. These results demonstrate image quality comparable to high-end commercial microscopes at a cost of less than US$400 for a bright-field system, with an extra US$100 needed for the fluorescence module. PMID:29543835

  13. Survey of Modular Military Vehicles: Benefits and Burdens

    DTIC Science & Technology

    2016-01-01

    Survey of Modular Military Vehicles: BENEFITS and BURDENS Jean M. Dasch and David J. Gorsich Modularity in military vehicle design is generally...considered a positive attribute that promotes adaptability, resilience, and cost savings. The benefits and burdens of modularity are considered by...Engineering Center, vehicles were considered based on horizontal modularity , vertical modularity , and distributed modularity . Examples were given for each

  14. Optimization of automated large-scale production of [(18)F]fluoroethylcholine for PET prostate cancer imaging.

    PubMed

    Pascali, Giancarlo; D'Antonio, Luca; Bovone, Paola; Gerundini, Paolo; August, Thorsten

    2009-07-01

    PET tumor imaging is gaining importance in current clinical practice. FDG-PET is the most utilized approach but suffers from inflammation influences and is not utilizable in prostate cancer detection. Recently, (11)C-choline analogues have been employed successfully in this field of imaging, leading to a growing interest in the utilization of (18)F-labeled analogues: [(18)F]fluoroethylcholine (FEC) has been demonstrated to be promising, especially in prostate cancer imaging. In this work we report an automatic radiosynthesis of this tracer with high yields, short synthesis time and ease of performance, potentially utilizable in routine production sites. We used a Modular Lab system to automatically perform the two-step/one-pot synthesis. In the first step, we labeled ethyleneglycolditosylate obtaining [(18)F]fluoroethyltosylate; in the second step, we performed the coupling of the latter intermediate with neat dimethylethanolamine. The final mixture was purified by means of solid phase extraction; in particular, the product was trapped into a cation-exchange resin and eluted with isotonic saline. The optimized procedure resulted in a non decay corrected yield of 36% and produced a range of 30-45 GBq of product already in injectable form. The product was analyzed for quality control and resulted as pure and sterile; in addition, residual solvents were under the required threshold. In this work, we present an automatic FEC radiosynthesis that has been optimized for routine production. This findings should foster the interest for a wider utilization of this radiomolecule for imaging of prostate cancer with PET, a field for which no gold-standard tracer has yet been validated.

  15. Timing matters: the processing of pitch relations

    PubMed Central

    Weise, Annekathrin; Grimm, Sabine; Trujillo-Barreto, Nelson J.; Schröger, Erich

    2014-01-01

    The human central auditory system can automatically extract abstract regularities from a variant auditory input. To this end, temporarily separated events need to be related. This study tested whether the timing between events, falling either within or outside the temporal window of integration (~350 ms), impacts the extraction of abstract feature relations. We utilized tone pairs for which tones within but not across pairs revealed a constant pitch relation (e.g., pitch of second tone of a pair higher than pitch of first tone, while absolute pitch values varied across pairs). We measured the mismatch negativity (MMN; the brain’s error signal to auditory regularity violations) to second tones that rarely violated the pitch relation (e.g., pitch of second tone lower). A Short condition in which tone duration (90 ms) and stimulus onset asynchrony between the tones of a pair were short (110 ms) was compared to two conditions, where this onset asynchrony was long (510 ms). In the Long Gap condition, the tone durations were identical to Short (90 ms), but the silent interval was prolonged by 400 ms. In Long Tone, the duration of the first tone was prolonged by 400 ms, while the silent interval was comparable to Short (20 ms). Results show a frontocentral MMN of comparable amplitude in all conditions. Thus, abstract pitch relations can be extracted even when the within-pair timing exceeds the integration period. Source analyses indicate MMN generators in the supratemporal cortex. Interestingly, they were located more anterior in Long Gap than in Short and Long Tone. Moreover, frontal generator activity was found for Long Gap and Long Tone. Thus, the way in which the system automatically registers irregular abstract pitch relations depends on the timing of the events to be linked. Pending that the current MMN data mirror established abstract rule representations coding the regular pitch relation, neural processes building these templates vary with timing. PMID:24966823

  16. Automated Modular Magnetic Resonance Imaging Clinical Decision Support System (MIROR): An Application in Pediatric Cancer Diagnosis.

    PubMed

    Zarinabad, Niloufar; Meeus, Emma M; Manias, Karen; Foster, Katharine; Peet, Andrew

    2018-05-02

    Advances in magnetic resonance imaging and the introduction of clinical decision support systems has underlined the need for an analysis tool to extract and analyze relevant information from magnetic resonance imaging data to aid decision making, prevent errors, and enhance health care. The aim of this study was to design and develop a modular medical image region of interest analysis tool and repository (MIROR) for automatic processing, classification, evaluation, and representation of advanced magnetic resonance imaging data. The clinical decision support system was developed and evaluated for diffusion-weighted imaging of body tumors in children (cohort of 48 children, with 37 malignant and 11 benign tumors). Mevislab software and Python have been used for the development of MIROR. Regions of interests were drawn around benign and malignant body tumors on different diffusion parametric maps, and extracted information was used to discriminate the malignant tumors from benign tumors. Using MIROR, the various histogram parameters derived for each tumor case when compared with the information in the repository provided additional information for tumor characterization and facilitated the discrimination between benign and malignant tumors. Clinical decision support system cross-validation showed high sensitivity and specificity in discriminating between these tumor groups using histogram parameters. MIROR, as a diagnostic tool and repository, allowed the interpretation and analysis of magnetic resonance imaging images to be more accessible and comprehensive for clinicians. It aims to increase clinicians' skillset by introducing newer techniques and up-to-date findings to their repertoire and make information from previous cases available to aid decision making. The modular-based format of the tool allows integration of analyses that are not readily available clinically and streamlines the future developments. ©Niloufar Zarinabad, Emma M Meeus, Karen Manias, Katharine Foster, Andrew Peet. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 02.05.2018.

  17. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record.

    PubMed

    Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2012-06-01

    Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.

  18. Development of an optical character recognition pipeline for handwritten form fields from an electronic health record

    PubMed Central

    Peissig, Peggy L; McCarty, Catherine A; Starren, Justin

    2011-01-01

    Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871

  19. A modular computational framework for automated peak extraction from ion mobility spectra

    PubMed Central

    2014-01-01

    Background An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. Results We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Conclusions Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims. PMID:24450533

  20. A modular computational framework for automated peak extraction from ion mobility spectra.

    PubMed

    D'Addario, Marianna; Kopczynski, Dominik; Baumbach, Jörg Ingo; Rahmann, Sven

    2014-01-22

    An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims.

  1. Technical Reliability Studies. EOS/ESD Technology Abstracts

    DTIC Science & Technology

    1981-01-01

    MECHANISMS MELAMINE MESFETS MICROWAVE MIS 15025 AUTOMATIC MACHINE PRECAUTIONS FOR HOS/OiOS 15006 INSTRUCTIONS FOR INSTALLATION AND...ELIMINATION OF EOS INDUCED SECONDARY FAILURE MECHANISMS 15000 USE OF MELAMINE WORK-SURFACE FOR ESD POTENTIAL BLEED OFF 16141 MICROWAVE NANOSECOND... microwave devices, optoelectronics, and selected nonelectronic parts employed in military, space and commercial applications. In addition, a System

  2. Yaounde French Speech Corpus

    DTIC Science & Technology

    2017-03-01

    the Center for Technology Enhanced Language Learning (CTELL), a research cell in the Department of Foreign Languages, United States Military Academy...models for automatic speech recognition (ASR), and to, thereby, investigate the utility of ASR in pedagogical technology . The corpus is a sample of...lexical resources, language technology 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18. NUMBER OF

  3. BROWSER: An Automatic Indexing On-Line Text Retrieval System. Annual Progress Report.

    ERIC Educational Resources Information Center

    Williams, J. H., Jr.

    The development and testing of the Browsing On-line With Selective Retrieval (BROWSER) text retrieval system allowing a natural language query statement and providing on-line browsing capabilities through an IBM 2260 display terminal is described. The prototype system contains data bases of 25,000 German language patent abstracts, 9,000 English…

  4. Determination of GTA Welding Efficiencies

    DTIC Science & Technology

    1993-03-01

    continue on reverse if ncessary andidentify by block number) A method is developed for estimating welding efficiencies for moving arc GTAW processes...Dutta, Co-Advi r Department of Mechanical Engineering ii ABSTRACT A method is developed for estimating welding efficiencies for moving arc GTAW ...17 Figure 10. Miller Welding Equipment ............. ... 18 Figure 11. GTAW Torch Setup for Automatic Welding. . 19 Figure 12

  5. Communications and tracking expert systems study

    NASA Technical Reports Server (NTRS)

    Leibfried, T. F.; Feagin, Terry; Overland, David

    1987-01-01

    The original objectives of the study consisted of five broad areas of investigation: criteria and issues for explanation of communication and tracking system anomaly detection, isolation, and recovery; data storage simplification issues for fault detection expert systems; data selection procedures for decision tree pruning and optimization to enhance the abstraction of pertinent information for clear explanation; criteria for establishing levels of explanation suited to needs; and analysis of expert system interaction and modularization. Progress was made in all areas, but to a lesser extent in the criteria for establishing levels of explanation suited to needs. Among the types of expert systems studied were those related to anomaly or fault detection, isolation, and recovery.

  6. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  7. The relative efficiency of modular and non-modular networks of different size

    PubMed Central

    Tosh, Colin R.; McNally, Luke

    2015-01-01

    Most biological networks are modular but previous work with small model networks has indicated that modularity does not necessarily lead to increased functional efficiency. Most biological networks are large, however, and here we examine the relative functional efficiency of modular and non-modular neural networks at a range of sizes. We conduct a detailed analysis of efficiency in networks of two size classes: ‘small’ and ‘large’, and a less detailed analysis across a range of network sizes. The former analysis reveals that while the modular network is less efficient than one of the two non-modular networks considered when networks are small, it is usually equally or more efficient than both non-modular networks when networks are large. The latter analysis shows that in networks of small to intermediate size, modular networks are much more efficient that non-modular networks of the same (low) connective density. If connective density must be kept low to reduce energy needs for example, this could promote modularity. We have shown how relative functionality/performance scales with network size, but the precise nature of evolutionary relationship between network size and prevalence of modularity will depend on the costs of connectivity. PMID:25631996

  8. Arkheia: Data Management and Communication for Open Computational Neuroscience

    PubMed Central

    Antolík, Ján; Davison, Andrew P.

    2018-01-01

    Two trends have been unfolding in computational neuroscience during the last decade. First, a shift of focus to increasingly complex and heterogeneous neural network models, with a concomitant increase in the level of collaboration within the field (whether direct or in the form of building on top of existing tools and results). Second, a general trend in science toward more open communication, both internally, with other potential scientific collaborators, and externally, with the wider public. This multi-faceted development toward more integrative approaches and more intense communication within and outside of the field poses major new challenges for modelers, as currently there is a severe lack of tools to help with automatic communication and sharing of all aspects of a simulation workflow to the rest of the community. To address this important gap in the current computational modeling software infrastructure, here we introduce Arkheia. Arkheia is a web-based open science platform for computational models in systems neuroscience. It provides an automatic, interactive, graphical presentation of simulation results, experimental protocols, and interactive exploration of parameter searches, in a web browser-based application. Arkheia is focused on automatic presentation of these resources with minimal manual input from users. Arkheia is written in a modular fashion with a focus on future development of the platform. The platform is designed in an open manner, with a clearly defined and separated API for database access, so that any project can write its own backend translating its data into the Arkheia database format. Arkheia is not a centralized platform, but allows any user (or group of users) to set up their own repository, either for public access by the general population, or locally for internal use. Overall, Arkheia provides users with an automatic means to communicate information about not only their models but also individual simulation results and the entire experimental context in an approachable graphical manner, thus facilitating the user's ability to collaborate in the field and outreach to a wider audience. PMID:29556187

  9. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    NASA Astrophysics Data System (ADS)

    D'Amico, G.; Amodeo, A.; Mattis, I.; Freudenthaler, V.; Pappalardo, G.

    2015-10-01

    In this paper we describe an automatic tool for the pre-processing of lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. The ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, the ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. The ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of the ELPP module, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of the ELPP module is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of the ELPP module. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. The ELPP module has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  10. Arkheia: Data Management and Communication for Open Computational Neuroscience.

    PubMed

    Antolík, Ján; Davison, Andrew P

    2018-01-01

    Two trends have been unfolding in computational neuroscience during the last decade. First, a shift of focus to increasingly complex and heterogeneous neural network models, with a concomitant increase in the level of collaboration within the field (whether direct or in the form of building on top of existing tools and results). Second, a general trend in science toward more open communication, both internally, with other potential scientific collaborators, and externally, with the wider public. This multi-faceted development toward more integrative approaches and more intense communication within and outside of the field poses major new challenges for modelers, as currently there is a severe lack of tools to help with automatic communication and sharing of all aspects of a simulation workflow to the rest of the community. To address this important gap in the current computational modeling software infrastructure, here we introduce Arkheia. Arkheia is a web-based open science platform for computational models in systems neuroscience. It provides an automatic, interactive, graphical presentation of simulation results, experimental protocols, and interactive exploration of parameter searches, in a web browser-based application. Arkheia is focused on automatic presentation of these resources with minimal manual input from users. Arkheia is written in a modular fashion with a focus on future development of the platform. The platform is designed in an open manner, with a clearly defined and separated API for database access, so that any project can write its own backend translating its data into the Arkheia database format. Arkheia is not a centralized platform, but allows any user (or group of users) to set up their own repository, either for public access by the general population, or locally for internal use. Overall, Arkheia provides users with an automatic means to communicate information about not only their models but also individual simulation results and the entire experimental context in an approachable graphical manner, thus facilitating the user's ability to collaborate in the field and outreach to a wider audience.

  11. Paying attention to reading: the neurobiology of reading and dyslexia.

    PubMed

    Shaywitz, Sally E; Shaywitz, Bennett A

    2008-01-01

    Extraordinary progress in functional brain imaging, primarily advances in functional magnetic resonance imaging, now allows scientists to understand the neural systems serving reading and how these systems differ in dyslexic readers. Scientists now speak of the neural signature of dyslexia, a singular achievement that for the first time has made what was previously a hidden disability, now visible. Paralleling this achievement in understanding the neurobiology of dyslexia, progress in the identification and treatment of dyslexia now offers the hope of identifying children at risk for dyslexia at a very young age and providing evidence-based, effective interventions. Despite these advances, for many dyslexic readers, becoming a skilled, automatic reader remains elusive, in great part because though children with dyslexia can be taught to decode words, teaching children to read fluently and automatically represents the next frontier in research on dyslexia. We suggest that to break through this "fluency" barrier, investigators will need to reexamine the more than 20-year-old central dogma in reading research: the generation of the phonological code from print is modular, that is, automatic and not attention demanding, and not requiring any other cognitive process. Recent findings now present a competing view: other cognitive processes are involved in reading, particularly attentional mechanisms, and that disruption of these attentional mechanisms play a causal role in reading difficulties. Recognition of the role of attentional mechanisms in reading now offer potentially new strategies for interventions in dyslexia. In particular, the use of pharmacotherapeutic agents affecting attentional mechanisms not only may provide a window into the neurochemical mechanisms underlying dyslexia but also may offer a potential adjunct treatment for teaching dyslexic readers to read fluently and automatically. Preliminary studies suggest that agents traditionally used to treat disorders of attention, particularly attention-deficit/hyperactivity disorder, may prove to be an effective adjunct to improving reading in dyslexic students.

  12. Extracting rate changes in transcriptional regulation from MEDLINE abstracts.

    PubMed

    Liu, Wenting; Miao, Kui; Li, Guangxia; Chang, Kuiyu; Zheng, Jie; Rajapakse, Jagath C

    2014-01-01

    Time delays are important factors that are often neglected in gene regulatory network (GRN) inference models. Validating time delays from knowledge bases is a challenge since the vast majority of biological databases do not record temporal information of gene regulations. Biological knowledge and facts on gene regulations are typically extracted from bio-literature with specialized methods that depend on the regulation task. In this paper, we mine evidences for time delays related to the transcriptional regulation of yeast from the PubMed abstracts. Since the vast majority of abstracts lack quantitative time information, we can only collect qualitative evidences of time delays. Specifically, the speed-up or delay in transcriptional regulation rate can provide evidences for time delays (shorter or longer) in GRN. Thus, we focus on deriving events related to rate changes in transcriptional regulation. A corpus of yeast regulation related abstracts was manually labeled with such events. In order to capture these events automatically, we create an ontology of sub-processes that are likely to result in transcription rate changes by combining textual patterns and biological knowledge. We also propose effective feature extraction methods based on the created ontology to identify the direct evidences with specific details of these events. Our ontologies outperform existing state-of-the-art gene regulation ontologies in the automatic rule learning method applied to our corpus. The proposed deterministic ontology rule-based method can achieve comparable performance to the automatic rule learning method based on decision trees. This demonstrates the effectiveness of our ontology in identifying rate-changing events. We also tested the effectiveness of the proposed feature mining methods on detecting direct evidence of events. Experimental results show that the machine learning method on these features achieves an F1-score of 71.43%. The manually labeled corpus of events relating to rate changes in transcriptional regulation for yeast is available in https://sites.google.com/site/wentingntu/data. The created ontologies summarized both biological causes of rate changes in transcriptional regulation and corresponding positive and negative textual patterns from the corpus. They are demonstrated to be effective in identifying rate-changing events, which shows the benefits of combining textual patterns and biological knowledge on extracting complex biological events.

  13. Control of intelligent robots in space

    NASA Technical Reports Server (NTRS)

    Freund, E.; Buehler, CH.

    1989-01-01

    In view of space activities like International Space Station, Man-Tended-Free-Flyer (MTFF) and free flying platforms, the development of intelligent robotic systems is gaining increasing importance. The range of applications that have to be performed by robotic systems in space includes e.g., the execution of experiments in space laboratories, the service and maintenance of satellites and flying platforms, the support of automatic production processes or the assembly of large network structures. Some of these tasks will require the development of bi-armed or of multiple robotic systems including functional redundancy. For the development of robotic systems which are able to perform this variety of tasks a hierarchically structured modular concept of automation is required. This concept is characterized by high flexibility as well as by automatic specialization to the particular sequence of tasks that have to be performed. On the other hand it has to be designed such that the human operator can influence or guide the system on different levels of control supervision, and decision. This leads to requirements for the hardware and software concept which permit a range of application of the robotic systems from telemanipulation to autonomous operation. The realization of this goal requires strong efforts in the development of new methods, software and hardware concepts, and the integration into an automation concept.

  14. A programming environment for distributed complex computing. An overview of the Framework for Interdisciplinary Design Optimization (FIDO) project. NASA Langley TOPS exhibit H120b

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Weston, Robert P.; Eidson, Thomas M.

    1993-01-01

    The Framework for Interdisciplinary Design Optimization (FIDO) is a general programming environment for automating the distribution of complex computing tasks over a networked system of heterogeneous computers. For example, instead of manually passing a complex design problem between its diverse specialty disciplines, the FIDO system provides for automatic interactions between the discipline tasks and facilitates their communications. The FIDO system networks all the computers involved into a distributed heterogeneous computing system, so they have access to centralized data and can work on their parts of the total computation simultaneously in parallel whenever possible. Thus, each computational task can be done by the most appropriate computer. Results can be viewed as they are produced and variables changed manually for steering the process. The software is modular in order to ease migration to new problems: different codes can be substituted for each of the current code modules with little or no effect on the others. The potential for commercial use of FIDO rests in the capability it provides for automatically coordinating diverse computations on a networked system of workstations and computers. For example, FIDO could provide the coordination required for the design of vehicles or electronics or for modeling complex systems.

  15. Hybrid Semantic Analysis for Mapping Adverse Drug Reaction Mentions in Tweets to Medical Terminology.

    PubMed

    Emadzadeh, Ehsan; Sarker, Abeed; Nikfarjam, Azadeh; Gonzalez, Graciela

    2017-01-01

    Social networks, such as Twitter, have become important sources for active monitoring of user-reported adverse drug reactions (ADRs). Automatic extraction of ADR information can be crucial for healthcare providers, drug manufacturers, and consumers. However, because of the non-standard nature of social media language, automatically extracted ADR mentions need to be mapped to standard forms before they can be used by operational pharmacovigilance systems. We propose a modular natural language processing pipeline for mapping (normalizing) colloquial mentions of ADRs to their corresponding standardized identifiers. We seek to accomplish this task and enable customization of the pipeline so that distinct unlabeled free text resources can be incorporated to use the system for other normalization tasks. Our approach, which we call Hybrid Semantic Analysis (HSA), sequentially employs rule-based and semantic matching algorithms for mapping user-generated mentions to concept IDs in the Unified Medical Language System vocabulary. The semantic matching component of HSA is adaptive in nature and uses a regression model to combine various measures of semantic relatedness and resources to optimize normalization performance on the selected data source. On a publicly available corpus, our normalization method achieves 0.502 recall and 0.823 precision (F-measure: 0.624). Our proposed method outperforms a baseline based on latent semantic analysis and another that uses MetaMap.

  16. PubFinder: a tool for improving retrieval rate of relevant PubMed abstracts.

    PubMed

    Goetz, Thomas; von der Lieth, Claus-Wilhelm

    2005-07-01

    Since it is becoming increasingly laborious to manually extract useful information embedded in the ever-growing volumes of literature, automated intelligent text analysis tools are becoming more and more essential to assist in this task. PubFinder (www.glycosciences.de/tools/PubFinder) is a publicly available web tool designed to improve the retrieval rate of scientific abstracts relevant for a specific scientific topic. Only the selection of a representative set of abstracts is required, which are central for a scientific topic. No special knowledge concerning the query-syntax is necessary. Based on the selected abstracts, a list of discriminating words is automatically calculated, which is subsequently used for scoring all defined PubMed abstracts for their probability of belonging to the defined scientific topic. This results in a hit-list of references in the descending order of their likelihood score. The algorithms and procedures implemented in PubFinder facilitate the perpetual task for every scientist of staying up-to-date with current publications dealing with a specific subject in biomedicine.

  17. Combining MEDLINE and publisher data to create parallel corpora for the automatic translation of biomedical text

    PubMed Central

    2013-01-01

    Background Most of the institutional and research information in the biomedical domain is available in the form of English text. Even in countries where English is an official language, such as the United States, language can be a barrier for accessing biomedical information for non-native speakers. Recent progress in machine translation suggests that this technique could help make English texts accessible to speakers of other languages. However, the lack of adequate specialized corpora needed to train statistical models currently limits the quality of automatic translations in the biomedical domain. Results We show how a large-sized parallel corpus can automatically be obtained for the biomedical domain, using the MEDLINE database. The corpus generated in this work comprises article titles obtained from MEDLINE and abstract text automatically retrieved from journal websites, which substantially extends the corpora used in previous work. After assessing the quality of the corpus for two language pairs (English/French and English/Spanish) we use the Moses package to train a statistical machine translation model that outperforms previous models for automatic translation of biomedical text. Conclusions We have built translation data sets in the biomedical domain that can easily be extended to other languages available in MEDLINE. These sets can successfully be applied to train statistical machine translation models. While further progress should be made by incorporating out-of-domain corpora and domain-specific lexicons, we believe that this work improves the automatic translation of biomedical texts. PMID:23631733

  18. Adaptive multi-resolution Modularity for detecting communities in networks

    NASA Astrophysics Data System (ADS)

    Chen, Shi; Wang, Zhi-Zhong; Bao, Mei-Hua; Tang, Liang; Zhou, Ji; Xiang, Ju; Li, Jian-Ming; Yi, Chen-He

    2018-02-01

    Community structure is a common topological property of complex networks, which attracted much attention from various fields. Optimizing quality functions for community structures is a kind of popular strategy for community detection, such as Modularity optimization. Here, we introduce a general definition of Modularity, by which several classical (multi-resolution) Modularity can be derived, and then propose a kind of adaptive (multi-resolution) Modularity that can combine the advantages of different Modularity. By applying the Modularity to various synthetic and real-world networks, we study the behaviors of the methods, showing the validity and advantages of the multi-resolution Modularity in community detection. The adaptive Modularity, as a kind of multi-resolution method, can naturally solve the first-type limit of Modularity and detect communities at different scales; it can quicken the disconnecting of communities and delay the breakup of communities in heterogeneous networks; and thus it is expected to generate the stable community structures in networks more effectively and have stronger tolerance against the second-type limit of Modularity.

  19. Product modular design incorporating preventive maintenance issues

    NASA Astrophysics Data System (ADS)

    Gao, Yicong; Feng, Yixiong; Tan, Jianrong

    2016-03-01

    Traditional modular design methods lead to product maintenance problems, because the module form of a system is created according to either the function requirements or the manufacturing considerations. For solving these problems, a new modular design method is proposed with the considerations of not only the traditional function related attributes, but also the maintenance related ones. First, modularity parameters and modularity scenarios for product modularity are defined. Then the reliability and economic assessment models of product modularity strategies are formulated with the introduction of the effective working age of modules. A mathematical model used to evaluate the difference among the modules of the product so that the optimal module of the product can be established. After that, a multi-objective optimization problem based on metrics for preventive maintenance interval different degrees and preventive maintenance economics is formulated for modular optimization. Multi-objective GA is utilized to rapidly approximate the Pareto set of optimal modularity strategy trade-offs between preventive maintenance cost and preventive maintenance interval difference degree. Finally, a coordinate CNC boring machine is adopted to depict the process of product modularity. In addition, two factorial design experiments based on the modularity parameters are constructed and analyzed. These experiments investigate the impacts of these parameters on the optimal modularity strategies and the structure of module. The research proposes a new modular design method, which may help to improve the maintainability of product in modular design.

  20. Subliminal speech priming.

    PubMed

    Kouider, Sid; Dupoux, Emmanuel

    2005-08-01

    We present a novel subliminal priming technique that operates in the auditory modality. Masking is achieved by hiding a spoken word within a stream of time-compressed speechlike sounds with similar spectral characteristics. Participants were unable to consciously identify the hidden words, yet reliable repetition priming was found. This effect was unaffected by a change in the speaker's voice and remained restricted to lexical processing. The results show that the speech modality, like the written modality, involves the automatic extraction of abstract word-form representations that do not include nonlinguistic details. In both cases, priming operates at the level of discrete and abstract lexical entries and is little influenced by overlap in form or semantics.

  1. Reasoning about Function Objects

    NASA Astrophysics Data System (ADS)

    Nordio, Martin; Calcagno, Cristiano; Meyer, Bertrand; Müller, Peter; Tschannen, Julian

    Modern object-oriented languages support higher-order implementations through function objects such as delegates in C#, agents in Eiffel, or closures in Scala. Function objects bring a new level of abstraction to the object-oriented programming model, and require a comparable extension to specification and verification techniques. We introduce a verification methodology that extends function objects with auxiliary side-effect free (pure) methods to model logical artifacts: preconditions, postconditions and modifies clauses. These pure methods can be used to specify client code abstractly, that is, independently from specific instantiations of the function objects. To demonstrate the feasibility of our approach, we have implemented an automatic prover, which verifies several non-trivial examples.

  2. Formal Analysis of the Remote Agent Before and After Flight

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.

    2000-01-01

    This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.

  3. Development of an automatic subsea blowout preventer stack control system using PLC based SCADA.

    PubMed

    Cai, Baoping; Liu, Yonghong; Liu, Zengkai; Wang, Fei; Tian, Xiaojie; Zhang, Yanzhen

    2012-01-01

    An extremely reliable remote control system for subsea blowout preventer stack is developed based on the off-the-shelf triple modular redundancy system. To meet a high reliability requirement, various redundancy techniques such as controller redundancy, bus redundancy and network redundancy are used to design the system hardware architecture. The control logic, human-machine interface graphical design and redundant databases are developed by using the off-the-shelf software. A series of experiments were performed in laboratory to test the subsea blowout preventer stack control system. The results showed that the tested subsea blowout preventer functions could be executed successfully. For the faults of programmable logic controllers, discrete input groups and analog input groups, the control system could give correct alarms in the human-machine interface. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Modulus stabilization in a non-flat warped braneworld scenario

    NASA Astrophysics Data System (ADS)

    Banerjee, Indrani; SenGupta, Soumitra

    2017-05-01

    The stability of the modular field in a warped brane world scenario has been a subject of interest for a long time. Goldberger and Wise (GW) proposed a mechanism to achieve this by invoking a massive scalar field in the bulk space-time neglecting the back-reaction. In this work, we examine the possibility of stabilizing the modulus without bringing about any external scalar field. We show that instead of flat 3-branes as considered in Randall-Sundrum (RS) warped braneworld model, if one considers a more generalized version of warped geometry with de Sitter 3-brane, then the brane vacuum energy automatically leads to a modulus potential with a metastable minimum. Our result further reveals that in this scenario the gauge hierarchy problem can also be resolved for an appropriate choice of the brane's cosmological constant.

  5. Research on SEU hardening of heterogeneous Dual-Core SoC

    NASA Astrophysics Data System (ADS)

    Huang, Kun; Hu, Keliu; Deng, Jun; Zhang, Tao

    2017-08-01

    The implementation of Single-Event Upsets (SEU) hardening has various schemes. However, some of them require a lot of human, material and financial resources. This paper proposes an easy scheme on SEU hardening for Heterogeneous Dual-core SoC (HD SoC) which contains three techniques. First, the automatic Triple Modular Redundancy (TMR) technique is adopted to harden the register heaps of the processor and the instruction-fetching module. Second, Hamming codes are used to harden the random access memory (RAM). Last, a software signature technique is applied to check the programs which are running on CPU. The scheme need not to consume additional resources, and has little influence on the performance of CPU. These technologies are very mature, easy to implement and needs low cost. According to the simulation result, the scheme can satisfy the basic demand of SEU-hardening.

  6. Software Considerations for Subscale Flight Testing of Experimental Control Laws

    NASA Technical Reports Server (NTRS)

    Murch, Austin M.; Cox, David E.; Cunningham, Kevin

    2009-01-01

    The NASA AirSTAR system has been designed to address the challenges associated with safe and efficient subscale flight testing of research control laws in adverse flight conditions. In this paper, software elements of this system are described, with an emphasis on components which allow for rapid prototyping and deployment of aircraft control laws. Through model-based design and automatic coding a common code-base is used for desktop analysis, piloted simulation and real-time flight control. The flight control system provides the ability to rapidly integrate and test multiple research control laws and to emulate component or sensor failures. Integrated integrity monitoring systems provide aircraft structural load protection, isolate the system from control algorithm failures, and monitor the health of telemetry streams. Finally, issues associated with software configuration management and code modularity are briefly discussed.

  7. An automated Genomes-to-Natural Products platform (GNP) for the discovery of modular natural products.

    PubMed

    Johnston, Chad W; Skinnider, Michael A; Wyatt, Morgan A; Li, Xiang; Ranieri, Michael R M; Yang, Lian; Zechel, David L; Ma, Bin; Magarvey, Nathan A

    2015-09-28

    Bacterial natural products are a diverse and valuable group of small molecules, and genome sequencing indicates that the vast majority remain undiscovered. The prediction of natural product structures from biosynthetic assembly lines can facilitate their discovery, but highly automated, accurate, and integrated systems are required to mine the broad spectrum of sequenced bacterial genomes. Here we present a genome-guided natural products discovery tool to automatically predict, combinatorialize and identify polyketides and nonribosomal peptides from biosynthetic assembly lines using LC-MS/MS data of crude extracts in a high-throughput manner. We detail the directed identification and isolation of six genetically predicted polyketides and nonribosomal peptides using our Genome-to-Natural Products platform. This highly automated, user-friendly programme provides a means of realizing the potential of genetically encoded natural products.

  8. A context-adaptable approach to clinical guidelines.

    PubMed

    Terenziani, Paolo; Montani, Stefania; Bottrighi, Alessio; Torchio, Mauro; Molino, Gianpaolo; Correndo, Gianluca

    2004-01-01

    One of the most relevant obstacles to the use and dissemination of clinical guidelines is the gap between the generality of guidelines (as defined, e.g., by physicians' committees) and the peculiarities of the specific context of application. In particular, general guidelines do not take into account the fact that the tools needed for laboratory and instrumental investigations might be unavailable at a given hospital. Moreover, computer-based guideline managers must also be integrated with the Hospital Information System (HIS), and usually different DBMS are adopted by different hospitals. The GLARE (Guideline Acquisition, Representation and Execution) system addresses these issues by providing a facility for automatic resource-based adaptation of guidelines to the specific context of application, and by providing a modular architecture in which only limited and well-localised changes are needed to integrate the system with the HIS at hand.

  9. Development of modularity in the neural activity of childrenʼs brains

    NASA Astrophysics Data System (ADS)

    Chen, Man; Deem, Michael W.

    2015-02-01

    We study how modularity of the human brain changes as children develop into adults. Theory suggests that modularity can enhance the response function of a networked system subject to changing external stimuli. Thus, greater cognitive performance might be achieved for more modular neural activity, and modularity might likely increase as children develop. The value of modularity calculated from functional magnetic resonance imaging (fMRI) data is observed to increase during childhood development and peak in young adulthood. Head motion is deconvolved from the fMRI data, and it is shown that the dependence of modularity on age is independent of the magnitude of head motion. A model is presented to illustrate how modularity can provide greater cognitive performance at short times, i.e. task switching. A fitness function is extracted from the model. Quasispecies theory is used to predict how the average modularity evolves with age, illustrating the increase of modularity during development from children to adults that arises from selection for rapid cognitive function in young adults. Experiments exploring the effect of modularity on cognitive performance are suggested. Modularity may be a potential biomarker for injury, rehabilitation, or disease.

  10. Digital Morphing Wing: Active Wing Shaping Concept Using Composite Lattice-Based Cellular Structures

    PubMed Central

    Jenett, Benjamin; Calisch, Sam; Cellucci, Daniel; Cramer, Nick; Gershenfeld, Neil; Swei, Sean

    2017-01-01

    Abstract We describe an approach for the discrete and reversible assembly of tunable and actively deformable structures using modular building block parts for robotic applications. The primary technical challenge addressed by this work is the use of this method to design and fabricate low density, highly compliant robotic structures with spatially tuned stiffness. This approach offers a number of potential advantages over more conventional methods for constructing compliant robots. The discrete assembly reduces manufacturing complexity, as relatively simple parts can be batch-produced and joined to make complex structures. Global mechanical properties can be tuned based on sub-part ordering and geometry, because local stiffness and density can be independently set to a wide range of values and varied spatially. The structure's intrinsic modularity can significantly simplify analysis and simulation. Simple analytical models for the behavior of each building block type can be calibrated with empirical testing and synthesized into a highly accurate and computationally efficient model of the full compliant system. As a case study, we describe a modular and reversibly assembled wing that performs continuous span-wise twist deformation. It exhibits high performance aerodynamic characteristics, is lightweight and simple to fabricate and repair. The wing is constructed from discrete lattice elements, wherein the geometric and mechanical attributes of the building blocks determine the global mechanical properties of the wing. We describe the mechanical design and structural performance of the digital morphing wing, including their relationship to wind tunnel tests that suggest the ability to increase roll efficiency compared to a conventional rigid aileron system. We focus here on describing the approach to design, modeling, and construction as a generalizable approach for robotics that require very lightweight, tunable, and actively deformable structures. PMID:28289574

  11. Synthesis, structural characterization and selectively catalytic properties of metal-organic frameworks with nano-sized channels: A modular design strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu Lingguang; Gu Lina; Hu Gang

    2009-03-15

    Modular design method for designing and synthesizing microporous metal-organic frameworks (MOFs) with selective catalytical activity was described. MOFs with both nano-sized channels and potential catalytic activities could be obtained through self-assembly of a framework unit and a catalyst unit. By selecting hexaaquo metal complexes and the ligand BTC (BTC=1,3,5-benzenetricarboxylate) as framework-building blocks and using the metal complex [M(phen){sub 2}(H{sub 2}O){sub 2}]{sup 2+} (phen=1,10-phenanthroline) as a catalyst unit, a series of supramolecular MOFs 1-7 with three-dimensional nano-sized channels, i.e. [M{sup 1}(H{sub 2}O){sub 6}].[M{sup 2}(phen){sub 2}(H{sub 2}O){sub 2}]{sub 2}.2(BTC).xH{sub 2}O (M{sup 1}, M{sup 2}=Co(II), Ni(II), Cu(II), Zn(II), or Mn(II), phen=1,10-phenanthroline, BTC=1,3,5-benzenetricarboxylate, x=22-24),more » were synthesized through self-assembly, and their structures were characterized by IR, elemental analysis, and single-crystal X-ray diffraction. These supramolecular microporous MOFs showed significant size and shape selectivity in the catalyzed oxidation of phenols, which is due to catalytic reactions taking place in the channels of the framework. Design strategy, synthesis, and self-assembly mechanism for the construction of these porous MOFs were discussed. - Grapical abstract: A modular design strategy has been developed to synthesize microporous metal-organic frameworks with potential catalytic activity by self-assembly of the framework-building blocks and the catalyst unit.« less

  12. Impact of translation on named-entity recognition in radiology texts

    PubMed Central

    Pedro, Vasco

    2017-01-01

    Abstract Radiology reports describe the results of radiography procedures and have the potential of being a useful source of information which can bring benefits to health care systems around the world. One way to automatically extract information from the reports is by using Text Mining tools. The problem is that these tools are mostly developed for English and reports are usually written in the native language of the radiologist, which is not necessarily English. This creates an obstacle to the sharing of Radiology information between different communities. This work explores the solution of translating the reports to English before applying the Text Mining tools, probing the question of what translation approach should be used. We created MRRAD (Multilingual Radiology Research Articles Dataset), a parallel corpus of Portuguese research articles related to Radiology and a number of alternative translations (human, automatic and semi-automatic) to English. This is a novel corpus which can be used to move forward the research on this topic. Using MRRAD we studied which kind of automatic or semi-automatic translation approach is more effective on the Named-entity recognition task of finding RadLex terms in the English version of the articles. Considering the terms extracted from human translations as our gold standard, we calculated how similar to this standard were the terms extracted using other translations. We found that a completely automatic translation approach using Google leads to F-scores (between 0.861 and 0.868, depending on the extraction approach) similar to the ones obtained through a more expensive semi-automatic translation approach using Unbabel (between 0.862 and 0.870). To better understand the results we also performed a qualitative analysis of the type of errors found in the automatic and semi-automatic translations. Database URL: https://github.com/lasigeBioTM/MRRAD PMID:29220455

  13. Origin of Emotion Effects on ERP Correlates of Emotional Word Processing: The Emotion Duality Approach.

    PubMed

    Imbir, Kamil Konrad; Jarymowicz, Maria Teresa; Spustek, Tomasz; Kuś, Rafał; Żygierewicz, Jarosław

    2015-01-01

    We distinguish two evaluative systems which evoke automatic and reflective emotions. Automatic emotions are direct reactions to stimuli whereas reflective emotions are always based on verbalized (and often abstract) criteria of evaluation. We conducted an electroencephalography (EEG) study in which 25 women were required to read and respond to emotional words which engaged either the automatic or reflective system. Stimulus words were emotional (positive or negative) and neutral. We found an effect of valence on an early response with dipolar fronto-occipital topography; positive words evoked a higher amplitude response than negative words. We also found that topographically specific differences in the amplitude of the late positive complex were related to the system involved in processing. Emotional stimuli engaging the automatic system were associated with significantly higher amplitudes in the left-parietal region; the response to neutral words was similar regardless of the system engaged. A different pattern of effects was observed in the central region, neutral stimuli engaging the reflective system evoked a higher amplitudes response whereas there was no system effect for emotional stimuli. These differences could not be reduced to effects of differences between the arousing properties and concreteness of the words used as stimuli.

  14. Origin of Emotion Effects on ERP Correlates of Emotional Word Processing: The Emotion Duality Approach

    PubMed Central

    Imbir, Kamil Konrad; Jarymowicz, Maria Teresa; Spustek, Tomasz; Kuś, Rafał; Żygierewicz, Jarosław

    2015-01-01

    We distinguish two evaluative systems which evoke automatic and reflective emotions. Automatic emotions are direct reactions to stimuli whereas reflective emotions are always based on verbalized (and often abstract) criteria of evaluation. We conducted an electroencephalography (EEG) study in which 25 women were required to read and respond to emotional words which engaged either the automatic or reflective system. Stimulus words were emotional (positive or negative) and neutral. We found an effect of valence on an early response with dipolar fronto-occipital topography; positive words evoked a higher amplitude response than negative words. We also found that topographically specific differences in the amplitude of the late positive complex were related to the system involved in processing. Emotional stimuli engaging the automatic system were associated with significantly higher amplitudes in the left-parietal region; the response to neutral words was similar regardless of the system engaged. A different pattern of effects was observed in the central region, neutral stimuli engaging the reflective system evoked a higher amplitudes response whereas there was no system effect for emotional stimuli. These differences could not be reduced to effects of differences between the arousing properties and concreteness of the words used as stimuli. PMID:25955719

  15. Evaluation of a depth proportional intake device for automatic pumping samplers

    Treesearch

    Rand E. Eads; Robert B. Thomas

    1983-01-01

    Abstract - A depth proportional intake boom for portable pumping samplers was used to collect suspended sediment samples in two coastal streams for three winters. The boom pivots on the stream bed while a float on the downstream end allows debris to depress the boom and pass without becoming trapped. This equipment modifies point sampling by maintaining the intake...

  16. Maestro Workflow Conductor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Natale, Francesco

    2017-06-01

    MaestroWF is a Python tool and software package for loading YAML study specifications that represents a simulation campaign. The package is capable of parameterizing a study, pulling dependencies automatically, formatting output directories, and managing the flow and execution of the campaign. MaestroWF also provides a set of abstracted objects that can also be used to develop user specific scripts for launching simulation campaigns.

  17. JPRS Report, Science & Technology, USSR: Materials Science

    DTIC Science & Technology

    1988-03-15

    another. It has developed and transferred to design institutes technological schemes of transportation systems for working deep levels at the Sarbay...manuscript received 1 Oct 84, in final version 19 May 86) pp 45-50 [Article by V.V. Shefel, engineer, Energomontazhproyekt Design -Technological...Institute, Moscow] [Abstract] Three variants of automatic argon-arc welding have been developed at the Energomontazh Design -Technological Institute

  18. Datasets of Odontocete Sounds Annotated for Developing Automatic Detection Methods

    DTIC Science & Technology

    2010-12-01

    Passive acoustic detection of Minke whales (Balaenoptera acutorostrata) off the West Coast of Kauai, HI. Book of abstracts, Fourth International...Workshop on Detection , Classification and Localization of Marine Mammals using Passive Acoustics , Pavia, Italy, Sept. 10- 13, 2009, p. 57. Roch, M., Y...Mellinger, and D. Gillespie. 2010. Comparison of beaked whale detection algorithms. Applied Acoustics 71:1043-1049. 8 References

  19. International Congress on Glass XII (in several languages)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doremus, R H; LaCourse, W C; Mackenzie, J D

    1980-01-01

    A total of 158 papers are included under nine headings: structure and glass formation; optical properties; electrical and magnetic properties; mechanical properties and relaxation; mass transport; chemical durability and surfaces; nucleation; crystallization; and glass ceramics; processing; and automatic controls. Separate abstracts were prepared for eight papers; four of the remaining papers had been processed previously for the data base. (DLC)

  20. Patient-Specific Deep Architectural Model for ECG Classification

    PubMed Central

    Luo, Kan; Cuschieri, Alfred

    2017-01-01

    Heartbeat classification is a crucial step for arrhythmia diagnosis during electrocardiographic (ECG) analysis. The new scenario of wireless body sensor network- (WBSN-) enabled ECG monitoring puts forward a higher-level demand for this traditional ECG analysis task. Previously reported methods mainly addressed this requirement with the applications of a shallow structured classifier and expert-designed features. In this study, modified frequency slice wavelet transform (MFSWT) was firstly employed to produce the time-frequency image for heartbeat signal. Then the deep learning (DL) method was performed for the heartbeat classification. Here, we proposed a novel model incorporating automatic feature abstraction and a deep neural network (DNN) classifier. Features were automatically abstracted by the stacked denoising auto-encoder (SDA) from the transferred time-frequency image. DNN classifier was constructed by an encoder layer of SDA and a softmax layer. In addition, a deterministic patient-specific heartbeat classifier was achieved by fine-tuning on heartbeat samples, which included a small subset of individual samples. The performance of the proposed model was evaluated on the MIT-BIH arrhythmia database. Results showed that an overall accuracy of 97.5% was achieved using the proposed model, confirming that the proposed DNN model is a powerful tool for heartbeat pattern recognition. PMID:29065597

  1. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE PAGES

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul; ...

    2017-12-20

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  2. Automatic blocking for complex three-dimensional configurations

    NASA Technical Reports Server (NTRS)

    Dannenhoffer, John F., III

    1995-01-01

    A new blocking technique for complex three-dimensional configurations is described. This new technique is based upon the concept of an abstraction, or squared-up representation, of the configuration and the associated grid. By allowing the user to describe blocking requirements in natural terms (such as 'wrap a grid around this leading edge' or 'make all grid lines emanating from this wall orthogonal to it'), users can quickly generate complex grids around complex configurations, while still maintaining a high level of control where desired. An added advantage of the abstraction concept is that once a blocking is defined for a class of configurations, it can be automatically applied to other configurations of the same class, making the new technique particularly well suited for the parametric variations which typically occur during design processes. Grids have been generated for a variety of real-world, two- and three-dimensional configurations. In all cases, the time required to generate the grid, given just an electronic form of the configuration, was at most a few days. Hence with this new technique, the generation of a block-structured grid is only slightly more expensive than the generation of an unstructured grid for the same configuration.

  3. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  4. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at

    PubMed Central

    Bukowski, Henryk; Hietanen, Jari K.; Samson, Dana

    2015-01-01

    ABSTRACT Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations. PMID:26924936

  5. Automatically classifying sentences in full-text biomedical articles into Introduction, Methods, Results and Discussion.

    PubMed

    Agarwal, Shashank; Yu, Hong

    2009-12-01

    Biomedical texts can be typically represented by four rhetorical categories: Introduction, Methods, Results and Discussion (IMRAD). Classifying sentences into these categories can benefit many other text-mining tasks. Although many studies have applied different approaches for automatically classifying sentences in MEDLINE abstracts into the IMRAD categories, few have explored the classification of sentences that appear in full-text biomedical articles. We first evaluated whether sentences in full-text biomedical articles could be reliably annotated into the IMRAD format and then explored different approaches for automatically classifying these sentences into the IMRAD categories. Our results show an overall annotation agreement of 82.14% with a Kappa score of 0.756. The best classification system is a multinomial naïve Bayes classifier trained on manually annotated data that achieved 91.95% accuracy and an average F-score of 91.55%, which is significantly higher than baseline systems. A web version of this system is available online at-http://wood.ims.uwm.edu/full_text_classifier/.

  6. Kinase Pathway Database: An Integrated Protein-Kinase and NLP-Based Protein-Interaction Resource

    PubMed Central

    Koike, Asako; Kobayashi, Yoshiyuki; Takagi, Toshihisa

    2003-01-01

    Protein kinases play a crucial role in the regulation of cellular functions. Various kinds of information about these molecules are important for understanding signaling pathways and organism characteristics. We have developed the Kinase Pathway Database, an integrated database involving major completely sequenced eukaryotes. It contains the classification of protein kinases and their functional conservation, ortholog tables among species, protein–protein, protein–gene, and protein–compound interaction data, domain information, and structural information. It also provides an automatic pathway graphic image interface. The protein, gene, and compound interactions are automatically extracted from abstracts for all genes and proteins by natural-language processing (NLP).The method of automatic extraction uses phrase patterns and the GENA protein, gene, and compound name dictionary, which was developed by our group. With this database, pathways are easily compared among species using data with more than 47,000 protein interactions and protein kinase ortholog tables. The database is available for querying and browsing at http://kinasedb.ontology.ims.u-tokyo.ac.jp/. PMID:12799355

  7. Experiments in automatic word class and word sense identification for information retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauch, S.; Futrelle, R.P.

    Automatic identification of related words and automatic detection of word senses are two long-standing goals of researchers in natural language processing. Word class information and word sense identification may enhance the performance of information retrieval system4ms. Large online corpora and increased computational capabilities make new techniques based on corpus linguisitics feasible. Corpus-based analysis is especially needed for corpora from specialized fields for which no electronic dictionaries or thesauri exist. The methods described here use a combination of mutual information and word context to establish word similarities. Then, unsupervised classification is done using clustering in the word space, identifying word classesmore » without pretagging. We also describe an extension of the method to handle the difficult problems of disambiguation and of determining part-of-speech and semantic information for low-frequency words. The method is powerful enough to produce high-quality results on a small corpus of 200,000 words from abstracts in a field of molecular biology.« less

  8. Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support

    NASA Astrophysics Data System (ADS)

    Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar

    This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.

  9. A multilingual gold-standard corpus for biomedical concept recognition: the Mantra GSC

    PubMed Central

    Clematide, Simon; Akhondi, Saber A; van Mulligen, Erik M; Rebholz-Schuhmann, Dietrich

    2015-01-01

    Objective To create a multilingual gold-standard corpus for biomedical concept recognition. Materials and methods We selected text units from different parallel corpora (Medline abstract titles, drug labels, biomedical patent claims) in English, French, German, Spanish, and Dutch. Three annotators per language independently annotated the biomedical concepts, based on a subset of the Unified Medical Language System and covering a wide range of semantic groups. To reduce the annotation workload, automatically generated preannotations were provided. Individual annotations were automatically harmonized and then adjudicated, and cross-language consistency checks were carried out to arrive at the final annotations. Results The number of final annotations was 5530. Inter-annotator agreement scores indicate good agreement (median F-score 0.79), and are similar to those between individual annotators and the gold standard. The automatically generated harmonized annotation set for each language performed equally well as the best annotator for that language. Discussion The use of automatic preannotations, harmonized annotations, and parallel corpora helped to keep the manual annotation efforts manageable. The inter-annotator agreement scores provide a reference standard for gauging the performance of automatic annotation techniques. Conclusion To our knowledge, this is the first gold-standard corpus for biomedical concept recognition in languages other than English. Other distinguishing features are the wide variety of semantic groups that are being covered, and the diversity of text genres that were annotated. PMID:25948699

  10. Modular Courses in British Higher Education: A Critical Assessment

    ERIC Educational Resources Information Center

    Church, Clive

    1975-01-01

    The trends towards modular course structures is examined. British conceptions of modularization are compared with American interpretations of modular instruction, the former shown to be concerned almost exclusively with content, the latter attempting more radical changes in students' learning behavior. Rationales for British modular schemes are…

  11. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens

    PubMed Central

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-01-01

    Abstract. This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear–cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  12. Large-scale classification of traffic signs under real-world conditions

    NASA Astrophysics Data System (ADS)

    Hazelhoff, Lykele; Creusen, Ivo; van de Wouw, Dennis; de With, Peter H. N.

    2012-02-01

    Traffic sign inventories are important to governmental agencies as they facilitate evaluation of traffic sign locations and are beneficial for road and sign maintenance. These inventories can be created (semi-)automatically based on street-level panoramic images. In these images, object detection is employed to detect the signs in each image, followed by a classification stage to retrieve the specific sign type. Classification of traffic signs is a complicated matter, since sign types are very similar with only minor differences within the sign, a high number of different signs is involved and multiple distortions occur, including variations in capturing conditions, occlusions, viewpoints and sign deformations. Therefore, we propose a method for robust classification of traffic signs, based on the Bag of Words approach for generic object classification. We extend the approach with a flexible, modular codebook to model the specific features of each sign type independently, in order to emphasize at the inter-sign differences instead of the parts common for all sign types. Additionally, this allows us to model and label the present false detections. Furthermore, analysis of the classification output provides the unreliable results. This classification system has been extensively tested for three different sign classes, covering 60 different sign types in total. These three data sets contain the sign detection results on street-level panoramic images, extracted from a country-wide database. The introduction of the modular codebook shows a significant improvement for all three sets, where the system is able to classify about 98% of the reliable results correctly.

  13. Development and Performance of the Modularized, High-performance Computing and Hybrid-architecture Capable GEOS-Chem Chemical Transport Model

    NASA Astrophysics Data System (ADS)

    Long, M. S.; Yantosca, R.; Nielsen, J.; Linford, J. C.; Keller, C. A.; Payer Sulprizio, M.; Jacob, D. J.

    2014-12-01

    The GEOS-Chem global chemical transport model (CTM), used by a large atmospheric chemistry research community, has been reengineered to serve as a platform for a range of computational atmospheric chemistry science foci and applications. Development included modularization for coupling to general circulation and Earth system models (ESMs) and the adoption of co-processor capable atmospheric chemistry solvers. This was done using an Earth System Modeling Framework (ESMF) interface that operates independently of GEOS-Chem scientific code to permit seamless transition from the GEOS-Chem stand-alone serial CTM to deployment as a coupled ESM module. In this manner, the continual stream of updates contributed by the CTM user community is automatically available for broader applications, which remain state-of-science and directly referenceable to the latest version of the standard GEOS-Chem CTM. These developments are now available as part of the standard version of the GEOS-Chem CTM. The system has been implemented as an atmospheric chemistry module within the NASA GEOS-5 ESM. The coupled GEOS-5/GEOS-Chem system was tested for weak and strong scalability and performance with a tropospheric oxidant-aerosol simulation. Results confirm that the GEOS-Chem chemical operator scales efficiently for any number of processes. Although inclusion of atmospheric chemistry in ESMs is computationally expensive, the excellent scalability of the chemical operator means that the relative cost goes down with increasing number of processes, making fine-scale resolution simulations possible.

  14. TAMU: Blueprint for A New Space Mission Operations System Paradigm

    NASA Technical Reports Server (NTRS)

    Ruszkowski, James T.; Meshkat, Leila; Haensly, Jean; Pennington, Al; Hogle, Charles

    2011-01-01

    The Transferable, Adaptable, Modular and Upgradeable (TAMU) Flight Production Process (FPP) is a System of System (SOS) framework which cuts across multiple organizations and their associated facilities, that are, in the most general case, in geographically disperse locations, to develop the architecture and associated workflow processes of products for a broad range of flight projects. Further, TAMU FPP provides for the automatic execution and re-planning of the workflow processes as they become operational. This paper provides the blueprint for the TAMU FPP paradigm. This blueprint presents a complete, coherent technique, process and tool set that results in an infrastructure that can be used for full lifecycle design and decision making during the flight production process. Based on the many years of experience with the Space Shuttle Program (SSP) and the International Space Station (ISS), the currently cancelled Constellation Program which aimed on returning humans to the moon as a starting point, has been building a modern model-based Systems Engineering infrastructure to Re-engineer the FPP. This infrastructure uses a structured modeling and architecture development approach to optimize the system design thereby reducing the sustaining costs and increasing system efficiency, reliability, robustness and maintainability metrics. With the advent of the new vision for human space exploration, it is now necessary to further generalize this framework to take into consideration a broad range of missions and the participation of multiple organizations outside of the MOD; hence the Transferable, Adaptable, Modular and Upgradeable (TAMU) concept.

  15. Dissociation of modular total hip arthroplasty at the neck-stem interface without dislocation.

    PubMed

    Kouzelis, A; Georgiou, C S; Megas, P

    2012-12-01

    Modular femoral and acetabular components are now widely used, but only a few complications related to the modularity itself have been reported. We describe a case of dissociation of the modular total hip arthroplasty (THA) at the femoral neck-stem interface during walking. The possible causes of this dissociation are discussed. Successful treatment was provided with surgical revision and replacement of the modular neck components. Surgeons who use modular components in hip arthroplasties should be aware of possible early complications in which the modularity of the prostheses is the major factor of failure.

  16. Quasispecies theory for evolution of modularity.

    PubMed

    Park, Jeong-Man; Niestemski, Liang Ren; Deem, Michael W

    2015-01-01

    Biological systems are modular, and this modularity evolves over time and in different environments. A number of observations have been made of increased modularity in biological systems under increased environmental pressure. We here develop a quasispecies theory for the dynamics of modularity in populations of these systems. We show how the steady-state fitness in a randomly changing environment can be computed. We derive a fluctuation dissipation relation for the rate of change of modularity and use it to derive a relationship between rate of environmental changes and rate of growth of modularity. We also find a principle of least action for the evolved modularity at steady state. Finally, we compare our predictions to simulations of protein evolution and find them to be consistent.

  17. The Sargassum Early Advisory System (SEAS)

    NASA Astrophysics Data System (ADS)

    Armstrong, D.; Gallegos, S. C.

    2016-02-01

    The Sargassum Early Advisory System (SEAS) web-app was designed to automatically detect Sargassum at sea, forecast movement of the seaweed, and alert users of potential landings. Inspired to help address the economic hardships caused by large landings of Sargassum, the web app automates and enhances the manual tasks conducted by the SEAS group of Texas A&M University at Galveston. The SEAS web app is a modular, mobile-friendly tool that automates the entire workflow from data acquisition to user management. The modules include: 1) an Imagery Retrieval Module to automatically download Landsat-8 Operational Land Imagery (OLI) from the United States Geological Survey (USGS), 2) a Processing Module for automatic detection of Sargassum in the OLI imagery, and subsequent mapping of theses patches in the HYCOM grid, producing maps that show Sargassum clusters; 3) a Forecasting engine fed by the HYbrid Coordinate Ocean Model (HYCOM) model currents and winds from weather buoys; and 4) a mobile phone optimized geospatial user interface. The user can view the last known position of Sargassum clusters, trajectory and location projections for the next 24, 72 and 168 hrs. Users can also subscribe to alerts generated for particular areas. Currently, the SEAS web app produces advisories for Texas beaches. The forecasted Sargassum landing locations are validated by reports from Texas beach managers. However, the SEAS web app was designed to easily expand to other areas, and future plans call for extending the SEAS web app to Mexico and the Caribbean islands. The SEAS web app development is led by NASA, with participation by ASRC Federal/Computer Science Corporation, and the Naval Research Laboratory, all at Stennis Space Center, and Texas A&M University at Galveston.

  18. Self-organized modularization in evolutionary algorithms.

    PubMed

    Dauscher, Peter; Uthmann, Thomas

    2005-01-01

    The principle of modularization has proven to be extremely successful in the field of technical applications and particularly for Software Engineering purposes. The question to be answered within the present article is whether mechanisms can also be identified within the framework of Evolutionary Computation that cause a modularization of solutions. We will concentrate on processes, where modularization results only from the typical evolutionary operators, i.e. selection and variation by recombination and mutation (and not, e.g., from special modularization operators). This is what we call Self-Organized Modularization. Based on a combination of two formalizations by Radcliffe and Altenberg, some quantitative measures of modularity are introduced. Particularly, we distinguish Built-in Modularity as an inherent property of a genotype and Effective Modularity, which depends on the rest of the population. These measures can easily be applied to a wide range of present Evolutionary Computation models. It will be shown, both theoretically and by simulation, that under certain conditions, Effective Modularity (as defined within this paper) can be a selection factor. This causes Self-Organized Modularization to take place. The experimental observations emphasize the importance of Effective Modularity in comparison with Built-in Modularity. Although the experimental results have been obtained using a minimalist toy model, they can lead to a number of consequences for existing models as well as for future approaches. Furthermore, the results suggest a complex self-amplification of highly modular equivalence classes in the case of respected relations. Since the well-known Holland schemata are just the equivalence classes of respected relations in most Simple Genetic Algorithms, this observation emphasizes the role of schemata as Building Blocks (in comparison with arbitrary subsets of the search space).

  19. [Modular enteral nutrition in pediatrics].

    PubMed

    Murillo Sanchís, S; Prenafeta Ferré, M T; Sempere Luque, M D

    1991-01-01

    Modular Enteral Nutrition may be a substitute for Parenteral Nutrition in children with different pathologies. Study of 4 children with different pathologies selected from a group of 40 admitted to the Maternal-Childrens Hospital "Valle de Hebrón" in Barcelona, who received modular enteral nutrition. They were monitored on a daily basis by the Dietician Service. Modular enteral nutrition consists of modules of proteins, peptides, lipids, glucids and mineral salts-vitamins. 1.--Craneo-encephalic traumatisms with loss of consciousness, Feeding with a combination of parenteral nutrition and modular enteral nutrition for 7 days. In view of the tolerance and good results of the modular enteral nutrition, the parenteral nutrition was suspended and modular enteral nutrition alone used up to a total of 43 days. 2.--55% burns with 36 days of hyperproteic modular enteral nutrition together with normal feeding. A more rapid recovery was achieved with an increase in total proteins and albumin. 3.--Persistent diarrhoea with 31 days of modular enteral nutrition, 5 days on parenteral nutrition alone and 8 days on combined parenteral nutrition and modular enteral nutrition. In view of the tolerance and good results of the modular enteral nutrition, the parenteral nutrition was suspended. 4.--Mucoviscidosis with a total of 19 days on modular enteral nutrition, 12 of which were exclusively on modular enteral nutrition and 7 as a night supplement to normal feeding. We administered proteic intakes of up to 20% of the total calorific intake and in concentrations of up to 1.2 calories/ml of the final preparation, always with a good tolerance. Modular enteral nutrition can and should be used as a substitute for parenteral nutrition in children with different pathologies, thus preventing the complications inherent in parenteral nutrition.

  20. Convergent evolution of modularity in metabolic networks through different community structures.

    PubMed

    Zhou, Wanding; Nakhleh, Luay

    2012-09-14

    It has been reported that the modularity of metabolic networks of bacteria is closely related to the variability of their living habitats. However, given the dependency of the modularity score on the community structure, it remains unknown whether organisms achieve certain modularity via similar or different community structures. In this work, we studied the relationship between similarities in modularity scores and similarities in community structures of the metabolic networks of 1021 species. Both similarities are then compared against the genetic distances. We revisited the association between modularity and variability of the microbial living environments and extended the analysis to other aspects of their life style such as temperature and oxygen requirements. We also tested both topological and biological intuition of the community structures identified and investigated the extent of their conservation with respect to the taxonomy. We find that similar modularities are realized by different community structures. We find that such convergent evolution of modularity is closely associated with the number of (distinct) enzymes in the organism's metabolome, a consequence of different life styles of the species. We find that the order of modularity is the same as the order of the number of the enzymes under the classification based on the temperature preference but not on the oxygen requirement. Besides, inspection of modularity-based communities reveals that these communities are graph-theoretically meaningful yet not reflective of specific biological functions. From an evolutionary perspective, we find that the community structures are conserved only at the level of kingdoms. Our results call for more investigation into the interplay between evolution and modularity: how evolution shapes modularity, and how modularity affects evolution (mainly in terms of fitness and evolvability). Further, our results call for exploring new measures of modularity and network communities that better correspond to functional categorizations.

  1. Semi-Automatic Indexing of Full Text Biomedical Articles

    PubMed Central

    Gay, Clifford W.; Kayaalp, Mehmet; Aronson, Alan R.

    2005-01-01

    The main application of U.S. National Library of Medicine’s Medical Text Indexer (MTI) is to provide indexing recommendations to the Library’s indexing staff. The current input to MTI consists of the titles and abstracts of articles to be indexed. This study reports on an extension of MTI to the full text of articles appearing in online medical journals that are indexed for Medline®. Using a collection of 17 journal issues containing 500 articles, we report on the effectiveness of the contribution of terms by the whole article and also by each section. We obtain the best results using a model consisting of the sections Results, Results and Discussion, and Conclusions together with the article’s title and abstract, the captions of tables and figures, and sections that have no titles. The resulting model provides indexing significantly better (7.4%) than what is currently achieved using only titles and abstracts. PMID:16779044

  2. Auditing SNOMED Relationships Using a Converse Abstraction Network

    PubMed Central

    Wei, Duo; Halper, Michael; Elhanan, Gai; Chen, Yan; Perl, Yehoshua; Geller, James; Spackman, Kent A.

    2009-01-01

    In SNOMED CT, a given kind of attribute relationship is defined between two hierarchies, a source and a target. Certain hierarchies (or subhierarchies) serve only as targets, with no outgoing relationships of their own. However, converse relationships—those pointing in a direction opposite to the defined relationships—while not explicitly represented in SNOMED’s inferred view, can be utilized in forming an alternative view of a source. In particular, they can help shed light on a source hierarchy’s overall relationship structure. Toward this end, an abstraction network, called the converse abstraction network (CAN), derived automatically from a given SNOMED hierarchy is presented. An auditing methodology based on the CAN is formulated. The methodology is applied to SNOMED’s Device subhierarchy and the related device relationships of the Procedure hierarchy. The results indicate that the CAN is useful in finding opportunities for refining and improving SNOMED. PMID:20351941

  3. Implicit Contractive Mappings in Modular Metric and Fuzzy Metric Spaces

    PubMed Central

    Hussain, N.; Salimi, P.

    2014-01-01

    The notion of modular metric spaces being a natural generalization of classical modulars over linear spaces like Lebesgue, Orlicz, Musielak-Orlicz, Lorentz, Orlicz-Lorentz, and Calderon-Lozanovskii spaces was recently introduced. In this paper we investigate the existence of fixed points of generalized α-admissible modular contractive mappings in modular metric spaces. As applications, we derive some new fixed point theorems in partially ordered modular metric spaces, Suzuki type fixed point theorems in modular metric spaces and new fixed point theorems for integral contractions. In last section, we develop an important relation between fuzzy metric and modular metric and deduce certain new fixed point results in triangular fuzzy metric spaces. Moreover, some examples are provided here to illustrate the usability of the obtained results. PMID:25003157

  4. Modular Power Standard for Space Explorations Missions

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Gardner, Brent G.

    2016-01-01

    Future human space exploration will most likely be composed of assemblies of multiple modular spacecraft elements with interconnected electrical power systems. An electrical system composed of a standardized set modular building blocks provides significant development, integration, and operational cost advantages. The modular approach can also provide the flexibility to configure power systems to meet the mission needs. A primary goal of the Advanced Exploration Systems (AES) Modular Power System (AMPS) project is to establish a Modular Power Standard that is needed to realize these benefits. This paper is intended to give the space exploration community a "first look" at the evolving Modular Power Standard and invite their comments and technical contributions.

  5. PYTHON for Variable Star Astronomy (Abstract)

    NASA Astrophysics Data System (ADS)

    Craig, M.

    2018-06-01

    (Abstract only) Open source PYTHON packages that are useful for data reduction, photometry, and other tasks relevant to variable star astronomy have been developed over the last three to four years as part of the Astropy project. Using this software, it is relatively straightforward to reduce images, automatically detect sources, and match them to catalogs. Over the last year browser-based tools for performing some of those tasks have been developed that minimize or eliminate the need to write any of your own code. After providing an overview of the current state of the software, an application that calculates transformation coefficients on a frame-by-frame basis by matching stars in an image to the APASS catalog will be described.

  6. Proving refinement transformations using extended denotational semantics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winter, V.L.; Boyle, J.M.

    1996-04-01

    TAMPR is a fully automatic transformation system based on syntactic rewrites. Our approach in a correctness proof is to map the transformation into an axiomatized mathematical domain where formal (and automated) reasoning can be performed. This mapping is accomplished via an extended denotational semantic paradigm. In this approach, the abstract notion of a program state is distributed between an environment function and a store function. Such a distribution introduces properties that go beyond the abstract state that is being modeled. The reasoning framework needs to be aware of these properties in order to successfully complete a correctness proof. This papermore » discusses some of our experiences in proving the correctness of TAMPR transformations.« less

  7. SERC 2014-2018 Technical Plan

    DTIC Science & Technology

    2013-10-25

    assurance-case analysis that are not only more powerful in anomaly detection , but also leading to stronger possibilities for positive assurance and to...PAGES 50 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c . THIS PAGE unclassified Standard Form 298 (Rev. 8...base for responding to cyber-attacks; and ( c ) combine techniques developed for automatic control systems in a manner that will both enable defense

  8. Computer Recognition of Facial Profiles

    DTIC Science & Technology

    1974-08-01

    facial recognition 20. ABSTRACT (Continue on reverse side It necessary and Identify by block number) A system for the recognition of human faces from...21 2.6 Classification Algorithms ........... ... 32 III FACIAL RECOGNITION AND AUTOMATIC TRAINING . . . 37 3.1 Facial Profile Recognition...provide a fair test of the classification system. The work of Goldstein, Harmon, and Lesk [81 indicates, however, that for facial recognition , a ten class

  9. Variable Discretisation for Anomaly Detection using Bayesian Networks

    DTIC Science & Technology

    2017-01-01

    UNCLASSIFIED DST- Group –TR–3328 1 Introduction Bayesian network implementations usually require each variable to take on a finite number of mutually...UNCLASSIFIED Variable Discretisation for Anomaly Detection using Bayesian Networks Jonathan Legg National Security and ISR Division Defence Science...and Technology Group DST- Group –TR–3328 ABSTRACT Anomaly detection is the process by which low probability events are automatically found against a

  10. Volume 2: Compendium of Abstracts

    DTIC Science & Technology

    2017-06-01

    simulation work using a standard running model for legged systems, the Spring Loaded Inverted Pendulum (SLIP) Model. In this model, the dynamics of a single...bar SLIP model is analyzed using a basin of attraction analyses to determine the optimal configuration for running at different velocities and...acquisition, and the automatic target acquisition were then compared to each other. After running trials with the current system, it will be

  11. Cryptanalysis of the Sodark Family of Cipher Algorithms

    DTIC Science & Technology

    2017-09-01

    software project for building three-bit LUT circuit representations of S- boxes is available as a GitHub repository [40]. It contains several improvements...DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The...second- and third-generation automatic link establishment (ALE) systems for high frequency radios. Radios utilizing ALE technology are in use by a

  12. No Emergency Incident Recognizes Borders

    DTIC Science & Technology

    2011-03-01

    ABSTRACT (maximum 200 words) The state of Arizona and the bordering towns of northern Mexico acknowledge the need for capability planning. They...northern Mexico are taking a preventive approach and have created the Bi- National Arizona Emergency Response Task Force (BAERTF). The goal of the...BAERTF is to deliver a timely, supportive response and automatic, mutual-aid capability to any jurisdiction in the state of Arizona or northern Mexico

  13. GPS-UTC Time Synchronization

    DTIC Science & Technology

    1989-11-01

    GPS-UTC TIME SYNCHRONIZATION C. H. MCKENZIE W. A. FEESS R, H. LUCAS H. HOLTZ A. L. SATIN The Aerospace Corporation El Segundo, California...Abstract Two automatic algorithms for synchronizing the GPS time standard to the UTC time standard are evaluated. Both algorithms control GPS-UTC...is required to synchronize its broadcast time standard to within one microsecond o f the time standard maintained by the US Naval Observatory

  14. Development of an Advanced, Automatic, Ultrasonic NDE Imaging System via Adaptive Learning Network Signal Processing Techniques

    DTIC Science & Technology

    1981-03-13

    UNCLASSIFIED SECURITY CLAS,:FtfC ’i OF TH*!’ AGC W~ct P- A* 7~9r1) 0. ABSTRACT (continued) onuing in concert with a sophisticated detector has...and New York, 1969. Whalen, M.F., L.J. O’Brien, and A.N. Mucciardi, "Application of Adaptive Learning Netowrks for the Characterization of Two

  15. Large Scale System Defense

    DTIC Science & Technology

    2008-10-01

    AD); Aeolos, a distributed intrusion detection and event correlation infrastructure; STAND, a training-set sanitization technique applicable to ADs...UU 18. NUMBER OF PAGES 25 19a. NAME OF RESPONSIBLE PERSON Frank H. Born a. REPORT U b. ABSTRACT U c . THIS PAGE U 19b. TELEPHONE...Summary of findings 2 (a) Automatic Patch Generation 2 (b) Better Patch Management 2 ( c ) Artificial Diversity 3 (d) Distributed Anomaly Detection 3

  16. Distributed Data Integration Infrastructure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Critchlow, T; Ludaescher, B; Vouk, M

    The Internet is becoming the preferred method for disseminating scientific data from a variety of disciplines. This can result in information overload on the part of the scientists, who are unable to query all of the relevant sources, even if they knew where to find them, what they contained, how to interact with them, and how to interpret the results. A related issue is keeping up with current trends in information technology often taxes the end-user's expertise and time. Thus instead of benefiting from this information rich environment, scientists become experts on a small number of sources and technologies, usemore » them almost exclusively, and develop a resistance to innovations that can enhance their productivity. Enabling information based scientific advances, in domains such as functional genomics, requires fully utilizing all available information and the latest technologies. In order to address this problem we are developing a end-user centric, domain-sensitive workflow-based infrastructure, shown in Figure 1, that will allow scientists to design complex scientific workflows that reflect the data manipulation required to perform their research without an undue burden. We are taking a three-tiered approach to designing this infrastructure utilizing (1) abstract workflow definition, construction, and automatic deployment, (2) complex agent-based workflow execution and (3) automatic wrapper generation. In order to construct a workflow, the scientist defines an abstract workflow (AWF) in terminology (semantics and context) that is familiar to him/her. This AWF includes all of the data transformations, selections, and analyses required by the scientist, but does not necessarily specify particular data sources. This abstract workflow is then compiled into an executable workflow (EWF, in our case XPDL) that is then evaluated and executed by the workflow engine. This EWF contains references to specific data source and interfaces capable of performing the desired actions. In order to provide access to the largest number of resources possible, our lowest level utilizes automatic wrapper generation techniques to create information and data wrappers capable of interacting with the complex interfaces typical in scientific analysis. The remainder of this document outlines our work in these three areas, the impact our work has made, and our plans for the future.« less

  17. Circularly Polarized Luminescence in Enantiopure Europium and Terbium Complexes with Modular, All-Oxygen Donor Ligands

    PubMed Central

    Seitz, Michael; Do, King; Ingram, Andrew J.; Moore, Evan G.; Muller, Gilles; Raymond, Kenneth N.

    2009-01-01

    Abstract: Circulaly polarized luminescence from terbium(III) complexed and excited by chiral antenna ligands gives strong emission The modular synthesis of three new octadentate, enantiopure ligands are reported - one with the bidentate chelating unit 2-hydroxyisophthalamide (IAM) and two with 1-hydroxy-2-pyridinone (1,2-HOPO) units. A new design principle is introduced for the chiral, non-racemic hexamines which constitute the central backbones for the presented class of ligands. The terbium(III) complex of the IAM ligand, as well as the europium(III) complexes of the 1,2-HOPO ligands are synthesized and characterized by various techniques (NMR, UV, CD, luminescence spectroscopy). All species exhibit excellent stability and moderate to high luminescence efficiency (quantum yields ΦEu = 0.05–0.08 and ΦTb = 0.30–0.57) in aqueous solution at physiological pH. Special focus is put onto the properties of the complexes in regard to circularly polarized luminescence (CPL). The maximum luminescence dissymmetry factors (glum) in aqueous solution are high with |glum|max = 0.08 – 0.40. Together with the very favorable general properties (good stability, high quantum yields, long lifetimes), the presented lanthanide complexes can be considered as good candidates for analytical probes based on CPL in biologically relevant environments. PMID:19639983

  18. A versatile modular bioreactor platform for Tissue Engineering

    PubMed Central

    Schuerlein, Sebastian; Schwarz, Thomas; Krziminski, Steffan; Gätzner, Sabine; Hoppensack, Anke; Schwedhelm, Ivo; Schweinlin, Matthias; Walles, Heike

    2016-01-01

    Abstract Tissue Engineering (TE) bears potential to overcome the persistent shortage of donor organs in transplantation medicine. Additionally, TE products are applied as human test systems in pharmaceutical research to close the gap between animal testing and the administration of drugs to human subjects in clinical trials. However, generating a tissue requires complex culture conditions provided by bioreactors. Currently, the translation of TE technologies into clinical and industrial applications is limited due to a wide range of different tissue‐specific, non‐disposable bioreactor systems. To ensure a high level of standardization, a suitable cost‐effectiveness, and a safe graft production, a generic modular bioreactor platform was developed. Functional modules provide robust control of culture processes, e.g. medium transport, gas exchange, heating, or trapping of floating air bubbles. Characterization revealed improved performance of the modules in comparison to traditional cell culture equipment such as incubators, or peristaltic pumps. By combining the modules, a broad range of culture conditions can be achieved. The novel bioreactor platform allows using disposable components and facilitates tissue culture in closed fluidic systems. By sustaining native carotid arteries, engineering a blood vessel, and generating intestinal tissue models according to a previously published protocol the feasibility and performance of the bioreactor platform was demonstrated. PMID:27492568

  19. Embracing complexity: theory, cases and the future of bioethics.

    PubMed

    Wilson, James

    2014-01-01

    This paper reflects on the relationship between theory and practice in bioethics, by using various concepts drawn from debates on innovation in healthcare research--in particular debates around how best to connect up blue skies 'basic' research with practical innovations that can improve human lives. It argues that it is a mistake to assume that the most difficult and important questions in bioethics are the most abstract ones, and also a mistake to assume that getting clear about abstract cases will automatically be of much help in getting clear about more complex cases. It replaces this implicitly linear model with a more complex one that draws on the idea of translational research in healthcare. On the translational model, there is a continuum of cases from the most simple and abstract (thought experiments) to the most concrete and complex (real world cases). Insights need to travel in both directions along this continuum--from the more abstract to the more concrete and from the more concrete to the more abstract. The paper maps out some difficulties in moving from simpler to more complex cases, and in doing so makes recommendations about the future of bioethics.

  20. Configurable double-sided modular jet impingement assemblies for electronics cooling

    DOEpatents

    Zhou, Feng; Dede, Ercan Mehmet

    2018-05-22

    A modular jet impingement assembly includes an inlet tube fluidly coupled to a fluid inlet, an outlet tube fluidly coupled to a fluid outlet, and a modular manifold having a first distribution recess extending into a first side of the modular manifold, a second distribution recess extending into a second side of the modular manifold, a plurality of inlet connection tubes positioned at an inlet end of the modular manifold, and a plurality of outlet connection tubes positioned at an outlet end of the modular manifold. A first manifold insert is removably positioned within the first distribution recess, a second manifold insert is removably positioned within the second distribution recess, and a first and second heat transfer plate each removably coupled to the modular manifold. The first and second heat transfer plates each comprise an impingement surface.

  1. MATE standardization

    NASA Astrophysics Data System (ADS)

    Farmer, R. E.

    1982-11-01

    The MATE (Modular Automatic Test Equipment) program was developed to combat the proliferation of unique, expensive ATE within the Air Force. MATE incorporates a standard management approach and a standard architecture designed to implement a cradle-to-grave approach to the acquisition of ATE and to significantly reduce the life cycle cost of weapons systems support. These standards are detailed in the MATE Guides. The MATE Guides assist both the Air Force and Industry in implementing the MATE concept, and provide the necessary tools and guidance required for successful acquisition of ATE. The guides also provide the necessary specifications for industry to build MATE-qualifiable equipment. The MATE architecture provides standards for all key interfaces of an ATE system. The MATE approach to the acquisition and management of ATE has been jointly endorsed by the commanders of Air Force Systems Command and Air Force Logistics Command as the way of doing business in the future.

  2. Advanced E-O test capability for Army Next-Generation Automated Test System (NGATS)

    NASA Astrophysics Data System (ADS)

    Errea, S.; Grigor, J.; King, D. F.; Matis, G.; McHugh, S.; McKechnie, J.; Nehring, B.

    2015-05-01

    The Future E-O (FEO) program was established to develop a flexible, modular, automated test capability as part of the Next Generation Automatic Test System (NGATS) program to support the test and diagnostic needs of currently fielded U.S. Army electro-optical (E-O) devices, as well as being expandable to address the requirements of future Navy, Marine Corps and Air Force E-O systems. Santa Barbara infrared (SBIR) has designed, fabricated, and delivered three (3) prototype FEO for engineering and logistics evaluation prior to anticipated full-scale production beginning in 2016. In addition to presenting a detailed overview of the FEO system hardware design, features and testing capabilities, the integration of SBIR's EO-IR sensor and laser test software package, IRWindows 4™, into FEO to automate the test execution, data collection and analysis, archiving and reporting of results is also described.

  3. A Note on Interfacing Object Warehouses and Mass Storage Systems for Data Mining Applications

    NASA Technical Reports Server (NTRS)

    Grossman, Robert L.; Northcutt, Dave

    1996-01-01

    Data mining is the automatic discovery of patterns, associations, and anomalies in data sets. Data mining requires numerically and statistically intensive queries. Our assumption is that data mining requires a specialized data management infrastructure to support the aforementioned intensive queries, but because of the sizes of data involved, this infrastructure is layered over a hierarchical storage system. In this paper, we discuss the architecture of a system which is layered for modularity, but exploits specialized lightweight services to maintain efficiency. Rather than use a full functioned database for example, we use light weight object services specialized for data mining. We propose using information repositories between layers so that components on either side of the layer can access information in the repositories to assist in making decisions about data layout, the caching and migration of data, the scheduling of queries, and related matters.

  4. Design and the parametric testing of the space station prototype integrated vapor compression distillation water recovery module

    NASA Technical Reports Server (NTRS)

    Reveley, W. F.; Nuccio, P. P.

    1975-01-01

    Potable water for the Space Station Prototype life support system is generated by the vapor compression technique of vacuum distillation. A description of a complete three-man modular vapor compression water renovation loop that was built and tested is presented; included are all of the pumps, tankage, chemical post-treatment, instrumentation, and controls necessary to make the loop representative of an automatic, self-monitoring, null gravity system. The design rationale is given and the evolved configuration is described. Presented next are the results of an extensive parametric test during which distilled water was generated from urine and urinal flush water with concentration of solids in the evaporating liquid increasing progressively to 60 percent. Water quality, quantity and production rate are shown together with measured energy consumption rate in terms of watt-hours per kilogram of distilled water produced.

  5. [Not Available].

    PubMed

    Pecevski, Dejan; Natschläger, Thomas; Schuch, Klaus

    2009-01-01

    The Parallel Circuit SIMulator (PCSIM) is a software package for simulation of neural circuits. It is primarily designed for distributed simulation of large scale networks of spiking point neurons. Although its computational core is written in C++, PCSIM's primary interface is implemented in the Python programming language, which is a powerful programming environment and allows the user to easily integrate the neural circuit simulator with data analysis and visualization tools to manage the full neural modeling life cycle. The main focus of this paper is to describe PCSIM's full integration into Python and the benefits thereof. In particular we will investigate how the automatically generated bidirectional interface and PCSIM's object-oriented modular framework enable the user to adopt a hybrid modeling approach: using and extending PCSIM's functionality either employing pure Python or C++ and thus combining the advantages of both worlds. Furthermore, we describe several supplementary PCSIM packages written in pure Python and tailored towards setting up and analyzing neural simulations.

  6. Adding the Human Element to Ship Manoeuvring Simulations

    NASA Astrophysics Data System (ADS)

    Aarsæther, Karl Gunnar; Moan, Torgeir

    Time-domain simulation of ship manoeuvring has been utilized in risk analysis to assess the effect of changes to the ship-lane, development in traffic volume and the associated risk. The process of ship manoeuvring in a wider socio-technical context consists of the technical systems, operational procedures, the human operators and support functions. Automated manoeuvring simulations without human operators in the simulation loop have often been preferred in simulation studies due to the low time required for simulations. Automatic control has represented the human element with little effort devoted to explain the relationship between the guidance and control algorithms and the human operator which they replace. This paper describes the development and application of a model for the human element for autonomous time-domain manoeuvring simulations. The method is applicable in the time-domain, modular and found to be capable of reproducing observed manoeuvre patterns, but limited to represent the intended behaviour.

  7. Large-field-of-view, modular, stabilized, adaptive-optics-based scanning laser ophthalmoscope.

    PubMed

    Burns, Stephen A; Tumbar, Remy; Elsner, Ann E; Ferguson, Daniel; Hammer, Daniel X

    2007-05-01

    We describe the design and performance of an adaptive optics retinal imager that is optimized for use during dynamic correction for eye movements. The system incorporates a retinal tracker and stabilizer, a wide-field line scan scanning laser ophthalmoscope (SLO), and a high-resolution microelectromechanical-systems-based adaptive optics SLO. The detection system incorporates selection and positioning of confocal apertures, allowing measurement of images arising from different portions of the double pass retinal point-spread function (psf). System performance was excellent. The adaptive optics increased the brightness and contrast for small confocal apertures by more than 2x and decreased the brightness of images obtained with displaced apertures, confirming the ability of the adaptive optics system to improve the psf. The retinal image was stabilized to within 18 microm 90% of the time. Stabilization was sufficient for cross-correlation techniques to automatically align the images.

  8. An Adaptive Unstructured Grid Method by Grid Subdivision, Local Remeshing, and Grid Movement

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar Z.

    1999-01-01

    An unstructured grid adaptation technique has been developed and successfully applied to several three dimensional inviscid flow test cases. The approach is based on a combination of grid subdivision, local remeshing, and grid movement. For solution adaptive grids, the surface triangulation is locally refined by grid subdivision, and the tetrahedral grid in the field is partially remeshed at locations of dominant flow features. A grid redistribution strategy is employed for geometric adaptation of volume grids to moving or deforming surfaces. The method is automatic and fast and is designed for modular coupling with different solvers. Several steady state test cases with different inviscid flow features were tested for grid/solution adaptation. In all cases, the dominant flow features, such as shocks and vortices, were accurately and efficiently predicted with the present approach. A new and robust method of moving tetrahedral "viscous" grids is also presented and demonstrated on a three-dimensional example.

  9. Integration of passive driver-assistance systems with on-board vehicle systems

    NASA Astrophysics Data System (ADS)

    Savchenko, V. V.; Poddubko, S. N.

    2018-02-01

    Implementation in OIAS such functions as driver’s state monitoring and high-precision calculation of the current navigation coordinates of the vehicle, modularity of the OIAS construction and the possible increase in the functionality through integration with other onboard systems has a promising development future. The development of intelligent transport systems and their components allows setting and solving fundamentally new tasks for the safety of human-to-machine transport systems, and the automatic analysis of heterogeneous information flows provides a synergistic effect. The analysis of cross-modal information exchange in human-machine transport systems, from uniform methodological points of view, will allow us, with an accuracy acceptable for solving applied problems, to form in real time an integrated assessment of the state of the basic components of the human-to-machine system and the dynamics in changing situation-centered environment, including the external environment, in their interrelations.

  10. Introducing PLIA: Planetary Laboratory for Image Analysis

    NASA Astrophysics Data System (ADS)

    Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.

    2005-08-01

    We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.

  11. Event heap: a coordination infrastructure for dynamic heterogeneous application interactions in ubiquitous computing environments

    DOEpatents

    Johanson, Bradley E.; Fox, Armando; Winograd, Terry A.; Hanrahan, Patrick M.

    2010-04-20

    An efficient and adaptive middleware infrastructure called the Event Heap system dynamically coordinates application interactions and communications in a ubiquitous computing environment, e.g., an interactive workspace, having heterogeneous software applications running on various machines and devices across different platforms. Applications exchange events via the Event Heap. Each event is characterized by a set of unordered, named fields. Events are routed by matching certain attributes in the fields. The source and target versions of each field are automatically set when an event is posted or used as a template. The Event Heap system implements a unique combination of features, both intrinsic to tuplespaces and specific to the Event Heap, including content based addressing, support for routing patterns, standard routing fields, limited data persistence, query persistence/registration, transparent communication, self-description, flexible typing, logical/physical centralization, portable client API, at most once per source first-in-first-out ordering, and modular restartability.

  12. Semantic Web Service Delivery in Healthcare Based on Functional and Non-Functional Properties.

    PubMed

    Schweitzer, Marco; Gorfer, Thilo; Hörbst, Alexander

    2017-01-01

    In the past decades, a lot of endeavor has been made on the trans-institutional exchange of healthcare data through electronic health records (EHR) in order to obtain a lifelong, shared accessible health record of a patient. Besides basic information exchange, there is a growing need for Information and Communication Technology (ICT) to support the use of the collected health data in an individual, case-specific workflow-based manner. This paper presents the results on how workflows can be used to process data from electronic health records, following a semantic web service approach that enables automatic discovery, composition and invocation of suitable web services. Based on this solution, the user (physician) can define its needs from a domain-specific perspective, whereas the ICT-system fulfills those needs with modular web services. By involving also non-functional properties for the service selection, this approach is even more suitable for the dynamic medical domain.

  13. Using Generative Representations to Evolve Robots. Chapter 1

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2004-01-01

    Recent research has demonstrated the ability of evolutionary algorithms to automatically design both the physical structure and software controller of real physical robots. One of the challenges for these automated design systems is to improve their ability to scale to the high complexities found in real-world problems. Here we claim that for automated design systems to scale in complexity they must use a representation which allows for the hierarchical creation and reuse of modules, which we call a generative representation. Not only is the ability to reuse modules necessary for functional scalability, but it is also valuable for improving efficiency in testing and construction. We then describe an evolutionary design system with a generative representation capable of hierarchical modularity and demonstrate it for the design of locomoting robots in simulation. Finally, results from our experiments show that evolution with our generative representation produces better robots than those evolved with a non-generative representation.

  14. Large Field of View, Modular, Stabilized, Adaptive-Optics-Based Scanning Laser Ophthalmoscope

    PubMed Central

    Burns, Stephen A.; Tumbar, Remy; Elsner, Ann E.; Ferguson, Daniel; Hammer, Daniel X.

    2007-01-01

    We describe the design and performance of an adaptive optics retinal imager that is optimized for use during dynamic correction for eye movements. The system incorporates a retinal tracker and stabilizer, a wide field line scan Scanning Laser Ophthalmocsope (SLO), and a high resolution MEMS based adaptive optics SLO. The detection system incorporates selection and positioning of confocal apertures, allowing measurement of images arising from different portions of the double pass retinal point spread function (psf). System performance was excellent. The adaptive optics increased the brightness and contrast for small confocal apertures by more than 2x, and decreased the brightness of images obtained with displaced apertures, confirming the ability of the adaptive optics system to improve the pointspread function. The retinal image was stabilized to within 18 microns 90% of the time. Stabilization was sufficient for cross-correlation techniques to automatically align the images. PMID:17429477

  15. A wide bandwidth CCD buffer memory system

    NASA Technical Reports Server (NTRS)

    Siemens, K.; Wallace, R. W.; Robinson, C. R.

    1978-01-01

    A prototype system was implemented to demonstrate that CCD's can be applied advantageously to the problem of low power digital storage and particularly to the problem of interfacing widely varying data rates. CCD shift register memories (8K bit) were used to construct a feasibility model 128 K-bit buffer memory system. Serial data that can have rates between 150 kHz and 4.0 MHz can be stored in 4K-bit, randomly-accessible memory blocks. Peak power dissipation during a data transfer is less than 7 W, while idle power is approximately 5.4 W. The system features automatic data input synchronization with the recirculating CCD memory block start address. System expansion to accommodate parallel inputs or a greater number of memory blocks can be performed in a modular fashion. Since the control logic does not increase proportionally to increase in memory capacity, the power requirements per bit of storage can be reduced significantly in a larger system.

  16. Linking Somatic and Symbolic Representation in Semantic Memory: The Dynamic Multilevel Reactivation Framework

    PubMed Central

    Reilly, Jamie; Peelle, Jonathan E; Garcia, Amanda; Crutch, Sebastian J

    2016-01-01

    Biological plausibility is an essential constraint for any viable model of semantic memory. Yet, we have only the most rudimentary understanding of how the human brain conducts abstract symbolic transformations that underlie word and object meaning. Neuroscience has evolved a sophisticated arsenal of techniques for elucidating the architecture of conceptual representation. Nevertheless, theoretical convergence remains elusive. Here we describe several contrastive approaches to the organization of semantic knowledge, and in turn we offer our own perspective on two recurring questions in semantic memory research: 1) to what extent are conceptual representations mediated by sensorimotor knowledge (i.e., to what degree is semantic memory embodied)? 2) How might an embodied semantic system represent abstract concepts such as modularity, symbol, or proposition? To address these questions, we review the merits of sensorimotor (i.e., embodied) and amodal (i.e., disembodied) semantic theories and address the neurobiological constraints underlying each. We conclude that the shortcomings of both perspectives in their extreme forms necessitate a hybrid middle ground. We accordingly propose the Dynamic Multilevel Reactivation Framework, an integrative model premised upon flexible interplay between sensorimotor and amodal symbolic representations mediated by multiple cortical hubs. We discuss applications of the Dynamic Multilevel Reactivation Framework to abstract and concrete concept representation and describe how a multidimensional conceptual topography based on emotion, sensation, and magnitude can successfully frame a semantic space containing meanings for both abstract and concrete words. The consideration of ‘abstract conceptual features’ does not diminish the role of logical and/or executive processing in activating, manipulating and using information stored in conceptual representations. Rather, it proposes that the material on which these processes operate necessarily combine pure sensorimotor information and higher-order cognitive dimensions involved in symbolic representation. PMID:27294419

  17. Linking somatic and symbolic representation in semantic memory: the dynamic multilevel reactivation framework.

    PubMed

    Reilly, Jamie; Peelle, Jonathan E; Garcia, Amanda; Crutch, Sebastian J

    2016-08-01

    Biological plausibility is an essential constraint for any viable model of semantic memory. Yet, we have only the most rudimentary understanding of how the human brain conducts abstract symbolic transformations that underlie word and object meaning. Neuroscience has evolved a sophisticated arsenal of techniques for elucidating the architecture of conceptual representation. Nevertheless, theoretical convergence remains elusive. Here we describe several contrastive approaches to the organization of semantic knowledge, and in turn we offer our own perspective on two recurring questions in semantic memory research: (1) to what extent are conceptual representations mediated by sensorimotor knowledge (i.e., to what degree is semantic memory embodied)? (2) How might an embodied semantic system represent abstract concepts such as modularity, symbol, or proposition? To address these questions, we review the merits of sensorimotor (i.e., embodied) and amodal (i.e., disembodied) semantic theories and address the neurobiological constraints underlying each. We conclude that the shortcomings of both perspectives in their extreme forms necessitate a hybrid middle ground. We accordingly propose the Dynamic Multilevel Reactivation Framework-an integrative model predicated upon flexible interplay between sensorimotor and amodal symbolic representations mediated by multiple cortical hubs. We discuss applications of the dynamic multilevel reactivation framework to abstract and concrete concept representation and describe how a multidimensional conceptual topography based on emotion, sensation, and magnitude can successfully frame a semantic space containing meanings for both abstract and concrete words. The consideration of 'abstract conceptual features' does not diminish the role of logical and/or executive processing in activating, manipulating and using information stored in conceptual representations. Rather, it proposes that the materials upon which these processes operate necessarily combine pure sensorimotor information and higher-order cognitive dimensions involved in symbolic representation.

  18. Convergent evolution of modularity in metabolic networks through different community structures

    PubMed Central

    2012-01-01

    Background It has been reported that the modularity of metabolic networks of bacteria is closely related to the variability of their living habitats. However, given the dependency of the modularity score on the community structure, it remains unknown whether organisms achieve certain modularity via similar or different community structures. Results In this work, we studied the relationship between similarities in modularity scores and similarities in community structures of the metabolic networks of 1021 species. Both similarities are then compared against the genetic distances. We revisited the association between modularity and variability of the microbial living environments and extended the analysis to other aspects of their life style such as temperature and oxygen requirements. We also tested both topological and biological intuition of the community structures identified and investigated the extent of their conservation with respect to the taxomony. Conclusions We find that similar modularities are realized by different community structures. We find that such convergent evolution of modularity is closely associated with the number of (distinct) enzymes in the organism’s metabolome, a consequence of different life styles of the species. We find that the order of modularity is the same as the order of the number of the enzymes under the classification based on the temperature preference but not on the oxygen requirement. Besides, inspection of modularity-based communities reveals that these communities are graph-theoretically meaningful yet not reflective of specific biological functions. From an evolutionary perspective, we find that the community structures are conserved only at the level of kingdoms. Our results call for more investigation into the interplay between evolution and modularity: how evolution shapes modularity, and how modularity affects evolution (mainly in terms of fitness and evolvability). Further, our results call for exploring new measures of modularity and network communities that better correspond to functional categorizations. PMID:22974099

  19. Modular jet impingement assemblies with passive and active flow control for electronics cooling

    DOEpatents

    Zhou, Feng; Dede, Ercan Mehmet; Joshi, Shailesh

    2016-09-13

    Power electronics modules having modular jet impingement assembly utilized to cool heat generating devices are disclosed. The modular jet impingement assemblies include a modular manifold having a distribution recess, one or more angled inlet connection tubes positioned at an inlet end of the modular manifold that fluidly couple the inlet tube to the distribution recess and one or more outlet connection tubes positioned at an outlet end of the modular manifold that fluidly coupling the outlet tube to the distribution recess. The modular jet impingement assemblies include a manifold insert removably positioned within the distribution recess and include one or more inlet branch channels each including an impinging slot and one or more outlet branch channels each including a collecting slot. Further a heat transfer plate coupled to the modular manifold, the heat transfer plate comprising an impingement surface including an array of fins that extend toward the manifold insert.

  20. Modular cathode assemblies and methods of using the same for electrochemical reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiedmeyer, Stanley G.; Barnes, Laurel A.; Williamson, Mark A.

    Modular cathode assemblies are useable in electrolytic reduction systems and include a basket through which fluid electrolyte may pass and exchange charge with a material to be reduced in the basket. The basket can be divided into upper and lower sections to provide entry for the material. Example embodiment cathode assemblies may have any shape to permit modular placement at any position in reduction systems. Modular cathode assemblies include a cathode plate in the basket, to which unique and opposite electrical power may be supplied. Example embodiment modular cathode assemblies may have standardized electrical connectors. Modular cathode assemblies may bemore » supported by a top plate of an electrolytic reduction system. Electrolytic oxide reduction systems are operated by positioning modular cathode and anode assemblies at desired positions, placing a material in the basket, and charging the modular assemblies to reduce the metal oxide.« less

  1. Modular cathode assemblies and methods of using the same for electrochemical reduction

    DOEpatents

    Wiedmeyer, Stanley G; Barnes, Laurel A; Williamson, Mark A; Willit, James L

    2014-12-02

    Modular cathode assemblies are useable in electrolytic reduction systems and include a basket through which fluid electrolyte may pass and exchange charge with a material to be reduced in the basket. The basket can be divided into upper and lower sections to provide entry for the material. Example embodiment cathode assemblies may have any shape to permit modular placement at any position in reduction systems. Modular cathode assemblies include a cathode plate in the basket, to which unique and opposite electrical power may be supplied. Example embodiment modular cathode assemblies may have standardized electrical connectors. Modular cathode assemblies may be supported by a top plate of an electrolytic reduction system. Electrolytic oxide reduction systems are operated by positioning modular cathode and anode assemblies at desired positions, placing a material in the basket, and charging the modular assemblies to reduce the metal oxide.

  2. The Current Status of Modular Coordination. A Research Correlation Conference of Building Research Institute, Division of Engineering and Industrial Research (Fall 1959).

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    Publication of conference presentations include--(1) a brief review of current modular standard development, (2) the statistical status of modular practice, (3) availability of modular products, and (4) educational programs on modular coordination. Included are--(1) explanatory diagrams, (2) text of an open panel discussion, and (3) a list of…

  3. A multilingual gold-standard corpus for biomedical concept recognition: the Mantra GSC.

    PubMed

    Kors, Jan A; Clematide, Simon; Akhondi, Saber A; van Mulligen, Erik M; Rebholz-Schuhmann, Dietrich

    2015-09-01

    To create a multilingual gold-standard corpus for biomedical concept recognition. We selected text units from different parallel corpora (Medline abstract titles, drug labels, biomedical patent claims) in English, French, German, Spanish, and Dutch. Three annotators per language independently annotated the biomedical concepts, based on a subset of the Unified Medical Language System and covering a wide range of semantic groups. To reduce the annotation workload, automatically generated preannotations were provided. Individual annotations were automatically harmonized and then adjudicated, and cross-language consistency checks were carried out to arrive at the final annotations. The number of final annotations was 5530. Inter-annotator agreement scores indicate good agreement (median F-score 0.79), and are similar to those between individual annotators and the gold standard. The automatically generated harmonized annotation set for each language performed equally well as the best annotator for that language. The use of automatic preannotations, harmonized annotations, and parallel corpora helped to keep the manual annotation efforts manageable. The inter-annotator agreement scores provide a reference standard for gauging the performance of automatic annotation techniques. To our knowledge, this is the first gold-standard corpus for biomedical concept recognition in languages other than English. Other distinguishing features are the wide variety of semantic groups that are being covered, and the diversity of text genres that were annotated. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  4. Modular Design in Treaty Verification Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macarthur, Duncan Whittemore; Benz, Jacob; Tolk, Keith

    2015-01-27

    It is widely believed that modular design is a good thing. However, there are often few explicit arguments, or even an agreed range of definitions, to back up this belief. In this paper, we examine the potential range of design modularity, the implications of various amounts of modularity, and the advantages and disadvantages of each level of modular construction. We conclude with a comparison of the advantages and disadvantages of each type, as well as discuss many caveats that should be observed to take advantage of the positive features of modularity and minimize the effects of the negative. The tradeoffsmore » described in this paper will be evaluated during the conceptual design to determine what amount of modularity should be included.« less

  5. Analysis of Advanced Modular Power Systems (AMPS) for Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard; Soeder, James F.; Beach, Ray

    2014-01-01

    The Advanced Modular Power Systems (AMPS) project is developing a modular approach to spacecraft power systems for exploration beyond Earth orbit. AMPS is intended to meet the need of reducing the cost of design development, test and integration and also reducing the operational logistics cost of supporting exploration missions. AMPS seeks to establish modular power building blocks with standardized electrical, mechanical, thermal and data interfaces that can be applied across multiple exploration vehicles. The presentation discusses the results of a cost analysis that compares the cost of the modular approach against a traditional non-modular approach.

  6. In Silico Investigation of a Surgical Interface for Remote Control of Modular Miniature Robots in Minimally Invasive Surgery

    PubMed Central

    Zygomalas, Apollon; Giokas, Konstantinos; Koutsouris, Dimitrios

    2014-01-01

    Aim. Modular mini-robots can be used in novel minimally invasive surgery techniques like natural orifice transluminal endoscopic surgery (NOTES) and laparoendoscopic single site (LESS) surgery. The control of these miniature assistants is complicated. The aim of this study is the in silico investigation of a remote controlling interface for modular miniature robots which can be used in minimally invasive surgery. Methods. The conceptual controlling system was developed, programmed, and simulated using professional robotics simulation software. Three different modes of control were programmed. The remote controlling surgical interface was virtually designed as a high scale representation of the respective modular mini-robot, therefore a modular controlling system itself. Results. With the proposed modular controlling system the user could easily identify the conformation of the modular mini-robot and adequately modify it as needed. The arrangement of each module was always known. The in silico investigation gave useful information regarding the controlling mode, the adequate speed of rearrangements, and the number of modules needed for efficient working tasks. Conclusions. The proposed conceptual model may promote the research and development of more sophisticated modular controlling systems. Modular surgical interfaces may improve the handling and the dexterity of modular miniature robots during minimally invasive procedures. PMID:25295187

  7. In silico investigation of a surgical interface for remote control of modular miniature robots in minimally invasive surgery.

    PubMed

    Zygomalas, Apollon; Giokas, Konstantinos; Koutsouris, Dimitrios

    2014-01-01

    Aim. Modular mini-robots can be used in novel minimally invasive surgery techniques like natural orifice transluminal endoscopic surgery (NOTES) and laparoendoscopic single site (LESS) surgery. The control of these miniature assistants is complicated. The aim of this study is the in silico investigation of a remote controlling interface for modular miniature robots which can be used in minimally invasive surgery. Methods. The conceptual controlling system was developed, programmed, and simulated using professional robotics simulation software. Three different modes of control were programmed. The remote controlling surgical interface was virtually designed as a high scale representation of the respective modular mini-robot, therefore a modular controlling system itself. Results. With the proposed modular controlling system the user could easily identify the conformation of the modular mini-robot and adequately modify it as needed. The arrangement of each module was always known. The in silico investigation gave useful information regarding the controlling mode, the adequate speed of rearrangements, and the number of modules needed for efficient working tasks. Conclusions. The proposed conceptual model may promote the research and development of more sophisticated modular controlling systems. Modular surgical interfaces may improve the handling and the dexterity of modular miniature robots during minimally invasive procedures.

  8. Design strategies to address the effect of hydrophobic epitope on stability and in vitro assembly of modular virus-like particle.

    PubMed

    Tekewe, Alemu; Connors, Natalie K; Middelberg, Anton P J; Lua, Linda H L

    2016-08-01

    Virus-like particles (VLPs) and capsomere subunits have shown promising potential as safe and effective vaccine candidates. They can serve as platforms for the display of foreign epitopes on their surfaces in a modular architecture. Depending on the physicochemical properties of the antigenic modules, modularization may affect the expression, solubility and stability of capsomeres, and VLP assembly. In this study, three module designs of a rotavirus hydrophobic peptide (RV10) were synthesized using synthetic biology. Among the three synthetic modules, modularization of the murine polyomavirus VP1 with a single copy of RV10 flanked by long linkers and charged residues resulted in the expression of stable modular capsomeres. Further employing the approach of module titration of RV10 modules on each capsomere via Escherichia coli co-expression of unmodified VP1 and modular VP1-RV10 successfully translated purified modular capomeres into modular VLPs when assembled in vitro. Our results demonstrate that tailoring the physicochemical properties of modules to enhance modular capsomeres stability is achievable through synthetic biology designs. Combined with module titration strategy to avoid steric hindrance to intercapsomere interactions, this allows bioprocessing of bacterially produced in vitro assembled modular VLPs. © 2016 The Protein Society.

  9. On a High-Performance VLSI Solution to Database Problems.

    DTIC Science & Technology

    1981-08-01

    offer such attractive features as automatic verification and. maintenance of semantic integrity, usage of views as abstraction and authorization...course, is the waste of too much potential resource. The global database may contain information for many different users and applications. In processing...working on, this may cause no damage at all, but some waste of space. Therefore one solution may be perhaps to do nothing to prevent its occurrence

  10. Literature Mining of Pathogenesis-Related Proteins in Human Pathogens for Database Annotation

    DTIC Science & Technology

    2009-10-01

    person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control...submission and for literature mining result display with automatically tagged abstracts. I. Literature data sets for machine learning algorithm training...mass spectrometry) proteomics data from Burkholderia strains. • Task1 ( M13 -15): Preliminary analysis of the Burkholderia proteomic space

  11. Signal recognition and parameter estimation of BPSK-LFM combined modulation

    NASA Astrophysics Data System (ADS)

    Long, Chao; Zhang, Lin; Liu, Yu

    2015-07-01

    Intra-pulse analysis plays an important role in electronic warfare. Intra-pulse feature abstraction focuses on primary parameters such as instantaneous frequency, modulation, and symbol rate. In this paper, automatic modulation recognition and feature extraction for combined BPSK-LFM modulation signals based on decision theoretic approach is studied. The simulation results show good recognition effect and high estimation precision, and the system is easy to be realized.

  12. JANE, A new information retrieval system for the Radiation Shielding Information Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trubey, D.K.

    A new information storage and retrieval system has been developed for the Radiation Shielding Information Center (RSIC) at Oak Ridge National Laboratory to replace mainframe systems that have become obsolete. The database contains citations and abstracts of literature which were selected by RSIC analysts and indexed with terms from a controlled vocabulary. The database, begun in 1963, has been maintained continuously since that time. The new system, called JANE, incorporates automatic indexing techniques and on-line retrieval using the RSIC Data General Eclipse MV/4000 minicomputer, Automatic indexing and retrieval techniques based on fuzzy-set theory allow the presentation of results in ordermore » of Retrieval Status Value. The fuzzy-set membership function depends on term frequency in the titles and abstracts and on Term Discrimination Values which indicate the resolving power of the individual terms. These values are determined by the Cover Coefficient method. The use of a commercial database base to store and retrieve the indexing information permits rapid retrieval of the stored documents. Comparisons of the new and presently-used systems for actual searches of the literature indicate that it is practical to replace the mainframe systems with a minicomputer system similar to the present version of JANE. 18 refs., 10 figs.« less

  13. The iFlow modelling framework v2.4: a modular idealized process-based model for flow and transport in estuaries

    NASA Astrophysics Data System (ADS)

    Dijkstra, Yoeri M.; Brouwer, Ronald L.; Schuttelaars, Henk M.; Schramkowski, George P.

    2017-07-01

    The iFlow modelling framework is a width-averaged model for the systematic analysis of the water motion and sediment transport processes in estuaries and tidal rivers. The distinctive solution method, a mathematical perturbation method, used in the model allows for identification of the effect of individual physical processes on the water motion and sediment transport and study of the sensitivity of these processes to model parameters. This distinction between processes provides a unique tool for interpreting and explaining hydrodynamic interactions and sediment trapping. iFlow also includes a large number of options to configure the model geometry and multiple choices of turbulence and salinity models. Additionally, the model contains auxiliary components, including one that facilitates easy and fast sensitivity studies. iFlow has a modular structure, which makes it easy to include, exclude or change individual model components, called modules. Depending on the required functionality for the application at hand, modules can be selected to construct anything from very simple quasi-linear models to rather complex models involving multiple non-linear interactions. This way, the model complexity can be adjusted to the application. Once the modules containing the required functionality are selected, the underlying model structure automatically ensures modules are called in the correct order. The model inserts iteration loops over groups of modules that are mutually dependent. iFlow also ensures a smooth coupling of modules using analytical and numerical solution methods. This way the model combines the speed and accuracy of analytical solutions with the versatility of numerical solution methods. In this paper we present the modular structure, solution method and two examples of the use of iFlow. In the examples we present two case studies, of the Yangtze and Scheldt rivers, demonstrating how iFlow facilitates the analysis of model results, the understanding of the underlying physics and the testing of parameter sensitivity. A comparison of the model results to measurements shows a good qualitative agreement. iFlow is written in Python and is available as open source code under the LGPL license.

  14. A training approach to improve stepping automaticity while dual-tasking in Parkinson's disease

    PubMed Central

    Chomiak, Taylor; Watts, Alexander; Meyer, Nicole; Pereira, Fernando V.; Hu, Bin

    2017-01-01

    Abstract Background: Deficits in motor movement automaticity in Parkinson's disease (PD), especially during multitasking, are early and consistent hallmarks of cognitive function decline, which increases fall risk and reduces quality of life. This study aimed to test the feasibility and potential efficacy of a wearable sensor-enabled technological platform designed for an in-home music-contingent stepping-in-place (SIP) training program to improve step automaticity during dual-tasking (DT). Methods: This was a 4-week prospective intervention pilot study. The intervention uses a sensor system and algorithm that runs off the iPod Touch which calculates step height (SH) in real-time. These measurements were then used to trigger auditory (treatment group, music; control group, radio podcast) playback in real-time through wireless headphones upon maintenance of repeated large amplitude stepping. With small steps or shuffling, auditory playback stops, thus allowing participants to use anticipatory motor control to regain positive feedback. Eleven participants were recruited from an ongoing trial (Trial Number: ISRCTN06023392). Fear of falling (FES-I), general cognitive functioning (MoCA), self-reported freezing of gait (FOG-Q), and DT step automaticity were evaluated. Results: While we found no significant effect of training on FES-I, MoCA, or FOG-Q, we did observe a significant group (music vs podcast) by training interaction in DT step automaticity (P<0.01). Conclusion: Wearable device technology can be used to enable musically-contingent SIP training to increase motor automaticity for people living with PD. The training approach described here can be implemented at home to meet the growing demand for self-management of symptoms by patients. PMID:28151878

  15. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  16. User-guided segmentation for volumetric retinal optical coherence tomography images

    PubMed Central

    Yin, Xin; Chao, Jennifer R.; Wang, Ruikang K.

    2014-01-01

    Abstract. Despite the existence of automatic segmentation techniques, trained graders still rely on manual segmentation to provide retinal layers and features from clinical optical coherence tomography (OCT) images for accurate measurements. To bridge the gap between this time-consuming need of manual segmentation and currently available automatic segmentation techniques, this paper proposes a user-guided segmentation method to perform the segmentation of retinal layers and features in OCT images. With this method, by interactively navigating three-dimensional (3-D) OCT images, the user first manually defines user-defined (or sketched) lines at regions where the retinal layers appear very irregular for which the automatic segmentation method often fails to provide satisfactory results. The algorithm is then guided by these sketched lines to trace the entire 3-D retinal layer and anatomical features by the use of novel layer and edge detectors that are based on robust likelihood estimation. The layer and edge boundaries are finally obtained to achieve segmentation. Segmentation of retinal layers in mouse and human OCT images demonstrates the reliability and efficiency of the proposed user-guided segmentation method. PMID:25147962

  17. Increased neural responses to empathy for pain might explain how acute stress increases prosociality

    PubMed Central

    Tomova, L.; Majdandžić, J.; Hummer, A.; Windischberger, C.; Heinrichs, M.

    2017-01-01

    Abstract Recent behavioral investigations suggest that acute stress can increase prosocial behavior. Here, we investigated whether increased empathy represents a potential mechanism for this finding. Using functional magnetic resonance imaging, we assessed the effects of acute stress on neural responses related to automatic and regulatory components of empathy for pain as well as subsequent prosocial behavior. Stress increased activation in brain areas associated with the automatic sharing of others’ pain, such as the anterior insula, the anterior midcingulate cortex, and the primary somatosensory cortex. In addition, we found increased prosocial behavior under stress. Furthermore, activation in the anterior midcingulate cortex mediated the effects of stress on prosocial behavior. However, stressed participants also displayed stronger and inappropriate other-related responses in situations which required them to take the perspective of another person, and to regulate their automatic affective responses. Thus, while acute stress may increase prosocial behavior by intensifying the sharing of others’ emotions, this comes at the cost of reduced cognitive appraisal abilities. Depending on the contextual constraints, stress may therefore affect empathy in ways that are either beneficial or detrimental. PMID:27798249

  18. Classification of C2C12 cells at differentiation by convolutional neural network of deep learning using phase contrast images.

    PubMed

    Niioka, Hirohiko; Asatani, Satoshi; Yoshimura, Aina; Ohigashi, Hironori; Tagawa, Seiichi; Miyake, Jun

    2018-01-01

    In the field of regenerative medicine, tremendous numbers of cells are necessary for tissue/organ regeneration. Today automatic cell-culturing system has been developed. The next step is constructing a non-invasive method to monitor the conditions of cells automatically. As an image analysis method, convolutional neural network (CNN), one of the deep learning method, is approaching human recognition level. We constructed and applied the CNN algorithm for automatic cellular differentiation recognition of myogenic C2C12 cell line. Phase-contrast images of cultured C2C12 are prepared as input dataset. In differentiation process from myoblasts to myotubes, cellular morphology changes from round shape to elongated tubular shape due to fusion of the cells. CNN abstract the features of the shape of the cells and classify the cells depending on the culturing days from when differentiation is induced. Changes in cellular shape depending on the number of days of culture (Day 0, Day 3, Day 6) are classified with 91.3% accuracy. Image analysis with CNN has a potential to realize regenerative medicine industry.

  19. The Modular need for the Division Signal Battalion

    DTIC Science & Technology

    2017-06-09

    findings and analyzes them to expand on them. It is with these findings and subsequent analysis that the case studies shape the answer to the three...These case studies focus on the signal leadership development and how it occurred in the pre-modular force structure, during modularity, and the...the comparative case study research. The case studies focus on signal leader development in a pre-modular signal force, a modular signal force, and

  20. Modular Fixturing System

    NASA Technical Reports Server (NTRS)

    Littell, Justin Anderson (Inventor); Street, Jon P. (Inventor)

    2017-01-01

    The modular fixturing system of the present invention is modular, reusable and capable of significant customization, both in terms of system radius and system height, allowing it to be arranged and rearranged in numerous unique configurations. The system includes multiple modular stanchions having stanchion shafts and stanchion feet that removably attach to apertures in a table. Angle brackets attached to the modular stanchions support shelves. These shelves in turn provide support to work pieces during fabrication processes such as welding.

  1. A Modularized Counselor-Education Program.

    ERIC Educational Resources Information Center

    Miller, Thomas V.; Dimattia, Dominic J.

    1978-01-01

    Counselor-education programs may be enriched through the use of modularized learning experiences. This article notes several recent articles on competency-based counselor education, the concepts of simulation and modularization, and describes the process of developing a modularized master's program at the University of Bridgeport in Connecticut.…

  2. On the classification of weakly integral modular categories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul; Galindo, César; Ng, Siu-Hung

    In this paper we classify all modular categories of dimension 4m, where m is an odd square-free integer, and all rank 6 and rank 7 weakly integral modular categories. This completes the classification of weakly integral modular categories through rank 7. In particular, our results imply that all integral modular categories of rank at most 7 are pointed (that is, every simple object has dimension 1). All the non-integral (but weakly integral) modular categories of ranks 6 and 7 have dimension 4m, with m an odd square free integer, so their classification is an application of our main result. Themore » classification of rank 7 integral modular categories is facilitated by an analysis of the two group actions on modular categories: the Galois group of the field generated by the entries of the S-matrix and the group of invertible isomorphism classes of objects. We derive some valuable arithmetic consequences from these actions.« less

  3. JTSA: an open source framework for time series abstractions.

    PubMed

    Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana

    2015-10-01

    The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large dataset. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Modular interdependency in complex dynamical systems.

    PubMed

    Watson, Richard A; Pollack, Jordan B

    2005-01-01

    Herbert A. Simon's characterization of modularity in dynamical systems describes subsystems as having dynamics that are approximately independent of those of other subsystems (in the short term). This fits with the general intuition that modules must, by definition, be approximately independent. In the evolution of complex systems, such modularity may enable subsystems to be modified and adapted independently of other subsystems, whereas in a nonmodular system, modifications to one part of the system may result in deleterious side effects elsewhere in the system. But this notion of modularity and its effect on evolvability is not well quantified and is rather simplistic. In particular, modularity need not imply that intermodule dependences are weak or unimportant. In dynamical systems this is acknowledged by Simon's suggestion that, in the long term, the dynamical behaviors of subsystems do interact with one another, albeit in an "aggregate" manner--but this kind of intermodule interaction is omitted in models of modularity for evolvability. In this brief discussion we seek to unify notions of modularity in dynamical systems with notions of how modularity affects evolvability. This leads to a quantifiable measure of modularity and a different understanding of its effect on evolvability.

  5. Experimental Verification and Integration of a Next Generation Smart Power Management System

    NASA Astrophysics Data System (ADS)

    Clemmer, Tavis B.

    With the increase in energy demand by the residential community in this country and the diminishing fossil fuel resources being used for electric energy production there is a need for a system to efficiently manage power within a residence. The Smart Green Power Node (SGPN) is a next generation energy management system that automates on-site energy production, storage, consumption, and grid usage to yield the most savings for both the utility and the consumer. Such a system automatically manages on-site distributed generation sources such as a PhotoVoltaic (PV) input and battery storage to curtail grid energy usage when the price is high. The SGPN high level control features an advanced modular algorithm that incorporates weather data for projected PV generation, battery health monitoring algorithms, user preferences for load prioritization within the home in case of an outage, Time of Use (ToU) grid power pricing, and status of on-site resources to intelligently schedule and manage power flow between the grid, loads, and the on-site resources. The SGPN has a scalable, modular architecture such that it can be customized for user specific applications. This drove the topology for the SGPN which connects on-site resources at a low voltage DC microbus; a two stage bi-directional inverter/rectifier then couples the AC load and residential grid connect to on-site generation. The SGPN has been designed, built, and is undergoing testing. Hardware test results obtained are consistent with the design goals set and indicate that the SGPN is a viable system with recommended changes and future work.

  6. A navigated mechatronic system with haptic features to assist in surgical interventions.

    PubMed

    Pieck, S; Gross, I; Knappe, P; Kuenzler, S; Kerschbaumer, F; Wahrburg, J

    2003-01-01

    In orthopaedic surgery, the development of new computer-based technologies such as navigation systems and robotics will facilitate more precise, reproducible results in surgical interventions. There are already commercial systems available for clinical use, though these still have some limitations and drawbacks. This paper presents an alternative approach to a universal modular surgical assistant system for supporting less or minimally invasive surgery. The position of a mechatronic arm, which is part of the system, is controlled by a navigation system so that small patient movements are automatically detected and compensated for in real time. Thus, the optimal tool position can be constantly maintained without the need for rigid bone or patient fixation. Furthermore, a force control mode of the mechatronic assistant system, based on a force-torque sensor, not only increases safety during surgical interventions but also facilitates hand-driven direct positioning of the arm. A prototype has been successfully tested in clinical applications at the Orthopadische Universitätsklinik Frankfurt. For the first time worldwide, implantation of the cup prosthesis in total hip replacement surgery has been carried out with the assistance of a mechatronic arm. According to measurements by the digitizing system, operating tool angle deviation remained below 0.5 degrees, relative to the preoperative planning. The presented approach to a new kind of surgical mechatronic assistance system supports the surgeon as needed by optimal positioning of the surgical instruments. Due to its modular design, it is applicable to a wide range of tasks in surgical interventions, e.g., endoscope guidance, bone preparation, etc.

  7. Individual differences and time-varying features of modular brain architecture.

    PubMed

    Liao, Xuhong; Cao, Miao; Xia, Mingrui; He, Yong

    2017-05-15

    Recent studies have suggested that human brain functional networks are topologically organized into functionally specialized but inter-connected modules to facilitate efficient information processing and highly flexible cognitive function. However, these studies have mainly focused on group-level network modularity analyses using "static" functional connectivity approaches. How these extraordinary modular brain structures vary across individuals and spontaneously reconfigure over time remain largely unknown. Here, we employed multiband resting-state functional MRI data (N=105) from the Human Connectome Project and a graph-based modularity analysis to systematically investigate individual variability and dynamic properties in modular brain networks. We showed that the modular structures of brain networks dramatically vary across individuals, with higher modular variability primarily in the association cortex (e.g., fronto-parietal and attention systems) and lower variability in the primary systems. Moreover, brain regions spontaneously changed their module affiliations on a temporal scale of seconds, which cannot be simply attributable to head motion and sampling error. Interestingly, the spatial pattern of intra-subject dynamic modular variability largely overlapped with that of inter-subject modular variability, both of which were highly reproducible across repeated scanning sessions. Finally, the regions with remarkable individual/temporal modular variability were closely associated with network connectors and the number of cognitive components, suggesting a potential contribution to information integration and flexible cognitive function. Collectively, our findings highlight individual modular variability and the notable dynamic characteristics in large-scale brain networks, which enhance our understanding of the neural substrates underlying individual differences in a variety of cognition and behaviors. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Sensory Intelligence for Extraction of an Abstract Auditory Rule: A Cross-Linguistic Study.

    PubMed

    Guo, Xiao-Tao; Wang, Xiao-Dong; Liang, Xiu-Yuan; Wang, Ming; Chen, Lin

    2018-02-21

    In a complex linguistic environment, while speech sounds can greatly vary, some shared features are often invariant. These invariant features constitute so-called abstract auditory rules. Our previous study has shown that with auditory sensory intelligence, the human brain can automatically extract the abstract auditory rules in the speech sound stream, presumably serving as the neural basis for speech comprehension. However, whether the sensory intelligence for extraction of abstract auditory rules in speech is inherent or experience-dependent remains unclear. To address this issue, we constructed a complex speech sound stream using auditory materials in Mandarin Chinese, in which syllables had a flat lexical tone but differed in other acoustic features to form an abstract auditory rule. This rule was occasionally and randomly violated by the syllables with the rising, dipping or falling tone. We found that both Chinese and foreign speakers detected the violations of the abstract auditory rule in the speech sound stream at a pre-attentive stage, as revealed by the whole-head recordings of mismatch negativity (MMN) in a passive paradigm. However, MMNs peaked earlier in Chinese speakers than in foreign speakers. Furthermore, Chinese speakers showed different MMN peak latencies for the three deviant types, which paralleled recognition points. These findings indicate that the sensory intelligence for extraction of abstract auditory rules in speech sounds is innate but shaped by language experience. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  9. Modular Apparatus and Method for Attaching Multiple Devices

    NASA Technical Reports Server (NTRS)

    Okojie, Robert S (Inventor)

    2015-01-01

    A modular apparatus for attaching sensors and electronics is disclosed. The modular apparatus includes a square recess including a plurality of cavities and a reference cavity such that a pressure sensor can be connected to the modular apparatus. The modular apparatus also includes at least one voltage input hole and at least one voltage output hole operably connected to each of the plurality of cavities such that voltage can be applied to the pressure sensor and received from the pressure sensor.

  10. Robotic hand with modular extensions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salisbury, Curt Michael; Quigley, Morgan

    A robotic device is described herein. The robotic device includes a frame that comprises a plurality of receiving regions that are configured to receive a respective plurality of modular robotic extensions. The modular robotic extensions are removably attachable to the frame at the respective receiving regions by way of respective mechanical fuses. Each mechanical fuse is configured to trip when a respective modular robotic extension experiences a predefined load condition, such that the respective modular robotic extension detaches from the frame when the load condition is met.

  11. Effective biomedical document classification for identifying publications relevant to the mouse Gene Expression Database (GXD).

    PubMed

    Jiang, Xiangying; Ringwald, Martin; Blake, Judith; Shatkay, Hagit

    2017-01-01

    The Gene Expression Database (GXD) is a comprehensive online database within the Mouse Genome Informatics resource, aiming to provide available information about endogenous gene expression during mouse development. The information stems primarily from many thousands of biomedical publications that database curators must go through and read. Given the very large number of biomedical papers published each year, automatic document classification plays an important role in biomedical research. Specifically, an effective and efficient document classifier is needed for supporting the GXD annotation workflow. We present here an effective yet relatively simple classification scheme, which uses readily available tools while employing feature selection, aiming to assist curators in identifying publications relevant to GXD. We examine the performance of our method over a large manually curated dataset, consisting of more than 25 000 PubMed abstracts, of which about half are curated as relevant to GXD while the other half as irrelevant to GXD. In addition to text from title-and-abstract, we also consider image captions, an important information source that we integrate into our method. We apply a captions-based classifier to a subset of about 3300 documents, for which the full text of the curated articles is available. The results demonstrate that our proposed approach is robust and effectively addresses the GXD document classification. Moreover, using information obtained from image captions clearly improves performance, compared to title and abstract alone, affirming the utility of image captions as a substantial evidence source for automatically determining the relevance of biomedical publications to a specific subject area. www.informatics.jax.org. © The Author(s) 2017. Published by Oxford University Press.

  12. Modular femoral neck fracture after primary total hip arthroplasty.

    PubMed

    Sotereanos, Nicholas G; Sauber, Timothy J; Tupis, Todd T

    2013-01-01

    The use of modular femoral stems in primary total hip arthroplasty has increased considerably in recent years. These modular components offer the surgeon the ability to independently alter version, offset, and length of the femoral component of a hip arthroplasty. This increases the surgeon's ability to accurately recreate the relevant anatomy but increases the possibilities of corrosion and fracture. Multiple case reports have highlighted fractures of these modular components. We present a case of a fracture of a modular design that has had no previously reported modular neck fractures. The patient was informed that data concerning the case would be submitted, and he consented. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Compilation of Abstracts of Theses Submitted by Candidates for Degrees: October 1988 to September 1989

    DTIC Science & Technology

    1989-09-30

    to accommodate peripherally non -uniform flow modelling free of experimental uncertainties. It was effects (blockage) in the throughflow code...combines that experimental control functions with a detail in this thesis, and the results of a computer menu-driven, diagnostic subsystem to ensure...equations and design a complete (DSL) for both linear and non -linear models and automatic control system for the three dimensional compared. Cross

  14. Field Evaluation of Particle Counter Technology for Aviation Fuel Contamination Detection - Fort Rucker

    DTIC Science & Technology

    2013-06-20

    Automatic Particle Counter, cleanliness, free water, Diesel 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT none 18. NUMBER OF PAGES...Governmental transfer receipts and 1.0 mg/L on issue to aircraft, or up to 10 mg/L for product used as a diesel product for ground use (1). Free...industry. The International Organization for Standardization (ISO) has published several methods and test procedures for the calibration and use of

  15. Addressing the Heterogeneity of Subject Indexing in the ADS Databases

    NASA Astrophysics Data System (ADS)

    Dubin, David S.

    A drawback of the current document representation scheme in the ADS abstract service is its heterogeneous subject indexing. Several related but inconsistent indexing languages are represented in ADS. A method of reconciling some indexing inconsistencies is described. Using lexical similarity alone, one out of six ADS descriptors can be automatically mapped to some other descriptor. Analysis of postings data can direct administrators to those mergings it is most important to check for errors.

  16. Automating document classification for the Immune Epitope Database

    PubMed Central

    Wang, Peng; Morgan, Alexander A; Zhang, Qing; Sette, Alessandro; Peters, Bjoern

    2007-01-01

    Background The Immune Epitope Database contains information on immune epitopes curated manually from the scientific literature. Like similar projects in other knowledge domains, significant effort is spent on identifying which articles are relevant for this purpose. Results We here report our experience in automating this process using Naïve Bayes classifiers trained on 20,910 abstracts classified by domain experts. Improvements on the basic classifier performance were made by a) utilizing information stored in PubMed beyond the abstract itself b) applying standard feature selection criteria and c) extracting domain specific feature patterns that e.g. identify peptides sequences. We have implemented the classifier into the curation process determining if abstracts are clearly relevant, clearly irrelevant, or if no certain classification can be made, in which case the abstracts are manually classified. Testing this classification scheme on an independent dataset, we achieve 95% sensitivity and specificity in the 51.1% of abstracts that were automatically classified. Conclusion By implementing text classification, we have sped up the reference selection process without sacrificing sensitivity or specificity of the human expert classification. This study provides both practical recommendations for users of text classification tools, as well as a large dataset which can serve as a benchmark for tool developers. PMID:17655769

  17. Modular properties of 6d (DELL) systems

    NASA Astrophysics Data System (ADS)

    Aminov, G.; Mironov, A.; Morozov, A.

    2017-11-01

    If super-Yang-Mills theory possesses the exact conformal invariance, there is an additional modular invariance under the change of the complex bare charge [InlineMediaObject not available: see fulltext.]. The low-energy Seiberg-Witten prepotential ℱ( a), however, is not explicitly invariant, because the flat moduli also change a - → a D = ∂ℱ/∂ a. In result, the prepotential is not a modular form and depends also on the anomalous Eisenstein series E 2. This dependence is usually described by the universal MNW modular anomaly equation. We demonstrate that, in the 6 d SU( N) theory with two independent modular parameters τ and \\widehat{τ} , the modular anomaly equation changes, because the modular transform of τ is accompanied by an ( N -dependent!) shift of \\widehat{τ} and vice versa. This is a new peculiarity of double-elliptic systems, which deserves further investigation.

  18. Towards a Formal Basis for Modular Safety Cases

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Pai, Ganesh

    2015-01-01

    Safety assurance using argument-based safety cases is an accepted best-practice in many safety-critical sectors. Goal Structuring Notation (GSN), which is widely used for presenting safety arguments graphically, provides a notion of modular arguments to support the goal of incremental certification. Despite the efforts at standardization, GSN remains an informal notation whereas the GSN standard contains appreciable ambiguity especially concerning modular extensions. This, in turn, presents challenges when developing tools and methods to intelligently manipulate modular GSN arguments. This paper develops the elements of a theory of modular safety cases, leveraging our previous work on formalizing GSN arguments. Using example argument structures we highlight some ambiguities arising through the existing guidance, present the intuition underlying the theory, clarify syntax, and address modular arguments, contracts, well-formedness and well-scopedness of modules. Based on this theory, we have a preliminary implementation of modular arguments in our toolset, AdvoCATE.

  19. Brain modularity controls the critical behavior of spontaneous activity.

    PubMed

    Russo, R; Herrmann, H J; de Arcangelis, L

    2014-03-13

    The human brain exhibits a complex structure made of scale-free highly connected modules loosely interconnected by weaker links to form a small-world network. These features appear in healthy patients whereas neurological diseases often modify this structure. An important open question concerns the role of brain modularity in sustaining the critical behaviour of spontaneous activity. Here we analyse the neuronal activity of a model, successful in reproducing on non-modular networks the scaling behaviour observed in experimental data, on a modular network implementing the main statistical features measured in human brain. We show that on a modular network, regardless the strength of the synaptic connections or the modular size and number, activity is never fully scale-free. Neuronal avalanches can invade different modules which results in an activity depression, hindering further avalanche propagation. Critical behaviour is solely recovered if inter-module connections are added, modifying the modular into a more random structure.

  20. Why Go Modular? A Review of Modular A-Level Mathematics.

    ERIC Educational Resources Information Center

    Taverner, Sally; Wright, Martin

    1997-01-01

    Attitudes, academic intentions, and attainment of students gaining a grade in A-level (Advanced level) mathematics were compared for those who followed a modular course and those assessed at the end of two years of study. Overall, the final grades of those assessed modularly were half a grade higher. (JOW)

  1. On Classification of Modular Categories by Rank: Table A.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruillard, Paul; Ng, Siu-Hung; Rowell, Eric C.

    2016-04-10

    The feasibility of a classification-by-rank program for modular categories follows from the Rank-Finiteness Theorem. We develop arithmetic, representation theoretic and algebraic methods for classifying modular categories by rank. As an application, we determine all possible fusion rules for all rank=5 modular categories and describe the corresponding monoidal equivalence classes.

  2. 46 CFR 181.450 - Independent modular smoke detecting units.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Independent modular smoke detecting units. 181.450... Independent modular smoke detecting units. (a) An independent modular smoke detecting unit must: (1) Meet UL 217 (incorporated by reference, see 46 CFR 175.600) and be listed as a “Single Station Smoke detector...

  3. 46 CFR 181.450 - Independent modular smoke detecting units.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Independent modular smoke detecting units. 181.450... Independent modular smoke detecting units. (a) An independent modular smoke detecting unit must: (1) Meet UL 217 (incorporated by reference, see 46 CFR 175.600) and be listed as a “Single Station Smoke detector...

  4. 46 CFR 181.450 - Independent modular smoke detecting units.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Independent modular smoke detecting units. 181.450... Independent modular smoke detecting units. (a) An independent modular smoke detecting unit must: (1) Meet UL 217 (incorporated by reference, see 46 CFR 175.600) and be listed as a “Single Station Smoke detector...

  5. 46 CFR 181.450 - Independent modular smoke detecting units.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Independent modular smoke detecting units. 181.450... Independent modular smoke detecting units. (a) An independent modular smoke detecting unit must: (1) Meet UL 217 (incorporated by reference, see 46 CFR 175.600) and be listed as a “Single Station Smoke detector...

  6. 46 CFR 181.450 - Independent modular smoke detecting units.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Independent modular smoke detecting units. 181.450... Independent modular smoke detecting units. (a) An independent modular smoke detecting unit must: (1) Meet UL 217 (incorporated by reference, see 46 CFR 175.600) and be listed as a “Single Station Smoke detector...

  7. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    PubMed

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  8. Unraveling the disease consequences and mechanisms of modular structure in animal social networks

    PubMed Central

    Leu, Stephan T.; Cross, Paul C.; Hudson, Peter J.; Bansal, Shweta

    2017-01-01

    Disease risk is a potential cost of group living. Although modular organization is thought to reduce this cost in animal societies, empirical evidence toward this hypothesis has been conflicting. We analyzed empirical social networks from 43 animal species to motivate our study of the epidemiological consequences of modular structure in animal societies. From these empirical studies, we identified the features of interaction patterns associated with network modularity and developed a theoretical network model to investigate when and how subdivisions in social networks influence disease dynamics. Contrary to prior work, we found that disease risk is largely unaffected by modular structure, although social networks beyond a modular threshold experience smaller disease burden and longer disease duration. Our results illustrate that the lowering of disease burden in highly modular social networks is driven by two mechanisms of modular organization: network fragmentation and subgroup cohesion. Highly fragmented social networks with cohesive subgroups are able to structurally trap infections within a few subgroups and also cause a structural delay to the spread of disease outbreaks. Finally, we show that network models incorporating modular structure are necessary only when prior knowledge suggests that interactions within the population are highly subdivided. Otherwise, null networks based on basic knowledge about group size and local contact heterogeneity may be sufficient when data-limited estimates of epidemic consequences are necessary. Overall, our work does not support the hypothesis that modular structure universally mitigates the disease impact of group living. PMID:28373567

  9. Unraveling the disease consequences and mechanisms of modular structure in animal social networks

    USGS Publications Warehouse

    Sah, Pratha; Leu, Stephan T.; Cross, Paul C.; Hudson, Peter J.; Bansal, Shweta

    2017-01-01

    Disease risk is a potential cost of group living. Although modular organization is thought to reduce this cost in animal societies, empirical evidence toward this hypothesis has been conflicting. We analyzed empirical social networks from 43 animal species to motivate our study of the epidemiological consequences of modular structure in animal societies. From these empirical studies, we identified the features of interaction patterns associated with network modularity and developed a theoretical network model to investigate when and how subdivisions in social networks influence disease dynamics. Contrary to prior work, we found that disease risk is largely unaffected by modular structure, although social networks beyond a modular threshold experience smaller disease burden and longer disease duration. Our results illustrate that the lowering of disease burden in highly modular social networks is driven by two mechanisms of modular organization: network fragmentation and subgroup cohesion. Highly fragmented social networks with cohesive subgroups are able to structurally trap infections within a few subgroups and also cause a structural delay to the spread of disease outbreaks. Finally, we show that network models incorporating modular structure are necessary only when prior knowledge suggests that interactions within the population are highly subdivided. Otherwise, null networks based on basic knowledge about group size and local contact heterogeneity may be sufficient when data-limited estimates of epidemic consequences are necessary. Overall, our work does not support the hypothesis that modular structure universally mitigates the disease impact of group living.

  10. Unraveling the disease consequences and mechanisms of modular structure in animal social networks.

    PubMed

    Sah, Pratha; Leu, Stephan T; Cross, Paul C; Hudson, Peter J; Bansal, Shweta

    2017-04-18

    Disease risk is a potential cost of group living. Although modular organization is thought to reduce this cost in animal societies, empirical evidence toward this hypothesis has been conflicting. We analyzed empirical social networks from 43 animal species to motivate our study of the epidemiological consequences of modular structure in animal societies. From these empirical studies, we identified the features of interaction patterns associated with network modularity and developed a theoretical network model to investigate when and how subdivisions in social networks influence disease dynamics. Contrary to prior work, we found that disease risk is largely unaffected by modular structure, although social networks beyond a modular threshold experience smaller disease burden and longer disease duration. Our results illustrate that the lowering of disease burden in highly modular social networks is driven by two mechanisms of modular organization: network fragmentation and subgroup cohesion. Highly fragmented social networks with cohesive subgroups are able to structurally trap infections within a few subgroups and also cause a structural delay to the spread of disease outbreaks. Finally, we show that network models incorporating modular structure are necessary only when prior knowledge suggests that interactions within the population are highly subdivided. Otherwise, null networks based on basic knowledge about group size and local contact heterogeneity may be sufficient when data-limited estimates of epidemic consequences are necessary. Overall, our work does not support the hypothesis that modular structure universally mitigates the disease impact of group living.

  11. Modular architecture of protein structures and allosteric communications: potential implications for signaling proteins and regulatory linkages

    PubMed Central

    del Sol, Antonio; Araúzo-Bravo, Marcos J; Amoros, Dolors; Nussinov, Ruth

    2007-01-01

    Background Allosteric communications are vital for cellular signaling. Here we explore a relationship between protein architectural organization and shortcuts in signaling pathways. Results We show that protein domains consist of modules interconnected by residues that mediate signaling through the shortest pathways. These mediating residues tend to be located at the inter-modular boundaries, which are more rigid and display a larger number of long-range interactions than intra-modular regions. The inter-modular boundaries contain most of the residues centrally conserved in the protein fold, which may be crucial for information transfer between amino acids. Our approach to modular decomposition relies on a representation of protein structures as residue-interacting networks, and removal of the most central residue contacts, which are assumed to be crucial for allosteric communications. The modular decomposition of 100 multi-domain protein structures indicates that modules constitute the building blocks of domains. The analysis of 13 allosteric proteins revealed that modules characterize experimentally identified functional regions. Based on the study of an additional functionally annotated dataset of 115 proteins, we propose that high-modularity modules include functional sites and are the basic functional units. We provide examples (the Gαs subunit and P450 cytochromes) to illustrate that the modular architecture of active sites is linked to their functional specialization. Conclusion Our method decomposes protein structures into modules, allowing the study of signal transmission between functional sites. A modular configuration might be advantageous: it allows signaling proteins to expand their regulatory linkages and may elicit a broader range of control mechanisms either via modular combinations or through modulation of inter-modular linkages. PMID:17531094

  12. MRMAide: a mixed resolution modeling aide

    NASA Astrophysics Data System (ADS)

    Treshansky, Allyn; McGraw, Robert M.

    2002-07-01

    The Mixed Resolution Modeling Aide (MRMAide) technology is an effort to semi-automate the implementation of Mixed Resolution Modeling (MRM). MRMAide suggests ways of resolving differences in fidelity and resolution across diverse modeling paradigms. The goal of MRMAide is to provide a technology that will allow developers to incorporate model components into scenarios other than those for which they were designed. Currently, MRM is implemented by hand. This is a tedious, error-prone, and non-portable process. MRMAide, in contrast, will automatically suggest to a developer where and how to connect different components and/or simulations. MRMAide has three phases of operation: pre-processing, data abstraction, and validation. During pre-processing the components to be linked together are evaluated in order to identify appropriate mapping points. During data abstraction those mapping points are linked via data abstraction algorithms. During validation developers receive feedback regarding their newly created models relative to existing baselined models. The current work presents an overview of the various problems encountered during MRM and the various technologies utilized by MRMAide to overcome those problems.

  13. Abstracting of suspected illegal land use in urban areas using case-based classification of remote sensing images

    NASA Astrophysics Data System (ADS)

    Chen, Fulong; Wang, Chao; Yang, Chengyun; Zhang, Hong; Wu, Fan; Lin, Wenjuan; Zhang, Bo

    2008-11-01

    This paper proposed a method that uses a case-based classification of remote sensing images and applied this method to abstract the information of suspected illegal land use in urban areas. Because of the discrete cases for imagery classification, the proposed method dealt with the oscillation of spectrum or backscatter within the same land use category, and it not only overcame the deficiency of maximum likelihood classification (the prior probability of land use could not be obtained) but also inherited the advantages of the knowledge-based classification system, such as artificial intelligence and automatic characteristics. Consequently, the proposed method could do the classifying better. Then the researchers used the object-oriented technique for shadow removal in highly dense city zones. With multi-temporal SPOT 5 images whose resolution was 2.5×2.5 meters, the researchers found that the method can abstract suspected illegal land use information in urban areas using post-classification comparison technique.

  14. Modular assembly for supporting, straining, and directing flow to a core in a nuclear reactor

    DOEpatents

    Pennell, William E.

    1977-01-01

    A reactor core support arrangement for supporting, straining, and providing fluid flow to the core and periphery of a nuclear reactor during normal operation. A plurality of removable inlet modular units are contained within permanent liners in the lower supporting plate of the reactor vessel lower internals. During normal operation (1) each inlet modular unit directs main coolant flow to a plurality of core assemblies, the latter being removably supported in receptacles in the upper portion of the modular unit and (2) each inlet modular unit may direct bypass flow to a low pressure annular region of the reactor vessel. Each inlet modular unit may include special fluid seals interposed between mating surfaces of the inlet modular units and the core assemblies and between the inlet modular units and the liners, to minimize leakage and achieve an hydraulic balance. Utilizing the hydraulic balance, the modular units are held in the liners and the assemblies are held in the modular unit receptacles by their own respective weight. Included as part of the permanent liners below the horizontal support plate are generally hexagonal axial debris barriers. The axial debris barriers collectively form a bottom boundary of a secondary high pressure plenum, the upper boundary of which is the bottom surface of the horizontal support plate. Peripheral liners include radial debris barriers which collectively form a barrier against debris entry radially. During normal operation primary coolant inlet openings in the liner, below the axial debris barriers, pass a large amount of coolant into the inlet modular units, and secondary coolant inlet openings in the portion of the liners within the secondary plenum pass a small amount of coolant into the inlet modular units. The secondary coolant inlet openings also provide alternative coolant inlet flow paths in the unlikely event of blockage of the primary inlet openings. The primary inlet openings have characteristics which limit the entry of debris and minimize the potential for debris entering the primary inlets blocking the secondary inlets from inside the modular unit.

  15. MeSH indexing based on automatically generated summaries.

    PubMed

    Jimeno-Yepes, Antonio J; Plaza, Laura; Mork, James G; Aronson, Alan R; Díaz, Alberto

    2013-06-26

    MEDLINE citations are manually indexed at the U.S. National Library of Medicine (NLM) using as reference the Medical Subject Headings (MeSH) controlled vocabulary. For this task, the human indexers read the full text of the article. Due to the growth of MEDLINE, the NLM Indexing Initiative explores indexing methodologies that can support the task of the indexers. Medical Text Indexer (MTI) is a tool developed by the NLM Indexing Initiative to provide MeSH indexing recommendations to indexers. Currently, the input to MTI is MEDLINE citations, title and abstract only. Previous work has shown that using full text as input to MTI increases recall, but decreases precision sharply. We propose using summaries generated automatically from the full text for the input to MTI to use in the task of suggesting MeSH headings to indexers. Summaries distill the most salient information from the full text, which might increase the coverage of automatic indexing approaches based on MEDLINE. We hypothesize that if the results were good enough, manual indexers could possibly use automatic summaries instead of the full texts, along with the recommendations of MTI, to speed up the process while maintaining high quality of indexing results. We have generated summaries of different lengths using two different summarizers, and evaluated the MTI indexing on the summaries using different algorithms: MTI, individual MTI components, and machine learning. The results are compared to those of full text articles and MEDLINE citations. Our results show that automatically generated summaries achieve similar recall but higher precision compared to full text articles. Compared to MEDLINE citations, summaries achieve higher recall but lower precision. Our results show that automatic summaries produce better indexing than full text articles. Summaries produce similar recall to full text but much better precision, which seems to indicate that automatic summaries can efficiently capture the most important contents within the original articles. The combination of MEDLINE citations and automatically generated summaries could improve the recommendations suggested by MTI. On the other hand, indexing performance might be dependent on the MeSH heading being indexed. Summarization techniques could thus be considered as a feature selection algorithm that might have to be tuned individually for each MeSH heading.

  16. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    PubMed Central

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Abstract Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow. The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  17. Patient Education Self-Management During Surgical Recovery: Combining Mobile (iPad) and a Content Management System

    PubMed Central

    Moradkhani, Anilga; Douglas, Kristin S. Vickers; Prinsen, Sharon K.; Fischer, Erin N.; Schroeder, Darrell R.

    2014-01-01

    Abstract Objective: The objective of this investigation was to assess whether a new electronic health (e-health) platform, combining mobile computing and a content management system, could effectively deliver modular and “just-in-time” education to older patients following cardiac surgery. Subjects and Methods: Patients were provided with iPad® (Apple®, Cupertino, CA) tablets that delivered educational modules as part of a daily “to do” list in a plan of care. The tablet communicated wirelessly to a dashboard where data were aggregated and displayed for providers. Results: A surgical population of 149 patients with a mean age of 68 years utilized 5,267 of 6,295 (84%) of education modules delivered over a 5.3-day hospitalization. Increased age was not associated with decreased use. Conclusions: We demonstrate that age, hospitalization, and major surgery are not significant barriers to effective patient education if content is highly consumable and relevant to patients' daily care experience. We also show that mobile technology, even if unfamiliar to many older patients, makes this possible. The combination of mobile computing with a content management system allows for dynamic, modular, personalized, and “just-in-time” education in a highly consumable format. This approach presents a means by which patients may become informed participants in new healthcare models. PMID:24443928

  18. Automatic reconstruction of a bacterial regulatory network using Natural Language Processing

    PubMed Central

    Rodríguez-Penagos, Carlos; Salgado, Heladia; Martínez-Flores, Irma; Collado-Vides, Julio

    2007-01-01

    Background Manual curation of biological databases, an expensive and labor-intensive process, is essential for high quality integrated data. In this paper we report the implementation of a state-of-the-art Natural Language Processing system that creates computer-readable networks of regulatory interactions directly from different collections of abstracts and full-text papers. Our major aim is to understand how automatic annotation using Text-Mining techniques can complement manual curation of biological databases. We implemented a rule-based system to generate networks from different sets of documents dealing with regulation in Escherichia coli K-12. Results Performance evaluation is based on the most comprehensive transcriptional regulation database for any organism, the manually-curated RegulonDB, 45% of which we were able to recreate automatically. From our automated analysis we were also able to find some new interactions from papers not already curated, or that were missed in the manual filtering and review of the literature. We also put forward a novel Regulatory Interaction Markup Language better suited than SBML for simultaneously representing data of interest for biologists and text miners. Conclusion Manual curation of the output of automatic processing of text is a good way to complement a more detailed review of the literature, either for validating the results of what has been already annotated, or for discovering facts and information that might have been overlooked at the triage or curation stages. PMID:17683642

  19. Portable modular detection system

    DOEpatents

    Brennan, James S [Rodeo, CA; Singh, Anup [Danville, CA; Throckmorton, Daniel J [Tracy, CA; Stamps, James F [Livermore, CA

    2009-10-13

    Disclosed herein are portable and modular detection devices and systems for detecting electromagnetic radiation, such as fluorescence, from an analyte which comprises at least one optical element removably attached to at least one alignment rail. Also disclosed are modular detection devices and systems having an integrated lock-in amplifier and spatial filter and assay methods using the portable and modular detection devices.

  20. Modularity-like objective function in annotated networks

    NASA Astrophysics Data System (ADS)

    Xie, Jia-Rong; Wang, Bing-Hong

    2017-12-01

    We ascertain the modularity-like objective function whose optimization is equivalent to the maximum likelihood in annotated networks. We demonstrate that the modularity-like objective function is a linear combination of modularity and conditional entropy. In contrast with statistical inference methods, in our method, the influence of the metadata is adjustable; when its influence is strong enough, the metadata can be recovered. Conversely, when it is weak, the detection may correspond to another partition. Between the two, there is a transition. This paper provides a concept for expanding the scope of modularity methods.

  1. Modular organization and hospital performance.

    PubMed

    Kuntz, Ludwig; Vera, Antonio

    2007-02-01

    The concept of modularization represents a modern form of organization, which contains the vertical disaggregation of the firm and the use of market mechanisms within hierarchies. The objective of this paper is to examine whether the use of modular structures has a positive effect on hospital performance. The empirical section makes use of multiple regression analyses and leads to the main result that modularization does not have a positive effect on hospital performance. However, the analysis also finds out positive efficiency effects of two central ideas of modularization, namely process orientation and internal market mechanisms.

  2. Modular analysis of biological networks.

    PubMed

    Kaltenbach, Hans-Michael; Stelling, Jörg

    2012-01-01

    The analysis of complex biological networks has traditionally relied on decomposition into smaller, semi-autonomous units such as individual signaling pathways. With the increased scope of systems biology (models), rational approaches to modularization have become an important topic. With increasing acceptance of de facto modularity in biology, widely different definitions of what constitutes a module have sparked controversies. Here, we therefore review prominent classes of modular approaches based on formal network representations. Despite some promising research directions, several important theoretical challenges remain open on the way to formal, function-centered modular decompositions for dynamic biological networks.

  3. Full characterization of modular values for finite-dimensional systems

    NASA Astrophysics Data System (ADS)

    Ho, Le Bin; Imoto, Nobuyuki

    2016-06-01

    Kedem and Vaidman obtained a relationship between the spin-operator modular value and its weak value for specific coupling strengths [14]. Here we give a general expression for the modular value in the n-dimensional Hilbert space using the weak values up to (n - 1)th order of an arbitrary observable for any coupling strength, assuming non-degenerated eigenvalues. For two-dimensional case, it shows a linear relationship between the weak value and the modular value. We also relate the modular value of the sum of observables to the weak value of their product.

  4. On the role of sparseness in the evolution of modularity in gene regulatory networks

    PubMed Central

    2018-01-01

    Modularity is a widespread property in biological systems. It implies that interactions occur mainly within groups of system elements. A modular arrangement facilitates adjustment of one module without perturbing the rest of the system. Therefore, modularity of developmental mechanisms is a major factor for evolvability, the potential to produce beneficial variation from random genetic change. Understanding how modularity evolves in gene regulatory networks, that create the distinct gene activity patterns that characterize different parts of an organism, is key to developmental and evolutionary biology. One hypothesis for the evolution of modules suggests that interactions between some sets of genes become maladaptive when selection favours additional gene activity patterns. The removal of such interactions by selection would result in the formation of modules. A second hypothesis suggests that modularity evolves in response to sparseness, the scarcity of interactions within a system. Here I simulate the evolution of gene regulatory networks and analyse diverse experimentally sustained networks to study the relationship between sparseness and modularity. My results suggest that sparseness alone is neither sufficient nor necessary to explain modularity in gene regulatory networks. However, sparseness amplifies the effects of forms of selection that, like selection for additional gene activity patterns, already produce an increase in modularity. That evolution of new gene activity patterns is frequent across evolution also supports that it is a major factor in the evolution of modularity. That sparseness is widespread across gene regulatory networks indicates that it may have facilitated the evolution of modules in a wide variety of cases. PMID:29775459

  5. On the role of sparseness in the evolution of modularity in gene regulatory networks.

    PubMed

    Espinosa-Soto, Carlos

    2018-05-01

    Modularity is a widespread property in biological systems. It implies that interactions occur mainly within groups of system elements. A modular arrangement facilitates adjustment of one module without perturbing the rest of the system. Therefore, modularity of developmental mechanisms is a major factor for evolvability, the potential to produce beneficial variation from random genetic change. Understanding how modularity evolves in gene regulatory networks, that create the distinct gene activity patterns that characterize different parts of an organism, is key to developmental and evolutionary biology. One hypothesis for the evolution of modules suggests that interactions between some sets of genes become maladaptive when selection favours additional gene activity patterns. The removal of such interactions by selection would result in the formation of modules. A second hypothesis suggests that modularity evolves in response to sparseness, the scarcity of interactions within a system. Here I simulate the evolution of gene regulatory networks and analyse diverse experimentally sustained networks to study the relationship between sparseness and modularity. My results suggest that sparseness alone is neither sufficient nor necessary to explain modularity in gene regulatory networks. However, sparseness amplifies the effects of forms of selection that, like selection for additional gene activity patterns, already produce an increase in modularity. That evolution of new gene activity patterns is frequent across evolution also supports that it is a major factor in the evolution of modularity. That sparseness is widespread across gene regulatory networks indicates that it may have facilitated the evolution of modules in a wide variety of cases.

  6. Anatomical Network Analysis Shows Decoupling of Modular Lability and Complexity in the Evolution of the Primate Skull

    PubMed Central

    Esteve-Altava, Borja; Boughner, Julia C.; Diogo, Rui; Villmoare, Brian A.; Rasskin-Gutman, Diego

    2015-01-01

    Modularity and complexity go hand in hand in the evolution of the skull of primates. Because analyses of these two parameters often use different approaches, we do not know yet how modularity evolves within, or as a consequence of, an also-evolving complex organization. Here we use a novel network theory-based approach (Anatomical Network Analysis) to assess how the organization of skull bones constrains the co-evolution of modularity and complexity among primates. We used the pattern of bone contacts modeled as networks to identify connectivity modules and quantify morphological complexity. We analyzed whether modularity and complexity evolved coordinately in the skull of primates. Specifically, we tested Herbert Simon’s general theory of near-decomposability, which states that modularity promotes the evolution of complexity. We found that the skulls of extant primates divide into one conserved cranial module and up to three labile facial modules, whose composition varies among primates. Despite changes in modularity, statistical analyses reject a positive feedback between modularity and complexity. Our results suggest a decoupling of complexity and modularity that translates to varying levels of constraint on the morphological evolvability of the primate skull. This study has methodological and conceptual implications for grasping the constraints that underlie the developmental and functional integration of the skull of humans and other primates. PMID:25992690

  7. Irrelevance Reasoning in Knowledge Based Systems

    NASA Technical Reports Server (NTRS)

    Levy, A. Y.

    1993-01-01

    This dissertation considers the problem of reasoning about irrelevance of knowledge in a principled and efficient manner. Specifically, it is concerned with two key problems: (1) developing algorithms for automatically deciding what parts of a knowledge base are irrelevant to a query and (2) the utility of relevance reasoning. The dissertation describes a novel tool, the query-tree, for reasoning about irrelevance. Based on the query-tree, we develop several algorithms for deciding what formulas are irrelevant to a query. Our general framework sheds new light on the problem of detecting independence of queries from updates. We present new results that significantly extend previous work in this area. The framework also provides a setting in which to investigate the connection between the notion of irrelevance and the creation of abstractions. We propose a new approach to research on reasoning with abstractions, in which we investigate the properties of an abstraction by considering the irrelevance claims on which it is based. We demonstrate the potential of the approach for the cases of abstraction of predicates and projection of predicate arguments. Finally, we describe an application of relevance reasoning to the domain of modeling physical devices.

  8. Retrofitting the AutoBayes Program Synthesis System with Concrete Syntax

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Visser, Eelco

    2004-01-01

    AutoBayes is a fully automatic, schema-based program synthesis system for statistical data analysis applications. Its core component is a schema library. i.e., a collection of generic code templates with associated applicability constraints which are instantiated in a problem-specific way during synthesis. Currently, AutoBayes is implemented in Prolog; the schemas thus use abstract syntax (i.e., Prolog terms) to formulate the templates. However, the conceptual distance between this abstract representation and the concrete syntax of the generated programs makes the schemas hard to create and maintain. In this paper we describe how AutoBayes is retrofitted with concrete syntax. We show how it is integrated into Prolog and describe how the seamless interaction of concrete syntax fragments with AutoBayes's remaining legacy meta-programming kernel based on abstract syntax is achieved. We apply the approach to gradually mitigate individual schemas without forcing a disruptive migration of the entire system to a different First experiences show that a smooth migration can be achieved. Moreover, it can result in a considerable reduction of the code size and improved readability of the code. In particular, abstracting out fresh-variable generation and second-order term construction allows the formulation of larger continuous fragments.

  9. Splatterplots: overcoming overdraw in scatter plots.

    PubMed

    Mayorga, Adrian; Gleicher, Michael

    2013-09-01

    We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the data set as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how Splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.

  10. Splatterplots: Overcoming Overdraw in Scatter Plots

    PubMed Central

    Mayorga, Adrian; Gleicher, Michael

    2014-01-01

    We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen. PMID:23846097

  11. Splatterplots: Overcoming Overdraw in Scatter Plots.

    PubMed

    Mayorga, Adrian; Gleicher, Michael

    2013-03-20

    We introduce Splatterplots, a novel presentation of scattered data that enables visualizations that scale beyond standard scatter plots. Traditional scatter plots suffer from overdraw (overlapping glyphs) as the number of points per unit area increases. Overdraw obscures outliers, hides data distributions, and makes the relationship among subgroups of the data difficult to discern. To address these issues, Splatterplots abstract away information such that the density of data shown in any unit of screen space is bounded, while allowing continuous zoom to reveal abstracted details. Abstraction automatically groups dense data points into contours and samples remaining points. We combine techniques for abstraction with with perceptually based color blending to reveal the relationship between data subgroups. The resulting visualizations represent the dense regions of each subgroup of the dataset as smooth closed shapes and show representative outliers explicitly. We present techniques that leverage the GPU for Splatterplot computation and rendering, enabling interaction with massive data sets. We show how splatterplots can be an effective alternative to traditional methods of displaying scatter data communicating data trends, outliers, and data set relationships much like traditional scatter plots, but scaling to data sets of higher density and up to millions of points on the screen.

  12. A Simple and Practical Dictionary-based Approach for Identification of Proteins in Medline Abstracts

    PubMed Central

    Egorov, Sergei; Yuryev, Anton; Daraselia, Nikolai

    2004-01-01

    Objective: The aim of this study was to develop a practical and efficient protein identification system for biomedical corpora. Design: The developed system, called ProtScan, utilizes a carefully constructed dictionary of mammalian proteins in conjunction with a specialized tokenization algorithm to identify and tag protein name occurrences in biomedical texts and also takes advantage of Medline “Name-of-Substance” (NOS) annotation. The dictionaries for ProtScan were constructed in a semi-automatic way from various public-domain sequence databases followed by an intensive expert curation step. Measurements: The recall and precision of the system have been determined using 1,000 randomly selected and hand-tagged Medline abstracts. Results: The developed system is capable of identifying protein occurrences in Medline abstracts with a 98% precision and 88% recall. It was also found to be capable of processing approximately 300 abstracts per second. Without utilization of NOS annotation, precision and recall were found to be 98.5% and 84%, respectively. Conclusion: The developed system appears to be well suited for protein-based Medline indexing and can help to improve biomedical information retrieval. Further approaches to ProtScan's recall improvement also are discussed. PMID:14764613

  13. Modular Mayhem? A Case Study of the Development of the A-Level Science Curriculum in England

    ERIC Educational Resources Information Center

    Hayward, Geoff; McNicholl, Jane

    2007-01-01

    This article investigates the costs and benefits of the increased use of modular or unitized qualification designs through a case study of the GCE A-level science curriculum in England. Following a brief review of the development of modular A-levels, the various proposed advantages of modularity--short-term goals and regular feedback, flexibility…

  14. Simulation of value stream mapping and discrete optimization of energy consumption in modular construction

    NASA Astrophysics Data System (ADS)

    Chowdhury, Md Mukul

    With the increased practice of modularization and prefabrication, the construction industry gained the benefits of quality management, improved completion time, reduced site disruption and vehicular traffic, and improved overall safety and security. Whereas industrialized construction methods, such as modular and manufactured buildings, have evolved over decades, core techniques used in prefabrication plants vary only slightly from those employed in traditional site-built construction. With a focus on energy and cost efficient modular construction, this research presents the development of a simulation, measurement and optimization system for energy consumption in the manufacturing process of modular construction. The system is based on Lean Six Sigma principles and loosely coupled system operation to identify the non-value adding tasks and possible causes of low energy efficiency. The proposed system will also include visualization functions for demonstration of energy consumption in modular construction. The benefits of implementing this system include a reduction in the energy consumption in production cost, decrease of energy cost in the production of lean-modular construction, and increase profit. In addition, the visualization functions will provide detailed information about energy efficiency and operation flexibility in modular construction. A case study is presented to validate the reliability of the system.

  15. Framework for Defining and Assessing Benefits of a Modular Assembly Design Approach for Exploration Systems

    NASA Technical Reports Server (NTRS)

    Dorsey, John T.; Collins, Timothy J.; Moe, Rud V.; Doggett,. William R.

    2006-01-01

    A comprehensive modular assembly system model has been proposed that extends the art from modular hardware, to include in-space assembly, servicing and repair and it s critical components of infrastructure, agents and assembly operations. Benefits of modular assembly have been identified and a set of metrics defined that extends the art beyond the traditional measures of performance, with emphasis on criteria that allow life-cycle mission costs to be used as a figure of merit (and include all substantive terms that have an impact on the evaluation). The modular assembly approach was used as a basis for developing a Solar Electric Transfer Vehicle (SETV) concept and three modular assembly scenarios were developed. The modular assembly approach also allows the SETV to be entered into service much earlier than competing conventional configurations and results in a great deal of versatility in accommodating different launch vehicle payload capabilities, allowing for modules to be pre-assembled before launch or assembled on orbit, without changing the space vehicle design.

  16. Modular Knowledge Representation and Reasoning in the Semantic Web

    NASA Astrophysics Data System (ADS)

    Serafini, Luciano; Homola, Martin

    Construction of modular ontologies by combining different modules is becoming a necessity in ontology engineering in order to cope with the increasing complexity of the ontologies and the domains they represent. The modular ontology approach takes inspiration from software engineering, where modularization is a widely acknowledged feature. Distributed reasoning is the other side of the coin of modular ontologies: given an ontology comprising of a set of modules, it is desired to perform reasoning by combination of multiple reasoning processes performed locally on each of the modules. In the last ten years, a number of approaches for combining logics has been developed in order to formalize modular ontologies. In this chapter, we survey and compare the main formalisms for modular ontologies and distributed reasoning in the Semantic Web. We select four formalisms build on formal logical grounds of Description Logics: Distributed Description Logics, ℰ-connections, Package-based Description Logics and Integrated Distributed Description Logics. We concentrate on expressivity and distinctive modeling features of each framework. We also discuss reasoning capabilities of each framework.

  17. Classification of functional interactions from multi-electrodes data using conditional modularity analysis

    NASA Astrophysics Data System (ADS)

    Makhtar, Siti Noormiza; Senik, Mohd Harizal

    2018-02-01

    The availability of massive amount of neuronal signals are attracting widespread interest in functional connectivity analysis. Functional interactions estimated by multivariate partial coherence analysis in the frequency domain represent the connectivity strength in this study. Modularity is a network measure for the detection of community structure in network analysis. The discovery of community structure for the functional neuronal network was implemented on multi-electrode array (MEA) signals recorded from hippocampal regions in isoflurane-anaesthetized Lister-hooded rats. The analysis is expected to show modularity changes before and after local unilateral kainic acid (KA)-induced epileptiform activity. The result is presented using color-coded graphic of conditional modularity measure for 19 MEA nodes. This network is separated into four sub-regions to show the community detection within each sub-region. The results show that classification of neuronal signals into the inter- and intra-modular nodes is feasible using conditional modularity analysis. Estimation of segregation properties using conditional modularity analysis may provide further information about functional connectivity from MEA data.

  18. A Formal Theory for Modular ERDF Ontologies

    NASA Astrophysics Data System (ADS)

    Analyti, Anastasia; Antoniou, Grigoris; Damásio, Carlos Viegas

    The success of the Semantic Web is impossible without any form of modularity, encapsulation, and access control. In an earlier paper, we extended RDF graphs with weak and strong negation, as well as derivation rules. The ERDF #n-stable model semantics of the extended RDF framework (ERDF) is defined, extending RDF(S) semantics. In this paper, we propose a framework for modular ERDF ontologies, called modular ERDF framework, which enables collaborative reasoning over a set of ERDF ontologies, while support for hidden knowledge is also provided. In particular, the modular ERDF stable model semantics of modular ERDF ontologies is defined, extending the ERDF #n-stable model semantics. Our proposed framework supports local semantics and different points of view, local closed-world and open-world assumptions, and scoped negation-as-failure. Several complexity results are provided.

  19. Future Concepts for Modular, Intelligent Aerospace Power Systems

    NASA Technical Reports Server (NTRS)

    Button, Robert M.; Soeder, James F.

    2004-01-01

    Nasa's resent commitment to Human and Robotic Space Exploration obviates the need for more affordable and sustainable systems and missions. Increased use of modularity and on-board intelligent technologies will enable these lofty goals. To support this new paradigm, an advanced technology program to develop modular, intelligent power management and distribution (PMAD) system technologies is presented. The many benefits to developing and including modular functionality in electrical power components and systems are shown to include lower costs and lower mass for highly reliable systems. The details of several modular technologies being developed by NASA are presented, broken down into hierarchical levels. Modularity at the device level, including the use of power electronic building blocks, is shown to provide benefits in lowering the development time and costs of new power electronic components.

  20. Modular multiplication in GF(p) for public-key cryptography

    NASA Astrophysics Data System (ADS)

    Olszyna, Jakub

    Modular multiplication forms the basis of modular exponentiation which is the core operation of the RSA cryptosystem. It is also present in many other cryptographic algorithms including those based on ECC and HECC. Hence, an efficient implementation of PKC relies on efficient implementation of modular multiplication. The paper presents a survey of most common algorithms for modular multiplication along with hardware architectures especially suitable for cryptographic applications in energy constrained environments. The motivation for studying low-power and areaefficient modular multiplication algorithms comes from enabling public-key security for ultra-low power devices that can perform under constrained environments like wireless sensor networks. Serial architectures for GF(p) are analyzed and presented. Finally proposed architectures are verified and compared according to the amount of power dissipated throughout the operation.

Top