Sample records for modularity optimization multi-step

  1. Adaptive multi-resolution Modularity for detecting communities in networks

    NASA Astrophysics Data System (ADS)

    Chen, Shi; Wang, Zhi-Zhong; Bao, Mei-Hua; Tang, Liang; Zhou, Ji; Xiang, Ju; Li, Jian-Ming; Yi, Chen-He

    2018-02-01

    Community structure is a common topological property of complex networks, which attracted much attention from various fields. Optimizing quality functions for community structures is a kind of popular strategy for community detection, such as Modularity optimization. Here, we introduce a general definition of Modularity, by which several classical (multi-resolution) Modularity can be derived, and then propose a kind of adaptive (multi-resolution) Modularity that can combine the advantages of different Modularity. By applying the Modularity to various synthetic and real-world networks, we study the behaviors of the methods, showing the validity and advantages of the multi-resolution Modularity in community detection. The adaptive Modularity, as a kind of multi-resolution method, can naturally solve the first-type limit of Modularity and detect communities at different scales; it can quicken the disconnecting of communities and delay the breakup of communities in heterogeneous networks; and thus it is expected to generate the stable community structures in networks more effectively and have stronger tolerance against the second-type limit of Modularity.

  2. Parameter optimization of a hydrologic model in a snow-dominated basin using a modular Python framework

    NASA Astrophysics Data System (ADS)

    Volk, J. M.; Turner, M. A.; Huntington, J. L.; Gardner, M.; Tyler, S.; Sheneman, L.

    2016-12-01

    Many distributed models that simulate watershed hydrologic processes require a collection of multi-dimensional parameters as input, some of which need to be calibrated before the model can be applied. The Precipitation Runoff Modeling System (PRMS) is a physically-based and spatially distributed hydrologic model that contains a considerable number of parameters that often need to be calibrated. Modelers can also benefit from uncertainty analysis of these parameters. To meet these needs, we developed a modular framework in Python to conduct PRMS parameter optimization, uncertainty analysis, interactive visual inspection of parameters and outputs, and other common modeling tasks. Here we present results for multi-step calibration of sensitive parameters controlling solar radiation, potential evapo-transpiration, and streamflow in a PRMS model that we applied to the snow-dominated Dry Creek watershed in Idaho. We also demonstrate how our modular approach enables the user to use a variety of parameter optimization and uncertainty methods or easily define their own, such as Monte Carlo random sampling, uniform sampling, or even optimization methods such as the downhill simplex method or its commonly used, more robust counterpart, shuffled complex evolution.

  3. Product modular design incorporating preventive maintenance issues

    NASA Astrophysics Data System (ADS)

    Gao, Yicong; Feng, Yixiong; Tan, Jianrong

    2016-03-01

    Traditional modular design methods lead to product maintenance problems, because the module form of a system is created according to either the function requirements or the manufacturing considerations. For solving these problems, a new modular design method is proposed with the considerations of not only the traditional function related attributes, but also the maintenance related ones. First, modularity parameters and modularity scenarios for product modularity are defined. Then the reliability and economic assessment models of product modularity strategies are formulated with the introduction of the effective working age of modules. A mathematical model used to evaluate the difference among the modules of the product so that the optimal module of the product can be established. After that, a multi-objective optimization problem based on metrics for preventive maintenance interval different degrees and preventive maintenance economics is formulated for modular optimization. Multi-objective GA is utilized to rapidly approximate the Pareto set of optimal modularity strategy trade-offs between preventive maintenance cost and preventive maintenance interval difference degree. Finally, a coordinate CNC boring machine is adopted to depict the process of product modularity. In addition, two factorial design experiments based on the modularity parameters are constructed and analyzed. These experiments investigate the impacts of these parameters on the optimal modularity strategies and the structure of module. The research proposes a new modular design method, which may help to improve the maintainability of product in modular design.

  4. Optimal multi-community network modularity for information diffusion

    NASA Astrophysics Data System (ADS)

    Wu, Jiaocan; Du, Ruping; Zheng, Yingying; Liu, Dong

    2016-02-01

    Studies demonstrate that community structure plays an important role in information spreading recently. In this paper, we investigate the impact of multi-community structure on information diffusion with linear threshold model. We utilize extended GN network that contains four communities and analyze dynamic behaviors of information that spreads on it. And we discover the optimal multi-community network modularity for information diffusion based on the social reinforcement. Results show that, within the appropriate range, multi-community structure will facilitate information diffusion instead of hindering it, which accords with the results derived from two-community network.

  5. Modularization of gradient-index optical design using wavefront matching enabled optimization.

    PubMed

    Nagar, Jogender; Brocker, Donovan E; Campbell, Sawyer D; Easum, John A; Werner, Douglas H

    2016-05-02

    This paper proposes a new design paradigm which allows for a modular approach to replacing a homogeneous optical lens system with a higher-performance GRadient-INdex (GRIN) lens system using a WaveFront Matching (WFM) method. In multi-lens GRIN systems, a full-system-optimization approach can be challenging due to the large number of design variables. The proposed WFM design paradigm enables optimization of each component independently by explicitly matching the WaveFront Error (WFE) of the original homogeneous component at the exit pupil, resulting in an efficient design procedure for complex multi-lens systems.

  6. A Modular Robotic System with Applications to Space Exploration

    NASA Technical Reports Server (NTRS)

    Hancher, Matthew D.; Hornby, Gregory S.

    2006-01-01

    Modular robotic systems offer potential advantages as versatile, fault-tolerant, cost-effective platforms for space exploration, but a sufficiently mature system is not yet available. We describe the possible applications of such a system, and present prototype hardware intended as a step in the right direction. We also present elements of an automated design and optimization framework aimed at making modular robots easier to design and use, and discuss the results of applying the system to a gait optimization problem. Finally, we discuss the potential near-term applications of modular robotics to terrestrial robotics research.

  7. A Hybrid Optimization Framework with POD-based Order Reduction and Design-Space Evolution Scheme

    NASA Astrophysics Data System (ADS)

    Ghoman, Satyajit S.

    The main objective of this research is to develop an innovative multi-fidelity multi-disciplinary design, analysis and optimization suite that integrates certain solution generation codes and newly developed innovative tools to improve the overall optimization process. The research performed herein is divided into two parts: (1) the development of an MDAO framework by integration of variable fidelity physics-based computational codes, and (2) enhancements to such a framework by incorporating innovative features extending its robustness. The first part of this dissertation describes the development of a conceptual Multi-Fidelity Multi-Strategy and Multi-Disciplinary Design Optimization Environment (M3 DOE), in context of aircraft wing optimization. M 3 DOE provides the user a capability to optimize configurations with a choice of (i) the level of fidelity desired, (ii) the use of a single-step or multi-step optimization strategy, and (iii) combination of a series of structural and aerodynamic analyses. The modularity of M3 DOE allows it to be a part of other inclusive optimization frameworks. The M 3 DOE is demonstrated within the context of shape and sizing optimization of the wing of a Generic Business Jet aircraft. Two different optimization objectives, viz. dry weight minimization, and cruise range maximization are studied by conducting one low-fidelity and two high-fidelity optimization runs to demonstrate the application scope of M3 DOE. The second part of this dissertation describes the development of an innovative hybrid optimization framework that extends the robustness of M 3 DOE by employing a proper orthogonal decomposition-based design-space order reduction scheme combined with the evolutionary algorithm technique. The POD method of extracting dominant modes from an ensemble of candidate configurations is used for the design-space order reduction. The snapshot of candidate population is updated iteratively using evolutionary algorithm technique of fitness-driven retention. This strategy capitalizes on the advantages of evolutionary algorithm as well as POD-based reduced order modeling, while overcoming the shortcomings inherent with these techniques. When linked with M3 DOE, this strategy offers a computationally efficient methodology for problems with high level of complexity and a challenging design-space. This newly developed framework is demonstrated for its robustness on a nonconventional supersonic tailless air vehicle wing shape optimization problem.

  8. Network community-detection enhancement by proper weighting

    NASA Astrophysics Data System (ADS)

    Khadivi, Alireza; Ajdari Rad, Ali; Hasler, Martin

    2011-04-01

    In this paper, we show how proper assignment of weights to the edges of a complex network can enhance the detection of communities and how it can circumvent the resolution limit and the extreme degeneracy problems associated with modularity. Our general weighting scheme takes advantage of graph theoretic measures and it introduces two heuristics for tuning its parameters. We use this weighting as a preprocessing step for the greedy modularity optimization algorithm of Newman to improve its performance. The result of the experiments of our approach on computer-generated and real-world data networks confirm that the proposed approach not only mitigates the problems of modularity but also improves the modularity optimization.

  9. Modular Training for Robot-Assisted Radical Prostatectomy: Where to Begin?

    PubMed

    Lovegrove, Catherine; Ahmed, Kamran; Novara, Giacomo; Guru, Khurshid; Mottrie, Alex; Challacombe, Ben; der Poel, Henk Van; Peabody, James; Dasgupta, Prokar

    Effective training is paramount for patient safety. Modular training entails advancing through surgical steps of increasing difficulty. This study aimed to construct a modular training pathway for use in robot-assisted radical prostatectomy (RARP). It aims to identify the sequence of procedural steps that are learnt before surgeons are able to perform a full procedure without an intervention from mentor. This is a multi-institutional, prospective, observational, longitudinal study. We used a validated training tool (RARP Score). Data regarding surgeons' stage of training and progress were collected for analysis. A modular training pathway was constructed with consensus on the level of difficulty and evaluation of individual steps. We identified and recorded the sequence of steps performed by fellows during their learning curves. We included 15 urology fellows from UK, Europe, and Australia. A total of 15 surgeons were assessed by mentors in 425 RARP cases over 8 months (range: 7-79) across 15 international centers. There were substantial differences in the sequence of RARP steps according to the chronology of the procedure, difficulty level, and the order in which surgeons actually learned steps. Steps were not attempted in chronological order. The greater the difficulty, the later the cohort first undertook the step (p = 0.021). The cohort undertook steps of difficulty level I at median case number 1. Steps of difficulty levels II, III, and IV showed more variation in median case number of the first attempt. We recommend that, in the operating theater, steps be learned in order of increasing difficulty. A new modular training route has been designed. This incorporates the steps of RARP with the following order of priority: difficulty level > median case number of first attempt > most frequently undertaken in surgical training. An evidence-based modular training pathway has been developed that facilitates a safe introduction to RARP for novice surgeons. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. A Systems Approach towards an Intelligent and Self-Controlling Platform for Integrated Continuous Reaction Sequences**

    PubMed Central

    Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V

    2015-01-01

    Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747

  11. A Multi-Year Plan for Research, Development, and Prototype Testing of Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Brennan T.; Welch, Tim; Witt, Adam M.

    The Multi-Year Plan for Research, Development, and Prototype Testing of Standard Modular Hydropower Technology (MYRP) presents a strategy for specifying, designing, testing, and demonstrating the efficacy of standard modular hydropower (SMH) as an environmentally compatible and cost-optimized renewable electricity generation technology. The MYRP provides the context, background, and vision for testing the SMH hypothesis: if standardization, modularity, and preservation of stream functionality become essential and fully realized features of hydropower technology, project design, and regulatory processes, they will enable previously unrealized levels of new project development with increased acceptance, reduced costs, increased predictability of outcomes, and increased value to stakeholders.more » To achieve success in this effort, the MYRP outlines a framework of stakeholder-validated criteria, models, design tools, testing facilities, and assessment protocols that will facilitate the development of next-generation hydropower technologies.« less

  12. Fully chip-embedded automation of a multi-step lab-on-a-chip process using a modularized timer circuit.

    PubMed

    Kang, Junsu; Lee, Donghyeon; Heo, Young Jin; Chung, Wan Kyun

    2017-11-07

    For highly-integrated microfluidic systems, an actuation system is necessary to control the flow; however, the bulk of actuation devices including pumps or valves has impeded the broad application of integrated microfluidic systems. Here, we suggest a microfluidic process control method based on built-in microfluidic circuits. The circuit is composed of a fluidic timer circuit and a pneumatic logic circuit. The fluidic timer circuit is a serial connection of modularized timer units, which sequentially pass high pressure to the pneumatic logic circuit. The pneumatic logic circuit is a NOR gate array designed to control the liquid-controlling process. By using the timer circuit as a built-in signal generator, multi-step processes could be done totally inside the microchip without any external controller. The timer circuit uses only two valves per unit, and the number of process steps can be extended without limitation by adding timer units. As a demonstration, an automation chip has been designed for a six-step droplet treatment, which entails 1) loading, 2) separation, 3) reagent injection, 4) incubation, 5) clearing and 6) unloading. Each process was successfully performed for a pre-defined step-time without any external control device.

  13. Modular synthesis of a dual metal-dual semiconductor nano-heterostructure

    DOE PAGES

    Amirav, Lilac; Oba, Fadekemi; Aloni, Shaul; ...

    2015-04-29

    Reported is the design and modular synthesis of a dual metal-dual semiconductor heterostructure with control over the dimensions and placement of its individual components. Analogous to molecular synthesis, colloidal synthesis is now evolving into a series of sequential synthetic procedures with separately optimized steps. Here we detail the challenges and parameters that must be considered when assembling such a multicomponent nanoparticle, and their solutions.

  14. Progress toward Modular UAS for Geoscience Applications

    NASA Astrophysics Data System (ADS)

    Dahlgren, R. P.; Clark, M. A.; Comstock, R. J.; Fladeland, M.; Gascot, H., III; Haig, T. H.; Lam, S. J.; Mazhari, A. A.; Palomares, R. R.; Pinsker, E. A.; Prathipati, R. T.; Sagaga, J.; Thurling, J. S.; Travers, S. V.

    2017-12-01

    Small Unmanned Aerial Systems (UAS) have become accepted tools for geoscience, ecology, agriculture, disaster response, land management, and industry. A variety of consumer UAS options exist as science and engineering payload platforms, but their incompatibilities with one another contribute to high operational costs compared with those of piloted aircraft. This research explores the concept of modular UAS, demonstrating airframes that can be reconfigured in the field for experimental optimization, to enable multi-mission support, facilitate rapid repair, or respond to changing field conditions. Modular UAS is revolutionary in allowing aircraft to be optimized around the payload, reversing the conventional wisdom of designing the payload to accommodate an unmodifiable aircraft. UAS that are reconfigurable like Legos™ are ideal for airborne science service providers, system integrators, instrument designers and end users to fulfill a wide range of geoscience experiments. Modular UAS facilitate the adoption of open-source software and rapid prototyping technology where design reuse is important in the context of a highly regulated industry like aerospace. The industry is now at a stage where consolidation, acquisition, and attrition will reduce the number of small manufacturers, with a reduction of innovation and motivation to reduce costs. Modularity leads to interface specifications, which can evolve into de facto or formal standards which contain minimum (but sufficient) details such that multiple vendors can then design to those standards and demonstrate interoperability. At that stage, vendor coopetition leads to robust interface standards, interoperability standards and multi-source agreements which in turn drive costs down significantly.

  15. An iterative network partition algorithm for accurate identification of dense network modules

    PubMed Central

    Sun, Siqi; Dong, Xinran; Fu, Yao; Tian, Weidong

    2012-01-01

    A key step in network analysis is to partition a complex network into dense modules. Currently, modularity is one of the most popular benefit functions used to partition network modules. However, recent studies suggested that it has an inherent limitation in detecting dense network modules. In this study, we observed that despite the limitation, modularity has the advantage of preserving the primary network structure of the undetected modules. Thus, we have developed a simple iterative Network Partition (iNP) algorithm to partition a network. The iNP algorithm provides a general framework in which any modularity-based algorithm can be implemented in the network partition step. Here, we tested iNP with three modularity-based algorithms: multi-step greedy (MSG), spectral clustering and Qcut. Compared with the original three methods, iNP achieved a significant improvement in the quality of network partition in a benchmark study with simulated networks, identified more modules with significantly better enrichment of functionally related genes in both yeast protein complex network and breast cancer gene co-expression network, and discovered more cancer-specific modules in the cancer gene co-expression network. As such, iNP should have a broad application as a general method to assist in the analysis of biological networks. PMID:22121225

  16. In silico evolution of the hunchback gene indicates redundancy in cis-regulatory organization and spatial gene expression

    PubMed Central

    Zagrijchuk, Elizaveta A.; Sabirov, Marat A.; Holloway, David M.; Spirov, Alexander V.

    2014-01-01

    Biological development depends on the coordinated expression of genes in time and space. Developmental genes have extensive cis-regulatory regions which control their expression. These regions are organized in a modular manner, with different modules controlling expression at different times and locations. Both how modularity evolved and what function it serves are open questions. We present a computational model for the cis-regulation of the hunchback (hb) gene in the fruit fly (Drosophila). We simulate evolution (using an evolutionary computation approach from computer science) to find the optimal cis-regulatory arrangements for fitting experimental hb expression patterns. We find that the cis-regulatory region tends to readily evolve modularity. These cis-regulatory modules (CRMs) do not tend to control single spatial domains, but show a multi-CRM/multi-domain correspondence. We find that the CRM-domain correspondence seen in Drosophila evolves with a high probability in our model, supporting the biological relevance of the approach. The partial redundancy resulting from multi-CRM control may confer some biological robustness against corruption of regulatory sequences. The technique developed on hb could readily be applied to other multi-CRM developmental genes. PMID:24712536

  17. A modular approach to intensity-modulated arc therapy optimization with noncoplanar trajectories

    NASA Astrophysics Data System (ADS)

    Papp, Dávid; Bortfeld, Thomas; Unkelbach, Jan

    2015-07-01

    Utilizing noncoplanar beam angles in volumetric modulated arc therapy (VMAT) has the potential to combine the benefits of arc therapy, such as short treatment times, with the benefits of noncoplanar intensity modulated radiotherapy (IMRT) plans, such as improved organ sparing. Recently, vendors introduced treatment machines that allow for simultaneous couch and gantry motion during beam delivery to make noncoplanar VMAT treatments possible. Our aim is to provide a reliable optimization method for noncoplanar isocentric arc therapy plan optimization. The proposed solution is modular in the sense that it can incorporate different existing beam angle selection and coplanar arc therapy optimization methods. Treatment planning is performed in three steps. First, a number of promising noncoplanar beam directions are selected using an iterative beam selection heuristic; these beams serve as anchor points of the arc therapy trajectory. In the second step, continuous gantry/couch angle trajectories are optimized using a simple combinatorial optimization model to define a beam trajectory that efficiently visits each of the anchor points. Treatment time is controlled by limiting the time the beam needs to trace the prescribed trajectory. In the third and final step, an optimal arc therapy plan is found along the prescribed beam trajectory. In principle any existing arc therapy optimization method could be incorporated into this step; for this work we use a sliding window VMAT algorithm. The approach is demonstrated using two particularly challenging cases. The first one is a lung SBRT patient whose planning goals could not be satisfied with fewer than nine noncoplanar IMRT fields when the patient was treated in the clinic. The second one is a brain tumor patient, where the target volume overlaps with the optic nerves and the chiasm and it is directly adjacent to the brainstem. Both cases illustrate that the large number of angles utilized by isocentric noncoplanar VMAT plans can help improve dose conformity, homogeneity, and organ sparing simultaneously using the same beam trajectory length and delivery time as a coplanar VMAT plan.

  18. MOCAT: A Metagenomics Assembly and Gene Prediction Toolkit

    PubMed Central

    Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R.; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/. PMID:23082188

  19. MOCAT: a metagenomics assembly and gene prediction toolkit.

    PubMed

    Kultima, Jens Roat; Sunagawa, Shinichi; Li, Junhua; Chen, Weineng; Chen, Hua; Mende, Daniel R; Arumugam, Manimozhiyan; Pan, Qi; Liu, Binghang; Qin, Junjie; Wang, Jun; Bork, Peer

    2012-01-01

    MOCAT is a highly configurable, modular pipeline for fast, standardized processing of single or paired-end sequencing data generated by the Illumina platform. The pipeline uses state-of-the-art programs to quality control, map, and assemble reads from metagenomic samples sequenced at a depth of several billion base pairs, and predict protein-coding genes on assembled metagenomes. Mapping against reference databases allows for read extraction or removal, as well as abundance calculations. Relevant statistics for each processing step can be summarized into multi-sheet Excel documents and queryable SQL databases. MOCAT runs on UNIX machines and integrates seamlessly with the SGE and PBS queuing systems, commonly used to process large datasets. The open source code and modular architecture allow users to modify or exchange the programs that are utilized in the various processing steps. Individual processing steps and parameters were benchmarked and tested on artificial, real, and simulated metagenomes resulting in an improvement of selected quality metrics. MOCAT can be freely downloaded at http://www.bork.embl.de/mocat/.

  20. Characteristics and Concepts of Dynamic Hub Proteins in DNA Processing Machinery from Studies of RPA

    PubMed Central

    Sugitani, Norie; Chazin, Walter J.

    2015-01-01

    DNA replication, damage response and repair require the coordinated action of multi-domain proteins operating within dynamic multi-protein machines that act upon the DNA substrate. These modular proteins contain flexible linkers of various lengths, which enable changes in the spatial distribution of the globular domains (architecture) that harbor their essential biochemical functions. This mobile architecture is uniquely suited to follow the evolving substrate landscape present over the course of the specific process performed by the multi-protein machinery. A fundamental advance in understanding of protein machinery is the realization of the pervasive role of dynamics. Not only is the machine undergoing dynamic transformations, but the proteins themselves are flexible and constantly adapting to the progression through the steps of the overall process. Within this dynamic context the activity of the constituent proteins must be coordinated, a role typically played by hub proteins. A number of important characteristics of modular proteins and concepts about the operation of dynamic machinery have been discerned. These provide the underlying basis for the action of the machinery that reads DNA, and responds to and repairs DNA damage. Here, we introduce a number of key characteristics and concepts, including the modularity of the proteins, linkage of weak binding sites, direct competition between sites, and allostery, using the well recognized hub protein replication protein A (RPA). PMID:25542993

  1. Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Agte, Jeremy S.; Sandusky, Robert R., Jr.

    1998-01-01

    BLISS is a method for optimization of engineering systems by decomposition. It separates the system level optimization, having a relatively small number of design variables, from the potentially numerous subsystem optimizations that may each have a large number of local design variables. The subsystem optimizations are autonomous and may be conducted concurrently. Subsystem and system optimizations alternate, linked by sensitivity data, producing a design improvement in each iteration. Starting from a best guess initial design, the method improves that design in iterative cycles, each cycle comprised of two steps. In step one, the system level variables are frozen and the improvement is achieved by separate, concurrent, and autonomous optimizations in the local variable subdomains. In step two, further improvement is sought in the space of the system level variables. Optimum sensitivity data link the second step to the first. The method prototype was implemented using MATLAB and iSIGHT programming software and tested on a simplified, conceptual level supersonic business jet design, and a detailed design of an electronic device. Satisfactory convergence and favorable agreement with the benchmark results were observed. Modularity of the method is intended to fit the human organization and map well on the computing technology of concurrent processing.

  2. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    NASA Astrophysics Data System (ADS)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land use change. Finally, the major contributions to the science are presented along with valuable directions for future research.

  3. Detector for positronium temperature measurements by two-photon angular correlation

    NASA Astrophysics Data System (ADS)

    Cecchini, G. G.; Jones, A. C. L.; Fuentes-Garcia, M.; Adams, D. J.; Austin, M.; Membreno, E.; Mills, A. P.

    2018-05-01

    We report on the design and characterization of a modular γ-ray detector assembly developed for accurate and efficient detection of coincident 511 keV back-to-back γ-rays following electron-positron annihilation. Each modular detector consists of 16 narrow lutetium yttrium oxyorthosilicate scintillators coupled to a multi-anode Hamamatsu H12700B photomultiplier tube. We discuss the operation and optimization of 511 keV γ-ray detection resulting from testing various scintillators and detector arrangements concluding with an estimate of the coincident 511 keV detection efficiency for the intended experiment and a preliminary test representing one-quarter of the completed array.

  4. Tracking Virus Particles in Fluorescence Microscopy Images Using Multi-Scale Detection and Multi-Frame Association.

    PubMed

    Jaiswal, Astha; Godinez, William J; Eils, Roland; Lehmann, Maik Jorg; Rohr, Karl

    2015-11-01

    Automatic fluorescent particle tracking is an essential task to study the dynamics of a large number of biological structures at a sub-cellular level. We have developed a probabilistic particle tracking approach based on multi-scale detection and two-step multi-frame association. The multi-scale detection scheme allows coping with particles in close proximity. For finding associations, we have developed a two-step multi-frame algorithm, which is based on a temporally semiglobal formulation as well as spatially local and global optimization. In the first step, reliable associations are determined for each particle individually in local neighborhoods. In the second step, the global spatial information over multiple frames is exploited jointly to determine optimal associations. The multi-scale detection scheme and the multi-frame association finding algorithm have been combined with a probabilistic tracking approach based on the Kalman filter. We have successfully applied our probabilistic tracking approach to synthetic as well as real microscopy image sequences of virus particles and quantified the performance. We found that the proposed approach outperforms previous approaches.

  5. Long range personalized cancer treatment strategies incorporating evolutionary dynamics.

    PubMed

    Yeang, Chen-Hsiang; Beckman, Robert A

    2016-10-22

    Current cancer precision medicine strategies match therapies to static consensus molecular properties of an individual's cancer, thus determining the next therapeutic maneuver. These strategies typically maintain a constant treatment while the cancer is not worsening. However, cancers feature complicated sub-clonal structure and dynamic evolution. We have recently shown, in a comprehensive simulation of two non-cross resistant therapies across a broad parameter space representing realistic tumors, that substantial improvement in cure rates and median survival can be obtained utilizing dynamic precision medicine strategies. These dynamic strategies explicitly consider intratumoral heterogeneity and evolutionary dynamics, including predicted future drug resistance states, and reevaluate optimal therapy every 45 days. However, the optimization is performed in single 45 day steps ("single-step optimization"). Herein we evaluate analogous strategies that think multiple therapeutic maneuvers ahead, considering potential outcomes at 5 steps ahead ("multi-step optimization") or 40 steps ahead ("adaptive long term optimization (ALTO)") when recommending the optimal therapy in each 45 day block, in simulations involving both 2 and 3 non-cross resistant therapies. We also evaluate an ALTO approach for situations where simultaneous combination therapy is not feasible ("Adaptive long term optimization: serial monotherapy only (ALTO-SMO)"). Simulations utilize populations of 764,000 and 1,700,000 virtual patients for 2 and 3 drug cases, respectively. Each virtual patient represents a unique clinical presentation including sizes of major and minor tumor subclones, growth rates, evolution rates, and drug sensitivities. While multi-step optimization and ALTO provide no significant average survival benefit, cure rates are significantly increased by ALTO. Furthermore, in the subset of individual virtual patients demonstrating clinically significant difference in outcome between approaches, by far the majority show an advantage of multi-step or ALTO over single-step optimization. ALTO-SMO delivers cure rates superior or equal to those of single- or multi-step optimization, in 2 and 3 drug cases respectively. In selected virtual patients incurable by dynamic precision medicine using single-step optimization, analogous strategies that "think ahead" can deliver long-term survival and cure without any disadvantage for non-responders. When therapies require dose reduction in combination (due to toxicity), optimal strategies feature complex patterns involving rapidly interleaved pulses of combinations and high dose monotherapy. This article was reviewed by Wendy Cornell, Marek Kimmel, and Andrzej Swierniak. Wendy Cornell and Andrzej Swierniak are external reviewers (not members of the Biology Direct editorial board). Andrzej Swierniak was nominated by Marek Kimmel.

  6. Purification of complex samples: Implementation of a modular and reconfigurable droplet-based microfluidic platform with cascaded deterministic lateral displacement separation modules

    PubMed Central

    Pudda, Catherine; Boizot, François; Verplanck, Nicolas; Revol-Cavalier, Frédéric; Berthier, Jean; Thuaire, Aurélie

    2018-01-01

    Particle separation in microfluidic devices is a common problematic for sample preparation in biology. Deterministic lateral displacement (DLD) is efficiently implemented as a size-based fractionation technique to separate two populations of particles around a specific size. However, real biological samples contain components of many different sizes and a single DLD separation step is not sufficient to purify these complex samples. When connecting several DLD modules in series, pressure balancing at the DLD outlets of each step becomes critical to ensure an optimal separation efficiency. A generic microfluidic platform is presented in this paper to optimize pressure balancing, when DLD separation is connected either to another DLD module or to a different microfluidic function. This is made possible by generating droplets at T-junctions connected to the DLD outlets. Droplets act as pressure controllers, which perform at the same time the encapsulation of DLD sorted particles and the balance of output pressures. The optimized pressures to apply on DLD modules and on T-junctions are determined by a general model that ensures the equilibrium of the entire platform. The proposed separation platform is completely modular and reconfigurable since the same predictive model applies to any cascaded DLD modules of the droplet-based cartridge. PMID:29768490

  7. Transformational System Concepts and Technologies for Our Future in Space

    NASA Technical Reports Server (NTRS)

    Howell, Joe T.; Mankins, John C.

    2004-01-01

    Continued constrained budgets and growing national and international interests in the commercialization and development of space requires NASA to be constantly vigilant, to be creative, and to seize every opportunity for assuring the maximum return on space infrastructure investments. Accordingly, efforts are underway to forge new and innovative approaches to transform our space systems in the future to ultimately achieve two or three or five times as much with the same resources. This bold undertaking can be achieved only through extensive cooperative efforts throughout the aerospace community and truly effective planning to pursue advanced space system design concepts and high-risk/high-leverage research and technology. Definitive implementation strategies and roadmaps containing new methodologies and revolutionary approaches must be developed to economically accommodate the continued exploration and development of space. Transformation can be realized through modular design and stepping stone development. This approach involves sustainable budget levels and multi-purpose systems development of supporting capabilities that lead to a diverse amy of sustainable future space activities. Transformational design and development requires revolutionary advances by using modular designs and a planned, stepping stone development process. A modular approach to space systems potentially offers many improvements over traditional one-of-a-kind space systems comprised of different subsystem element with little standardization in interfaces or functionality. Modular systems must be more flexible, scaleable, reconfigurable, and evolvable. Costs can be reduced through learning curve effects and economies of scale, and by enabling servicing and repair that would not otherwise be feasible. This paper briefly discusses achieving a promising approach to transforming space systems planning and evolution into a meaningful stepping stone design, development, and implementation process. The success of this well planned and orchestrated approach holds great promise for achieving innovation and revolutionary technology development for supporting future exploration and development of space.

  8. Texas two-step: a framework for optimal multi-input single-output deconvolution.

    PubMed

    Neelamani, Ramesh; Deffenbaugh, Max; Baraniuk, Richard G

    2007-11-01

    Multi-input single-output deconvolution (MISO-D) aims to extract a deblurred estimate of a target signal from several blurred and noisy observations. This paper develops a new two step framework--Texas Two-Step--to solve MISO-D problems with known blurs. Texas Two-Step first reduces the MISO-D problem to a related single-input single-output deconvolution (SISO-D) problem by invoking the concept of sufficient statistics (SSs) and then solves the simpler SISO-D problem using an appropriate technique. The two-step framework enables new MISO-D techniques (both optimal and suboptimal) based on the rich suite of existing SISO-D techniques. In fact, the properties of SSs imply that a MISO-D algorithm is mean-squared-error optimal if and only if it can be rearranged to conform to the Texas Two-Step framework. Using this insight, we construct new wavelet- and curvelet-based MISO-D algorithms with asymptotically optimal performance. Simulated and real data experiments verify that the framework is indeed effective.

  9. Reversible logic gates based on enzyme-biocatalyzed reactions and realized in flow cells: a modular approach.

    PubMed

    Fratto, Brian E; Katz, Evgeny

    2015-05-18

    Reversible logic gates, such as the double Feynman gate, Toffoli gate and Peres gate, with 3-input/3-output channels are realized using reactions biocatalyzed with enzymes and performed in flow systems. The flow devices are constructed using a modular approach, where each flow cell is modified with one enzyme that biocatalyzes one chemical reaction. The multi-step processes mimicking the reversible logic gates are organized by combining the biocatalytic cells in different networks. This work emphasizes logical but not physical reversibility of the constructed systems. Their advantages and disadvantages are discussed and potential use in biosensing systems, rather than in computing devices, is suggested. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Multi-degree of freedom joystick for virtual reality simulation.

    PubMed

    Head, M J; Nelson, C A; Siu, K C

    2013-11-01

    A modular control interface and simulated virtual reality environment were designed and created in order to determine how the kinematic architecture of a control interface affects minimally invasive surgery training. A user is able to selectively determine the kinematic configuration of an input device (number, type and location of degrees of freedom) for a specific surgical simulation through the use of modular joints and constraint components. Furthermore, passive locking was designed and implemented through the use of inflated latex tubing around rotational joints in order to allow a user to step away from a simulation without unwanted tool motion. It is believed that these features will facilitate improved simulation of a variety of surgical procedures and, thus, improve surgical skills training.

  11. A Scalable and Robust Multi-Agent Approach to Distributed Optimization

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan

    2005-01-01

    Modularizing a large optimization problem so that the solutions to the subproblems provide a good overall solution is a challenging problem. In this paper we present a multi-agent approach to this problem based on aligning the agent objectives with the system objectives, obviating the need to impose external mechanisms to achieve collaboration among the agents. This approach naturally addresses scaling and robustness issues by ensuring that the agents do not rely on the reliable operation of other agents We test this approach in the difficult distributed optimization problem of imperfect device subset selection [Challet and Johnson, 2002]. In this problem, there are n devices, each of which has a "distortion", and the task is to find the subset of those n devices that minimizes the average distortion. Our results show that in large systems (1000 agents) the proposed approach provides improvements of over an order of magnitude over both traditional optimization methods and traditional multi-agent methods. Furthermore, the results show that even in extreme cases of agent failures (i.e., half the agents fail midway through the simulation) the system remains coordinated and still outperforms a failure-free and centralized optimization algorithm.

  12. A wireless modular multi-modal multi-node patch platform for robust biosignal monitoring.

    PubMed

    Pantelopoulos, Alexandros; Saldivar, Enrique; Roham, Masoud

    2011-01-01

    In this paper a wireless modular, multi-modal, multi-node patch platform is described. The platform comprises low-cost semi-disposable patch design aiming at unobtrusive ambulatory monitoring of multiple physiological parameters. Owing to its modular design it can be interfaced with various low-power RF communication and data storage technologies, while the data fusion of multi-modal and multi-node features facilitates measurement of several biosignals from multiple on-body locations for robust feature extraction. Preliminary results of the patch platform are presented which illustrate the capability to extract respiration rate from three different independent metrics, which combined together can give a more robust estimate of the actual respiratory rate.

  13. A simulator for surgery training: optimal sensory stimuli in a bone pinning simulation

    NASA Astrophysics Data System (ADS)

    Daenzer, Stefan; Fritzsche, Klaus

    2008-03-01

    Currently available low cost haptic devices allow inexpensive surgical training with no risk to patients. Major drawbacks of lower cost devices include limited maximum feedback force and the incapability to expose occurring moments. Aim of this work was the design and implementation of a surgical simulator that allows the evaluation of multi-sensory stimuli in order to overcome the occurring drawbacks. The simulator was built following a modular architecture to allow flexible combinations and thorough evaluation of different multi-sensory feedback modules. A Kirschner-Wire (K-Wire) tibial fracture fixation procedure was defined and implemented as a first test scenario. A set of computational metrics has been derived from the clinical requirements of the task to objectively assess the trainees performance during simulation. Sensory feedback modules for haptic and visual feedback have been developed, each in a basic and additionally in an enhanced form. First tests have shown that specific visual concepts can overcome some of the drawbacks coming along with low cost haptic devices. The simulator, the metrics and the surgery scenario together represent an important step towards a better understanding of the perception of multi-sensory feedback in complex surgical training tasks. Field studies on top of the architecture can open the way to risk-less and inexpensive surgical simulations that can keep up with traditional surgical training.

  14. Online Community Detection for Large Complex Networks

    PubMed Central

    Pan, Gang; Zhang, Wangsheng; Wu, Zhaohui; Li, Shijian

    2014-01-01

    Complex networks describe a wide range of systems in nature and society. To understand complex networks, it is crucial to investigate their community structure. In this paper, we develop an online community detection algorithm with linear time complexity for large complex networks. Our algorithm processes a network edge by edge in the order that the network is fed to the algorithm. If a new edge is added, it just updates the existing community structure in constant time, and does not need to re-compute the whole network. Therefore, it can efficiently process large networks in real time. Our algorithm optimizes expected modularity instead of modularity at each step to avoid poor performance. The experiments are carried out using 11 public data sets, and are measured by two criteria, modularity and NMI (Normalized Mutual Information). The results show that our algorithm's running time is less than the commonly used Louvain algorithm while it gives competitive performance. PMID:25061683

  15. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    NASA Astrophysics Data System (ADS)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis methodology for the PBMR to provide reference solutions. Investigation of different aspects of the coupled methodology and development of efficient kinetics treatment for the PBMR were carried out, which accounts for all feedback phenomena in an efficient manner. The OECD/NEA PBMR-400 coupled code benchmark was used as a test matrix for the proposed investigations. The integrated thermal-hydraulics and neutronics (multi-physics) methods were extended to enable modeling of a wider range of transients pertinent to the PBMR. First, the effect of the spatial mapping schemes (spatial coupling) was studied and quantified for different types of transients, which resulted in implementation of improved mapping methodology based on user defined criteria. The second aspect that was studied and optimized is the temporal coupling and meshing schemes between the neutronics and thermal-hydraulics time step selection algorithms. The coupled code convergence was achieved supplemented by application of methods to accelerate it. Finally, the modeling of all feedback phenomena in PBMRs was investigated and a novel treatment of cross-section dependencies was introduced for improving the representation of cross-section variations. The added benefit was that in the process of studying and improving the coupled multi-physics methodology more insight was gained into the physics and dynamics of PBMR, which will help also to optimize the PBMR design and improve its safety. One unique contribution of the PhD research is the investigation of the importance of the correct representation of the three-dimensional (3-D) effects in the PBMR analysis. The performed studies demonstrated that explicit 3-D modeling of control rod movement is superior and removes the errors associated with the grey curtain (2-D homogenized) approximation.

  16. A Multi-Scale, Multi-Physics Optimization Framework for Additively Manufactured Structural Components

    NASA Astrophysics Data System (ADS)

    El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel

    This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.

  17. A self optimizing synthetic organic reactor system using real-time in-line NMR spectroscopy.

    PubMed

    Sans, Victor; Porwol, Luzian; Dragone, Vincenza; Cronin, Leroy

    2015-02-01

    A configurable platform for synthetic chemistry incorporating an in-line benchtop NMR that is capable of monitoring and controlling organic reactions in real-time is presented. The platform is controlled via a modular LabView software control system for the hardware, NMR, data analysis and feedback optimization. Using this platform we report the real-time advanced structural characterization of reaction mixtures, including 19 F, 13 C, DEPT, 2D NMR spectroscopy (COSY, HSQC and 19 F-COSY) for the first time. Finally, the potential of this technique is demonstrated through the optimization of a catalytic organic reaction in real-time, showing its applicability to self-optimizing systems using criteria such as stereoselectivity, multi-nuclear measurements or 2D correlations.

  18. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    PubMed

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research.

  19. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    PubMed Central

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline for TMS-EEG signal processing, this toolbox intends to promote the widespread utility and standardization of an emerging technology in brain research. PMID:27774054

  20. Shortened OR time and decreased patient risk through use of a modular surgical instrument with artificial intelligence.

    PubMed

    Miller, David J; Nelson, Carl A; Oleynikov, Dmitry

    2009-05-01

    With a limited number of access ports, minimally invasive surgery (MIS) often requires the complete removal of one tool and reinsertion of another. Modular or multifunctional tools can be used to avoid this step. In this study, soft computing techniques are used to optimally arrange a modular tool's functional tips, allowing surgeons to deliver treatment of improved quality in less time, decreasing overall cost. The investigators watched University Medical Center surgeons perform MIS procedures (e.g., cholecystectomy and Nissen fundoplication) and recorded the procedures to digital video. The video was then used to analyze the types of instruments used, the duration of each use, and the function of each instrument. These data were aggregated with fuzzy logic techniques using four membership functions to quantify the overall usefulness of each tool. This allowed subsequent optimization of the arrangement of functional tips within the modular tool to decrease overall time spent changing instruments during simulated surgical procedures based on the video recordings. Based on a prototype and a virtual model of a multifunction laparoscopic tool designed by the investigators that can interchange six different instrument tips through the tool's shaft, the range of tool change times is approximately 11-13 s. Using this figure, estimated time savings for the procedures analyzed ranged from 2.5 to over 32 min, and on average, total surgery time can be reduced by almost 17% by using the multifunction tool.

  1. ISCE: A Modular, Reusable Library for Scalable SAR/InSAR Processing

    NASA Astrophysics Data System (ADS)

    Agram, P. S.; Lavalle, M.; Gurrola, E. M.; Sacco, G. F.; Rosen, P. A.

    2016-12-01

    Traditional community SAR/InSAR processing software tools have primarily focused on differential interferometry and Solid Earth applications. The InSAR Scientific Computing Environment (ISCE) was specifically designed to support the Earth Sciences user community as well as large scale operational processing tasks, thanks to its two-layered (Python+C/Fortran) architecture and modular framework. ISCE is freely distributed as a source tarball, allowing advanced users to modify and extend it for their research purposes and developing exploratory applications, while providing a relatively simple user interface for novice users to perform routine data analysis efficiently. Modular design of the ISCE library also enables easier development of applications to address the needs of Ecosystems, Cryosphere and Disaster Response communities in addition to the traditional Solid Earth applications. In this talk, we would like to emphasize the broader purview of the ISCE library and some of its unique features that sets it apart from other freely available community software like GMTSAR and DORIS, including: Support for multiple geometry regimes - Native Doppler (ALOS-1) as well Zero Doppler (ESA missions) systems. Support for data acquired by airborne platforms - e.g, JPL's UAVSAR and AirMOSS, DLR's F-SAR. Radiometric Terrain Correction - Auxiliary output layers from the geometry modules include projection angles, incidence angles, shadow-layover masks. Dense pixel offsets - Parallelized amplitude cross correlation for cryosphere / ionospheric correction applications. Rubber sheeting - Pixel-by-pixel offsets fields for resampling slave imagery for geometric co-registration/ ionospheric corrections. Preliminary Tandem-X processing support - Bistatic geometry modules. Extensibility to support other non-Solid Earth missions - Modules can be directly adopted for use with other SAR missions, e.g., SWOT. Preliminary support for multi-dimensional data products- multi-polarization, multi-frequency, multi-temporal, multi-baseline stacks via the PLANT and GIAnT toolboxes. Rapid prototyping - Geometry manipulation functionality at the python level allows users to prototype and test processing modules at the interpreter level before optimal implementation in C/C++/Fortran.

  2. MR CAT scan: a modular approach for hybrid imaging.

    PubMed

    Hillenbrand, C; Hahn, D; Haase, A; Jakob, P M

    2000-07-01

    In this study, a modular concept for NMR hybrid imaging is presented. This concept essentially integrates different imaging modules in a sequential fashion and is therefore called CAT (combined acquisition technique). CAT is not a single specific measurement sequence, but rather a sequence design concept whereby distinct acquisition techniques with varying imaging parameters are employed in rapid succession in order to cover k-space. The power of the CAT approach is that it provides a high flexibility toward the acquisition optimization with respect to the available imaging time and the desired image quality. Important CAT sequence optimization steps include the appropriate choice of the k-space coverage ratio and the application of mixed bandwidth technology. Details of both the CAT methodology and possible CAT acquisition strategies, such as FLASH/EPI-, RARE/EPI- and FLASH/BURST-CAT are provided. Examples from imaging experiments in phantoms and healthy volunteers including mixed bandwidth acquisitions are provided to demonstrate the feasibility of the proposed CAT concept.

  3. Coupling between a multi-physics workflow engine and an optimization framework

    NASA Astrophysics Data System (ADS)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  4. Multi-kilowatt modularized spacecraft power processing system development

    NASA Technical Reports Server (NTRS)

    Andrews, R. E.; Hayden, J. H.; Hedges, R. T.; Rehmann, D. W.

    1975-01-01

    A review of existing information pertaining to spacecraft power processing systems and equipment was accomplished with a view towards applicability to the modularization of multi-kilowatt power processors. Power requirements for future spacecraft were determined from the NASA mission model-shuttle systems payload data study which provided the limits for modular power equipment capabilities. Three power processing systems were compared to evaluation criteria to select the system best suited for modularity. The shunt regulated direct energy transfer system was selected by this analysis for a conceptual design effort which produced equipment specifications, schematics, envelope drawings, and power module configurations.

  5. Engineering genetic circuit interactions within and between synthetic minimal cells

    NASA Astrophysics Data System (ADS)

    Adamala, Katarzyna P.; Martin-Alarcon, Daniel A.; Guthrie-Honea, Katriona R.; Boyden, Edward S.

    2017-05-01

    Genetic circuits and reaction cascades are of great importance for synthetic biology, biochemistry and bioengineering. An open question is how to maximize the modularity of their design to enable the integration of different reaction networks and to optimize their scalability and flexibility. One option is encapsulation within liposomes, which enables chemical reactions to proceed in well-isolated environments. Here we adapt liposome encapsulation to enable the modular, controlled compartmentalization of genetic circuits and cascades. We demonstrate that it is possible to engineer genetic circuit-containing synthetic minimal cells (synells) to contain multiple-part genetic cascades, and that these cascades can be controlled by external signals as well as inter-liposomal communication without crosstalk. We also show that liposomes that contain different cascades can be fused in a controlled way so that the products of incompatible reactions can be brought together. Synells thus enable a more modular creation of synthetic biology cascades, an essential step towards their ultimate programmability.

  6. Multi-objective optimization for an automated and simultaneous phase and baseline correction of NMR spectral data

    NASA Astrophysics Data System (ADS)

    Sawall, Mathias; von Harbou, Erik; Moog, Annekathrin; Behrens, Richard; Schröder, Henning; Simoneau, Joël; Steimers, Ellen; Neymeyr, Klaus

    2018-04-01

    Spectral data preprocessing is an integral and sometimes inevitable part of chemometric analyses. For Nuclear Magnetic Resonance (NMR) spectra a possible first preprocessing step is a phase correction which is applied to the Fourier transformed free induction decay (FID) signal. This preprocessing step can be followed by a separate baseline correction step. Especially if series of high-resolution spectra are considered, then automated and computationally fast preprocessing routines are desirable. A new method is suggested that applies the phase and the baseline corrections simultaneously in an automated form without manual input, which distinguishes this work from other approaches. The underlying multi-objective optimization or Pareto optimization provides improved results compared to consecutively applied correction steps. The optimization process uses an objective function which applies strong penalty constraints and weaker regularization conditions. The new method includes an approach for the detection of zero baseline regions. The baseline correction uses a modified Whittaker smoother. The functionality of the new method is demonstrated for experimental NMR spectra. The results are verified against gravimetric data. The method is compared to alternative preprocessing tools. Additionally, the simultaneous correction method is compared to a consecutive application of the two correction steps.

  7. Classification of functional interactions from multi-electrodes data using conditional modularity analysis

    NASA Astrophysics Data System (ADS)

    Makhtar, Siti Noormiza; Senik, Mohd Harizal

    2018-02-01

    The availability of massive amount of neuronal signals are attracting widespread interest in functional connectivity analysis. Functional interactions estimated by multivariate partial coherence analysis in the frequency domain represent the connectivity strength in this study. Modularity is a network measure for the detection of community structure in network analysis. The discovery of community structure for the functional neuronal network was implemented on multi-electrode array (MEA) signals recorded from hippocampal regions in isoflurane-anaesthetized Lister-hooded rats. The analysis is expected to show modularity changes before and after local unilateral kainic acid (KA)-induced epileptiform activity. The result is presented using color-coded graphic of conditional modularity measure for 19 MEA nodes. This network is separated into four sub-regions to show the community detection within each sub-region. The results show that classification of neuronal signals into the inter- and intra-modular nodes is feasible using conditional modularity analysis. Estimation of segregation properties using conditional modularity analysis may provide further information about functional connectivity from MEA data.

  8. Intelligent Reconfigurable System with Self-Dammage Assessmentand Control Stress Capabilities

    NASA Astrophysics Data System (ADS)

    Trivailo, P.; Plotnikova, L.; Kao, T. W.

    2002-01-01

    Modern space structures are constructed using a modular approach that facilitates their transportation and assembly in space. Modular architecture of space structures also enables reconfiguration of large structures such that they can adapt to possible changes in environment, and also allows use of the limited structural resources available in space for completion of a much larger variety of tasks. An increase in size and complexity demands development of materials with a "smart" or active structural modulus and also of effective control algorithms to control the motion of large flexible structures. This challenging task has generated a lot of interest amongst scientists and engineers during the last two decades, however, research into the development of control schemes which can adapt to structural configuration changes has received less attention. This is possibly due to the increased complexity caused by alterations in geometry, which inevitably lead to changes in the dynamic properties of the system. This paper presents results of the application of a decentralized control approach for active control of large flexible structures undergoing significant reconfigurations. The Control Component Synthesis methodology was used to build controlled components and to assemble them into a controlled flexible structure that meets required performance specifications. To illustrate the efficiency of the method, numerical simulations were conducted for 2D and 3D modular truss structures and a multi-link beam system. In each case the performance of the decentralized control system has been evaluated using pole location maps, step and impulse response simulations and frequency response analysis. The performance of the decentralized control system has been measured against the optimal centralised control system for various excitation scenarios. A special case where one of the local component controllers fails was also examined. For better interpretation of the efficiency of the designed controllers, results of the simulations are illustrated using a Virtual Reality computer environment, offering advanced visual effects. Plotnikova@rmit.edu.au # Tsunwah@hotmail.com

  9. A proposal of optimal sampling design using a modularity strategy

    NASA Astrophysics Data System (ADS)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  10. Mechanical System Analysis/Design Tool (MSAT) Quick Guide

    NASA Technical Reports Server (NTRS)

    Lee, HauHua; Kolb, Mark; Madelone, Jack

    1998-01-01

    MSAT is a unique multi-component multi-disciplinary tool that organizes design analysis tasks around object-oriented representations of configuration components, analysis programs and modules, and data transfer links between them. This creative modular architecture enables rapid generation of input stream for trade-off studies of various engine configurations. The data transfer links automatically transport output from one application as relevant input to the next application once the sequence is set up by the user. The computations are managed via constraint propagation - the constraints supplied by the user as part of any optimization module. The software can be used in the preliminary design stage as well as during the detail design of product development process.

  11. Investigation of the binding properties of a multi-modular GH45 cellulase using bioinspired model assemblies.

    PubMed

    Fong, Monica; Berrin, Jean-Guy; Paës, Gabriel

    2016-01-01

    Enzymes degrading plant biomass polymers are widely used in biotechnological applications. Their efficiency can be limited by non-specific interactions occurring with some chemical motifs. In particular, the lignin component is known to bind enzymes irreversibly. In order to determine interactions of enzymes with their substrates, experiments are usually performed on isolated simple polymers which are not representative of plant cell wall complexity. But when using natural plant substrates, the role of individual chemical and structural features affecting enzyme-binding properties is also difficult to decipher. We have designed and used lignified model assemblies of plant cell walls as templates to characterize binding properties of multi-modular cellulases. These three-dimensional assemblies are modulated in their composition using the three principal polymers found in secondary plant cell walls (cellulose, hemicellulose, and lignin). Binding properties of enzymes are obtained from the measurement of their mobility that depends on their interactions with the polymers and chemical motifs of the assemblies. The affinity of the multi-modular GH45 cellulase was characterized using a statistical analysis to determine the role played by each assembly polymer. Presence of hemicellulose had much less impact on affinity than cellulose and model lignin. Depending on the number of CBMs appended to the cellulase catalytic core, binding properties toward cellulose and lignin were highly contrasted. Model assemblies bring new insights into the molecular determinants that are responsible for interactions between enzymes and substrate without the need of complex analysis. Consequently, we believe that model bioinspired assemblies will provide relevant information for the design and optimization of enzyme cocktails in the context of biorefineries.

  12. Advanced Modular Power Approach to Affordable, Supportable Space Systems

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Kimnach, Greg L.; Fincannon, James; Mckissock,, Barbara I.; Loyselle, Patricia L.; Wong, Edmond

    2013-01-01

    Recent studies of missions to the Moon, Mars and Near Earth Asteroids (NEA) indicate that these missions often involve several distinct separately launched vehicles that must ultimately be integrated together in-flight and operate as one unit. Therefore, it is important to see these vehicles as elements of a larger segmented spacecraft rather than separate spacecraft flying in formation. The evolution of large multi-vehicle exploration architecture creates the need (and opportunity) to establish a global power architecture that is common across all vehicles. The Advanced Exploration Systems (AES) Modular Power System (AMPS) project managed by NASA Glenn Research Center (GRC) is aimed at establishing the modular power system architecture that will enable power systems to be built from a common set of modular building blocks. The project is developing, demonstrating and evaluating key modular power technologies that are expected to minimize non-recurring development costs, reduce recurring integration costs, as well as, mission operational and support costs. Further, modular power is expected to enhance mission flexibility, vehicle reliability, scalability and overall mission supportability. The AMPS project not only supports multi-vehicle architectures but should enable multi-mission capability as well. The AMPS technology development involves near term demonstrations involving developmental prototype vehicles and field demonstrations. These operational demonstrations not only serve as a means of evaluating modular technology but also provide feedback to developers that assure that they progress toward truly flexible and operationally supportable modular power architecture.

  13. CXTFIT/Excel A modular adaptable code for parameter estimation, sensitivity analysis and uncertainty analysis for laboratory or field tracer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Guoping; Mayes, Melanie; Parker, Jack C

    2010-01-01

    We implemented the widely used CXTFIT code in Excel to provide flexibility and added sensitivity and uncertainty analysis functions to improve transport parameter estimation and to facilitate model discrimination for multi-tracer experiments on structured soils. Analytical solutions for one-dimensional equilibrium and nonequilibrium convection dispersion equations were coded as VBA functions so that they could be used as ordinary math functions in Excel for forward predictions. Macros with user-friendly interfaces were developed for optimization, sensitivity analysis, uncertainty analysis, error propagation, response surface calculation, and Monte Carlo analysis. As a result, any parameter with transformations (e.g., dimensionless, log-transformed, species-dependent reactions, etc.) couldmore » be estimated with uncertainty and sensitivity quantification for multiple tracer data at multiple locations and times. Prior information and observation errors could be incorporated into the weighted nonlinear least squares method with a penalty function. Users are able to change selected parameter values and view the results via embedded graphics, resulting in a flexible tool applicable to modeling transport processes and to teaching students about parameter estimation. The code was verified by comparing to a number of benchmarks with CXTFIT 2.0. It was applied to improve parameter estimation for four typical tracer experiment data sets in the literature using multi-model evaluation and comparison. Additional examples were included to illustrate the flexibilities and advantages of CXTFIT/Excel. The VBA macros were designed for general purpose and could be used for any parameter estimation/model calibration when the forward solution is implemented in Excel. A step-by-step tutorial, example Excel files and the code are provided as supplemental material.« less

  14. Efficient Inversion of Mult-frequency and Multi-Source Electromagnetic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gary D. Egbert

    2007-03-22

    The project covered by this report focused on development of efficient but robust non-linear inversion algorithms for electromagnetic induction data, in particular for data collected with multiple receivers, and multiple transmitters, a situation extremely common in eophysical EM subsurface imaging methods. A key observation is that for such multi-transmitter problems each step in commonly used linearized iterative limited memory search schemes such as conjugate gradients (CG) requires solution of forward and adjoint EM problems for each of the N frequencies or sources, essentially generating data sensitivities for an N dimensional data-subspace. These multiple sensitivities allow a good approximation to themore » full Jacobian of the data mapping to be built up in many fewer search steps than would be required by application of textbook optimization methods, which take no account of the multiplicity of forward problems that must be solved for each search step. We have applied this idea to a develop a hybrid inversion scheme that combines features of the iterative limited memory type methods with a Newton-type approach using a partial calculation of the Jacobian. Initial tests on 2D problems show that the new approach produces results essentially identical to a Newton type Occam minimum structure inversion, while running more rapidly than an iterative (fixed regularization parameter) CG style inversion. Memory requirements, while greater than for something like CG, are modest enough that even in 3D the scheme should allow 3D inverse problems to be solved on a common desktop PC, at least for modest (~ 100 sites, 15-20 frequencies) data sets. A secondary focus of the research has been development of a modular system for EM inversion, using an object oriented approach. This system has proven useful for more rapid prototyping of inversion algorithms, in particular allowing initial development and testing to be conducted with two-dimensional example problems, before approaching more computationally cumbersome three-dimensional problems.« less

  15. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  16. A Grey Wolf Optimizer for Modular Granular Neural Networks for Human Recognition

    PubMed Central

    Sánchez, Daniela; Melin, Patricia

    2017-01-01

    A grey wolf optimizer for modular neural network (MNN) with a granular approach is proposed. The proposed method performs optimal granulation of data and design of modular neural networks architectures to perform human recognition, and to prove its effectiveness benchmark databases of ear, iris, and face biometric measures are used to perform tests and comparisons against other works. The design of a modular granular neural network (MGNN) consists in finding optimal parameters of its architecture; these parameters are the number of subgranules, percentage of data for the training phase, learning algorithm, goal error, number of hidden layers, and their number of neurons. Nowadays, there is a great variety of approaches and new techniques within the evolutionary computing area, and these approaches and techniques have emerged to help find optimal solutions to problems or models and bioinspired algorithms are part of this area. In this work a grey wolf optimizer is proposed for the design of modular granular neural networks, and the results are compared against a genetic algorithm and a firefly algorithm in order to know which of these techniques provides better results when applied to human recognition. PMID:28894461

  17. A Grey Wolf Optimizer for Modular Granular Neural Networks for Human Recognition.

    PubMed

    Sánchez, Daniela; Melin, Patricia; Castillo, Oscar

    2017-01-01

    A grey wolf optimizer for modular neural network (MNN) with a granular approach is proposed. The proposed method performs optimal granulation of data and design of modular neural networks architectures to perform human recognition, and to prove its effectiveness benchmark databases of ear, iris, and face biometric measures are used to perform tests and comparisons against other works. The design of a modular granular neural network (MGNN) consists in finding optimal parameters of its architecture; these parameters are the number of subgranules, percentage of data for the training phase, learning algorithm, goal error, number of hidden layers, and their number of neurons. Nowadays, there is a great variety of approaches and new techniques within the evolutionary computing area, and these approaches and techniques have emerged to help find optimal solutions to problems or models and bioinspired algorithms are part of this area. In this work a grey wolf optimizer is proposed for the design of modular granular neural networks, and the results are compared against a genetic algorithm and a firefly algorithm in order to know which of these techniques provides better results when applied to human recognition.

  18. Multi objective multi refinery optimization with environmental and catastrophic failure effects objectives

    NASA Astrophysics Data System (ADS)

    Khogeer, Ahmed Sirag

    2005-11-01

    Petroleum refining is a capital-intensive business. With stringent environmental regulations on the processing industry and declining refining margins, political instability, increased risk of war and terrorist attacks in which refineries and fuel transportation grids may be targeted, higher pressures are exerted on refiners to optimize performance and find the best combination of feed and processes to produce salable products that meet stricter product specifications, while at the same time meeting refinery supply commitments and of course making profit. This is done through multi objective optimization. For corporate refining companies and at the national level, Intea-Refinery and Inter-Refinery optimization is the second step in optimizing the operation of the whole refining chain as a single system. Most refinery-wide optimization methods do not cover multiple objectives such as minimizing environmental impact, avoiding catastrophic failures, or enhancing product spec upgrade effects. This work starts by carrying out a refinery-wide, single objective optimization, and then moves to multi objective-single refinery optimization. The last step is multi objective-multi refinery optimization, the objectives of which are analysis of the effects of economic, environmental, product spec, strategic, and catastrophic failure. Simulation runs were carried out using both MATLAB and ASPEN PIMS utilizing nonlinear techniques to solve the optimization problem. The results addressed the need to debottleneck some refineries or transportation media in order to meet the demand for essential products under partial or total failure scenarios. They also addressed how importing some high spec products can help recover some of the losses and what is needed in order to accomplish this. In addition, the results showed nonlinear relations among local and global objectives for some refineries. The results demonstrate that refineries can have a local multi objective optimum that does not follow the same trends as either global or local single objective optimums. Catastrophic failure effects on refinery operations and on local objectives are more significant than environmental objective effects, and changes in the capacity or the local objectives follow a discrete behavioral pattern, in contrast to environmental objective cases in which the effects are smoother. (Abstract shortened by UMI.)

  19. Further optimization of SeDDaRA blind image deconvolution algorithm and its DSP implementation

    NASA Astrophysics Data System (ADS)

    Wen, Bo; Zhang, Qiheng; Zhang, Jianlin

    2011-11-01

    Efficient algorithm for blind image deconvolution and its high-speed implementation is of great value in practice. Further optimization of SeDDaRA is developed, from algorithm structure to numerical calculation methods. The main optimization covers that, the structure's modularization for good implementation feasibility, reducing the data computation and dependency of 2D-FFT/IFFT, and acceleration of power operation by segmented look-up table. Then the Fast SeDDaRA is proposed and specialized for low complexity. As the final implementation, a hardware system of image restoration is conducted by using the multi-DSP parallel processing. Experimental results show that, the processing time and memory demand of Fast SeDDaRA decreases 50% at least; the data throughput of image restoration system is over 7.8Msps. The optimization is proved efficient and feasible, and the Fast SeDDaRA is able to support the real-time application.

  20. Facile "modular assembly" for fast construction of a highly oriented crystalline MOF nanofilm.

    PubMed

    Xu, Gang; Yamada, Teppei; Otsubo, Kazuya; Sakaida, Shun; Kitagawa, Hiroshi

    2012-10-10

    The preparation of crystalline, ordered thin films of metal-organic frameworks (MOFs) will be a critical process for MOF-based nanodevices in the future. MOF thin films with perfect orientation and excellent crystallinity were formed with novel nanosheet-structured components, Cu-TCPP [TCPP = 5,10,15,20-tetrakis(4-carboxyphenyl)porphyrin], by a new "modular assembly" strategy. The modular assembly process involves two steps: a "modularization" step is used to synthesize highly crystalline "modules" with a nanosized structure that can be conveniently assembled into a thin film in the following "assembly" step. With this method, MOF thin films can easily be set up on different substrates at very high speed with controllable thickness. This new approach also enabled us to prepare highly oriented crystalline thin films of MOFs that cannot be prepared in thin-film form by traditional techniques.

  1. Adaptable, modular, multi-purpose space vehicle backplane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judd, Stephen; Dallmann, Nicholas; McCabe, Kevin

    An adaptable, modular, multi-purpose (AMM) space vehicle backplane may accommodate boards and components for various missions. The AMM backplane may provide a common hardware interface and common board-to-board communications. Components, connectors, test points, and sensors may be embedded directly into the backplane to provide additional functionality, diagnostics, and system access. Other space vehicle sections may plug directly into the backplane.

  2. Ultra-fast consensus of discrete-time multi-agent systems with multi-step predictive output feedback

    NASA Astrophysics Data System (ADS)

    Zhang, Wenle; Liu, Jianchang

    2016-04-01

    This article addresses the ultra-fast consensus problem of high-order discrete-time multi-agent systems based on a unified consensus framework. A novel multi-step predictive output mechanism is proposed under a directed communication topology containing a spanning tree. By predicting the outputs of a network several steps ahead and adding this information into the consensus protocol, it is shown that the asymptotic convergence factor is improved by a power of q + 1 compared to the routine consensus. The difficult problem of selecting the optimal control gain is solved well by introducing a variable called convergence step. In addition, the ultra-fast formation achievement is studied on the basis of this new consensus protocol. Finally, the ultra-fast consensus with respect to a reference model and robust consensus is discussed. Some simulations are performed to illustrate the effectiveness of the theoretical results.

  3. From Structure to Function: A Comprehensive Compendium of Tools to Unveil Protein Domains and Understand Their Role in Cytokinesis.

    PubMed

    Rincon, Sergio A; Paoletti, Anne

    2016-01-01

    Unveiling the function of a novel protein is a challenging task that requires careful experimental design. Yeast cytokinesis is a conserved process that involves modular structural and regulatory proteins. For such proteins, an important step is to identify their domains and structural organization. Here we briefly discuss a collection of methods commonly used for sequence alignment and prediction of protein structure that represent powerful tools for the identification homologous domains and design of structure-function approaches to test experimentally the function of multi-domain proteins such as those implicated in yeast cytokinesis.

  4. An Investigation to Manufacturing Analytical Services Composition using the Analytical Target Cascading Method.

    PubMed

    Tien, Kai-Wen; Kulvatunyou, Boonserm; Jung, Kiwook; Prabhu, Vittaldas

    2017-01-01

    As cloud computing is increasingly adopted, the trend is to offer software functions as modular services and compose them into larger, more meaningful ones. The trend is attractive to analytical problems in the manufacturing system design and performance improvement domain because 1) finding a global optimization for the system is a complex problem; and 2) sub-problems are typically compartmentalized by the organizational structure. However, solving sub-problems by independent services can result in a sub-optimal solution at the system level. This paper investigates the technique called Analytical Target Cascading (ATC) to coordinate the optimization of loosely-coupled sub-problems, each may be modularly formulated by differing departments and be solved by modular analytical services. The result demonstrates that ATC is a promising method in that it offers system-level optimal solutions that can scale up by exploiting distributed and modular executions while allowing easier management of the problem formulation.

  5. Multi-objective design optimization of antenna structures using sequential domain patching with automated patch size determination

    NASA Astrophysics Data System (ADS)

    Koziel, Slawomir; Bekasiewicz, Adrian

    2018-02-01

    In this article, a simple yet efficient and reliable technique for fully automated multi-objective design optimization of antenna structures using sequential domain patching (SDP) is discussed. The optimization procedure according to SDP is a two-step process: (i) obtaining the initial set of Pareto-optimal designs representing the best possible trade-offs between considered conflicting objectives, and (ii) Pareto set refinement for yielding the optimal designs at the high-fidelity electromagnetic (EM) simulation model level. For the sake of computational efficiency, the first step is realized at the level of a low-fidelity (coarse-discretization) EM model by sequential construction and relocation of small design space segments (patches) in order to create a path connecting the extreme Pareto front designs obtained beforehand. The second stage involves response correction techniques and local response surface approximation models constructed by reusing EM simulation data acquired in the first step. A major contribution of this work is an automated procedure for determining the patch dimensions. It allows for appropriate selection of the number of patches for each geometry variable so as to ensure reliability of the optimization process while maintaining its low cost. The importance of this procedure is demonstrated by comparing it with uniform patch dimensions.

  6. Optimal Network Modularity for Information Diffusion

    NASA Astrophysics Data System (ADS)

    Nematzadeh, Azadeh; Ferrara, Emilio; Flammini, Alessandro; Ahn, Yong-Yeol

    2014-08-01

    We investigate the impact of community structure on information diffusion with the linear threshold model. Our results demonstrate that modular structure may have counterintuitive effects on information diffusion when social reinforcement is present. We show that strong communities can facilitate global diffusion by enhancing local, intracommunity spreading. Using both analytic approaches and numerical simulations, we demonstrate the existence of an optimal network modularity, where global diffusion requires the minimal number of early adopters.

  7. Modular Subsea Monitoring Network (MSM) - Realizing Integrated Environmental Monitoring Solutions

    NASA Astrophysics Data System (ADS)

    Mosch, Thomas; Fietzek, Peer

    2016-04-01

    In a variety of scientific and industrial application areas, ranging i.e. from the supervision of hydrate fields over the detection and localization of fugitive emissions from subsea oil and gas production to fish farming, fixed point observatories are useful and applied means. They monitor the water column and/or are placed at the sea floor over long periods of time. They are essential oceanographic platforms for providing valuable long-term time series data and multi-parameter measurements. Various mooring and observatory endeavors world-wide contribute valuable data needed for understanding our planet's ocean systems and biogeochemical processes. Continuously powered cabled observatories enable real-time data transmission from spots of interest close to the shore or to ocean infrastructures. Independent of the design of the observatories they all rely on sensors which demands for regular maintenance. This work is in most cases associated with cost-intensive maintenance on a regular time basis for the entire sensor carrying fixed platform. It is mandatory to encounter this asset for long-term monitoring by enhancing hardware efficiency. On the basis of two examples of use from the area of hydrate monitoring (off Norway and Japan) we will present the concept of the Modular Subsea Monitoring Network (MSM). The modular, scalable and networking capabilities of the MSM allow for an easy adaptation to different monitoring tasks. Providing intelligent power management, combining chemical and acoustical sensors, adaptation of the payload according to the monitoring tasks, autonomous powering, modular design for easy transportation, storage and mobilization, Vessel of Opportunity-borne launching and recovery capability with a video-guided launcher system and a rope recovery system are key facts addressed during the development of the MSM. Step by step the MSM concept applied to the observatory hardware will also be extended towards the gathered data to maximize the efficiency of subsea monitoring in a variety of applications.

  8. Multi-objective optimization of process parameters of multi-step shaft formed with cross wedge rolling based on orthogonal test

    NASA Astrophysics Data System (ADS)

    Han, S. T.; Shu, X. D.; Shchukin, V.; Kozhevnikova, G.

    2018-06-01

    In order to achieve reasonable process parameters in forming multi-step shaft by cross wedge rolling, the research studied the rolling-forming process multi-step shaft on the DEFORM-3D finite element software. The interactive orthogonal experiment was used to study the effect of the eight parameters, the first section shrinkage rate φ1, the first forming angle α1, the first spreading angle β1, the first spreading length L1, the second section shrinkage rate φ2, the second forming angle α2, the second spreading angle β2 and the second spreading length L2, on the quality of shaft end and the microstructure uniformity. By using the fuzzy mathematics comprehensive evaluation method and the extreme difference analysis, the influence degree of the process parameters on the quality of the multi-step shaft is obtained: β2>φ2L1>α1>β1>φ1>α2L2. The results of the study can provide guidance for obtaining multi-stepped shaft with high mechanical properties and achieving near net forming without stub bar in cross wedge rolling.

  9. Low Earth Orbital Mission Aboard the Space Test Experiments Platform (STEP-3)

    NASA Technical Reports Server (NTRS)

    Brinza, David E.

    1992-01-01

    A discussion of the Space Active Modular Materials Experiments (SAMMES) is presented in vugraph form. The discussion is divided into three sections: (1) a description of SAMMES; (2) a SAMMES/STEP-3 mission overview; and (3) SAMMES follow on efforts. The SAMMES/STEP-3 mission objectives are as follows: assess LEO space environmental effects on SDIO materials; quantify orbital and local environments; and demonstrate the modular experiment concept.

  10. A new multi-scale method to reveal hierarchical modular structures in biological networks.

    PubMed

    Jiao, Qing-Ju; Huang, Yan; Shen, Hong-Bin

    2016-11-15

    Biological networks are effective tools for studying molecular interactions. Modular structure, in which genes or proteins may tend to be associated with functional modules or protein complexes, is a remarkable feature of biological networks. Mining modular structure from biological networks enables us to focus on a set of potentially important nodes, which provides a reliable guide to future biological experiments. The first fundamental challenge in mining modular structure from biological networks is that the quality of the observed network data is usually low owing to noise and incompleteness in the obtained networks. The second problem that poses a challenge to existing approaches to the mining of modular structure is that the organization of both functional modules and protein complexes in networks is far more complicated than was ever thought. For instance, the sizes of different modules vary considerably from each other and they often form multi-scale hierarchical structures. To solve these problems, we propose a new multi-scale protocol for mining modular structure (named ISIMB) driven by a node similarity metric, which works in an iteratively converged space to reduce the effects of the low data quality of the observed network data. The multi-scale node similarity metric couples both the local and the global topology of the network with a resolution regulator. By varying this resolution regulator to give different weightings to the local and global terms in the metric, the ISIMB method is able to fit the shape of modules and to detect them on different scales. Experiments on protein-protein interaction and genetic interaction networks show that our method can not only mine functional modules and protein complexes successfully, but can also predict functional modules from specific to general and reveal the hierarchical organization of protein complexes.

  11. Optimal design of an alignment-free two-DOF rehabilitation robot for the shoulder complex.

    PubMed

    Galinski, Daniel; Sapin, Julien; Dehez, Bruno

    2013-06-01

    This paper presents the optimal design of an alignment-free exoskeleton for the rehabilitation of the shoulder complex. This robot structure is constituted of two actuated joints and is linked to the arm through passive degrees of freedom (DOFs) to drive the flexion-extension and abduction-adduction movements of the upper arm. The optimal design of this structure is performed through two steps. The first step is a multi-objective optimization process aiming to find the best parameters characterizing the robot and its position relative to the patient. The second step is a comparison process aiming to select the best solution from the optimization results on the basis of several criteria related to practical considerations. The optimal design process leads to a solution outperforming an existing solution on aspects as kinematics or ergonomics while being more simple.

  12. Modular injector integrated linear apparatus with motion profile optimization for spatial atomic layer deposition.

    PubMed

    Wang, Xiaolei; Li, Yun; Lin, Jilong; Shan, Bin; Chen, Rong

    2017-11-01

    A spatial atomic layer deposition apparatus integrated with a modular injector and a linear motor has been designed. It consists of four parts: a precursor delivery manifold, a modular injector, a reaction zone, and a driving unit. An injector with multi-layer structured channels is designed to help improve precursor distribution homogeneity. During the back and forth movement of the substrate at high speed, the inertial impact caused by jerk and sudden changes of acceleration will degrade the film deposition quality. Such residual vibration caused by inertial impact will aggravate the fluctuation of the gap distance between the injector and the substrate in the deposition process. Thus, an S-curve motion profile is implemented to reduce the large inertial impact, and the maximum position error could be reduced by 84%. The microstructure of the film under the S-curve motion profile shows smaller root-mean-square and scanning voltage amplitude under an atomic force microscope, which verifies the effectiveness of the S-curve motion profile in reducing the residual vibration and stabilizing the gap distance between the injector and the substrate. The film deposition rate could reach 100 nm/min while maintaining good uniformity without obvious periodic patterns on the surface.

  13. Modular injector integrated linear apparatus with motion profile optimization for spatial atomic layer deposition

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolei; Li, Yun; Lin, Jilong; Shan, Bin; Chen, Rong

    2017-11-01

    A spatial atomic layer deposition apparatus integrated with a modular injector and a linear motor has been designed. It consists of four parts: a precursor delivery manifold, a modular injector, a reaction zone, and a driving unit. An injector with multi-layer structured channels is designed to help improve precursor distribution homogeneity. During the back and forth movement of the substrate at high speed, the inertial impact caused by jerk and sudden changes of acceleration will degrade the film deposition quality. Such residual vibration caused by inertial impact will aggravate the fluctuation of the gap distance between the injector and the substrate in the deposition process. Thus, an S-curve motion profile is implemented to reduce the large inertial impact, and the maximum position error could be reduced by 84%. The microstructure of the film under the S-curve motion profile shows smaller root-mean-square and scanning voltage amplitude under an atomic force microscope, which verifies the effectiveness of the S-curve motion profile in reducing the residual vibration and stabilizing the gap distance between the injector and the substrate. The film deposition rate could reach 100 nm/min while maintaining good uniformity without obvious periodic patterns on the surface.

  14. Standardized Modular Power Interfaces for Future Space Explorations Missions

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard

    2015-01-01

    Earlier studies show that future human explorations missions are composed of multi-vehicle assemblies with interconnected electric power systems. Some vehicles are often intended to serve as flexible multi-purpose or multi-mission platforms. This drives the need for power architectures that can be reconfigured to support this level of flexibility. Power system developmental costs can be reduced, program wide, by utilizing a common set of modular building blocks. Further, there are mission operational and logistics cost benefits of using a common set of modular spares. These benefits are the goals of the Advanced Exploration Systems (AES) Modular Power System (AMPS) project. A common set of modular blocks requires a substantial level of standardization in terms of the Electrical, Data System, and Mechanical interfaces. The AMPS project is developing a set of proposed interface standards that will provide useful guidance for modular hardware developers but not needlessly constrain technology options, or limit future growth in capability. In 2015 the AMPS project focused on standardizing the interfaces between the elements of spacecraft power distribution and energy storage. The development of the modular power standard starts with establishing mission assumptions and ground rules to define design application space. The standards are defined in terms of AMPS objectives including Commonality, Reliability-Availability, Flexibility-Configurability and Supportability-Reusability. The proposed standards are aimed at assembly and sub-assembly level building blocks. AMPS plans to adopt existing standards for spacecraft command and data, software, network interfaces, and electrical power interfaces where applicable. Other standards including structural encapsulation, heat transfer, and fluid transfer, are governed by launch and spacecraft environments and bound by practical limitations of weight and volume. Developing these mechanical interface standards is more difficult but an essential part of defining physical building blocks of modular power. This presentation describes the AMPS projects progress towards standardized modular power interfaces.

  15. Modular, Reconfigurable, High-Energy Systems Stepping Stones

    NASA Technical Reports Server (NTRS)

    Howell, Joe T.; Carrington, Connie K.; Mankins, John C.

    2005-01-01

    Modular, Reconfigurable, High-Energy Systems are Stepping Stones to provide capabilities for energy-rich infrastructure strategically located in space to support a variety of exploration scenarios. Abundant renewable energy at lunar or L1 locations could support propellant production and storage in refueling scenarios that enable affordable exploration. Renewable energy platforms in geosynchronous Earth orbits can collect and transmit power to satellites, or to Earth-surface locations. Energy-rich space technologies also enable the use of electric-powered propulsion systems that could efficiently deliver cargo and exploration facilities to remote locations. A first step to an energy-rich space infrastructure is a 100-kWe class solar-powered platform in Earth orbit. The platform would utilize advanced technologies in solar power collection and generation, power management and distribution, thermal management, and electric propulsion. It would also provide a power-rich free-flying platform to demonstrate in space a portfolio of technology flight experiments. This paper presents a preliminary design concept for a 100-kWe solar-powered satellite with the capability to flight-demonstrate a variety of payload experiments and to utilize electric propulsion. State-of-the-art solar concentrators, highly efficient multi-junction solar cells, integrated thermal management on the arrays, and innovative deployable structure design and packaging make the 100-kW satellite feasible for launch on one existing launch vehicle. Higher voltage arrays and power management and distribution (PMAD) systems reduce or eliminate the need for massive power converters, and could enable direct- drive of high-voltage solar electric thrusters.

  16. Modular, Reconfigurable, and Rapid Response Space Systems: The Remote Sensing Advanced Technology Microsatellite

    NASA Technical Reports Server (NTRS)

    Esper, Jaime; Andary, Jim; Oberright, John; So, Maria; Wegner, Peter; Hauser, Joe

    2004-01-01

    Modular, Reconfigurable, and Rapid-response (MR(sup 2)) space systems represent a paradigm shift in the way space assets of all sizes are designed, manufactured, integrated, tested, and flown. This paper will describe the MR(sup 2) paradigm in detail, and will include guidelines for its implementation. The Remote Sensing Advanced Technology microsatellite (RSAT) is a proposed flight system test-bed used for developing and implementing principles and best practices for MR(sup 2) spacecraft, and their supporting infrastructure. The initial goal of this test-bed application is to produce a lightweight (approx. 100 kg), production-minded, cost-effective, and scalable remote sensing micro-satellite capable of high performance and broad applicability. Such applications range from future distributed space systems, to sensor-webs, and rapid-response satellite systems. Architectures will be explored that strike a balance between modularity and integration while preserving the MR(sup 2) paradigm. Modularity versus integration has always been a point of contention when approaching a design: whereas one-of-a-kind missions may require close integration resulting in performance optimization, multiple and flexible application spacecraft benefit &om modularity, resulting in maximum flexibility. The process of building spacecraft rapidly (< 7 days), requires a concerted and methodical look at system integration and test processes and pitfalls. Although the concept of modularity is not new and was first developed in the 1970s by NASA's Goddard Space Flight Center (Multi-Mission Modular Spacecraft), it was never modernized and was eventually abandoned. Such concepts as the Rapid Spacecraft Development Office (RSDO) became the preferred method for acquiring satellites. Notwithstanding, over the past 30 years technology has advanced considerably, and the time is ripe to reconsider modularity in its own right, as enabler of R(sup 2), and as a key element of transformational systems. The MR2 architecture provides a competitive advantage over the old modular approach in its rapid response to market needs that are difficult to predict both from the perspectives of evolving technology, as well as mission and application requirements.

  17. A self optimizing synthetic organic reactor system using real-time in-line NMR spectroscopy† †Electronic supplementary information (ESI) available: Details about the methodology, LabView scripts, experimental set-ups, additional spectra and self-optimization can be found in the SI. See DOI: 10.1039/c4sc03075c Click here for additional data file.

    PubMed Central

    Sans, Victor; Porwol, Luzian; Dragone, Vincenza

    2015-01-01

    A configurable platform for synthetic chemistry incorporating an in-line benchtop NMR that is capable of monitoring and controlling organic reactions in real-time is presented. The platform is controlled via a modular LabView software control system for the hardware, NMR, data analysis and feedback optimization. Using this platform we report the real-time advanced structural characterization of reaction mixtures, including 19F, 13C, DEPT, 2D NMR spectroscopy (COSY, HSQC and 19F-COSY) for the first time. Finally, the potential of this technique is demonstrated through the optimization of a catalytic organic reaction in real-time, showing its applicability to self-optimizing systems using criteria such as stereoselectivity, multi-nuclear measurements or 2D correlations. PMID:29560211

  18. Analysis of In-Space Assembly of Modular Systems

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; VanLaak, James; Johnson, Spencer L.; Chytka, Trina M.; Reeves, John D.; Todd, B. Keith; Moe, Rud V.; Stambolian, Damon B.

    2005-01-01

    Early system-level life cycle assessments facilitate cost effective optimization of system architectures to enable implementation of both modularity and in-space assembly, two key Exploration Systems Research & Technology (ESR&T) Strategic Challenges. Experiences with the International Space Station (ISS) demonstrate that the absence of this rigorous analysis can result in increased cost and operational risk. An effort is underway, called Analysis of In-Space Assembly of Modular Systems, to produce an innovative analytical methodology, including an evolved analysis toolset and proven processes in a collaborative engineering environment, to support the design and evaluation of proposed concepts. The unique aspect of this work is that it will produce the toolset, techniques and initial products to analyze and compare the detailed, life cycle costs and performance of different implementations of modularity for in-space assembly. A multi-Center team consisting of experienced personnel from the Langley Research Center, Johnson Space Center, Kennedy Space Center, and the Goddard Space Flight Center has been formed to bring their resources and experience to this development. At the end of this 30-month effort, the toolset will be ready to support the Exploration Program with an integrated assessment strategy that embodies all life-cycle aspects of the mission from design and manufacturing through operations to enable early and timely selection of an optimum solution among many competing alternatives. Already there are many different designs for crewed missions to the Moon that present competing views of modularity requiring some in-space assembly. The purpose of this paper is to highlight the approach for scoring competing designs.

  19. Multidisciplinary Analysis and Optimization Generation 1 and Next Steps

    NASA Technical Reports Server (NTRS)

    Naiman, Cynthia Gutierrez

    2008-01-01

    The Multidisciplinary Analysis & Optimization Working Group (MDAO WG) of the Systems Analysis Design & Optimization (SAD&O) discipline in the Fundamental Aeronautics Program s Subsonic Fixed Wing (SFW) project completed three major milestones during Fiscal Year (FY)08: "Requirements Definition" Milestone (1/31/08); "GEN 1 Integrated Multi-disciplinary Toolset" (Annual Performance Goal) (6/30/08); and "Define Architecture & Interfaces for Next Generation Open Source MDAO Framework" Milestone (9/30/08). Details of all three milestones are explained including documentation available, potential partner collaborations, and next steps in FY09.

  20. A modular method for the extraction of DNA and RNA, and the separation of DNA pools from diverse environmental sample types

    PubMed Central

    Lever, Mark A.; Torti, Andrea; Eickenbusch, Philip; Michaud, Alexander B.; Šantl-Temkiv, Tina; Jørgensen, Bo Barker

    2015-01-01

    A method for the extraction of nucleic acids from a wide range of environmental samples was developed. This method consists of several modules, which can be individually modified to maximize yields in extractions of DNA and RNA or separations of DNA pools. Modules were designed based on elaborate tests, in which permutations of all nucleic acid extraction steps were compared. The final modular protocol is suitable for extractions from igneous rock, air, water, and sediments. Sediments range from high-biomass, organic rich coastal samples to samples from the most oligotrophic region of the world's oceans and the deepest borehole ever studied by scientific ocean drilling. Extraction yields of DNA and RNA are higher than with widely used commercial kits, indicating an advantage to optimizing extraction procedures to match specific sample characteristics. The ability to separate soluble extracellular DNA pools without cell lysis from intracellular and particle-complexed DNA pools may enable new insights into the cycling and preservation of DNA in environmental samples in the future. A general protocol is outlined, along with recommendations for optimizing this general protocol for specific sample types and research goals. PMID:26042110

  1. Multi-objective evolutionary algorithms for fuzzy classification in survival prediction.

    PubMed

    Jiménez, Fernando; Sánchez, Gracia; Juárez, José M

    2014-03-01

    This paper presents a novel rule-based fuzzy classification methodology for survival/mortality prediction in severe burnt patients. Due to the ethical aspects involved in this medical scenario, physicians tend not to accept a computer-based evaluation unless they understand why and how such a recommendation is given. Therefore, any fuzzy classifier model must be both accurate and interpretable. The proposed methodology is a three-step process: (1) multi-objective constrained optimization of a patient's data set, using Pareto-based elitist multi-objective evolutionary algorithms to maximize accuracy and minimize the complexity (number of rules) of classifiers, subject to interpretability constraints; this step produces a set of alternative (Pareto) classifiers; (2) linguistic labeling, which assigns a linguistic label to each fuzzy set of the classifiers; this step is essential to the interpretability of the classifiers; (3) decision making, whereby a classifier is chosen, if it is satisfactory, according to the preferences of the decision maker. If no classifier is satisfactory for the decision maker, the process starts again in step (1) with a different input parameter set. The performance of three multi-objective evolutionary algorithms, niched pre-selection multi-objective algorithm, elitist Pareto-based multi-objective evolutionary algorithm for diversity reinforcement (ENORA) and the non-dominated sorting genetic algorithm (NSGA-II), was tested using a patient's data set from an intensive care burn unit and a standard machine learning data set from an standard machine learning repository. The results are compared using the hypervolume multi-objective metric. Besides, the results have been compared with other non-evolutionary techniques and validated with a multi-objective cross-validation technique. Our proposal improves the classification rate obtained by other non-evolutionary techniques (decision trees, artificial neural networks, Naive Bayes, and case-based reasoning) obtaining with ENORA a classification rate of 0.9298, specificity of 0.9385, and sensitivity of 0.9364, with 14.2 interpretable fuzzy rules on average. Our proposal improves the accuracy and interpretability of the classifiers, compared with other non-evolutionary techniques. We also conclude that ENORA outperforms niched pre-selection and NSGA-II algorithms. Moreover, given that our multi-objective evolutionary methodology is non-combinational based on real parameter optimization, the time cost is significantly reduced compared with other evolutionary approaches existing in literature based on combinational optimization. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Design of a Modular Monolithic Implicit Solver for Multi-Physics Applications

    NASA Technical Reports Server (NTRS)

    Carton De Wiart, Corentin; Diosady, Laslo T.; Garai, Anirban; Burgess, Nicholas; Blonigan, Patrick; Ekelschot, Dirk; Murman, Scott M.

    2018-01-01

    The design of a modular multi-physics high-order space-time finite-element framework is presented together with its extension to allow monolithic coupling of different physics. One of the main objectives of the framework is to perform efficient high- fidelity simulations of capsule/parachute systems. This problem requires simulating multiple physics including, but not limited to, the compressible Navier-Stokes equations, the dynamics of a moving body with mesh deformations and adaptation, the linear shell equations, non-re effective boundary conditions and wall modeling. The solver is based on high-order space-time - finite element methods. Continuous, discontinuous and C1-discontinuous Galerkin methods are implemented, allowing one to discretize various physical models. Tangent and adjoint sensitivity analysis are also targeted in order to conduct gradient-based optimization, error estimation, mesh adaptation, and flow control, adding another layer of complexity to the framework. The decisions made to tackle these challenges are presented. The discussion focuses first on the "single-physics" solver and later on its extension to the monolithic coupling of different physics. The implementation of different physics modules, relevant to the capsule/parachute system, are also presented. Finally, examples of coupled computations are presented, paving the way to the simulation of the full capsule/parachute system.

  3. FACETS: multi-faceted functional decomposition of protein interaction networks.

    PubMed

    Seah, Boon-Siew; Bhowmick, Sourav S; Dewey, C Forbes

    2012-10-15

    The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein-protein interaction (PPI) network using graph theoretic analysis. Despite the recent progress, systems level analysis of high-throughput PPIs remains a daunting task because of the amount of data they present. In this article, we propose a novel PPI network decomposition algorithm called FACETS in order to make sense of the deluge of interaction data using Gene Ontology (GO) annotations. FACETS finds not just a single functional decomposition of the PPI network, but a multi-faceted atlas of functional decompositions that portray alternative perspectives of the functional landscape of the underlying PPI network. Each facet in the atlas represents a distinct interpretation of how the network can be functionally decomposed and organized. Our algorithm maximizes interpretative value of the atlas by optimizing inter-facet orthogonality and intra-facet cluster modularity. We tested our algorithm on the global networks from IntAct, and compared it with gold standard datasets from MIPS and KEGG. We demonstrated the performance of FACETS. We also performed a case study that illustrates the utility of our approach. Supplementary data are available at the Bioinformatics online. Our software is available freely for non-commercial purposes from: http://www.cais.ntu.edu.sg/~assourav/Facets/

  4. ICARUS 600 ton: A status report

    NASA Astrophysics Data System (ADS)

    Vignoli, C.; Arneodo, F.; Badertscher, A.; Barbieri, E.; Benetti, P.; di Tigliole, A. Borio; Brunetti, R.; Bueno, A.; Calligarich, E.; Campanelli, M.; Carli, F.; Carpanese, C.; Cavalli, D.; Cavanna, F.; Cennini, P.; Centro, S.; Cesana, A.; Chen, C.; Chen, Y.; Cinquini, C.; Cline, D.; De Mitri, I.; Dolfini, R.; Favaretto, D.; Ferrari, A.; Berzolari, A. Gigli; Goudsmit, P.; He, K.; Huang, X.; Li, Z.; Lu, F.; Ma, J.; Mannocchi, G.; Mauri, F.; Mazza, D.; Mazzone, L.; Montanari, C.; Nurzia, G. P.; Otwinowski, S.; Palamara, O.; Pascoli, D.; Pepato, A.; Periale, L.; Petrera, S.; Piano-Mortari, G.; Piazzoli, A.; Picchi, P.; Pietropaolo, F.; Rancati, T.; Rappoldi, A.; Raselli, G. L.; Rebuzzi, D.; Revol, J. P.; Rico, J.; Rossella, M.; Rossi, C.; Rubbia, A.; Rubbia, C.; Sala, P.; Scannicchio, D.; Sergiampietri, F.; Suzuki, S.; Terrani, M.; Ventura, S.; Verdecchia, M.; Wang, H.; Woo, J.; Xu, G.; Xu, Z.; Zhang, C.; Zhang, Q.; Zheng, S.

    2000-05-01

    The goal of the ICARUS Project is the installation of a multi-kiloton LAr TPC in the underground Gran Sasso Laboratory. The programme foresees the realization of the detector in a modular way. The first step is the construction of a 600 ton module which is now at an advanced phase. It will be mounted and tested in Pavia in one year and then it will be moved to Gran Sasso for the final operation. The major cryogenic and purification systems and the mechanical components of the detector have been constructed and tested in a 10 m 3 prototype. The results of these tests are here summarized.

  5. Multi-reactor power system configurations for multimegawatt nuclear electric propulsion

    NASA Technical Reports Server (NTRS)

    George, Jeffrey A.

    1991-01-01

    A modular, multi-reactor power system and vehicle configuration for piloted nuclear electric propulsion (NEP) missions to Mars is presented. Such a design could provide enhanced system and mission reliability, allowing a comfortable safety margin for early manned flights, and would allow a range of piloted and cargo missions to be performed with a single power system design. Early use of common power modules for cargo missions would also provide progressive flight experience and validation of standardized systems for use in later piloted applications. System and mission analysis are presented to compare single and multi-reactor configurations for piloted Mars missions. A conceptual design for the Hydra modular multi-reactor NEP vehicle is presented.

  6. Combined Economic and Hydrologic Modeling to Support Collaborative Decision Making Processes

    NASA Astrophysics Data System (ADS)

    Sheer, D. P.

    2008-12-01

    For more than a decade, the core concept of the author's efforts in support of collaborative decision making has been a combination of hydrologic simulation and multi-objective optimization. The modeling has generally been used to support collaborative decision making processes. The OASIS model developed by HydroLogics Inc. solves a multi-objective optimization at each time step using a mixed integer linear program (MILP). The MILP can be configured to include any user defined objective, including but not limited too economic objectives. For example, an estimated marginal value for water for crops and M&I use were included in the objective function to drive trades in a model of the lower Rio Grande. The formulation of the MILP, constraints and objectives, in any time step is conditional: it changes based on the value of state variables and dynamic external forcing functions, such as rainfall, hydrology, market prices, arrival of migratory fish, water temperature, etc. It therefore acts as a dynamic short term multi-objective economic optimization for each time step. MILP is capable of solving a general problem that includes a very realistic representation of the physical system characteristics in addition to the normal multi-objective optimization objectives and constraints included in economic models. In all of these models, the short term objective function is a surrogate for achieving long term multi-objective results. The long term performance for any alternative (especially including operating strategies) is evaluated by simulation. An operating rule is the combination of conditions, parameters, constraints and objectives used to determine the formulation of the short term optimization in each time step. Heuristic wrappers for the simulation program have been developed improve the parameters of an operating rule, and are initiating research on a wrapper that will allow us to employ a genetic algorithm to improve the form of the rule (conditions, constraints, and short term objectives) as well. In the models operating rules represent different models of human behavior, and the objective of the modeling is to find rules for human behavior that perform well in terms of long term human objectives. The conceptual model used to represent human behavior incorporates economic multi-objective optimization for surrogate objectives, and rules that set those objectives based on current conditions and accounting for uncertainty, at least implicitly. The author asserts that real world operating rules follow this form and have evolved because they have been perceived as successful in the past. Thus, the modeling efforts focus on human behavior in much the same way that economic models focus on human behavior. This paper illustrates the above concepts with real world examples.

  7. Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Daniel G.

    In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less

  8. Space Debris Removal Using Multi-Mission Modular Spacecraft

    NASA Astrophysics Data System (ADS)

    Savioli, L.; Francesconi, A.; Maggi, F.; Olivieri, L.; Lorenzini, E.; Pardini, C.

    2013-08-01

    The study and development of ADR missions in LEO have become an issue of topical interest to the attention of the space community since the future space flight activities could be threatened by collisional cascade events. This paper presents the analysis of an ADR mission scenario where modular remover kits are employed to de-orbit some selected debris in SSO, while a distinct space tug performs the orbital transfers and rendezvous manoeuvres, and installs the remover kits on the client debris. Electro-dynamic tether and electric propulsion are considered as de-orbiting alternatives, while chemical propulsion is employed for the space tug. The total remover mass and de-orbiting time are identified as key parameters to compare the performances of the two de-orbiting options, while an optimization of the ΔV required to move between five selected objects is performed for a preliminary design at system level of the space tug. Final controlled re-entry is also considered and performed by means of a hybrid engine.

  9. Rosetta:MSF: a modular framework for multi-state computational protein design.

    PubMed

    Löffler, Patrick; Schmitz, Samuel; Hupfeld, Enrico; Sterner, Reinhard; Merkl, Rainer

    2017-06-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta's protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta's single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design.

  10. Rosetta:MSF: a modular framework for multi-state computational protein design

    PubMed Central

    Hupfeld, Enrico; Sterner, Reinhard

    2017-01-01

    Computational protein design (CPD) is a powerful technique to engineer existing proteins or to design novel ones that display desired properties. Rosetta is a software suite including algorithms for computational modeling and analysis of protein structures and offers many elaborate protocols created to solve highly specific tasks of protein engineering. Most of Rosetta’s protocols optimize sequences based on a single conformation (i. e. design state). However, challenging CPD objectives like multi-specificity design or the concurrent consideration of positive and negative design goals demand the simultaneous assessment of multiple states. This is why we have developed the multi-state framework MSF that facilitates the implementation of Rosetta’s single-state protocols in a multi-state environment and made available two frequently used protocols. Utilizing MSF, we demonstrated for one of these protocols that multi-state design yields a 15% higher performance than single-state design on a ligand-binding benchmark consisting of structural conformations. With this protocol, we designed de novo nine retro-aldolases on a conformational ensemble deduced from a (βα)8-barrel protein. All variants displayed measurable catalytic activity, testifying to a high success rate for this concept of multi-state enzyme design. PMID:28604768

  11. Development of a modularized two-step (M2S) chromosome integration technique for integration of multiple transcription units in Saccharomyces cerevisiae.

    PubMed

    Li, Siwei; Ding, Wentao; Zhang, Xueli; Jiang, Huifeng; Bi, Changhao

    2016-01-01

    Saccharomyces cerevisiae has already been used for heterologous production of fuel chemicals and valuable natural products. The establishment of complicated heterologous biosynthetic pathways in S. cerevisiae became the research focus of Synthetic Biology and Metabolic Engineering. Thus, simple and efficient genomic integration techniques of large number of transcription units are demanded urgently. An efficient DNA assembly and chromosomal integration method was created by combining homologous recombination (HR) in S. cerevisiae and Golden Gate DNA assembly method, designated as modularized two-step (M2S) technique. Two major assembly steps are performed consecutively to integrate multiple transcription units simultaneously. In Step 1, Modularized scaffold containing a head-to-head promoter module and a pair of terminators was assembled with two genes. Thus, two transcription units were assembled with Golden Gate method into one scaffold in one reaction. In Step 2, the two transcription units were mixed with modules of selective markers and integration sites and transformed into S. cerevisiae for assembly and integration. In both steps, universal primers were designed for identification of correct clones. Establishment of a functional β-carotene biosynthetic pathway in S. cerevisiae within 5 days demonstrated high efficiency of this method, and a 10-transcriptional-unit pathway integration illustrated the capacity of this method. Modular design of transcription units and integration elements simplified assembly and integration procedure, and eliminated frequent designing and synthesis of DNA fragments in previous methods. Also, by assembling most parts in Step 1 in vitro, the number of DNA cassettes for homologous integration in Step 2 was significantly reduced. Thus, high assembly efficiency, high integration capacity, and low error rate were achieved.

  12. Simulation of value stream mapping and discrete optimization of energy consumption in modular construction

    NASA Astrophysics Data System (ADS)

    Chowdhury, Md Mukul

    With the increased practice of modularization and prefabrication, the construction industry gained the benefits of quality management, improved completion time, reduced site disruption and vehicular traffic, and improved overall safety and security. Whereas industrialized construction methods, such as modular and manufactured buildings, have evolved over decades, core techniques used in prefabrication plants vary only slightly from those employed in traditional site-built construction. With a focus on energy and cost efficient modular construction, this research presents the development of a simulation, measurement and optimization system for energy consumption in the manufacturing process of modular construction. The system is based on Lean Six Sigma principles and loosely coupled system operation to identify the non-value adding tasks and possible causes of low energy efficiency. The proposed system will also include visualization functions for demonstration of energy consumption in modular construction. The benefits of implementing this system include a reduction in the energy consumption in production cost, decrease of energy cost in the production of lean-modular construction, and increase profit. In addition, the visualization functions will provide detailed information about energy efficiency and operation flexibility in modular construction. A case study is presented to validate the reliability of the system.

  13. Pareto Tracer: a predictor-corrector method for multi-objective optimization problems

    NASA Astrophysics Data System (ADS)

    Martín, Adanay; Schütze, Oliver

    2018-03-01

    This article proposes a novel predictor-corrector (PC) method for the numerical treatment of multi-objective optimization problems (MOPs). The algorithm, Pareto Tracer (PT), is capable of performing a continuation along the set of (local) solutions of a given MOP with k objectives, and can cope with equality and box constraints. Additionally, the first steps towards a method that manages general inequality constraints are also introduced. The properties of PT are first discussed theoretically and later numerically on several examples.

  14. Utility of coupling nonlinear optimization methods with numerical modeling software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, M.J.

    1996-08-05

    Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less

  15. Direct adaptive performance optimization of subsonic transports: A periodic perturbation technique

    NASA Technical Reports Server (NTRS)

    Espana, Martin D.; Gilyard, Glenn

    1995-01-01

    Aircraft performance can be optimized at the flight condition by using available redundancy among actuators. Effective use of this potential allows improved performance beyond limits imposed by design compromises. Optimization based on nominal models does not result in the best performance of the actual aircraft at the actual flight condition. An adaptive algorithm for optimizing performance parameters, such as speed or fuel flow, in flight based exclusively on flight data is proposed. The algorithm is inherently insensitive to model inaccuracies and measurement noise and biases and can optimize several decision variables at the same time. An adaptive constraint controller integrated into the algorithm regulates the optimization constraints, such as altitude or speed, without requiring and prior knowledge of the autopilot design. The algorithm has a modular structure which allows easy incorporation (or removal) of optimization constraints or decision variables to the optimization problem. An important part of the contribution is the development of analytical tools enabling convergence analysis of the algorithm and the establishment of simple design rules. The fuel-flow minimization and velocity maximization modes of the algorithm are demonstrated on the NASA Dryden B-720 nonlinear flight simulator for the single- and multi-effector optimization cases.

  16. Shape Optimization and Modular Discretization for the Development of a Morphing Wingtip

    NASA Astrophysics Data System (ADS)

    Morley, Joshua

    Better knowledge in the areas of aerodynamics and optimization has allowed designers to develop efficient wingtip structures in recent years. However, the requirements faced by wingtip devices can be considerably different amongst an aircraft's flight regimes. Traditional static wingtip devices are then a compromise between conflicting requirements, resulting in less than optimal performance within each regime. Alternatively, a morphing wingtip can reconfigure leading to improved performance over a range of dissimilar flight conditions. Developed within this thesis, is a modular morphing wingtip concept that centers on the use of variable geometry truss mechanisms to permit morphing. A conceptual design framework is established to aid in the development of the concept. The framework uses a metaheuristic optimization procedure to determine optimal continuous wingtip configurations. The configurations are then discretized for the modular concept. The functionality of the framework is demonstrated through a design study on a hypothetical wing/winglet within the thesis.

  17. Modeling surgical tool selection patterns as a "traveling salesman problem" for optimizing a modular surgical tool system.

    PubMed

    Nelson, Carl A; Miller, David J; Oleynikov, Dmitry

    2008-01-01

    As modular systems come into the forefront of robotic telesurgery, streamlining the process of selecting surgical tools becomes an important consideration. This paper presents a method for optimal queuing of tools in modular surgical tool systems, based on patterns in tool-use sequences, in order to minimize time spent changing tools. The solution approach is to model the set of tools as a graph, with tool-change frequency expressed as edge weights in the graph, and to solve the Traveling Salesman Problem for the graph. In a set of simulations, this method has shown superior performance at optimizing tool arrangements for streamlining surgical procedures.

  18. Quantification of soil water retention parameters using multi-section TDR-waveform analysis

    NASA Astrophysics Data System (ADS)

    Baviskar, S. M.; Heimovaara, T. J.

    2017-06-01

    Soil water retention parameters are important for describing flow in variably saturated soils. TDR is one of the standard methods used for determining water content in soil samples. In this study, we present an approach to estimate water retention parameters of a sample which is initially saturated and subjected to an incremental decrease in boundary head causing it to drain in a multi-step fashion. TDR waveforms are measured along the height of the sample at assumed different hydrostatic conditions at daily interval. The cumulative discharge outflow drained from the sample is also recorded. The saturated water content is obtained using volumetric analysis after the final step involved in multi-step drainage. The equation obtained by coupling the unsaturated parametric function and the apparent dielectric permittivity is fitted to a TDR wave propagation forward model. The unsaturated parametric function is used to spatially interpolate the water contents along TDR probe. The cumulative discharge outflow data is fitted with cumulative discharge estimated using the unsaturated parametric function. The weight of water inside the sample estimated at the first and final boundary head in multi-step drainage is fitted with the corresponding weights calculated using unsaturated parametric function. A Bayesian optimization scheme is used to obtain optimized water retention parameters for these different objective functions. This approach can be used for samples with long heights and is especially suitable for characterizing sands with a uniform particle size distribution at low capillary heads.

  19. Theoretically Founded Optimization of Auctioneer's Revenues in Expanding Auctions

    NASA Astrophysics Data System (ADS)

    Rabin, Jonathan; Shehory, Onn

    The expanding auction is a multi-unit auction which provides the auctioneer with control over the outcome of the auction by means of dynamically adding items for sale. Previous research on the expanding auction has provided a numeric method to calculate a strategy that optimizes the auctioneer's revenue. In this paper, we analyze various theoretical properties of the expanding auction, and compare it to VCG, a multi-unit auction protocol known in the art. We examine the effects of errors in the auctioneer's estimation of the buyers' maximal bidding values and prove a theoretical bound on the ratio between the revenue yielded by the Informed Decision Strategy (IDS) and the post-optimal strategy. We also analyze the relationship between the auction step and the optimal revenue and introduce a method of computing this optimizing step. We further compare the revenues yielded by the use of IDS with an expanding auction to those of the VCG mechanism and determine the conditions under which the former outperforms the latter. Our work provides new insight into the properties of the expanding auction. It further provides theoretically founded means for optimizing the revenue of auctioneer.

  20. Structural Integration of Sensors/Actuators by Laser Beam Melting for Tailored Smart Components

    NASA Astrophysics Data System (ADS)

    Töppel, Thomas; Lausch, Holger; Brand, Michael; Hensel, Eric; Arnold, Michael; Rotsch, Christian

    2018-03-01

    Laser beam melting (LBM), an additive laser powder bed fusion technology, enables the structural integration of temperature-sensitive sensors and actuators in complex monolithic metallic structures. The objective is to embed a functional component inside a metal part without losing its functionality by overheating. The first part of this paper addresses the development of a new process chain for bonded embedding of temperature-sensitive sensor/actuator systems by LBM. These systems are modularly built and coated by a multi-material/multi-layer thermal protection system of ceramic and metallic compounds. The characteristic of low global heat input in LBM is utilized for the functional embedding. In the second part, the specific functional design and optimization for tailored smart components with embedded functionalities are addressed. Numerical and experimental validated results are demonstrated on a smart femoral hip stem.

  1. Synthetic Biology for Cell-Free Biosynthesis: Fundamentals of Designing Novel In Vitro Multi-Enzyme Reaction Networks.

    PubMed

    Morgado, Gaspar; Gerngross, Daniel; Roberts, Tania M; Panke, Sven

    Cell-free biosynthesis in the form of in vitro multi-enzyme reaction networks or enzyme cascade reactions emerges as a promising tool to carry out complex catalysis in one-step, one-vessel settings. It combines the advantages of well-established in vitro biocatalysis with the power of multi-step in vivo pathways. Such cascades have been successfully applied to the synthesis of fine and bulk chemicals, monomers and complex polymers of chemical importance, and energy molecules from renewable resources as well as electricity. The scale of these initial attempts remains small, suggesting that more robust control of such systems and more efficient optimization are currently major bottlenecks. To this end, the very nature of enzyme cascade reactions as multi-membered systems requires novel approaches for implementation and optimization, some of which can be obtained from in vivo disciplines (such as pathway refactoring and DNA assembly), and some of which can be built on the unique, cell-free properties of cascade reactions (such as easy analytical access to all system intermediates to facilitate modeling).

  2. Application of an Evolution Strategy in Planetary Ephemeris Optimization

    NASA Astrophysics Data System (ADS)

    Mai, E.

    2016-12-01

    Classical planetary ephemeris construction comprises three major steps, which are performed iteratively: simultaneous numerical integration of coupled equations of motion of a multi-body system (propagator step), reduction of thousands of observations (reduction step), and optimization of various selected model parameters (adjustment step). This traditional approach is challenged by ongoing refinements in force modeling, e.g. inclusion of much more significant minor bodies, an ever-growing number of planetary observations, e.g. vast amount of spacecraft tracking data, etc. To master the high computational burden and in order to circumvent the need for inversion of huge normal equation matrices, we propose an alternative ephemeris construction method. The main idea is to solve the overall optimization problem by a straightforward direct evaluation of the whole set of mathematical formulas involved, rather than to solve it as an inverse problem with all its tacit mathematical assumptions and numerical difficulties. We replace the usual gradient search by a stochastic search, namely an evolution strategy, the latter of which is also perfect for the exploitation of parallel computing capabilities. Furthermore, this new approach enables multi-criteria optimization and time-varying optima. This issue will become important in future once ephemeris construction is just one part of even larger optimization problems, e.g. the combined and consistent determination of the physical state (orbit, size, shape, rotation, gravity,…) of celestial bodies (planets, satellites, asteroids, or comets), and if one seeks near real-time solutions. Here we outline the general idea and discuss first results. As an example, we present a simultaneous optimization of high-correlated asteroidal ring model parameters (total mass and heliocentric radius), based on simulations.

  3. Small worlds in space: Synchronization, spatial and relational modularity

    NASA Astrophysics Data System (ADS)

    Brede, M.

    2010-06-01

    In this letter we investigate networks that have been optimized to realize a trade-off between enhanced synchronization and cost of wire to connect the nodes in space. Analyzing the evolved arrangement of nodes in space and their corresponding network topology, a class of small-world networks characterized by spatial and network modularity is found. More precisely, for low cost of wire optimal configurations are characterized by a division of nodes into two spatial groups with maximum distance from each other, whereas network modularity is low. For high cost of wire, the nodes organize into several distinct groups in space that correspond to network modules connected on a ring. In between, spatially and relationally modular small-world networks are found.

  4. Design, Manufacturing and Characterization of Functionally Graded Flextensional Piezoelectric Actuators

    NASA Astrophysics Data System (ADS)

    Amigo, R. C. R.; Vatanabe, S. L.; Silva, E. C. N.

    2013-03-01

    Previous works have been shown several advantages in using Functionally Graded Materials (FGMs) for the performance of flextensional devices, such as reduction of stress concentrations and gains in reliability. In this work, the FGM concept is explored in the design of graded devices by using the Topology Optimization Method (TOM), in order to determine optimal topologies and gradations of the coupled structures of piezoactuators. The graded pieces are manufactured by using the Spark Plasma Sintering (SPS) technique and are bonded to piezoelectric ceramics. The graded actuators are then tested by using a modular vibrometer system for measuring output displacements, in order to validate the numerical simulations. The technological path developed here represents the initial step toward the manufacturing of an integral piezoelectric device, constituted by piezoelectric and non-piezoelectric materials without bonding layers.

  5. Development and validation of a numerical model for cross-section optimization of a multi-part probe for soft tissue intervention.

    PubMed

    Frasson, L; Neubert, J; Reina, S; Oldfield, M; Davies, B L; Rodriguez Y Baena, F

    2010-01-01

    The popularity of minimally invasive surgical procedures is driving the development of novel, safer and more accurate surgical tools. In this context a multi-part probe for soft tissue surgery is being developed in the Mechatronics in Medicine Laboratory at Imperial College, London. This study reports an optimization procedure using finite element methods, for the identification of an interlock geometry able to limit the separation of the segments composing the multi-part probe. An optimal geometry was obtained and the corresponding three-dimensional finite element model validated experimentally. Simulation results are shown to be consistent with the physical experiments. The outcome of this study is an important step in the provision of a novel miniature steerable probe for surgery.

  6. Efficient Multi-Stage Time Marching for Viscous Flows via Local Preconditioning

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Wood, William A.; vanLeer, Bram

    1999-01-01

    A new method has been developed to accelerate the convergence of explicit time-marching, laminar, Navier-Stokes codes through the combination of local preconditioning and multi-stage time marching optimization. Local preconditioning is a technique to modify the time-dependent equations so that all information moves or decays at nearly the same rate, thus relieving the stiffness for a system of equations. Multi-stage time marching can be optimized by modifying its coefficients to account for the presence of viscous terms, allowing larger time steps. We show it is possible to optimize the time marching scheme for a wide range of cell Reynolds numbers for the scalar advection-diffusion equation, and local preconditioning allows this optimization to be applied to the Navier-Stokes equations. Convergence acceleration of the new method is demonstrated through numerical experiments with circular advection and laminar boundary-layer flow over a flat plate.

  7. Evolutionary method for finding communities in bipartite networks.

    PubMed

    Zhan, Weihua; Zhang, Zhongzhi; Guan, Jihong; Zhou, Shuigeng

    2011-06-01

    An important step in unveiling the relation between network structure and dynamics defined on networks is to detect communities, and numerous methods have been developed separately to identify community structure in different classes of networks, such as unipartite networks, bipartite networks, and directed networks. Here, we show that the finding of communities in such networks can be unified in a general framework-detection of community structure in bipartite networks. Moreover, we propose an evolutionary method for efficiently identifying communities in bipartite networks. To this end, we show that both unipartite and directed networks can be represented as bipartite networks, and their modularity is completely consistent with that for bipartite networks, the detection of modular structure on which can be reformulated as modularity maximization. To optimize the bipartite modularity, we develop a modified adaptive genetic algorithm (MAGA), which is shown to be especially efficient for community structure detection. The high efficiency of the MAGA is based on the following three improvements we make. First, we introduce a different measure for the informativeness of a locus instead of the standard deviation, which can exactly determine which loci mutate. This measure is the bias between the distribution of a locus over the current population and the uniform distribution of the locus, i.e., the Kullback-Leibler divergence between them. Second, we develop a reassignment technique for differentiating the informative state a locus has attained from the random state in the initial phase. Third, we present a modified mutation rule which by incorporating related operations can guarantee the convergence of the MAGA to the global optimum and can speed up the convergence process. Experimental results show that the MAGA outperforms existing methods in terms of modularity for both bipartite and unipartite networks.

  8. A Modular Multilevel Converter with Power Mismatch Control for Grid-Connected Photovoltaic Systems

    DOE PAGES

    Duman, Turgay; Marti, Shilpa; Moonem, M. A.; ...

    2017-05-17

    A modular multilevel power converter configuration for grid connected photovoltaic (PV) systems is proposed. The converter configuration replaces the conventional bulky line frequency transformer with several high frequency transformers, potentially reducing the balance of systems cost of PV systems. The front-end converter for each port is a neutral-point diode clamped (NPC) multi-level dc-dc dual-active bridge (ML-DAB) which allows maximum power point tracking (MPPT). The integrated high frequency transformer provides the galvanic isolation between the PV and grid side and also steps up the low dc voltage from PV source. Following the ML-DAB stage, in each port, is a NPC inverter.more » N number of NPC inverters’ outputs are cascaded to attain the per-phase line-to-neutral voltage to connect directly to the distribution grid (i.e., 13.8 kV). The cascaded NPC (CNPC) inverters have the inherent advantage of using lower rated devices, smaller filters and low total harmonic distortion required for PV grid interconnection. The proposed converter system is modular, scalable, and serviceable with zero downtime with lower foot print and lower overall cost. A novel voltage balance control at each module based on power mismatch among N-ports, have been presented and verified in simulation. Analysis and simulation results are presented for the N-port converter. The converter performance has also been verified on a hardware prototype.« less

  9. A Modular Multilevel Converter with Power Mismatch Control for Grid-Connected Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duman, Turgay; Marti, Shilpa; Moonem, M. A.

    A modular multilevel power converter configuration for grid connected photovoltaic (PV) systems is proposed. The converter configuration replaces the conventional bulky line frequency transformer with several high frequency transformers, potentially reducing the balance of systems cost of PV systems. The front-end converter for each port is a neutral-point diode clamped (NPC) multi-level dc-dc dual-active bridge (ML-DAB) which allows maximum power point tracking (MPPT). The integrated high frequency transformer provides the galvanic isolation between the PV and grid side and also steps up the low dc voltage from PV source. Following the ML-DAB stage, in each port, is a NPC inverter.more » N number of NPC inverters’ outputs are cascaded to attain the per-phase line-to-neutral voltage to connect directly to the distribution grid (i.e., 13.8 kV). The cascaded NPC (CNPC) inverters have the inherent advantage of using lower rated devices, smaller filters and low total harmonic distortion required for PV grid interconnection. The proposed converter system is modular, scalable, and serviceable with zero downtime with lower foot print and lower overall cost. A novel voltage balance control at each module based on power mismatch among N-ports, have been presented and verified in simulation. Analysis and simulation results are presented for the N-port converter. The converter performance has also been verified on a hardware prototype.« less

  10. The design of a multi-harmonic step-tunable gyrotron

    NASA Astrophysics Data System (ADS)

    Qi, Xiang-Bo; Du, Chao-Hai; Zhu, Juan-Feng; Pan, Shi; Liu, Pu-Kun

    2017-03-01

    The theoretical study of a step-tunable gyrotron controlled by successive excitation of multi-harmonic modes is presented in this paper. An axis-encircling electron beam is employed to eliminate the harmonic mode competition. Physics images are depicted to elaborate the multi-harmonic interaction mechanism in determining the operating parameters at which arbitrary harmonic tuning can be realized by magnetic field sweeping to achieve controlled multiband frequencies' radiation. An important principle is revealed that a weak coupling coefficient under a high-harmonic interaction can be compensated by a high Q-factor. To some extent, the complementation between the high Q-factor and weak coupling coefficient makes the high-harmonic mode potential to achieve high efficiency. Based on a previous optimized magnetic cusp gun, the multi-harmonic step-tunable gyrotron is feasible by using harmonic tuning of first-to-fourth harmonic modes. Multimode simulation shows that the multi-harmonic gyrotron can operate on the 34 GHz first-harmonic TE11 mode, 54 GHz second-harmonic TE21 mode, 74 GHz third-harmonic TE31 mode, and 94 GHz fourth-harmonic TE41 mode, corresponding to peak efficiencies of 28.6%, 35.7%, 17.1%, and 11.4%, respectively. The multi-harmonic step-tunable gyrotron provides new possibilities in millimeter-terahertz source development especially for advanced terahertz applications.

  11. Shape optimization of the modular press body

    NASA Astrophysics Data System (ADS)

    Pabiszczak, Stanisław

    2016-12-01

    A paper contains an optimization algorithm of cross-sectional dimensions of a modular press body for the minimum mass criterion. Parameters of the wall thickness and the angle of their inclination relative to the base of section are assumed as the decision variables. The overall dimensions are treated as a constant. The optimal values of parameters were calculated using numerical method of the tool Solver in the program Microsoft Excel. The results of the optimization procedure helped reduce body weight by 27% while maintaining the required rigidity of the body.

  12. IMAGINE: Interstellar MAGnetic field INference Engine

    NASA Astrophysics Data System (ADS)

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  13. Multi-criteria objective based climate change impact assessment for multi-purpose multi-reservoir systems

    NASA Astrophysics Data System (ADS)

    Müller, Ruben; Schütze, Niels

    2014-05-01

    Water resources systems with reservoirs are expected to be sensitive to climate change. Assessment studies that analyze the impact of climate change on the performance of reservoirs can be divided in two groups: (1) Studies that simulate the operation under projected inflows with the current set of operational rules. Due to non adapted operational rules the future performance of these reservoirs can be underestimated and the impact overestimated. (2) Studies that optimize the operational rules for best adaption of the system to the projected conditions before the assessment of the impact. The latter allows for estimating more realistically future performance and adaption strategies based on new operation rules are available if required. Multi-purpose reservoirs serve various, often conflicting functions. If all functions cannot be served simultaneously at a maximum level, an effective compromise between multiple objectives of the reservoir operation has to be provided. Yet under climate change the historically preferenced compromise may no longer be the most suitable compromise in the future. Therefore a multi-objective based climate change impact assessment approach for multi-purpose multi-reservoir systems is proposed in the study. Projected inflows are provided in a first step using a physically based rainfall-runoff model. In a second step, a time series model is applied to generate long-term inflow time series. Finally, the long-term inflow series are used as driving variables for a simulation-based multi-objective optimization of the reservoir system in order to derive optimal operation rules. As a result, the adapted Pareto-optimal set of diverse best compromise solutions can be presented to the decision maker in order to assist him in assessing climate change adaption measures with respect to the future performance of the multi-purpose reservoir system. The approach is tested on a multi-purpose multi-reservoir system in a mountainous catchment in Germany. A climate change assessment is performed for climate change scenarios based on the SRES emission scenarios A1B, B1 and A2 for a set of statistically downscaled meteorological data. The future performance of the multi-purpose multi-reservoir system is quantified and possible intensifications of trade-offs between management goals or reservoir utilizations are shown.

  14. Modularity-like objective function in annotated networks

    NASA Astrophysics Data System (ADS)

    Xie, Jia-Rong; Wang, Bing-Hong

    2017-12-01

    We ascertain the modularity-like objective function whose optimization is equivalent to the maximum likelihood in annotated networks. We demonstrate that the modularity-like objective function is a linear combination of modularity and conditional entropy. In contrast with statistical inference methods, in our method, the influence of the metadata is adjustable; when its influence is strong enough, the metadata can be recovered. Conversely, when it is weak, the detection may correspond to another partition. Between the two, there is a transition. This paper provides a concept for expanding the scope of modularity methods.

  15. Deterministic methods for multi-control fuel loading optimization

    NASA Astrophysics Data System (ADS)

    Rahman, Fariz B. Abdul

    We have developed a multi-control fuel loading optimization code for pressurized water reactors based on deterministic methods. The objective is to flatten the fuel burnup profile, which maximizes overall energy production. The optimal control problem is formulated using the method of Lagrange multipliers and the direct adjoining approach for treatment of the inequality power peaking constraint. The optimality conditions are derived for a multi-dimensional multi-group optimal control problem via calculus of variations. Due to the Hamiltonian having a linear control, our optimal control problem is solved using the gradient method to minimize the Hamiltonian and a Newton step formulation to obtain the optimal control. We are able to satisfy the power peaking constraint during depletion with the control at beginning of cycle (BOC) by building the proper burnup path forward in time and utilizing the adjoint burnup to propagate the information back to the BOC. Our test results show that we are able to achieve our objective and satisfy the power peaking constraint during depletion using either the fissile enrichment or burnable poison as the control. Our fuel loading designs show an increase of 7.8 equivalent full power days (EFPDs) in cycle length compared with 517.4 EFPDs for the AP600 first cycle.

  16. Decomposition-Based Multiobjective Evolutionary Algorithm for Community Detection in Dynamic Social Networks

    PubMed Central

    Ma, Jingjing; Liu, Jie; Ma, Wenping; Gong, Maoguo; Jiao, Licheng

    2014-01-01

    Community structure is one of the most important properties in social networks. In dynamic networks, there are two conflicting criteria that need to be considered. One is the snapshot quality, which evaluates the quality of the community partitions at the current time step. The other is the temporal cost, which evaluates the difference between communities at different time steps. In this paper, we propose a decomposition-based multiobjective community detection algorithm to simultaneously optimize these two objectives to reveal community structure and its evolution in dynamic networks. It employs the framework of multiobjective evolutionary algorithm based on decomposition to simultaneously optimize the modularity and normalized mutual information, which quantitatively measure the quality of the community partitions and temporal cost, respectively. A local search strategy dealing with the problem-specific knowledge is incorporated to improve the effectiveness of the new algorithm. Experiments on computer-generated and real-world networks demonstrate that the proposed algorithm can not only find community structure and capture community evolution more accurately, but also be steadier than the two compared algorithms. PMID:24723806

  17. Decomposition-based multiobjective evolutionary algorithm for community detection in dynamic social networks.

    PubMed

    Ma, Jingjing; Liu, Jie; Ma, Wenping; Gong, Maoguo; Jiao, Licheng

    2014-01-01

    Community structure is one of the most important properties in social networks. In dynamic networks, there are two conflicting criteria that need to be considered. One is the snapshot quality, which evaluates the quality of the community partitions at the current time step. The other is the temporal cost, which evaluates the difference between communities at different time steps. In this paper, we propose a decomposition-based multiobjective community detection algorithm to simultaneously optimize these two objectives to reveal community structure and its evolution in dynamic networks. It employs the framework of multiobjective evolutionary algorithm based on decomposition to simultaneously optimize the modularity and normalized mutual information, which quantitatively measure the quality of the community partitions and temporal cost, respectively. A local search strategy dealing with the problem-specific knowledge is incorporated to improve the effectiveness of the new algorithm. Experiments on computer-generated and real-world networks demonstrate that the proposed algorithm can not only find community structure and capture community evolution more accurately, but also be steadier than the two compared algorithms.

  18. Patterning control strategies for minimum edge placement error in logic devices

    NASA Astrophysics Data System (ADS)

    Mulkens, Jan; Hanna, Michael; Slachter, Bram; Tel, Wim; Kubis, Michael; Maslow, Mark; Spence, Chris; Timoshkov, Vadim

    2017-03-01

    In this paper we discuss the edge placement error (EPE) for multi-patterning semiconductor manufacturing. In a multi-patterning scheme the creation of the final pattern is the result of a sequence of lithography and etching steps, and consequently the contour of the final pattern contains error sources of the different process steps. We describe the fidelity of the final pattern in terms of EPE, which is defined as the relative displacement of the edges of two features from their intended target position. We discuss our holistic patterning optimization approach to understand and minimize the EPE of the final pattern. As an experimental test vehicle we use the 7-nm logic device patterning process flow as developed by IMEC. This patterning process is based on Self-Aligned-Quadruple-Patterning (SAQP) using ArF lithography, combined with line cut exposures using EUV lithography. The computational metrology method to determine EPE is explained. It will be shown that ArF to EUV overlay, CDU from the individual process steps, and local CD and placement of the individual pattern features, are the important contributors. Based on the error budget, we developed an optimization strategy for each individual step and for the final pattern. Solutions include overlay and CD metrology based on angle resolved scatterometry, scanner actuator control to enable high order overlay corrections and computational lithography optimization to minimize imaging induced pattern placement errors of devices and metrology targets.

  19. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A Central Control Element (CCE) module which controls the Automatically Reconfigurable Modular System (ARMS) and allows both redundant processing and multi-computing in the same computer with real time mode switching, is discussed. The same hardware is used for either reliability enhancement, speed enhancement, or for a combination of both.

  20. Seismic data enhancement and regularization using finite offset Common Diffraction Surface (CDS) stack

    NASA Astrophysics Data System (ADS)

    Garabito, German; Cruz, João Carlos Ribeiro; Oliva, Pedro Andrés Chira; Söllner, Walter

    2017-01-01

    The Common Reflection Surface stack is a robust method for simulating zero-offset and common-offset sections with high accuracy from multi-coverage seismic data. For simulating common-offset sections, the Common-Reflection-Surface stack method uses a hyperbolic traveltime approximation that depends on five kinematic parameters for each selected sample point of the common-offset section to be simulated. The main challenge of this method is to find a computationally efficient data-driven optimization strategy for accurately determining the five kinematic stacking parameters on which each sample of the stacked common-offset section depends. Several authors have applied multi-step strategies to obtain the optimal parameters by combining different pre-stack data configurations. Recently, other authors used one-step data-driven strategies based on a global optimization for estimating simultaneously the five parameters from multi-midpoint and multi-offset gathers. In order to increase the computational efficiency of the global optimization process, we use in this paper a reduced form of the Common-Reflection-Surface traveltime approximation that depends on only four parameters, the so-called Common Diffraction Surface traveltime approximation. By analyzing the convergence of both objective functions and the data enhancement effect after applying the two traveltime approximations to the Marmousi synthetic dataset and a real land dataset, we conclude that the Common-Diffraction-Surface approximation is more efficient within certain aperture limits and preserves at the same time a high image accuracy. The preserved image quality is also observed in a direct comparison after applying both approximations for simulating common-offset sections on noisy pre-stack data.

  1. Modular entanglement.

    PubMed

    Gualdi, Giulia; Giampaolo, Salvatore M; Illuminati, Fabrizio

    2011-02-04

    We introduce and discuss the concept of modular entanglement. This is the entanglement that is established between the end points of modular systems composed by sets of interacting moduli of arbitrarily fixed size. We show that end-to-end modular entanglement scales in the thermodynamic limit and rapidly saturates with the number of constituent moduli. We clarify the mechanisms underlying the onset of entanglement between distant and noninteracting quantum systems and its optimization for applications to quantum repeaters and entanglement distribution and sharing.

  2. Modular space station phase B extension preliminary system design. Volume 5: configuration analyses

    NASA Technical Reports Server (NTRS)

    Stefan, A. J.; Goble, G. J.

    1972-01-01

    The initial and growth modular space station configurations are described, and the evolutionary steps arriving at the final configuration are outlined. Supporting tradeoff studies and analyses such as stress, radiation dosage, and micrometeoroid and thermal protection are included.

  3. Multi-step optimization strategy for fuel-optimal orbital transfer of low-thrust spacecraft

    NASA Astrophysics Data System (ADS)

    Rasotto, M.; Armellin, R.; Di Lizia, P.

    2016-03-01

    An effective method for the design of fuel-optimal transfers in two- and three-body dynamics is presented. The optimal control problem is formulated using calculus of variation and primer vector theory. This leads to a multi-point boundary value problem (MPBVP), characterized by complex inner constraints and a discontinuous thrust profile. The first issue is addressed by embedding the MPBVP in a parametric optimization problem, thus allowing a simplification of the set of transversality constraints. The second problem is solved by representing the discontinuous control function by a smooth function depending on a continuation parameter. The resulting trajectory optimization method can deal with different intermediate conditions, and no a priori knowledge of the control structure is required. Test cases in both the two- and three-body dynamics show the capability of the method in solving complex trajectory design problems.

  4. Simulation and Modeling Capability for Standard Modular Hydropower Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.

    Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.

  5. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.

    Background In Nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here, we combine metaomics and enzymology to identify and describe a novel Bacteroidetes family (UMH11) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results The first metabolic reconstruction of UMH11-affiliated genome bins, with a particular focus on the provisionally named UParaporphyromonas polyenzymogenes, illustrated their capacity to degrade various lignocellulosicmore » substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human-gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific Type 9 secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from UP. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected UP. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion We propose that UP. olyenzymogenes genotypes and other UMH11 members actively degrade plant biomass in the rumen of cows, sheep, and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gramnegative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  6. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.

    Abstract. Background In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here in this paper, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family (“Candidatus MH11”) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named “Candidatus Paraporphyromonas polyenzymogenes”,more » illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  7. “Candidatus Paraporphyromonas polyenzymogenes” encodes multi-modular cellulases linked to the type IX secretion system

    DOE PAGES

    Naas, A. E.; Solden, L. M.; Norbeck, A. D.; ...

    2018-03-01

    Abstract. Background In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here in this paper, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family (“Candidatus MH11”) composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. Results. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named “Candidatus Paraporphyromonas polyenzymogenes”,more » illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. Conclusion. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.« less

  8. "Candidatus Paraporphyromonas polyenzymogenes" encodes multi-modular cellulases linked to the type IX secretion system.

    PubMed

    Naas, A E; Solden, L M; Norbeck, A D; Brewer, H; Hagen, L H; Heggenes, I M; McHardy, A C; Mackie, R I; Paša-Tolić, L; Arntzen, M Ø; Eijsink, V G H; Koropatkin, N M; Hess, M; Wrighton, K C; Pope, P B

    2018-03-01

    In nature, obligate herbivorous ruminants have a close symbiotic relationship with their gastrointestinal microbiome, which proficiently deconstructs plant biomass. Despite decades of research, lignocellulose degradation in the rumen has thus far been attributed to a limited number of culturable microorganisms. Here, we combine meta-omics and enzymology to identify and describe a novel Bacteroidetes family ("Candidatus MH11") composed entirely of uncultivated strains that are predominant in ruminants and only distantly related to previously characterized taxa. The first metabolic reconstruction of Ca. MH11-affiliated genome bins, with a particular focus on the provisionally named "Candidatus Paraporphyromonas polyenzymogenes", illustrated their capacity to degrade various lignocellulosic substrates via comprehensive inventories of singular and multi-modular carbohydrate active enzymes (CAZymes). Closer examination revealed an absence of archetypical polysaccharide utilization loci found in human gut microbiota. Instead, we identified many multi-modular CAZymes putatively secreted via the Bacteroidetes-specific type IX secretion system (T9SS). This included cellulases with two or more catalytic domains, which are modular arrangements that are unique to Bacteroidetes species studied to date. Core metabolic proteins from Ca. P. polyenzymogenes were detected in metaproteomic data and were enriched in rumen-incubated plant biomass, indicating that active saccharification and fermentation of complex carbohydrates could be assigned to members of this novel family. Biochemical analysis of selected Ca. P. polyenzymogenes CAZymes further iterated the cellulolytic activity of this hitherto uncultured bacterium towards linear polymers, such as amorphous and crystalline cellulose as well as mixed linkage β-glucans. We propose that Ca. P. polyenzymogene genotypes and other Ca. MH11 members actively degrade plant biomass in the rumen of cows, sheep and most likely other ruminants, utilizing singular and multi-domain catalytic CAZymes secreted through the T9SS. The discovery of a prominent role of multi-modular cellulases in the Gram-negative Bacteroidetes, together with similar findings for Gram-positive cellulosomal bacteria (Ruminococcus flavefaciens) and anaerobic fungi (Orpinomyces sp.), suggests that complex enzymes are essential and have evolved within all major cellulolytic dominions inherent to the rumen.

  9. Unconditionally energy stable time stepping scheme for Cahn–Morral equation: Application to multi-component spinodal decomposition and optimal space tiling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tavakoli, Rouhollah, E-mail: rtavakoli@sharif.ir

    An unconditionally energy stable time stepping scheme is introduced to solve Cahn–Morral-like equations in the present study. It is constructed based on the combination of David Eyre's time stepping scheme and Schur complement approach. Although the presented method is general and independent of the choice of homogeneous free energy density function term, logarithmic and polynomial energy functions are specifically considered in this paper. The method is applied to study the spinodal decomposition in multi-component systems and optimal space tiling problems. A penalization strategy is developed, in the case of later problem, to avoid trivial solutions. Extensive numerical experiments demonstrate themore » success and performance of the presented method. According to the numerical results, the method is convergent and energy stable, independent of the choice of time stepsize. Its MATLAB implementation is included in the appendix for the numerical evaluation of algorithm and reproduction of the presented results. -- Highlights: •Extension of Eyre's convex–concave splitting scheme to multiphase systems. •Efficient solution of spinodal decomposition in multi-component systems. •Efficient solution of least perimeter periodic space partitioning problem. •Developing a penalization strategy to avoid trivial solutions. •Presentation of MATLAB implementation of the introduced algorithm.« less

  10. Optimal pattern synthesis for speech recognition based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Korsun, O. N.; Poliyev, A. V.

    2018-02-01

    The algorithm for building an optimal pattern for the purpose of automatic speech recognition, which increases the probability of correct recognition, is developed and presented in this work. The optimal pattern forming is based on the decomposition of an initial pattern to principal components, which enables to reduce the dimension of multi-parameter optimization problem. At the next step the training samples are introduced and the optimal estimates for principal components decomposition coefficients are obtained by a numeric parameter optimization algorithm. Finally, we consider the experiment results that show the improvement in speech recognition introduced by the proposed optimization algorithm.

  11. A Method for Optimal Load Dispatch of a Multi-zone Power System with Zonal Exchange Constraints

    NASA Astrophysics Data System (ADS)

    Hazarika, Durlav; Das, Ranjay

    2018-04-01

    This paper presented a method for economic generation scheduling of a multi-zone power system having inter zonal operational constraints. For this purpose, the generator rescheduling for a multi area power system having inter zonal operational constraints has been represented as a two step optimal generation scheduling problem. At first, the optimal generation scheduling has been carried out for the zone having surplus or deficient generation with proper spinning reserve using co-ordination equation. The power exchange required for the deficit zones and zones having no generation are estimated based on load demand and generation for the zone. The incremental transmission loss formulas for the transmission lines participating in the power transfer process among the zones are formulated. Using these, incremental transmission loss expression in co-ordination equation, the optimal generation scheduling for the zonal exchange has been determined. Simulation is carried out on IEEE 118 bus test system to examine the applicability and validity of the method.

  12. Epidemic outbreaks in growing scale-free networks with local structure

    NASA Astrophysics Data System (ADS)

    Ni, Shunjiang; Weng, Wenguo; Shen, Shifei; Fan, Weicheng

    2008-09-01

    The class of generative models has already attracted considerable interest from researchers in recent years and much expanded the original ideas described in BA model. Most of these models assume that only one node per time step joins the network. In this paper, we grow the network by adding n interconnected nodes as a local structure into the network at each time step with each new node emanating m new edges linking the node to the preexisting network by preferential attachment. This successfully generates key features observed in social networks. These include power-law degree distribution pk∼k, where μ=(n-1)/m is a tuning parameter defined as the modularity strength of the network, nontrivial clustering, assortative mixing, and modular structure. Moreover, all these features are dependent in a similar way on the parameter μ. We then study the susceptible-infected epidemics on this network with identical infectivity, and find that the initial epidemic behavior is governed by both of the infection scheme and the network structure, especially the modularity strength. The modularity of the network makes the spreading velocity much lower than that of the BA model. On the other hand, increasing the modularity strength will accelerate the propagation velocity.

  13. Supervisory Control System Architecture for Advanced Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetiner, Sacit M; Cole, Daniel L; Fugate, David L

    2013-08-01

    This technical report was generated as a product of the Supervisory Control for Multi-Modular SMR Plants project within the Instrumentation, Control and Human-Machine Interface technology area under the Advanced Small Modular Reactor (SMR) Research and Development Program of the U.S. Department of Energy. The report documents the definition of strategies, functional elements, and the structural architecture of a supervisory control system for multi-modular advanced SMR (AdvSMR) plants. This research activity advances the state-of-the art by incorporating decision making into the supervisory control system architectural layers through the introduction of a tiered-plant system approach. The report provides a brief history ofmore » hierarchical functional architectures and the current state-of-the-art, describes a reference AdvSMR to show the dependencies between systems, presents a hierarchical structure for supervisory control, indicates the importance of understanding trip setpoints, applies a new theoretic approach for comparing architectures, identifies cyber security controls that should be addressed early in system design, and describes ongoing work to develop system requirements and hardware/software configurations.« less

  14. Program document for Energy Systems Optimization Program 2 (ESOP2). Volume 1: Engineering manual

    NASA Technical Reports Server (NTRS)

    Hamil, R. G.; Ferden, S. L.

    1977-01-01

    The Energy Systems Optimization Program, which is used to provide analyses of Modular Integrated Utility Systems (MIUS), is discussed. Modifications to the input format to allow modular inputs in specified blocks of data are described. An optimization feature which enables the program to search automatically for the minimum value of one parameter while varying the value of other parameters is reported. New program option flags for prime mover analyses and solar energy for space heating and domestic hot water are also covered.

  15. Learning Multirobot Hose Transportation and Deployment by Distributed Round-Robin Q-Learning.

    PubMed

    Fernandez-Gauna, Borja; Etxeberria-Agiriano, Ismael; Graña, Manuel

    2015-01-01

    Multi-Agent Reinforcement Learning (MARL) algorithms face two main difficulties: the curse of dimensionality, and environment non-stationarity due to the independent learning processes carried out by the agents concurrently. In this paper we formalize and prove the convergence of a Distributed Round Robin Q-learning (D-RR-QL) algorithm for cooperative systems. The computational complexity of this algorithm increases linearly with the number of agents. Moreover, it eliminates environment non sta tionarity by carrying a round-robin scheduling of the action selection and execution. That this learning scheme allows the implementation of Modular State-Action Vetoes (MSAV) in cooperative multi-agent systems, which speeds up learning convergence in over-constrained systems by vetoing state-action pairs which lead to undesired termination states (UTS) in the relevant state-action subspace. Each agent's local state-action value function learning is an independent process, including the MSAV policies. Coordination of locally optimal policies to obtain the global optimal joint policy is achieved by a greedy selection procedure using message passing. We show that D-RR-QL improves over state-of-the-art approaches, such as Distributed Q-Learning, Team Q-Learning and Coordinated Reinforcement Learning in a paradigmatic Linked Multi-Component Robotic System (L-MCRS) control problem: the hose transportation task. L-MCRS are over-constrained systems with many UTS induced by the interaction of the passive linking element and the active mobile robots.

  16. 76 FR 31951 - Energy Conservation Program for Certain Commercial and Industrial Equipment: Decision and Order...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... specific to the Carrier Super Modular Multi-System (SMMSi) variable refrigerant flow (VRF) multi-split... in this notice to test and rate its SMMSi VRF multi-split commercial heat pumps. DATES: This Decision... its SMMSi VRF multi-split products. Carrier must use the alternate test procedure provided in this...

  17. Accurate, predictable, repeatable micro-assembly technology for polymer, microfluidic modules.

    PubMed

    Lee, Tae Yoon; Han, Kyudong; Barrett, Dwhyte O; Park, Sunggook; Soper, Steven A; Murphy, Michael C

    2018-01-01

    A method for the design, construction, and assembly of modular, polymer-based, microfluidic devices using simple micro-assembly technology was demonstrated to build an integrated fluidic system consisting of vertically stacked modules for carrying out multi-step molecular assays. As an example of the utility of the modular system, point mutation detection using the ligase detection reaction (LDR) following amplification by the polymerase chain reaction (PCR) was carried out. Fluid interconnects and standoffs ensured that temperatures in the vertically stacked reactors were within ± 0.2 C° at the center of the temperature zones and ± 1.1 C° overall. The vertical spacing between modules was confirmed using finite element models (ANSYS, Inc., Canonsburg, PA) to simulate the steady-state temperature distribution for the assembly. Passive alignment structures, including a hemispherical pin-in-hole, a hemispherical pin-in-slot, and a plate-plate lap joint, were developed using screw theory to enable accurate exactly constrained assembly of the microfluidic reactors, cover sheets, and fluid interconnects to facilitate the modular approach. The mean mismatch between the centers of adjacent through holes was 64 ± 7.7 μm, significantly reducing the dead volume necessary to accommodate manufacturing variation. The microfluidic components were easily assembled by hand and the assembly of several different configurations of microfluidic modules for executing the assay was evaluated. Temperatures were measured in the desired range in each reactor. The biochemical performance was comparable to that obtained with benchtop instruments, but took less than 45 min to execute, half the time.

  18. Modeling hospital surgical delivery process design using system simulation: optimizing patient flow and bed capacity as an illustration.

    PubMed

    Kumar, Sameer

    2011-01-01

    It is increasingly recognized that hospital operation is an intricate system with limited resources and many interacting sources of both positive and negative feedback. The purpose of this study is to design a surgical delivery process in a county hospital in the U.S where patient flow through a surgical ward is optimized. The system simulation modeling is used to address questions of capacity planning, throughput management and interacting resources which constitute the constantly changing complexity that characterizes designing a contemporary surgical delivery process in a hospital. The steps in building a system simulation model is demonstrated using an example of building a county hospital in a small city in the US. It is used to illustrate a modular system simulation modeling of patient surgery process flows. The system simulation model development will enable planners and designers how they can build in overall efficiencies in a healthcare facility through optimal bed capacity for peak patient flow of emergency and routine patients.

  19. Road screening and distribution route multi-objective robust optimization for hazardous materials based on neural network and genetic algorithm.

    PubMed

    Ma, Changxi; Hao, Wei; Pan, Fuquan; Xiang, Wang

    2018-01-01

    Route optimization of hazardous materials transportation is one of the basic steps in ensuring the safety of hazardous materials transportation. The optimization scheme may be a security risk if road screening is not completed before the distribution route is optimized. For road screening issues of hazardous materials transportation, a road screening algorithm of hazardous materials transportation is built based on genetic algorithm and Levenberg-Marquardt neural network (GA-LM-NN) by analyzing 15 attributes data of each road network section. A multi-objective robust optimization model with adjustable robustness is constructed for the hazardous materials transportation problem of single distribution center to minimize transportation risk and time. A multi-objective genetic algorithm is designed to solve the problem according to the characteristics of the model. The algorithm uses an improved strategy to complete the selection operation, applies partial matching cross shift and single ortho swap methods to complete the crossover and mutation operation, and employs an exclusive method to construct Pareto optimal solutions. Studies show that the sets of hazardous materials transportation road can be found quickly through the proposed road screening algorithm based on GA-LM-NN, whereas the distribution route Pareto solutions with different levels of robustness can be found rapidly through the proposed multi-objective robust optimization model and algorithm.

  20. The multi-purpose three-axis spectrometer (TAS) MIRA at FRM II

    NASA Astrophysics Data System (ADS)

    Georgii, R.; Weber, T.; Brandl, G.; Skoulatos, M.; Janoschek, M.; Mühlbauer, S.; Pfleiderer, C.; Böni, P.

    2018-02-01

    The cold-neutron three-axis spectrometer MIRA is an instrument optimized for low-energy excitations. Its excellent intrinsic Q-resolution makes it ideal for studying incommensurate magnetic systems (elastic and inelastic). MIRA is at the forefront of using advanced neutron focusing optics such as elliptic guides, which enable the investigation of small samples under extreme conditions. Another advantage of MIRA is the modular assembly allowing for instrumental adaption to the needs of the experiment within a few hours. The development of new methods such as the spin-echo technique MIEZE is another important application at MIRA. Scientific topics include the investigation of complex inter-metallic alloys and spectroscopy on incommensurate magnetic structures.

  1. Programmable Bio-Nano-Chip Systems for Serum CA125 Quantification: Towards Ovarian Cancer Diagnostics at the Point-of-Care

    PubMed Central

    Raamanathan, Archana; Simmons, Glennon W.; Christodoulides, Nicolaos; Floriano, Pierre N.; Furmaga, Wieslaw B.; Redding, Spencer W.; Lu, Karen H.; Bast, Robert C.; McDevitt, John T.

    2013-01-01

    Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, modular (Programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multi-modal and multi-marker screening approaches. In the p-BNC, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a 3-D microfluidic environment, the p-BNC operating variables (incubation times, flow rates and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (Inter- and intra-assay precision of 1.2% and 1.9% and LODs of 1.0 U/mL) was achieved on this mini-sensor ensemble. Further validation with sera of ovarian cancer patients (n=20) demonstrated excellent correlation (R2 = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics. PMID:22490510

  2. Multi-GPU implementation of a VMAT treatment plan optimization algorithm.

    PubMed

    Tian, Zhen; Peng, Fei; Folkerts, Michael; Tan, Jun; Jia, Xun; Jiang, Steve B

    2015-06-01

    Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU's relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors' group, on a multi-GPU platform to solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors' method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H&N) cancer case is then used to validate the authors' method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H&N patient cases and three prostate cases are used to demonstrate the advantages of the authors' method. The authors' multi-GPU implementation can finish the optimization process within ∼ 1 min for the H&N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23-46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. The results demonstrate that the multi-GPU implementation of the authors' column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors' study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.

  3. Analysis of a modular generator for high-voltage, high-frequency pulsed applications, using low voltage semiconductors (< 1 kV) and series connected step-up (1:10) transformers.

    PubMed

    Redondo, L M; Fernando Silva, J; Margato, E

    2007-03-01

    This article discusses the operation of a modular generator topology, which has been developed for high-frequency (kHz), high-voltage (kV) pulsed applications. The proposed generator uses individual modules, each one consisting of a pulse circuit based on a modified forward converter, which takes advantage of the required low duty cycle to operate with a low voltage clamp reset circuit for the step-up transformer. This reduces the maximum voltage on the semiconductor devices of both primary and secondary transformer sides. The secondary winding of each step-up transformer is series connected, delivering a fraction of the total voltage. Each individual pulsed module is supplied via an isolation transformer. The assembled modular laboratorial prototype, with three 5 kV modules, 800 V semiconductor switches, and 1:10 step-up transformers, has 80% efficiency, and is capable of delivering, into resistive loads, -15 kV1 A pulses with 5 micros width, 10 kHz repetition rate, with less than 1 micros pulse rise time. Experimental results for resistive loads are presented and discussed.

  4. Modular "plug-and-play" capsules for multi-capsule environment in the gastrointestinal tract.

    PubMed

    Phee, S J; Ting, E K; Lin, L; Huynh, V A; Kencana, A P; Wong, K J; Tan, S L

    2009-01-01

    The invention of wireless capsule endoscopy has opened new ways of diagnosing and treating diseases in the gastrointestinal tract. Current wireless capsules can perform simple operations such as imaging and data collection (like temperature, pressure, and pH) in the gastrointestinal tract. Researchers are now focusing on adding more sophisticated functions such as drug delivery, surgical clips/tags deployment, and tissue samples collection. The finite on-board power on these capsules is one of the factors that limits the functionalities of these wireless capsules. Thus multiple application-specific capsules would be needed to complete an endoscopic operation. This would give rise to a multi-capsule environment. Having a modular "plug-and-play" capsule design would facilitate doctors in configuring multiple application-specific capsules, e.g. tagging capsule, for use in the gastrointestinal tract. This multi-capsule environment also has the advantage of reducing power consumption through asymmetric multi-hop communication.

  5. A simple and versatile design concept for fluorophore derivatives with intramolecular photostabilization

    NASA Astrophysics Data System (ADS)

    van der Velde, Jasper H. M.; Oelerich, Jens; Huang, Jingyi; Smit, Jochem H.; Aminian Jazi, Atieh; Galiani, Silvia; Kolmakov, Kirill; Guoridis, Giorgos; Eggeling, Christian; Herrmann, Andreas; Roelfes, Gerard; Cordes, Thorben

    2016-01-01

    Intramolecular photostabilization via triple-state quenching was recently revived as a tool to impart synthetic organic fluorophores with `self-healing' properties. To date, utilization of such fluorophore derivatives is rare due to their elaborate multi-step synthesis. Here we present a general strategy to covalently link a synthetic organic fluorophore simultaneously to a photostabilizer and biomolecular target via unnatural amino acids. The modular approach uses commercially available starting materials and simple chemical transformations. The resulting photostabilizer-dye conjugates are based on rhodamines, carbopyronines and cyanines with excellent photophysical properties, that is, high photostability and minimal signal fluctuations. Their versatile use is demonstrated by single-step labelling of DNA, antibodies and proteins, as well as applications in single-molecule and super-resolution fluorescence microscopy. We are convinced that the presented scaffolding strategy and the improved characteristics of the conjugates in applications will trigger the broader use of intramolecular photostabilization and help to emerge this approach as a new gold standard.

  6. A simple and versatile design concept for fluorophore derivatives with intramolecular photostabilization

    PubMed Central

    van der Velde, Jasper H. M.; Oelerich, Jens; Huang, Jingyi; Smit, Jochem H.; Aminian Jazi, Atieh; Galiani, Silvia; Kolmakov, Kirill; Gouridis, Giorgos; Eggeling, Christian; Herrmann, Andreas; Roelfes, Gerard; Cordes, Thorben

    2016-01-01

    Intramolecular photostabilization via triple-state quenching was recently revived as a tool to impart synthetic organic fluorophores with ‘self-healing’ properties. To date, utilization of such fluorophore derivatives is rare due to their elaborate multi-step synthesis. Here we present a general strategy to covalently link a synthetic organic fluorophore simultaneously to a photostabilizer and biomolecular target via unnatural amino acids. The modular approach uses commercially available starting materials and simple chemical transformations. The resulting photostabilizer–dye conjugates are based on rhodamines, carbopyronines and cyanines with excellent photophysical properties, that is, high photostability and minimal signal fluctuations. Their versatile use is demonstrated by single-step labelling of DNA, antibodies and proteins, as well as applications in single-molecule and super-resolution fluorescence microscopy. We are convinced that the presented scaffolding strategy and the improved characteristics of the conjugates in applications will trigger the broader use of intramolecular photostabilization and help to emerge this approach as a new gold standard. PMID:26751640

  7. A modular platform for one-step assembly of multi-component membrane systems by fusion of charged proteoliposomes

    NASA Astrophysics Data System (ADS)

    Ishmukhametov, Robert R.; Russell, Aidan N.; Berry, Richard M.

    2016-10-01

    An important goal in synthetic biology is the assembly of biomimetic cell-like structures, which combine multiple biological components in synthetic lipid vesicles. A key limiting assembly step is the incorporation of membrane proteins into the lipid bilayer of the vesicles. Here we present a simple method for delivery of membrane proteins into a lipid bilayer within 5 min. Fusogenic proteoliposomes, containing charged lipids and membrane proteins, fuse with oppositely charged bilayers, with no requirement for detergent or fusion-promoting proteins, and deliver large, fragile membrane protein complexes into the target bilayers. We demonstrate the feasibility of our method by assembling a minimal electron transport chain capable of adenosine triphosphate (ATP) synthesis, combining Escherichia coli F1Fo ATP-synthase and the primary proton pump bo3-oxidase, into synthetic lipid vesicles with sizes ranging from 100 nm to ~10 μm. This provides a platform for the combination of multiple sets of membrane protein complexes into cell-like artificial structures.

  8. FACETS: multi-faceted functional decomposition of protein interaction networks

    PubMed Central

    Seah, Boon-Siew; Bhowmick, Sourav S.; Forbes Dewey, C.

    2012-01-01

    Motivation: The availability of large-scale curated protein interaction datasets has given rise to the opportunity to investigate higher level organization and modularity within the protein–protein interaction (PPI) network using graph theoretic analysis. Despite the recent progress, systems level analysis of high-throughput PPIs remains a daunting task because of the amount of data they present. In this article, we propose a novel PPI network decomposition algorithm called FACETS in order to make sense of the deluge of interaction data using Gene Ontology (GO) annotations. FACETS finds not just a single functional decomposition of the PPI network, but a multi-faceted atlas of functional decompositions that portray alternative perspectives of the functional landscape of the underlying PPI network. Each facet in the atlas represents a distinct interpretation of how the network can be functionally decomposed and organized. Our algorithm maximizes interpretative value of the atlas by optimizing inter-facet orthogonality and intra-facet cluster modularity. Results: We tested our algorithm on the global networks from IntAct, and compared it with gold standard datasets from MIPS and KEGG. We demonstrated the performance of FACETS. We also performed a case study that illustrates the utility of our approach. Contact: seah0097@ntu.edu.sg or assourav@ntu.edu.sg Supplementary information: Supplementary data are available at the Bioinformatics online. Availability: Our software is available freely for non-commercial purposes from: http://www.cais.ntu.edu.sg/∼assourav/Facets/ PMID:22908217

  9. Treatment Effects of a Modular Intervention for Early-Onset Child Behavior Problems on Family Contextual Outcomes

    ERIC Educational Resources Information Center

    Shaffer, Anne; Lindhiem, Oliver; Kolko, David J.

    2013-01-01

    The overall aim of this multi-informant study was to examine pre-post treatment changes, and maintenance at 3-year follow-up, for multiple dimensions of the family context, for a modular intervention that has previously demonstrated significant clinical improvements in child behavior and maintenance of these effects. Family outcomes included…

  10. Multi-objective optimization for generating a weighted multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic ensemble mean and may provide reliable future projections.

  11. Communication: An efficient approach to compute state-specific nuclear gradients for a generic state-averaged multi-configuration self consistent field wavefunction.

    PubMed

    Granovsky, Alexander A

    2015-12-21

    We present a new, very efficient semi-numerical approach for the computation of state-specific nuclear gradients of a generic state-averaged multi-configuration self consistent field wavefunction. Our approach eliminates the costly coupled-perturbed multi-configuration Hartree-Fock step as well as the associated integral transformation stage. The details of the implementation within the Firefly quantum chemistry package are discussed and several sample applications are given. The new approach is routinely applicable to geometry optimization of molecular systems with 1000+ basis functions using a standalone multi-core workstation.

  12. Communication: An efficient approach to compute state-specific nuclear gradients for a generic state-averaged multi-configuration self consistent field wavefunction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granovsky, Alexander A., E-mail: alex.granovsky@gmail.com

    We present a new, very efficient semi-numerical approach for the computation of state-specific nuclear gradients of a generic state-averaged multi-configuration self consistent field wavefunction. Our approach eliminates the costly coupled-perturbed multi-configuration Hartree-Fock step as well as the associated integral transformation stage. The details of the implementation within the Firefly quantum chemistry package are discussed and several sample applications are given. The new approach is routinely applicable to geometry optimization of molecular systems with 1000+ basis functions using a standalone multi-core workstation.

  13. Criteria for software modularization

    NASA Technical Reports Server (NTRS)

    Card, David N.; Page, Gerald T.; Mcgarry, Frank E.

    1985-01-01

    A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.

  14. Constant Communities in Complex Networks

    NASA Astrophysics Data System (ADS)

    Chakraborty, Tanmoy; Srinivasan, Sriram; Ganguly, Niloy; Bhowmick, Sanjukta; Mukherjee, Animesh

    2013-05-01

    Identifying community structure is a fundamental problem in network analysis. Most community detection algorithms are based on optimizing a combinatorial parameter, for example modularity. This optimization is generally NP-hard, thus merely changing the vertex order can alter their assignments to the community. However, there has been less study on how vertex ordering influences the results of the community detection algorithms. Here we identify and study the properties of invariant groups of vertices (constant communities) whose assignment to communities are, quite remarkably, not affected by vertex ordering. The percentage of constant communities can vary across different applications and based on empirical results we propose metrics to evaluate these communities. Using constant communities as a pre-processing step, one can significantly reduce the variation of the results. Finally, we present a case study on phoneme network and illustrate that constant communities, quite strikingly, form the core functional units of the larger communities.

  15. Multidimensional heuristic process for high-yield production of astaxanthin and fragrance molecules in Escherichia coli.

    PubMed

    Zhang, Congqiang; Seow, Vui Yin; Chen, Xixian; Too, Heng-Phon

    2018-05-11

    Optimization of metabolic pathways consisting of large number of genes is challenging. Multivariate modular methods (MMMs) are currently available solutions, in which reduced regulatory complexities are achieved by grouping multiple genes into modules. However, these methods work well for balancing the inter-modules but not intra-modules. In addition, application of MMMs to the 15-step heterologous route of astaxanthin biosynthesis has met with limited success. Here, we expand the solution space of MMMs and develop a multidimensional heuristic process (MHP). MHP can simultaneously balance different modules by varying promoter strength and coordinating intra-module activities by using ribosome binding sites (RBSs) and enzyme variants. Consequently, MHP increases enantiopure 3S,3'S-astaxanthin production to 184 mg l -1 day -1 or 320 mg l -1 . Similarly, MHP improves the yields of nerolidol and linalool. MHP may be useful for optimizing other complex biochemical pathways.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Peter; Dykes, Katherine; Scott, George

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. Furthermore, this document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  17. Airborne net-centric multi-INT sensor control, display, fusion, and exploitation systems

    NASA Astrophysics Data System (ADS)

    Linne von Berg, Dale C.; Lee, John N.; Kruer, Melvin R.; Duncan, Michael D.; Olchowski, Fred M.; Allman, Eric; Howard, Grant

    2004-08-01

    The NRL Optical Sciences Division has initiated a multi-year effort to develop and demonstrate an airborne net-centric suite of multi-intelligence (multi-INT) sensors and exploitation systems for real-time target detection and targeting product dissemination. The goal of this Net-centric Multi-Intelligence Fusion Targeting Initiative (NCMIFTI) is to develop an airborne real-time intelligence gathering and targeting system that can be used to detect concealed, camouflaged, and mobile targets. The multi-INT sensor suite will include high-resolution visible/infrared (EO/IR) dual-band cameras, hyperspectral imaging (HSI) sensors in the visible-to-near infrared, short-wave and long-wave infrared (VNIR/SWIR/LWIR) bands, Synthetic Aperture Radar (SAR), electronics intelligence sensors (ELINT), and off-board networked sensors. Other sensors are also being considered for inclusion in the suite to address unique target detection needs. Integrating a suite of multi-INT sensors on a single platform should optimize real-time fusion of the on-board sensor streams, thereby improving the detection probability and reducing the false alarms that occur in reconnaissance systems that use single-sensor types on separate platforms, or that use independent target detection algorithms on multiple sensors. In addition to the integration and fusion of the multi-INT sensors, the effort is establishing an open-systems net-centric architecture that will provide a modular "plug and play" capability for additional sensors and system components and provide distributed connectivity to multiple sites for remote system control and exploitation.

  18. A structural topological optimization method for multi-displacement constraints and any initial topology configuration

    NASA Astrophysics Data System (ADS)

    Rong, J. H.; Yi, J. H.

    2010-10-01

    In density-based topological design, one expects that the final result consists of elements either black (solid material) or white (void), without any grey areas. Moreover, one also expects that the optimal topology can be obtained by starting from any initial topology configuration. An improved structural topological optimization method for multi- displacement constraints is proposed in this paper. In the proposed method, the whole optimization process is divided into two optimization adjustment phases and a phase transferring step. Firstly, an optimization model is built to deal with the varied displacement limits, design space adjustments, and reasonable relations between the element stiffness matrix and mass and its element topology variable. Secondly, a procedure is proposed to solve the optimization problem formulated in the first optimization adjustment phase, by starting with a small design space and advancing to a larger deign space. The design space adjustments are automatic when the design domain needs expansions, in which the convergence of the proposed method will not be affected. The final topology obtained by the proposed procedure in the first optimization phase, can approach to the vicinity of the optimum topology. Then, a heuristic algorithm is given to improve the efficiency and make the designed structural topology black/white in both the phase transferring step and the second optimization adjustment phase. And the optimum topology can finally be obtained by the second phase optimization adjustments. Two examples are presented to show that the topologies obtained by the proposed method are of very good 0/1 design distribution property, and the computational efficiency is enhanced by reducing the element number of the design structural finite model during two optimization adjustment phases. And the examples also show that this method is robust and practicable.

  19. Design of a modular digital computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A design tradeoff study is reported for a modular spaceborne computer system that is responsive to many mission types and phases. The computer uses redundancy to maximize reliability, and multiprocessing to maximize processing capacity. Fault detection and recovery features provide optimal reliability.

  20. An overview of instrumentation for the Large Binocular Telescope

    NASA Astrophysics Data System (ADS)

    Wagner, R. Mark

    2004-09-01

    An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27'x 27') UB/VRI optimized mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6\\arcmin\\ field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4'x 4') imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0'.5 x 0'.5) imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench beam combiner with visible and near-infrared imagers utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC/NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support.

  1. Wind Farm Turbine Type and Placement Optimization

    NASA Astrophysics Data System (ADS)

    Graf, Peter; Dykes, Katherine; Scott, George; Fields, Jason; Lunacek, Monte; Quick, Julian; Rethore, Pierre-Elouan

    2016-09-01

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. This document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  2. Wind farm turbine type and placement optimization

    DOE PAGES

    Graf, Peter; Dykes, Katherine; Scott, George; ...

    2016-10-03

    The layout of turbines in a wind farm is already a challenging nonlinear, nonconvex, nonlinearly constrained continuous global optimization problem. Here we begin to address the next generation of wind farm optimization problems by adding the complexity that there is more than one turbine type to choose from. The optimization becomes a nonlinear constrained mixed integer problem, which is a very difficult class of problems to solve. Furthermore, this document briefly summarizes the algorithm and code we have developed, the code validation steps we have performed, and the initial results for multi-turbine type and placement optimization (TTP_OPT) we have run.

  3. Method of multi-dimensional moment analysis for the characterization of signal peaks

    DOEpatents

    Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A

    2012-10-23

    A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.

  4. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    DOE PAGES

    Chen, Bo; Chen, Chen; Wang, Jianhui; ...

    2017-07-07

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less

  5. Multi-Time Step Service Restoration for Advanced Distribution Systems and Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Bo; Chen, Chen; Wang, Jianhui

    Modern power systems are facing increased risk of disasters that can cause extended outages. The presence of remote control switches (RCSs), distributed generators (DGs), and energy storage systems (ESS) provides both challenges and opportunities for developing post-fault service restoration methodologies. Inter-temporal constraints of DGs, ESS, and loads under cold load pickup (CLPU) conditions impose extra complexity on problem formulation and solution. In this paper, a multi-time step service restoration methodology is proposed to optimally generate a sequence of control actions for controllable switches, ESSs, and dispatchable DGs to assist the system operator with decision making. The restoration sequence is determinedmore » to minimize the unserved customers by energizing the system step by step without violating operational constraints at each time step. The proposed methodology is formulated as a mixed-integer linear programming (MILP) model and can adapt to various operation conditions. Furthermore, the proposed method is validated through several case studies that are performed on modified IEEE 13-node and IEEE 123-node test feeders.« less

  6. Integration of multi-interface conversion channel using FPGA for modular photonic network

    NASA Astrophysics Data System (ADS)

    Janicki, Tomasz; Pozniak, Krzysztof T.; Romaniuk, Ryszard S.

    2010-09-01

    The article discusses the integration of different types of interfaces with FPGA circuits using a reconfigurable communication platform. The solution has been implemented in practice in a single node of a distributed measurement system. Construction of communication platform has been presented with its selected hardware modules, described in VHDL and implemented in FPGA circuits. The graphical user interface (GUI) has been described that allows a user to control the operation of the system. In the final part of the article selected practical solutions have been introduced. The whole measurement system resides on multi-gigabit optical network. The optical network construction is highly modular, reconfigurable and scalable.

  7. Development of Multi-Legged Walking Robot Using Reconfigurable Modular Design and Biomimetic Control Architecture

    NASA Astrophysics Data System (ADS)

    Chen, Xuedong; Sun, Yi; Huang, Qingjiu; Jia, Wenchuan; Pu, Huayan

    This paper focuses on the design of a modular multi-legged walking robot MiniQuad-I, which can be reconfigured into variety configurations, including quadruped and hexapod configurations for different tasks by changing the layout of modules. Critical design considerations when taking the adaptability, maintainability and extensibility in count simultaneously are discussed and then detailed designs of each module are presented. The biomimetic control architecture of MiniQuad-I is proposed, which can improve the capability of agility and independence of the robot. Simulations and experiments on crawling, object picking and obstacle avoiding are performed to verify functions of the MiniQuad-I.

  8. PRIMA Platform capability for satellite missions in LEO and MEO (SAR, Optical, GNSS, TLC, etc.)

    NASA Astrophysics Data System (ADS)

    Logue, T.; L'Abbate, M.

    2016-12-01

    PRIMA (Piattaforma Riconfigurabile Italiana Multi Applicativa) is a multi-mission 3-axis stabilized Platform developed by Thales Alenia Space Italia under ASI contract.PRIMA is designed to operate for a wide variety of applications from LEO, MEO up to GEO and for different classes of satellites Platform Family. It has an extensive heritage in flight heritage (LEO and MEO Satellites already fully operational) in which it has successfully demonstrated the flexibility of use, low management costs and the ability to adapt to changing operational conditions.The flexibility and modularity of PRIMA provides unique capability to satisfy different Payload design and mission requirements, thanks to the utilization of recurrent adaptable modules (Service Module-SVM, Propulsion Module-PPM, Payload Module-PLM) to obtain mission dependent configuration. PRIMA product line development is continuously progressing, and is based on state of art technology, modular architecture and an Integrated Avionics. The aim is to maintain and extent multi-mission capabilities to operate in different environments (LEO to GEO) with different payloads (SAR, Optical, GNSS, TLC, etc.). The design is compatible with a wide range of European and US equipment suppliers, thus maximising cooperation opportunity. Evolution activities are mainly focused on the following areas: Structure: to enable Spacecraft configurations for multiple launch; Thermal Control: to guarantee thermal limits for new missions, more demanding in terms of environment and payload; Electrical: to cope with higher power demand (e.g. electrical propulsion, wide range of payloads, etc.) considering orbital environment (e.g. lighting condition); Avionics : AOCS solutions optimized on mission (LEO observation driven by agility and pointing, agility not a driver for GEO). Use of sensors and actuators tailored for specific mission and related environments. Optimised Propulsion control. Data Handling, SW and FDIR mission customization, ensuring robust storage and downlink capability, long lasting autonomy and flexible operations in all mission phases, nominal and non-nominal conditions. This paper starting from PRIMA flight achievements will then outline PRIMA family multi-purpose features addressed to meet multi mission requirements.

  9. The benefits of adaptive parametrization in multi-objective Tabu Search optimization

    NASA Astrophysics Data System (ADS)

    Ghisu, Tiziano; Parks, Geoffrey T.; Jaeggi, Daniel M.; Jarrett, Jerome P.; Clarkson, P. John

    2010-10-01

    In real-world optimization problems, large design spaces and conflicting objectives are often combined with a large number of constraints, resulting in a highly multi-modal, challenging, fragmented landscape. The local search at the heart of Tabu Search, while being one of its strengths in highly constrained optimization problems, requires a large number of evaluations per optimization step. In this work, a modification of the pattern search algorithm is proposed: this modification, based on a Principal Components' Analysis of the approximation set, allows both a re-alignment of the search directions, thereby creating a more effective parametrization, and also an informed reduction of the size of the design space itself. These changes make the optimization process more computationally efficient and more effective - higher quality solutions are identified in fewer iterations. These advantages are demonstrated on a number of standard analytical test functions (from the ZDT and DTLZ families) and on a real-world problem (the optimization of an axial compressor preliminary design).

  10. Features of Modularly Assembled Compounds That Impart Bioactivity Against an RNA Target

    PubMed Central

    Rzuczek, Suzanne G.; Gao, Yu; Tang, Zhen-Zhi; Thornton, Charles A.; Kodadek, Thomas; Disney, Matthew D.

    2013-01-01

    Transcriptomes provide a myriad of potential RNAs that could be the targets of therapeutics or chemical genetic probes of function. Cell permeable small molecules, however, generally do not exploit these targets, owing to the difficulty in the design of high affinity, specific small molecules targeting RNA. As part of a general program to study RNA function using small molecules, we designed bioactive, modularly assembled small molecules that target the non-coding expanded RNA repeat that causes myotonic dystrophy type 1 (DM1), r(CUG)exp. Herein, we present a rigorous study to elucidate features in modularly assembled compounds that afford bioactivity. Different modular assembly scaffolds were investigated including polyamines, α-peptides, β-peptides, and peptide tertiary amides (PTAs). Based on activity as assessed by improvement of DM1-associated defects, stability against proteases, cellular permeability, and toxicity, we discovered that constrained backbones, namely PTAs, are optimal. Notably, we determined that r(CUG)exp is the target of the optimal PTA in cellular models and that the optimal PTA improves DM1-associated defects in a mouse model. Biophysical analyses were employed to investigate potential sources of bioactivity. These investigations show that modularly assembled compounds have increased residence times on their targets and faster on rates than the RNA-binding modules from which they were derived and faster on rates than the protein that binds r(CUG)exp, the inactivation of which gives rise to DM1-associated defects. These studies provide information about features of small molecules that are programmable for targeting RNA, allowing for the facile optimization of therapeutics or chemical probes against other cellular RNA targets. PMID:24032410

  11. Features of modularly assembled compounds that impart bioactivity against an RNA target.

    PubMed

    Rzuczek, Suzanne G; Gao, Yu; Tang, Zhen-Zhi; Thornton, Charles A; Kodadek, Thomas; Disney, Matthew D

    2013-10-18

    Transcriptomes provide a myriad of potential RNAs that could be the targets of therapeutics or chemical genetic probes of function. Cell-permeable small molecules, however, generally do not exploit these targets, owing to the difficulty in the design of high affinity, specific small molecules targeting RNA. As part of a general program to study RNA function using small molecules, we designed bioactive, modularly assembled small molecules that target the noncoding expanded RNA repeat that causes myotonic dystrophy type 1 (DM1), r(CUG)(exp). Herein, we present a rigorous study to elucidate features in modularly assembled compounds that afford bioactivity. Different modular assembly scaffolds were investigated, including polyamines, α-peptides, β-peptides, and peptide tertiary amides (PTAs). On the basis of activity as assessed by improvement of DM1-associated defects, stability against proteases, cellular permeability, and toxicity, we discovered that constrained backbones, namely, PTAs, are optimal. Notably, we determined that r(CUG)(exp) is the target of the optimal PTA in cellular models and that the optimal PTA improves DM1-associated defects in a mouse model. Biophysical analyses were employed to investigate potential sources of bioactivity. These investigations show that modularly assembled compounds have increased residence times on their targets and faster on rates than the RNA-binding modules from which they were derived. Moreover, they have faster on rates than the protein that binds r(CUG)(exp), the inactivation of which gives rise to DM1-associated defects. These studies provide information about features of small molecules that are programmable for targeting RNA, allowing for the facile optimization of therapeutics or chemical probes against other cellular RNA targets.

  12. Optimization of automated large-scale production of [(18)F]fluoroethylcholine for PET prostate cancer imaging.

    PubMed

    Pascali, Giancarlo; D'Antonio, Luca; Bovone, Paola; Gerundini, Paolo; August, Thorsten

    2009-07-01

    PET tumor imaging is gaining importance in current clinical practice. FDG-PET is the most utilized approach but suffers from inflammation influences and is not utilizable in prostate cancer detection. Recently, (11)C-choline analogues have been employed successfully in this field of imaging, leading to a growing interest in the utilization of (18)F-labeled analogues: [(18)F]fluoroethylcholine (FEC) has been demonstrated to be promising, especially in prostate cancer imaging. In this work we report an automatic radiosynthesis of this tracer with high yields, short synthesis time and ease of performance, potentially utilizable in routine production sites. We used a Modular Lab system to automatically perform the two-step/one-pot synthesis. In the first step, we labeled ethyleneglycolditosylate obtaining [(18)F]fluoroethyltosylate; in the second step, we performed the coupling of the latter intermediate with neat dimethylethanolamine. The final mixture was purified by means of solid phase extraction; in particular, the product was trapped into a cation-exchange resin and eluted with isotonic saline. The optimized procedure resulted in a non decay corrected yield of 36% and produced a range of 30-45 GBq of product already in injectable form. The product was analyzed for quality control and resulted as pure and sterile; in addition, residual solvents were under the required threshold. In this work, we present an automatic FEC radiosynthesis that has been optimized for routine production. This findings should foster the interest for a wider utilization of this radiomolecule for imaging of prostate cancer with PET, a field for which no gold-standard tracer has yet been validated.

  13. An Automated Pipeline for Engineering Many-Enzyme Pathways: Computational Sequence Design, Pathway Expression-Flux Mapping, and Scalable Pathway Optimization.

    PubMed

    Halper, Sean M; Cetnar, Daniel P; Salis, Howard M

    2018-01-01

    Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.

  14. Schrodinger's catapult II: entanglement between stationary and flying fields

    NASA Astrophysics Data System (ADS)

    Pfaff, W.; Axline, C.; Burkhart, L.; Vool, U.; Reinhold, P.; Frunzio, L.; Jiang, L.; Devoret, M.; Schoelkopf, R.

    Entanglement between nodes is an elementary resource in a quantum network. An important step towards its realization is entanglement between stationary and flying states. Here we experimentally demonstrate entanglement generation between a long-lived cavity memory and traveling mode in circuit QED. A large on/off ratio and fast control over a parametric mixing process allow us to realize conversion with tunable magnitude and duration between standing and flying mode. In the case of half-conversion, we observe correlations between the standing and flying state that confirm the generation of entangled states. We show this for both single-photon and multi-photon states, paving the way for error-correctable remote entanglement. Our system could serve as an essential component in a modular architecture for error-protected quantum information processing.

  15. Comparing Multi-Step IMAC and Multi-Step TiO2 Methods for Phosphopeptide Enrichment

    PubMed Central

    Yue, Xiaoshan; Schunter, Alissa; Hummon, Amanda B.

    2016-01-01

    Phosphopeptide enrichment from complicated peptide mixtures is an essential step for mass spectrometry-based phosphoproteomic studies to reduce sample complexity and ionization suppression effects. Typical methods for enriching phosphopeptides include immobilized metal affinity chromatography (IMAC) or titanium dioxide (TiO2) beads, which have selective affinity and can interact with phosphopeptides. In this study, the IMAC enrichment method was compared with the TiO2 enrichment method, using a multi-step enrichment strategy from whole cell lysate, to evaluate their abilities to enrich for different types of phosphopeptides. The peptide-to-beads ratios were optimized for both IMAC and TiO2 beads. Both IMAC and TiO2 enrichments were performed for three rounds to enable the maximum extraction of phosphopeptides from the whole cell lysates. The phosphopeptides that are unique to IMAC enrichment, unique to TiO2 enrichment, and identified with both IMAC and TiO2 enrichment were analyzed for their characteristics. Both IMAC and TiO2 enriched similar amounts of phosphopeptides with comparable enrichment efficiency. However, phosphopeptides that are unique to IMAC enrichment showed a higher percentage of multi-phosphopeptides, as well as a higher percentage of longer, basic, and hydrophilic phosphopeptides. Also, the IMAC and TiO2 procedures clearly enriched phosphopeptides with different motifs. Finally, further enriching with two rounds of TiO2 from the supernatant after IMAC enrichment, or further enriching with two rounds of IMAC from the supernatant TiO2 enrichment does not fully recover the phosphopeptides that are not identified with the corresponding multi-step enrichment. PMID:26237447

  16. Multi-contrast MRI registration of carotid arteries based on cross-sectional images and lumen boundaries

    NASA Astrophysics Data System (ADS)

    Wu, Yu-Xia; Zhang, Xi; Xu, Xiao-Pan; Liu, Yang; Zhang, Guo-Peng; Li, Bao-Juan; Chen, Hui-Jun; Lu, Hong-Bing

    2017-02-01

    Ischemic stroke has great correlation with carotid atherosclerosis and is mostly caused by vulnerable plaques. It's particularly important to analysis the components of plaques for the detection of vulnerable plaques. Recently plaque analysis based on multi-contrast magnetic resonance imaging has attracted great attention. Though multi-contrast MR imaging has potentials in enhanced demonstration of carotid wall, its performance is hampered by the misalignment of different imaging sequences. In this study, a coarse-to-fine registration strategy based on cross-sectional images and wall boundaries is proposed to solve the problem. It includes two steps: a rigid step using the iterative closest points to register the centerlines of carotid artery extracted from multi-contrast MR images, and a non-rigid step using the thin plate spline to register the lumen boundaries of carotid artery. In the rigid step, the centerline was extracted by tracking the crosssectional images along the vessel direction calculated by Hessian matrix. In the non-rigid step, a shape context descriptor is introduced to find corresponding points of two similar boundaries. In addition, the deterministic annealing technique is used to find a globally optimized solution. The proposed strategy was evaluated by newly developed three-dimensional, fast and high resolution multi-contrast black blood MR imaging. Quantitative validation indicated that after registration, the overlap of two boundaries from different sequences is 95%, and their mean surface distance is 0.12 mm. In conclusion, the proposed algorithm has improved the accuracy of registration effectively for further component analysis of carotid plaques.

  17. Efficient search, mapping, and optimization of multi-protein genetic systems in diverse bacteria

    PubMed Central

    Farasat, Iman; Kushwaha, Manish; Collens, Jason; Easterbrook, Michael; Guido, Matthew; Salis, Howard M

    2014-01-01

    Developing predictive models of multi-protein genetic systems to understand and optimize their behavior remains a combinatorial challenge, particularly when measurement throughput is limited. We developed a computational approach to build predictive models and identify optimal sequences and expression levels, while circumventing combinatorial explosion. Maximally informative genetic system variants were first designed by the RBS Library Calculator, an algorithm to design sequences for efficiently searching a multi-protein expression space across a > 10,000-fold range with tailored search parameters and well-predicted translation rates. We validated the algorithm's predictions by characterizing 646 genetic system variants, encoded in plasmids and genomes, expressed in six gram-positive and gram-negative bacterial hosts. We then combined the search algorithm with system-level kinetic modeling, requiring the construction and characterization of 73 variants to build a sequence-expression-activity map (SEAMAP) for a biosynthesis pathway. Using model predictions, we designed and characterized 47 additional pathway variants to navigate its activity space, find optimal expression regions with desired activity response curves, and relieve rate-limiting steps in metabolism. Creating sequence-expression-activity maps accelerates the optimization of many protein systems and allows previous measurements to quantitatively inform future designs. PMID:24952589

  18. Overtaking method based on sand-sifter mechanism: Why do optimistic value functions find optimal solutions in multi-armed bandit problems?

    PubMed

    Ochi, Kento; Kamiura, Moto

    2015-09-01

    A multi-armed bandit problem is a search problem on which a learning agent must select the optimal arm among multiple slot machines generating random rewards. UCB algorithm is one of the most popular methods to solve multi-armed bandit problems. It achieves logarithmic regret performance by coordinating balance between exploration and exploitation. Since UCB algorithms, researchers have empirically known that optimistic value functions exhibit good performance in multi-armed bandit problems. The terms optimistic or optimism might suggest that the value function is sufficiently larger than the sample mean of rewards. The first definition of UCB algorithm is focused on the optimization of regret, and it is not directly based on the optimism of a value function. We need to think the reason why the optimism derives good performance in multi-armed bandit problems. In the present article, we propose a new method, which is called Overtaking method, to solve multi-armed bandit problems. The value function of the proposed method is defined as an upper bound of a confidence interval with respect to an estimator of expected value of reward: the value function asymptotically approaches to the expected value of reward from the upper bound. If the value function is larger than the expected value under the asymptote, then the learning agent is almost sure to be able to obtain the optimal arm. This structure is called sand-sifter mechanism, which has no regrowth of value function of suboptimal arms. It means that the learning agent can play only the current best arm in each time step. Consequently the proposed method achieves high accuracy rate and low regret and some value functions of it can outperform UCB algorithms. This study suggests the advantage of optimism of agents in uncertain environment by one of the simplest frameworks. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  19. Development of a drone equipped with optimized sensors for nuclear and radiological risk characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boudergui, K.; Carrel, F.; Domenech, T.

    2011-07-01

    The MOBISIC project, funded by the Systematic Paris-Region cluster, is being developed in the context of local crisis (attack bombing in urban environment, in confined space such as an underground train tunnel etc.) or specific event securing (soccer world cup, political meeting etc.). It consists in conceiving, developing and experimenting a mobile, modular ('plug and play') and multi-sensors securing system. In this project, CEA LIST has suggested different solutions for nuclear risks detection and identification. It results in embedding a CZT sensor and a gamma camera in an indoor drone. This article first presents the different modifications carried out onmore » the UAV and different sensors, and focuses then on the experimental performances. (authors)« less

  20. Multivariate statistical analysis software technologies for astrophysical research involving large data bases

    NASA Technical Reports Server (NTRS)

    Djorgovski, Stanislav

    1992-01-01

    The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.

  1. The multi-purpose three-axis spectrometer (TAS) MIRA at FRM II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Georgii, Robert; Weber, Tobias; Brandl, Georg

    The cold-neutron three-axis spectrometer MIRA is an instrument optimized for low-energy excitations. Its excellent intrinsic $Q$-resolution makes it ideal for studying incommensurate magnetic systems (elastic and inelastic). MIRA is at the forefront of using advanced neutron focusing optics such as elliptic guides, which enable the investigation of small samples under extreme conditions. Another advantage of MIRA is the modular assembly allowing for instrumental adaption to the needs of the experiment within a few hours. The development of new methods such as the spin-echo technique MIEZE is another important application at MIRA. Finally, scientific topics include the investigation of complex inter-metallicmore » alloys and spectroscopy on incommensurate magnetic structures.« less

  2. Evaluation of the Painful Dual Taper Modular Neck Stem Total Hip Arthroplasty: Do They All Require Revision?

    PubMed

    Kwon, Young-Min

    2016-07-01

    Although dual taper modular-neck total hip arthroplasty (THA) design with additional neck-stem modularity has the potential to optimize hip biomechanical parameters by facilitating adjustments of leg length, femoral neck version and offset, there is increasing concern regarding this stem design as a result of the growing numbers of adverse local tissue reactions due to fretting and corrosion at the neck-stem taper junction. Implant factors such as taper cone angle, taper surface roughness, taper contact area, modular neck taper metallurgy, and femoral head size play important roles in influencing extent of taper corrosion. There should be a low threshold to conduct a systematic clinical evaluation of patients with dual-taper modular-neck stem THA using systematic risk stratification algorithms as early recognition and diagnosis will ensure prompt and appropriate treatment. Although specialized tests such as metal ion analysis and cross-sectional imaging modalities such as metal artifact reduction sequence magnetic resonance imaging (MARS MRI) are useful in optimizing clinical decision-making, overreliance on any single investigative tool in the clinical decision-making process for revision surgery should be avoided. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Analysis of the structure of complex networks at different resolution levels

    NASA Astrophysics Data System (ADS)

    Arenas, A.; Fernández, A.; Gómez, S.

    2008-05-01

    Modular structure is ubiquitous in real-world complex networks, and its detection is important because it gives insights into the structure-functionality relationship. The standard approach is based on the optimization of a quality function, modularity, which is a relative quality measure for the partition of a network into modules. Recently, some authors (Fortunato and Barthélemy 2007 Proc. Natl Acad. Sci. USA 104 36 and Kumpula et al 2007 Eur. Phys. J. B 56 41) have pointed out that the optimization of modularity has a fundamental drawback: the existence of a resolution limit beyond which no modular structure can be detected even though these modules might have their own entity. The reason is that several topological descriptions of the network coexist at different scales, which is, in general, a fingerprint of complex systems. Here, we propose a method that allows for multiple resolution screening of the modular structure. The method has been validated using synthetic networks, discovering the predefined structures at all scales. Its application to two real social networks allows us to find the exact splits reported in the literature, as well as the substructure beyond the actual split.

  4. Technology assessment of automation trends in the modular home industry

    Treesearch

    Phil Mitchell; Robert Russell Hurst

    2009-01-01

    This report provides an assessment of technology used in manufacturing modular homes in the United States, and that used in the German prefabricated wooden home industry. It is the first step toward identifying the research needs in automation and manufacturing methods that will facilitate mass customization in the home manufacturing industry. Within the United States...

  5. Optimal Runge-Kutta Schemes for High-order Spatial and Temporal Discretizations

    DTIC Science & Technology

    2015-06-01

    using larger time steps versus lower-order time integration with smaller time steps.4 In the present work, an attempt is made to gener - alize these... generality and because of interest in multi-speed and high Reynolds number, wall-bounded flow regimes, a dual-time framework is adopted in the present work...errors of general combinations of high-order spatial and temporal discretizations. Different Runge-Kutta time integrators are applied to central

  6. Optimizing Aspect-Oriented Mechanisms for Embedded Applications

    NASA Astrophysics Data System (ADS)

    Hundt, Christine; Stöhr, Daniel; Glesner, Sabine

    As applications for small embedded mobile devices are getting larger and more complex, it becomes inevitable to adopt more advanced software engineering methods from the field of desktop application development. Aspect-oriented programming (AOP) is a promising approach due to its advanced modularization capabilities. However, existing AOP languages tend to add a substantial overhead in both execution time and code size which restricts their practicality for small devices with limited resources. In this paper, we present optimizations for aspect-oriented mechanisms at the level of the virtual machine. Our experiments show that these optimizations yield a considerable performance gain along with a reduction of the code size. Thus, our optimizations establish the base for using advanced aspect-oriented modularization techniques for developing Java applications on small embedded devices.

  7. A modular theory of multisensory integration for motor control

    PubMed Central

    Tagliabue, Michele; McIntyre, Joseph

    2014-01-01

    To control targeted movements, such as reaching to grasp an object or hammering a nail, the brain can use divers sources of sensory information, such as vision and proprioception. Although a variety of studies have shown that sensory signals are optimally combined according to principles of maximum likelihood, increasing evidence indicates that the CNS does not compute a single, optimal estimation of the target's position to be compared with a single optimal estimation of the hand. Rather, it employs a more modular approach in which the overall behavior is built by computing multiple concurrent comparisons carried out simultaneously in a number of different reference frames. The results of these individual comparisons are then optimally combined in order to drive the hand. In this article we examine at a computational level two formulations of concurrent models for sensory integration and compare this to the more conventional model of converging multi-sensory signals. Through a review of published studies, both our own and those performed by others, we produce evidence favoring the concurrent formulations. We then examine in detail the effects of additive signal noise as information flows through the sensorimotor system. By taking into account the noise added by sensorimotor transformations, one can explain why the CNS may shift its reliance on one sensory modality toward a greater reliance on another and investigate under what conditions those sensory transformations occur. Careful consideration of how transformed signals will co-vary with the original source also provides insight into how the CNS chooses one sensory modality over another. These concepts can be used to explain why the CNS might, for instance, create a visual representation of a task that is otherwise limited to the kinesthetic domain (e.g., pointing with one hand to a finger on the other) and why the CNS might choose to recode sensory information in an external reference frame. PMID:24550816

  8. Clustering algorithm for determining community structure in large networks

    NASA Astrophysics Data System (ADS)

    Pujol, Josep M.; Béjar, Javier; Delgado, Jordi

    2006-07-01

    We propose an algorithm to find the community structure in complex networks based on the combination of spectral analysis and modularity optimization. The clustering produced by our algorithm is as accurate as the best algorithms on the literature of modularity optimization; however, the main asset of the algorithm is its efficiency. The best match for our algorithm is Newman’s fast algorithm, which is the reference algorithm for clustering in large networks due to its efficiency. When both algorithms are compared, our algorithm outperforms the fast algorithm both in efficiency and accuracy of the clustering, in terms of modularity. Thus, the results suggest that the proposed algorithm is a good choice to analyze the community structure of medium and large networks in the range of tens and hundreds of thousand vertices.

  9. Multi-objective Optimization of a Solar Humidification Dehumidification Desalination Unit

    NASA Astrophysics Data System (ADS)

    Rafigh, M.; Mirzaeian, M.; Najafi, B.; Rinaldi, F.; Marchesi, R.

    2017-11-01

    In the present paper, a humidification-dehumidification desalination unit integrated with solar system is considered. In the first step mathematical model of the whole plant is represented. Next, taking into account the logical constraints, the performance of the system is optimized. On one hand it is desired to have higher energetic efficiency, while on the other hand, higher efficiency results in an increment in the required area for each subsystem which consequently leads to an increase in the total cost of the plant. In the present work, the optimum solution is achieved when the specific energy of the solar heater and also the areas of humidifier and dehumidifier are minimized. Due to the fact that considered objective functions are in conflict, conventional optimization methods are not applicable. Hence, multi objective optimization using genetic algorithm which is an efficient tool for dealing with problems with conflicting objectives has been utilized and a set of optimal solutions called Pareto front each of which is a tradeoff between the mentioned objectives is generated.

  10. Coordinated control of active and reactive power of distribution network with distributed PV cluster via model predictive control

    NASA Astrophysics Data System (ADS)

    Ji, Yu; Sheng, Wanxing; Jin, Wei; Wu, Ming; Liu, Haitao; Chen, Feng

    2018-02-01

    A coordinated optimal control method of active and reactive power of distribution network with distributed PV cluster based on model predictive control is proposed in this paper. The method divides the control process into long-time scale optimal control and short-time scale optimal control with multi-step optimization. The models are transformed into a second-order cone programming problem due to the non-convex and nonlinear of the optimal models which are hard to be solved. An improved IEEE 33-bus distribution network system is used to analyse the feasibility and the effectiveness of the proposed control method

  11. Towards a sustainable modular robot system for planetary exploration

    NASA Astrophysics Data System (ADS)

    Hossain, S. G. M.

    This thesis investigates multiple perspectives of developing an unmanned robotic system suited for planetary terrains. In this case, the unmanned system consists of unit-modular robots. This type of robot has potential to be developed and maintained as a sustainable multi-robot system while located far from direct human intervention. Some characteristics that make this possible are: the cooperation, communication and connectivity among the robot modules, flexibility of individual robot modules, capability of self-healing in the case of a failed module and the ability to generate multiple gaits by means of reconfiguration. To demonstrate the effects of high flexibility of an individual robot module, multiple modules of a four-degree-of-freedom unit-modular robot were developed. The robot was equipped with a novel connector mechanism that made self-healing possible. Also, design strategies included the use of series elastic actuators for better robot-terrain interaction. In addition, various locomotion gaits were generated and explored using the robot modules, which is essential for a modular robot system to achieve robustness and thus successfully navigate and function in a planetary environment. To investigate multi-robot task completion, a biomimetic cooperative load transportation algorithm was developed and simulated. Also, a liquid motion-inspired theory was developed consisting of a large number of robot modules. This can be used to traverse obstacles that inevitably occur in maneuvering over rough terrains such as in a planetary exploration. Keywords: Modular robot, cooperative robots, biomimetics, planetary exploration, sustainability.

  12. Enhancement cavities for zero-offset-frequency pulse trains.

    PubMed

    Holzberger, S; Lilienfein, N; Trubetskov, M; Carstens, H; Lücking, F; Pervak, V; Krausz, F; Pupeza, I

    2015-05-15

    The optimal enhancement of broadband optical pulses in a passive resonator requires a seeding pulse train with a specific carrier-envelope-offset frequency. Here, we control the phase of the cavity mirrors to tune the offset frequency for which a given comb is optimally enhanced. This enables the enhancement of a zero-offset-frequency train of sub-30-fs pulses to multi-kW average powers. The combination of pulse duration, power, and zero phase slip constitutes a crucial step toward the generation of attosecond pulses at multi-10-MHz repetition rates. In addition, this control affords the enhancement of pulses generated by difference-frequency mixing, e.g., for mid-infrared spectroscopy.

  13. A seismic-network mission proposal as an example for modular robotic lunar exploration missions

    NASA Astrophysics Data System (ADS)

    Lange, C.; Witte, L.; Rosta, R.; Sohl, F.; Heffels, A.; Knapmeyer, M.

    2017-05-01

    In this paper it is intended to discuss an approach to reduce design costs for subsequent missions by introducing modularity, commonality and multi-mission capability and thereby reuse of mission individual investments into the design of lunar exploration infrastructural systems. The presented approach has been developed within the German Helmholtz-Alliance on Robotic Exploration of Extreme Environments (ROBEX), a research alliance bringing together deep-sea and space research to jointly develop technologies and investigate problems for the exploration of highly inaccessible terrain - be it in the deep sea and polar regions or on the Moon and other planets. Although overall costs are much smaller for deep sea missions as compared to lunar missions, a lot can be learned from modularity approaches in deep sea research infrastructure design, which allows a high operational flexibility in the planning phase of a mission as well as during its implementation. The research presented here is based on a review of existing modular solutions in Earth orbiting satellites as well as science and exploration systems. This is followed by an investigation of lunar exploration scenarios from which we derive requirements for a multi-mission modular architecture. After analyzing possible options, an approach using a bus modular architecture for dedicated subsystems is presented. The approach is based on exchangeable modules e.g. incorporating instruments, which are added to the baseline system platform according to the demands of the specific scenario. It will be described in more detail, including arising problems e.g. in the power or thermal domain. Finally, technological building blocks to put the architecture into practical use will be described more in detail.

  14. Multiple D3-Instantons and Mock Modular Forms II

    NASA Astrophysics Data System (ADS)

    Alexandrov, Sergei; Banerjee, Sibasish; Manschot, Jan; Pioline, Boris

    2018-03-01

    We analyze the modular properties of D3-brane instanton corrections to the hypermultiplet moduli space in type IIB string theory compactified on a Calabi-Yau threefold. In Part I, we found a necessary condition for the existence of an isometric action of S-duality on this moduli space: the generating function of DT invariants in the large volume attractor chamber must be a vector-valued mock modular form with specified modular properties. In this work, we prove that this condition is also sufficient at two-instanton order. This is achieved by producing a holomorphic action of {SL(2,Z)} on the twistor space which preserves the holomorphic contact structure. The key step is to cancel the anomalous modular variation of the Darboux coordinates by a local holomorphic contact transformation, which is generated by a suitable indefinite theta series. For this purpose we introduce a new family of theta series of signature (2, n - 2), find their modular completion, and conjecture sufficient conditions for their convergence, which may be of independent mathematical interest.

  15. Modular and Orthogonal Synthesis of Hybrid Polymers and Networks

    PubMed Central

    Liu, Shuang; Dicker, Kevin T.; Jia, Xinqiao

    2015-01-01

    Biomaterials scientists strive to develop polymeric materials with distinct chemical make-up, complex molecular architectures, robust mechanical properties and defined biological functions by drawing inspirations from biological systems. Salient features of biological designs include (1) repetitive presentation of basic motifs; and (2) efficient integration of diverse building blocks. Thus, an appealing approach to biomaterials synthesis is to combine synthetic and natural building blocks in a modular fashion employing novel chemical methods. Over the past decade, orthogonal chemistries have become powerful enabling tools for the modular synthesis of advanced biomaterials. These reactions require building blocks with complementary functionalities, occur under mild conditions in the presence of biological molecules and living cells and proceed with high yield and exceptional selectivity. These chemistries have facilitated the construction of complex polymers and networks in a step-growth fashion, allowing facile modulation of materials properties by simple variations of the building blocks. In this review, we first summarize features of several types of orthogonal chemistries. We then discuss recent progress in the synthesis of step growth linear polymers, dendrimers and networks that find application in drug delivery, 3D cell culture and tissue engineering. Overall, orthogonal reactions and modulular synthesis have not only minimized the steps needed for the desired chemical transformations but also maximized the diversity and functionality of the final products. The modular nature of the design, combined with the potential synergistic effect of the hybrid system, will likely result in novel hydrogel matrices with robust structures and defined functions. PMID:25572255

  16. Enantioselective modular synthesis of cyclohexenones: total syntheses of (+)-crypto- and (+)-infectocaryone.

    PubMed

    Franck, Géraldine; Brödner, Kerstin; Helmchen, Günter

    2010-09-03

    A modular synthesis of cyclohexenones is described and applied to the first enantioselective total syntheses of (+)-crypto- and (+)-infectocaryone. Key steps in the synthesis of cyclohexenones are an iridium-catalyzed allylic alkylation, nucleophilic allylation, and ring-closing metathesis. On the way to (+)-cryptocaryone, a catch and release strategy involving an iodolactonization/elimination and a regioselective C-acylation were used.

  17. Modular control subsystems for use in solar heating systems for multi-family dwellings

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Progress in the development of solar heating modular control subsystems is reported. Circuit design, circuit drawings, and printed circuit board layout are discussed along with maintenance manuals, installation instructions, and verification and acceptance tests. Calculations made to determine the predicted performance of the differential thermostat are given including details and results of tests for the offset temperature, and boil and freeze protect points.

  18. Development of modular control software for construction 3D-printer

    NASA Astrophysics Data System (ADS)

    Bazhanov, A.; Yudin, D.; Porkhalo, V.

    2018-03-01

    This article discusses the approach to developing modular software for real-time control of an industrial construction 3D printer. The proposed structure of a two-level software solution is implemented for a robotic system that moves in a Cartesian coordinate system with multi-axis interpolation. An algorithm for the formation and analysis of a path is considered to enable the most effective control of printing through dynamic programming.

  19. Multi-objective optimization of chromatographic rare earth element separation.

    PubMed

    Knutson, Hans-Kristian; Holmqvist, Anders; Nilsson, Bernt

    2015-10-16

    The importance of rare earth elements in modern technological industry grows, and as a result the interest for developing separation processes increases. This work is a part of developing chromatography as a rare earth element processing method. Process optimization is an important step in process development, and there are several competing objectives that need to be considered in a chromatographic separation process. Most studies are limited to evaluating the two competing objectives productivity and yield, and studies of scenarios with tri-objective optimizations are scarce. Tri-objective optimizations are much needed when evaluating the chromatographic separation of rare earth elements due to the importance of product pool concentration along with productivity and yield as process objectives. In this work, a multi-objective optimization strategy considering productivity, yield and pool concentration is proposed. This was carried out in the frame of a model based optimization study on a batch chromatography separation of the rare earth elements samarium, europium and gadolinium. The findings from the multi-objective optimization were used to provide with a general strategy for achieving desirable operation points, resulting in a productivity ranging between 0.61 and 0.75 kgEu/mcolumn(3), h(-1) and a pool concentration between 0.52 and 0.79 kgEu/m(3), while maintaining a purity above 99% and never falling below an 80% yield for the main target component europium. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Surveillance and reconnaissance ground system architecture

    NASA Astrophysics Data System (ADS)

    Devambez, Francois

    2001-12-01

    Modern conflicts induces various modes of deployment, due to the type of conflict, the type of mission, and phase of conflict. It is then impossible to define fixed architecture systems for surveillance ground segments. Thales has developed a structure for a ground segment based on the operational functions required, and on the definition of modules and networks. Theses modules are software and hardware modules, including communications and networks. This ground segment is called MGS (Modular Ground Segment), and is intended for use in airborne reconnaissance systems, surveillance systems, and U.A.V. systems. Main parameters for the definition of a modular ground image exploitation system are : Compliance with various operational configurations, Easy adaptation to the evolution of theses configurations, Interoperability with NATO and multinational forces, Security, Multi-sensors, multi-platforms capabilities, Technical modularity, Evolutivity Reduction of life cycle cost The general performances of the MGS are presented : type of sensors, acquisition process, exploitation of images, report generation, data base management, dissemination, interface with C4I. The MGS is then described as a set of hardware and software modules, and their organization to build numerous operational configurations. Architectures are from minimal configuration intended for a mono-sensor image exploitation system, to a full image intelligence center, for a multilevel exploitation of multi-sensor.

  1. A modular modulation method for achieving increases in metabolite production.

    PubMed

    Acerenza, Luis; Monzon, Pablo; Ortega, Fernando

    2015-01-01

    Increasing the production of overproducing strains represents a great challenge. Here, we develop a modular modulation method to determine the key steps for genetic manipulation to increase metabolite production. The method consists of three steps: (i) modularization of the metabolic network into two modules connected by linking metabolites, (ii) change in the activity of the modules using auxiliary rates producing or consuming the linking metabolites in appropriate proportions and (iii) determination of the key modules and steps to increase production. The mathematical formulation of the method in matrix form shows that it may be applied to metabolic networks of any structure and size, with reactions showing any kind of rate laws. The results are valid for any type of conservation relationships in the metabolite concentrations or interactions between modules. The activity of the module may, in principle, be changed by any large factor. The method may be applied recursively or combined with other methods devised to perform fine searches in smaller regions. In practice, it is implemented by integrating to the producer strain heterologous reactions or synthetic pathways producing or consuming the linking metabolites. The new procedure may contribute to develop metabolic engineering into a more systematic practice. © 2015 American Institute of Chemical Engineers.

  2. Toward a modular multi-material nanoparticle synthesis and assembly strategy via bionanocombinatorics: bifunctional peptides for linking Au and Ag nanomaterials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Briggs, Beverly D.; Palafox-Hernandez, J. Pablo; Li, Yue

    Materials-binding peptides represent a unique avenue towards controlling the shape and size of nanoparticles (NPs) grown under aqueous conditions. Here, employing a bionanocombinatorics approach, two such materials-binding peptides were linked at either end of a photoswitchable spacer, forming a multi-domain materials-binding molecule to control the in situ synthesis and organization of Ag and Au NPs under ambient conditions. These multi-domain molecules retained the peptides’ ability to nucleate, grow, and stabilize Ag and Au NPs in aqueous media. Disordered co-assemblies of the two nanomaterials were observed by TEM imaging of dried samples after sequential growth of the two metals, and showedmore » a clustering behavior that was not observed without both metals and the linker molecules. While TEM evidence indicated the formation of AuNP/AgNP assemblies upon drying, SAXS analysis indicated that no extended assemblies existed in solution, suggesting that sample drying plays an important role in facilitating NP clustering. Molecular simulations and experimental data revealed tunable materials-binding based upon the isomerization state of the photoswitchable unit and metal employed. This work is a first step in generating externally actuated biomolecules with specific material-binding properties that could be used as the building blocks to achieve multi-material switchable NP assemblies.« less

  3. Modularity in developmental biology and artificial organs: a missing concept in tissue engineering.

    PubMed

    Lenas, Petros; Luyten, Frank P; Doblare, Manuel; Nicodemou-Lena, Eleni; Lanzara, Andreina Elena

    2011-06-01

    Tissue engineering is reviving itself, adopting the concept of biomimetics of in vivo tissue development. A basic concept of developmental biology is the modularity of the tissue architecture according to which intermediates in tissue development constitute semiautonomous entities. Both engineering and nature have chosen the modular architecture to optimize the product or organism development and evolution. Bioartificial tissues do not have a modular architecture. On the contrary, artificial organs of modular architecture have been already developed in the field of artificial organs. Therefore the conceptual support of tissue engineering by the field of artificial organs becomes critical in its new endeavor of recapitulating in vitro the in vivo tissue development. © 2011, Copyright the Authors. Artificial Organs © 2011, International Center for Artificial Organs and Transplantation and Wiley Periodicals, Inc.

  4. An Advice Mechanism for Heterogeneous Robot Teams

    NASA Astrophysics Data System (ADS)

    Daniluk, Steven

    The use of reinforcement learning for robot teams has enabled complex tasks to be performed, but at the cost of requiring a large amount of exploration. Exchanging information between robots in the form of advice is one method to accelerate performance improvements. This thesis presents an advice mechanism for robot teams that utilizes advice from heterogeneous advisers via a method guaranteeing convergence to an optimal policy. The presented mechanism has the capability to use multiple advisers at each time step, and decide when advice should be requested and accepted, such that the use of advice decreases over time. Additionally, collective collaborative, and cooperative behavioural algorithms are integrated into a robot team architecture, to create a new framework that provides fault tolerance and modularity for robot teams.

  5. Seeing the wood for the trees: a forest of methods for optimization and omic-network integration in metabolic modelling.

    PubMed

    Vijayakumar, Supreeta; Conway, Max; Lió, Pietro; Angione, Claudio

    2017-05-30

    Metabolic modelling has entered a mature phase with dozens of methods and software implementations available to the practitioner and the theoretician. It is not easy for a modeller to be able to see the wood (or the forest) for the trees. Driven by this analogy, we here present a 'forest' of principal methods used for constraint-based modelling in systems biology. This provides a tree-based view of methods available to prospective modellers, also available in interactive version at http://modellingmetabolism.net, where it will be kept updated with new methods after the publication of the present manuscript. Our updated classification of existing methods and tools highlights the most promising in the different branches, with the aim to develop a vision of how existing methods could hybridize and become more complex. We then provide the first hands-on tutorial for multi-objective optimization of metabolic models in R. We finally discuss the implementation of multi-view machine learning approaches in poly-omic integration. Throughout this work, we demonstrate the optimization of trade-offs between multiple metabolic objectives, with a focus on omic data integration through machine learning. We anticipate that the combination of a survey, a perspective on multi-view machine learning and a step-by-step R tutorial should be of interest for both the beginner and the advanced user. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Technology-based design and scaling for RTGs for space exploration in the 100 W range

    NASA Astrophysics Data System (ADS)

    Summerer, Leopold; Pierre Roux, Jean; Pustovalov, Alexey; Gusev, Viacheslav; Rybkin, Nikolai

    2011-04-01

    This paper presents the results of a study on design considerations for a 100 W radioisotope thermo-electric generator (RTG). Special emphasis has been put on designing a modular, multi-purpose system with high overall TRL levels and making full use of the extensive Russian heritage in the design of radioisotope power systems. The modular approach allowed insight into the scaling of such RTGs covering the electric power range from 50 to 200 W e (EoL). The retained concept is based on a modular thermal block structure, a radiative inner-RTG heat transfer and using a two-stage thermo-electric conversion system.

  7. Upcycling UAS into modular platforms for Earth science and autonomy research

    NASA Astrophysics Data System (ADS)

    Dahlgren, R. P.; Dary, O. G.; Ogunbiyi, J. A.; Pinsker, E. A.; Reynolds, K. W.; Werner, C. A.

    2015-12-01

    This reports the results of a multidisciplinary project conducted at the NASA Ames Research Center (ARC) involving a number of student interns over the summers of 2014 and 2015. The project had a goal of applying rapid prototyping techniques, including 3D printing, to unmanned aircraft systems (UAS), and demonstrated that surplus UAS could be repurposed into new configurations suitable for conducting science missions. ARC received several units of the RQ-11 Raven and RQ-14 DragonEye manufactured by AeroVironment Corporation, along with ground stations and spare parts. These UAS have electric propulsion, a wingspan and length ~1m; they are designed to disassemble for transport, have a simple wing design with snap-together interfaces, made from lightweight materials. After removing all ITAR restricted technology these were made available to summer interns that also had access to 3D printing, CNC laser-cutting equipment through NASA's SpaceShop. The modular nature and simple wing profiles enabled the teams to deconstruct and subsequently reconfigure them into completely new airframes. Two multi-fuselage designs were assembled using Ardupilot-based common avionics architecture (CAA), with extended wingspans, an H-tail and an innovative cambered flap system. After NASA internal design reviews, the students fabricated new control surfaces and subcomponents necessary to splice the RQ-14 subcomponents back together. Laboratory testing was performed on test articles to determine bending modulus and safety factors, and documentation was prepared for airworthiness flight safety review. Upon receiving approval of documentation and flight readiness certification, the repurposed UAS were flown at Crows Landing airfield in Stanislaus County, California, initially under RC pilot control and subsequently under fully autonomous control. The RQ-11 is now being used to expand on the modularity design and the Team has been at work in designing different configurations and a payload pod that will allow flexible modular implementation. This project demonstrated that rapid prototyping combined with modular subcomponents can enable an increase in the rate of design iterations on aircraft optimized for science missions. Field data will be reported for missions at the Salton Sea and Crows Landing, California.

  8. A modular computational framework for automated peak extraction from ion mobility spectra

    PubMed Central

    2014-01-01

    Background An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. Results We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Conclusions Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims. PMID:24450533

  9. A modular computational framework for automated peak extraction from ion mobility spectra.

    PubMed

    D'Addario, Marianna; Kopczynski, Dominik; Baumbach, Jörg Ingo; Rahmann, Sven

    2014-01-22

    An ion mobility (IM) spectrometer coupled with a multi-capillary column (MCC) measures volatile organic compounds (VOCs) in the air or in exhaled breath. This technique is utilized in several biotechnological and medical applications. Each peak in an MCC/IM measurement represents a certain compound, which may be known or unknown. For clustering and classification of measurements, the raw data matrix must be reduced to a set of peaks. Each peak is described by its coordinates (retention time in the MCC and reduced inverse ion mobility) and shape (signal intensity, further shape parameters). This fundamental step is referred to as peak extraction. It is the basis for identifying discriminating peaks, and hence putative biomarkers, between two classes of measurements, such as a healthy control group and a group of patients with a confirmed disease. Current state-of-the-art peak extraction methods require human interaction, such as hand-picking approximate peak locations, assisted by a visualization of the data matrix. In a high-throughput context, however, it is preferable to have robust methods for fully automated peak extraction. We introduce PEAX, a modular framework for automated peak extraction. The framework consists of several steps in a pipeline architecture. Each step performs a specific sub-task and can be instantiated by different methods implemented as modules. We provide open-source software for the framework and several modules for each step. Additionally, an interface that allows easy extension by a new module is provided. Combining the modules in all reasonable ways leads to a large number of peak extraction methods. We evaluate all combinations using intrinsic error measures and by comparing the resulting peak sets with an expert-picked one. Our software PEAX is able to automatically extract peaks from MCC/IM measurements within a few seconds. The automatically obtained results keep up with the results provided by current state-of-the-art peak extraction methods. This opens a high-throughput context for the MCC/IM application field. Our software is available at http://www.rahmannlab.de/research/ims.

  10. Transparent DNA/RNA Co-extraction Workflow Protocol Suitable for Inhibitor-Rich Environmental Samples That Focuses on Complete DNA Removal for Transcriptomic Analyses

    PubMed Central

    Lim, Natalie Y. N.; Roco, Constance A.; Frostegård, Åsa

    2016-01-01

    Adequate comparisons of DNA and cDNA libraries from complex environments require methods for co-extraction of DNA and RNA due to the inherent heterogeneity of such samples, or risk bias caused by variations in lysis and extraction efficiencies. Still, there are few methods and kits allowing simultaneous extraction of DNA and RNA from the same sample, and the existing ones generally require optimization. The proprietary nature of kit components, however, makes modifications of individual steps in the manufacturer’s recommended procedure difficult. Surprisingly, enzymatic treatments are often performed before purification procedures are complete, which we have identified here as a major problem when seeking efficient genomic DNA removal from RNA extracts. Here, we tested several DNA/RNA co-extraction commercial kits on inhibitor-rich soils, and compared them to a commonly used phenol-chloroform co-extraction method. Since none of the kits/methods co-extracted high-quality nucleic acid material, we optimized the extraction workflow by introducing small but important improvements. In particular, we illustrate the need for extensive purification prior to all enzymatic procedures, with special focus on the DNase digestion step in RNA extraction. These adjustments led to the removal of enzymatic inhibition in RNA extracts and made it possible to reduce genomic DNA to below detectable levels as determined by quantitative PCR. Notably, we confirmed that DNase digestion may not be uniform in replicate extraction reactions, thus the analysis of “representative samples” is insufficient. The modular nature of our workflow protocol allows optimization of individual steps. It also increases focus on additional purification procedures prior to enzymatic processes, in particular DNases, yielding genomic DNA-free RNA extracts suitable for metatranscriptomic analysis. PMID:27803690

  11. Multi-GPU implementation of a VMAT treatment plan optimization algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Zhen, E-mail: Zhen.Tian@UTSouthwestern.edu, E-mail: Xun.Jia@UTSouthwestern.edu, E-mail: Steve.Jiang@UTSouthwestern.edu; Folkerts, Michael; Tan, Jun

    Purpose: Volumetric modulated arc therapy (VMAT) optimization is a computationally challenging problem due to its large data size, high degrees of freedom, and many hardware constraints. High-performance graphics processing units (GPUs) have been used to speed up the computations. However, GPU’s relatively small memory size cannot handle cases with a large dose-deposition coefficient (DDC) matrix in cases of, e.g., those with a large target size, multiple targets, multiple arcs, and/or small beamlet size. The main purpose of this paper is to report an implementation of a column-generation-based VMAT algorithm, previously developed in the authors’ group, on a multi-GPU platform tomore » solve the memory limitation problem. While the column-generation-based VMAT algorithm has been previously developed, the GPU implementation details have not been reported. Hence, another purpose is to present detailed techniques employed for GPU implementation. The authors also would like to utilize this particular problem as an example problem to study the feasibility of using a multi-GPU platform to solve large-scale problems in medical physics. Methods: The column-generation approach generates VMAT apertures sequentially by solving a pricing problem (PP) and a master problem (MP) iteratively. In the authors’ method, the sparse DDC matrix is first stored on a CPU in coordinate list format (COO). On the GPU side, this matrix is split into four submatrices according to beam angles, which are stored on four GPUs in compressed sparse row format. Computation of beamlet price, the first step in PP, is accomplished using multi-GPUs. A fast inter-GPU data transfer scheme is accomplished using peer-to-peer access. The remaining steps of PP and MP problems are implemented on CPU or a single GPU due to their modest problem scale and computational loads. Barzilai and Borwein algorithm with a subspace step scheme is adopted here to solve the MP problem. A head and neck (H and N) cancer case is then used to validate the authors’ method. The authors also compare their multi-GPU implementation with three different single GPU implementation strategies, i.e., truncating DDC matrix (S1), repeatedly transferring DDC matrix between CPU and GPU (S2), and porting computations involving DDC matrix to CPU (S3), in terms of both plan quality and computational efficiency. Two more H and N patient cases and three prostate cases are used to demonstrate the advantages of the authors’ method. Results: The authors’ multi-GPU implementation can finish the optimization process within ∼1 min for the H and N patient case. S1 leads to an inferior plan quality although its total time was 10 s shorter than the multi-GPU implementation due to the reduced matrix size. S2 and S3 yield the same plan quality as the multi-GPU implementation but take ∼4 and ∼6 min, respectively. High computational efficiency was consistently achieved for the other five patient cases tested, with VMAT plans of clinically acceptable quality obtained within 23–46 s. Conversely, to obtain clinically comparable or acceptable plans for all six of these VMAT cases that the authors have tested in this paper, the optimization time needed in a commercial TPS system on CPU was found to be in an order of several minutes. Conclusions: The results demonstrate that the multi-GPU implementation of the authors’ column-generation-based VMAT optimization can handle the large-scale VMAT optimization problem efficiently without sacrificing plan quality. The authors’ study may serve as an example to shed some light on other large-scale medical physics problems that require multi-GPU techniques.« less

  12. Robust and fast nonlinear optimization of diffusion MRI microstructure models.

    PubMed

    Harms, R L; Fritz, F J; Tobisch, A; Goebel, R; Roebroeck, A

    2017-07-15

    Advances in biophysical multi-compartment modeling for diffusion MRI (dMRI) have gained popularity because of greater specificity than DTI in relating the dMRI signal to underlying cellular microstructure. A large range of these diffusion microstructure models have been developed and each of the popular models comes with its own, often different, optimization algorithm, noise model and initialization strategy to estimate its parameter maps. Since data fit, accuracy and precision is hard to verify, this creates additional challenges to comparability and generalization of results from diffusion microstructure models. In addition, non-linear optimization is computationally expensive leading to very long run times, which can be prohibitive in large group or population studies. In this technical note we investigate the performance of several optimization algorithms and initialization strategies over a few of the most popular diffusion microstructure models, including NODDI and CHARMED. We evaluate whether a single well performing optimization approach exists that could be applied to many models and would equate both run time and fit aspects. All models, algorithms and strategies were implemented on the Graphics Processing Unit (GPU) to remove run time constraints, with which we achieve whole brain dataset fits in seconds to minutes. We then evaluated fit, accuracy, precision and run time for different models of differing complexity against three common optimization algorithms and three parameter initialization strategies. Variability of the achieved quality of fit in actual data was evaluated on ten subjects of each of two population studies with a different acquisition protocol. We find that optimization algorithms and multi-step optimization approaches have a considerable influence on performance and stability over subjects and over acquisition protocols. The gradient-free Powell conjugate-direction algorithm was found to outperform other common algorithms in terms of run time, fit, accuracy and precision. Parameter initialization approaches were found to be relevant especially for more complex models, such as those involving several fiber orientations per voxel. For these, a fitting cascade initializing or fixing parameter values in a later optimization step from simpler models in an earlier optimization step further improved run time, fit, accuracy and precision compared to a single step fit. This establishes and makes available standards by which robust fit and accuracy can be achieved in shorter run times. This is especially relevant for the use of diffusion microstructure modeling in large group or population studies and in combining microstructure parameter maps with tractography results. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Configuration optimization of laser guide stars and wavefront correctors for multi-conjugation adaptive optics

    NASA Astrophysics Data System (ADS)

    Xuan, Li; He, Bin; Hu, Li-Fa; Li, Da-Yu; Xu, Huan-Yu; Zhang, Xing-Yun; Wang, Shao-Xin; Wang, Yu-Kun; Yang, Cheng-Liang; Cao, Zhao-Liang; Mu, Quan-Quan; Lu, Xing-Hai

    2016-09-01

    Multi-conjugation adaptive optics (MCAOs) have been investigated and used in the large aperture optical telescopes for high-resolution imaging with large field of view (FOV). The atmospheric tomographic phase reconstruction and projection of three-dimensional turbulence volume onto wavefront correctors, such as deformable mirrors (DMs) or liquid crystal wavefront correctors (LCWCs), is a very important step in the data processing of an MCAO’s controller. In this paper, a method according to the wavefront reconstruction performance of MCAO is presented to evaluate the optimized configuration of multi laser guide stars (LGSs) and the reasonable conjugation heights of LCWCs. Analytical formulations are derived for the different configurations and are used to generate optimized parameters for MCAO. Several examples are given to demonstrate our LGSs configuration optimization method. Compared with traditional methods, our method has minimum wavefront tomographic error, which will be helpful to get higher imaging resolution at large FOV in MCAO. Project supported by the National Natural Science Foundation of China (Grant Nos. 11174274, 11174279, 61205021, 11204299, 61475152, and 61405194) and the State Key Laboratory of Applied Optics, Changchun Institute of Optics, Fine Mechanics and Physics, Chinese Academy of Sciences.

  14. A game theory-reinforcement learning (GT-RL) method to develop optimal operation policies for multi-operator reservoir systems

    NASA Astrophysics Data System (ADS)

    Madani, Kaveh; Hooshyar, Milad

    2014-11-01

    Reservoir systems with multiple operators can benefit from coordination of operation policies. To maximize the total benefit of these systems the literature has normally used the social planner's approach. Based on this approach operation decisions are optimized using a multi-objective optimization model with a compound system's objective. While the utility of the system can be increased this way, fair allocation of benefits among the operators remains challenging for the social planner who has to assign controversial weights to the system's beneficiaries and their objectives. Cooperative game theory provides an alternative framework for fair and efficient allocation of the incremental benefits of cooperation. To determine the fair and efficient utility shares of the beneficiaries, cooperative game theory solution methods consider the gains of each party in the status quo (non-cooperation) as well as what can be gained through the grand coalition (social planner's solution or full cooperation) and partial coalitions. Nevertheless, estimation of the benefits of different coalitions can be challenging in complex multi-beneficiary systems. Reinforcement learning can be used to address this challenge and determine the gains of the beneficiaries for different levels of cooperation, i.e., non-cooperation, partial cooperation, and full cooperation, providing the essential input for allocation based on cooperative game theory. This paper develops a game theory-reinforcement learning (GT-RL) method for determining the optimal operation policies in multi-operator multi-reservoir systems with respect to fairness and efficiency criteria. As the first step to underline the utility of the GT-RL method in solving complex multi-agent multi-reservoir problems without a need for developing compound objectives and weight assignment, the proposed method is applied to a hypothetical three-agent three-reservoir system.

  15. Modular High-Energy Systems for Solar Power Satellites

    NASA Technical Reports Server (NTRS)

    Howell, Joe T.; Carrington, Connie K.; Marzwell, Neville I.; Mankins, John C.

    2006-01-01

    Modular High-Energy Systems are Stepping Stones to provide capabilities for energy-rich infrastructure located in space to support a variety of exploration scenarios as well as provide a supplemental source of energy during peak demands to ground grid systems. Abundant renewable energy at lunar or other locations could support propellant production and storage in refueling scenarios that enable affordable exploration. Renewable energy platforms in geosynchronous Earth orbits can collect and transmit power to satellites, or to Earth-surface locations. Energy-rich space technologies also enable the use of electric-powered propulsion systems that could efficiently deliver cargo and exploration facilities to remote locations. A first step to an energy-rich space infrastructure is a 100-kWe class solar-powered platform in Earth orbit. The platform would utilize advanced technologies in solar power collection and generation, power management and distribution, thermal management, electric propulsion, wireless avionics, autonomous in space rendezvous and docking, servicing, and robotic assembly. It would also provide an energy-rich free-flying platform to demonstrate in space a portfolio of technology flight experiments. This paper summary a preliminary design concept for a 100-kWe solar-powered satellite system to demonstrate in-flight a variety of advanced technologies, each as a separate payload. These technologies include, but are not limited to state-of-the-art solar concentrators, highly efficient multi-junction solar cells, integrated thermal management on the arrays, and innovative deployable structure design and packaging to enable the 100-kW satellite feasible to launch on one existing launch vehicle. Higher voltage arrays and power distribution systems (PDS) reduce or eliminate the need for massive power converters, and could enable direct-drive of high-voltage solar electric thrusters.

  16. A Multi-object Exoplanet Detecting Technique

    NASA Astrophysics Data System (ADS)

    Zhang, K.

    2011-05-01

    Exoplanet exploration is not only a meaningful astronomical action, but also has a close relation with the extra-terrestrial life. High resolution echelle spectrograph is the key instrument for measuring stellar radial velocity (RV). But with higher precision, better environmental stability and higher cost are required. An improved technique of RV means invented by David J. Erskine in 1997, External Dispersed Interferometry (EDI), can increase the RV measuring precision by combining the moderate resolution spectrograph with a fixed-delay Michelson interferometer. LAMOST with large aperture and large field of view is equipped with 16 multi-object low resolution fiber spectrographs. And these spectrographs are capable to work in medium resolution mode (R=5{K}˜10{K}). LAMOST will be one of the most powerful exoplanet detecting systems over the world by introducing EDI technique. The EDI technique is a new technique for developing astronomical instrumentation in China. The operating theory of EDI was generally verified by a feasibility experiment done in 2009. And then a multi-object exoplanet survey system based on LAMOST spectrograph was proposed. According to this project, three important tasks have been done as follows: Firstly, a simulation of EDI operating theory contains the stellar spectrum model, interferometer transmission model, spectrograph mediation model and RV solution model. In order to meet the practical situation, two detecting modes, temporal and spatial phase-stepping methods, are separately simulated. The interference spectrum is analyzed with Fourier transform algorithm and a higher resolution conventional spectrum is resolved. Secondly, an EDI prototype is composed of a multi-object interferometer prototype and the LAMOST spectrograph. Some ideas are used in the design to reduce the effect of central obscuration, for example, modular structure and external/internal adjusting frames. Another feasibility experiment was done at Xinglong Station in 2010. A related spectrum reduction program and the instrumental stability were tested by obtaining some multi-object interference spectrum. Thirdly, studying the parameter optimization of fixed-delay Michelson interferometer is helpful to increase its inner thermal stability and reduce the external environmental requirement. Referring to Wide-angle Michelson Interferometer successfully used in Upper Atmospheric Wind field, a glass pair selecting scheme is given. By choosing a suitable glass pair of interference arms, the RV error can be stable as several hundred m\\cdots^{-1}\\cdot{dg}C^{-1}. Therefore, this work is helpful to deeply study EDI technique and speed up the development of multi-object exoplanet survey system. LAMOST will make a greater contribution to astronomy when the combination between its spectrographs and EDI technique comes true.

  17. Multi-species beam hardening calibration device for x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Evershed, Anthony N. Z.; Mills, David; Davis, Graham

    2012-10-01

    Impact-source X-ray microtomography (XMT) is a widely-used benchtop alternative to synchrotron radiation microtomography. Since X-rays from a tube are polychromatic, however, greyscale `beam hardening' artefacts are produced by the preferential absorption of low-energy photons in the beam path. A multi-material `carousel' test piece was developed to offer a wider range of X-ray attenuations from well-characterised filters than single-material step wedges can produce practically, and optimization software was developed to produce a beam hardening correction by use of the Nelder-Mead optimization method, tuned for specimens composed of other materials (such as hydroxyapatite [HA] or barium for dental applications.) The carousel test piece produced calibration polynomials reliably and with a significantly smaller discrepancy between the calculated and measured attenuations than the calibration step wedge previously in use. An immersion tank was constructed and used to simplify multi-material samples in order to negate the beam hardening effect of low atomic number materials within the specimen when measuring mineral concentration of higher-Z regions. When scanned in water at an acceleration voltage of 90 kV a Scanco AG hydroxyapatite / poly(methyl methacrylate) calibration phantom closely approximates a single-material system, producing accurate hydroxyapatite concentration measurements. This system can then be corrected for beam hardening for the material of interest.

  18. Automation of route identification and optimisation based on data-mining and chemical intuition.

    PubMed

    Lapkin, A A; Heer, P K; Jacob, P-M; Hutchby, M; Cunningham, W; Bull, S D; Davidson, M G

    2017-09-21

    Data-mining of Reaxys and network analysis of the combined literature and in-house reactions set were used to generate multiple possible reaction routes to convert a bio-waste feedstock, limonene, into a pharmaceutical API, paracetamol. The network analysis of data provides a rich knowledge-base for generation of the initial reaction screening and development programme. Based on the literature and the in-house data, an overall flowsheet for the conversion of limonene to paracetamol was proposed. Each individual reaction-separation step in the sequence was simulated as a combination of the continuous flow and batch steps. The linear model generation methodology allowed us to identify the reaction steps requiring further chemical optimisation. The generated model can be used for global optimisation and generation of environmental and other performance indicators, such as cost indicators. However, the identified further challenge is to automate model generation to evolve optimal multi-step chemical routes and optimal process configurations.

  19. Detailed analysis of an optimized FPP-based 3D imaging system

    NASA Astrophysics Data System (ADS)

    Tran, Dat; Thai, Anh; Duong, Kiet; Nguyen, Thanh; Nehmetallah, Georges

    2016-05-01

    In this paper, we present detail analysis and a step-by-step implementation of an optimized fringe projection profilometry (FPP) based 3D shape measurement system. First, we propose a multi-frequency and multi-phase shifting sinusoidal fringe pattern reconstruction approach to increase accuracy and sensitivity of the system. Second, phase error compensation caused by the nonlinear transfer function of the projector and camera is performed through polynomial approximation. Third, phase unwrapping is performed using spatial and temporal techniques and the tradeoff between processing speed and high accuracy is discussed in details. Fourth, generalized camera and system calibration are developed for phase to real world coordinate transformation. The calibration coefficients are estimated accurately using a reference plane and several gauge blocks with precisely known heights and by employing a nonlinear least square fitting method. Fifth, a texture will be attached to the height profile by registering a 2D real photo to the 3D height map. The last step is to perform 3D image fusion and registration using an iterative closest point (ICP) algorithm for a full field of view reconstruction. The system is experimentally constructed using compact, portable, and low cost off-the-shelf components. A MATLAB® based GUI is developed to control and synchronize the whole system.

  20. Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells.

    PubMed

    Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J; Rhodes, Christopher; Mukherjee, Partha P

    2016-02-01

    Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring.

  1. Non-aqueous Electrode Processing and Construction of Lithium-ion Coin Cells

    PubMed Central

    Stein, Malcolm; Chen, Chien-Fan; Robles, Daniel J.; Rhodes, Christopher; Mukherjee, Partha P.

    2016-01-01

    Research into new and improved materials to be utilized in lithium-ion batteries (LIB) necessitates an experimental counterpart to any computational analysis. Testing of lithium-ion batteries in an academic setting has taken on several forms, but at the most basic level lies the coin cell construction. In traditional LIB electrode preparation, a multi-phase slurry composed of active material, binder, and conductive additive is cast out onto a substrate. An electrode disc can then be punched from the dried sheet and used in the construction of a coin cell for electrochemical evaluation. Utilization of the potential of the active material in a battery is critically dependent on the microstructure of the electrode, as an appropriate distribution of the primary components are crucial to ensuring optimal electrical conductivity, porosity, and tortuosity, such that electrochemical and transport interaction is optimized. Processing steps ranging from the combination of dry powder, wet mixing, and drying can all critically affect multi-phase interactions that influence the microstructure formation. Electrochemical probing necessitates the construction of electrodes and coin cells with the utmost care and precision. This paper aims at providing a step-by-step guide of non-aqueous electrode processing and coin cell construction for lithium-ion batteries within an academic setting and with emphasis on deciphering the influence of drying and calendaring. PMID:26863503

  2. Research on Intelligent Control System of DC SQUID Magnetometer Parameters for Multi-channel System

    NASA Astrophysics Data System (ADS)

    Chen, Hua; Yang, Kang; Lu, Li; Kong, Xiangyan; Wang, Hai; Wu, Jun; Wang, Yongliang

    2018-07-01

    In a multi-channel SQUID measurement system, adjusting device parameters to optimal condition for all channels is time-consuming. In this paper, an intelligent control system is presented to determine the optimal working point of devices which is automatic and more efficient comparing to the manual one. An optimal working point searching algorithm is introduced as the core component of the control system. In this algorithm, the bias voltage V_bias is step scanned to obtain the maximal value of the peak-to-peak current value I_pp of the SQUID magnetometer modulation curve. We choose this point as the optimal one. Using the above control system, more than 30 weakly damped SQUID magnetometers with area of 5 × 5 mm^2 or 10 × 10 mm^2 are adjusted and a 36-channel magnetocardiography system perfectly worked in a magnetically shielded room. The average white flux noise is 15 {μ Φ }_0/Hz^{1/2}.

  3. Research on Intelligent Control System of DC SQUID Magnetometer Parameters for Multi-channel System

    NASA Astrophysics Data System (ADS)

    Chen, Hua; Yang, Kang; Lu, Li; Kong, Xiangyan; Wang, Hai; Wu, Jun; Wang, Yongliang

    2018-03-01

    In a multi-channel SQUID measurement system, adjusting device parameters to optimal condition for all channels is time-consuming. In this paper, an intelligent control system is presented to determine the optimal working point of devices which is automatic and more efficient comparing to the manual one. An optimal working point searching algorithm is introduced as the core component of the control system. In this algorithm, the bias voltage V_bias is step scanned to obtain the maximal value of the peak-to-peak current value I_pp of the SQUID magnetometer modulation curve. We choose this point as the optimal one. Using the above control system, more than 30 weakly damped SQUID magnetometers with area of 5 × 5 mm^2 or 10 × 10 mm^2 are adjusted and a 36-channel magnetocardiography system perfectly worked in a magnetically shielded room. The average white flux noise is 15 μΦ_0/Hz^{1/2}.

  4. Size-guided multi-seed heuristic method for geometry optimization of clusters: Application to benzene clusters.

    PubMed

    Takeuchi, Hiroshi

    2018-05-08

    Since searching for the global minimum on the potential energy surface of a cluster is very difficult, many geometry optimization methods have been proposed, in which initial geometries are randomly generated and subsequently improved with different algorithms. In this study, a size-guided multi-seed heuristic method is developed and applied to benzene clusters. It produces initial configurations of the cluster with n molecules from the lowest-energy configurations of the cluster with n - 1 molecules (seeds). The initial geometries are further optimized with the geometrical perturbations previously used for molecular clusters. These steps are repeated until the size n satisfies a predefined one. The method locates putative global minima of benzene clusters with up to 65 molecules. The performance of the method is discussed using the computational cost, rates to locate the global minima, and energies of initial geometries. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  5. Astrionic system optimization and modular astrionics for NASA missions after 1974. Preliminary definition of astrionic system for space tug Mission Vehicle Payload (MVP)

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Results of preliminary studies to define the space tug astrionic system, subsystems, and components to meet requirements for a variety of missions are reported. Emphasis is placed on demonstration of the modular astrionics approach in the design of the space tug astrionic system.

  6. DReAM: Demand Response Architecture for Multi-level District Heating and Cooling Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Saptarshi; Chandan, Vikas; Arya, Vijay

    In this paper, we exploit the inherent hierarchy of heat exchangers in District Heating and Cooling (DHC) networks and propose DReAM, a novel Demand Response (DR) architecture for Multi-level DHC networks. DReAM serves to economize system operation while still respecting comfort requirements of individual consumers. Contrary to many present day DR schemes that work on a consumer level granularity, DReAM works at a level of hierarchy above buildings, i.e. substations that supply heat to a group of buildings. This improves the overall DR scalability and reduce the computational complexity. In the first step of the proposed approach, mathematical models ofmore » individual substations and their downstream networks are abstracted into appropriately constructed low-complexity structural forms. In the second step, this abstracted information is employed by the utility to perform DR optimization that determines the optimal heat inflow to individual substations rather than buildings, in order to achieve the targeted objectives across the network. We validate the proposed DReAM framework through experimental results under different scenarios on a test network.« less

  7. Static inverter with synchronous output waveform synthesized by time-optimal-response feedback

    NASA Technical Reports Server (NTRS)

    Kernick, A.; Stechschulte, D. L.; Shireman, D. W.

    1976-01-01

    Time-optimal-response 'bang-bang' or 'bang-hang' technique, using four feedback control loops, synthesizes static-inverter sinusoidal output waveform by self-oscillatory but yet synchronous pulse-frequency-modulation (SPFM). A single modular power stage per phase of ac output entails the minimum of circuit complexity while providing by feedback synthesis individual phase voltage regulation, phase position control and inherent compensation simultaneously for line and load disturbances. Clipped sinewave performance is described under off-limit load or input voltage conditions. Also, approaches to high power levels, 3-phase arraying and parallel modular connection are given.

  8. Modular magazine for suitable handling of microparts in industry

    NASA Astrophysics Data System (ADS)

    Grimme, Ralf; Schmutz, Wolfgang; Schlenker, Dirk; Schuenemann, Matthias; Stock, Achim; Schaefer, Wolfgang

    1998-01-01

    Microassembly and microadjustment techniques are key technologies in the industrial production of hybrid microelectromechanical systems. One focal point in current microproduction research and engineering is the design and development of high-precision microassembly and microadjustment equipment capable of operating within the framework of flexible automated industrial production. As well as these developments, suitable microassembly tools for industrial use also need to be equipped with interfaces for the supply and delivery of microcomponents. The microassembly process necessitates the supply of microparts in a geometrically defined manner. In order to reduce processing steps and production costs, there is a demand for magazines capable of providing free accessibility to the fixed microcomponents. Commonly used at present are feeding techniques, which originate from the field of semiconductor production. However none of these techniques fully meets the requirements of industrial microassembly technology. A novel modular magazine set, developed and tested in a joint project, is presented here. The magazines are able to hold microcomponents during cleaning, inspection and assembly without nay additional handling steps. The modularity of their design allows for maximum technical flexibility. The modular magazine fits into currently practiced SEMI standards. The design and concept of the magazine enables industrial manufacturers to promote a cost-efficient and flexible precision assembly of microelectromechanical systems.

  9. EMMA: An Extensible Mammalian Modular Assembly Toolkit for the Rapid Design and Production of Diverse Expression Vectors.

    PubMed

    Martella, Andrea; Matjusaitis, Mantas; Auxillos, Jamie; Pollard, Steven M; Cai, Yizhi

    2017-07-21

    Mammalian plasmid expression vectors are critical reagents underpinning many facets of research across biology, biomedical research, and the biotechnology industry. Traditional cloning methods often require laborious manual design and assembly of plasmids using tailored sequential cloning steps. This process can be protracted, complicated, expensive, and error-prone. New tools and strategies that facilitate the efficient design and production of bespoke vectors would help relieve a current bottleneck for researchers. To address this, we have developed an extensible mammalian modular assembly kit (EMMA). This enables rapid and efficient modular assembly of mammalian expression vectors in a one-tube, one-step golden-gate cloning reaction, using a standardized library of compatible genetic parts. The high modularity, flexibility, and extensibility of EMMA provide a simple method for the production of functionally diverse mammalian expression vectors. We demonstrate the value of this toolkit by constructing and validating a range of representative vectors, such as transient and stable expression vectors (transposon based vectors), targeting vectors, inducible systems, polycistronic expression cassettes, fusion proteins, and fluorescent reporters. The method also supports simple assembly combinatorial libraries and hierarchical assembly for production of larger multigenetic cargos. In summary, EMMA is compatible with automated production, and novel genetic parts can be easily incorporated, providing new opportunities for mammalian synthetic biology.

  10. Modular architecture of protein structures and allosteric communications: potential implications for signaling proteins and regulatory linkages

    PubMed Central

    del Sol, Antonio; Araúzo-Bravo, Marcos J; Amoros, Dolors; Nussinov, Ruth

    2007-01-01

    Background Allosteric communications are vital for cellular signaling. Here we explore a relationship between protein architectural organization and shortcuts in signaling pathways. Results We show that protein domains consist of modules interconnected by residues that mediate signaling through the shortest pathways. These mediating residues tend to be located at the inter-modular boundaries, which are more rigid and display a larger number of long-range interactions than intra-modular regions. The inter-modular boundaries contain most of the residues centrally conserved in the protein fold, which may be crucial for information transfer between amino acids. Our approach to modular decomposition relies on a representation of protein structures as residue-interacting networks, and removal of the most central residue contacts, which are assumed to be crucial for allosteric communications. The modular decomposition of 100 multi-domain protein structures indicates that modules constitute the building blocks of domains. The analysis of 13 allosteric proteins revealed that modules characterize experimentally identified functional regions. Based on the study of an additional functionally annotated dataset of 115 proteins, we propose that high-modularity modules include functional sites and are the basic functional units. We provide examples (the Gαs subunit and P450 cytochromes) to illustrate that the modular architecture of active sites is linked to their functional specialization. Conclusion Our method decomposes protein structures into modules, allowing the study of signal transmission between functional sites. A modular configuration might be advantageous: it allows signaling proteins to expand their regulatory linkages and may elicit a broader range of control mechanisms either via modular combinations or through modulation of inter-modular linkages. PMID:17531094

  11. New Modular Camera No Ordinary Joe

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Although dubbed 'Little Joe' for its small-format characteristics, a new wavefront sensor camera has proved that it is far from coming up short when paired with high-speed, low-noise applications. SciMeasure Analytical Systems, Inc., a provider of cameras and imaging accessories for use in biomedical research and industrial inspection and quality control, is the eye behind Little Joe's shutter, manufacturing and selling the modular, multi-purpose camera worldwide to advance fields such as astronomy, neurobiology, and cardiology.

  12. Inter-subject FDG PET Brain Networks Exhibit Multi-scale Community Structure with Different Normalization Techniques.

    PubMed

    Sperry, Megan M; Kartha, Sonia; Granquist, Eric J; Winkelstein, Beth A

    2018-07-01

    Inter-subject networks are used to model correlations between brain regions and are particularly useful for metabolic imaging techniques, like 18F-2-deoxy-2-(18F)fluoro-D-glucose (FDG) positron emission tomography (PET). Since FDG PET typically produces a single image, correlations cannot be calculated over time. Little focus has been placed on the basic properties of inter-subject networks and if they are affected by group size and image normalization. FDG PET images were acquired from rats (n = 18), normalized by whole brain, visual cortex, or cerebellar FDG uptake, and used to construct correlation matrices. Group size effects on network stability were investigated by systematically adding rats and evaluating local network connectivity (node strength and clustering coefficient). Modularity and community structure were also evaluated in the differently normalized networks to assess meso-scale network relationships. Local network properties are stable regardless of normalization region for groups of at least 10. Whole brain-normalized networks are more modular than visual cortex- or cerebellum-normalized network (p < 0.00001); however, community structure is similar at network resolutions where modularity differs most between brain and randomized networks. Hierarchical analysis reveals consistent modules at different scales and clustering of spatially-proximate brain regions. Findings suggest inter-subject FDG PET networks are stable for reasonable group sizes and exhibit multi-scale modularity.

  13. Multi-exponential analysis of magnitude MR images using a quantitative multispectral edge-preserving filter.

    PubMed

    Bonny, Jean Marie; Boespflug-Tanguly, Odile; Zanca, Michel; Renou, Jean Pierre

    2003-03-01

    A solution for discrete multi-exponential analysis of T(2) relaxation decay curves obtained in current multi-echo imaging protocol conditions is described. We propose a preprocessing step to improve the signal-to-noise ratio and thus lower the signal-to-noise ratio threshold from which a high percentage of true multi-exponential detection is detected. It consists of a multispectral nonlinear edge-preserving filter that takes into account the signal-dependent Rician distribution of noise affecting magnitude MR images. Discrete multi-exponential decomposition, which requires no a priori knowledge, is performed by a non-linear least-squares procedure initialized with estimates obtained from a total least-squares linear prediction algorithm. This approach was validated and optimized experimentally on simulated data sets of normal human brains.

  14. Comparison of microbial community shifts in two parallel multi-step drinking water treatment processes.

    PubMed

    Xu, Jiajiong; Tang, Wei; Ma, Jun; Wang, Hong

    2017-07-01

    Drinking water treatment processes remove undesirable chemicals and microorganisms from source water, which is vital to public health protection. The purpose of this study was to investigate the effects of treatment processes and configuration on the microbiome by comparing microbial community shifts in two series of different treatment processes operated in parallel within a full-scale drinking water treatment plant (DWTP) in Southeast China. Illumina sequencing of 16S rRNA genes of water samples demonstrated little effect of coagulation/sedimentation and pre-oxidation steps on bacterial communities, in contrast to dramatic and concurrent microbial community shifts during ozonation, granular activated carbon treatment, sand filtration, and disinfection for both series. A large number of unique operational taxonomic units (OTUs) at these four treatment steps further illustrated their strong shaping power towards the drinking water microbial communities. Interestingly, multidimensional scaling analysis revealed tight clustering of biofilm samples collected from different treatment steps, with Nitrospira, the nitrite-oxidizing bacteria, noted at higher relative abundances in biofilm compared to water samples. Overall, this study provides a snapshot of step-to-step microbial evolvement in multi-step drinking water treatment systems, and the results provide insight to control and manipulation of the drinking water microbiome via optimization of DWTP design and operation.

  15. Multi-scale modularity and motif distributional effect in metabolic networks.

    PubMed

    Gao, Shang; Chen, Alan; Rahmani, Ali; Zeng, Jia; Tan, Mehmet; Alhajj, Reda; Rokne, Jon; Demetrick, Douglas; Wei, Xiaohui

    2016-01-01

    Metabolism is a set of fundamental processes that play important roles in a plethora of biological and medical contexts. It is understood that the topological information of reconstructed metabolic networks, such as modular organization, has crucial implications on biological functions. Recent interpretations of modularity in network settings provide a view of multiple network partitions induced by different resolution parameters. Here we ask the question: How do multiple network partitions affect the organization of metabolic networks? Since network motifs are often interpreted as the super families of evolved units, we further investigate their impact under multiple network partitions and investigate how the distribution of network motifs influences the organization of metabolic networks. We studied Homo sapiens, Saccharomyces cerevisiae and Escherichia coli metabolic networks; we analyzed the relationship between different community structures and motif distribution patterns. Further, we quantified the degree to which motifs participate in the modular organization of metabolic networks.

  16. Lifting scheme-based method for joint coding 3D stereo digital cinema with luminace correction and optimized prediction

    NASA Astrophysics Data System (ADS)

    Darazi, R.; Gouze, A.; Macq, B.

    2009-01-01

    Reproducing a natural and real scene as we see in the real world everyday is becoming more and more popular. Stereoscopic and multi-view techniques are used for this end. However due to the fact that more information are displayed requires supporting technologies such as digital compression to ensure the storage and transmission of the sequences. In this paper, a new scheme for stereo image coding is proposed. The original left and right images are jointly coded. The main idea is to optimally exploit the existing correlation between the two images. This is done by the design of an efficient transform that reduces the existing redundancy in the stereo image pair. This approach was inspired by Lifting Scheme (LS). The novelty in our work is that the prediction step is been replaced by an hybrid step that consists in disparity compensation followed by luminance correction and an optimized prediction step. The proposed scheme can be used for lossless and for lossy coding. Experimental results show improvement in terms of performance and complexity compared to recently proposed methods.

  17. Negotiating designs of multi-purpose reservoir systems in international basins

    NASA Astrophysics Data System (ADS)

    Geressu, Robel; Harou, Julien

    2016-04-01

    Given increasing agricultural and energy demands, coordinated management of multi-reservoir systems could help increase production without further stressing available water resources. However, regional or international disputes about water-use rights pose a challenge to efficient expansion and management of many large reservoir systems. Even when projects are likely to benefit all stakeholders, agreeing on the design, operation, financing, and benefit sharing can be challenging. This is due to the difficulty of considering multiple stakeholder interests in the design of projects and understanding the benefit trade-offs that designs imply. Incommensurate performance metrics, incomplete knowledge on system requirements, lack of objectivity in managing conflict and difficulty to communicate complex issue exacerbate the problem. This work proposes a multi-step hybrid multi-objective optimization and multi-criteria ranking approach for supporting negotiation in water resource systems. The approach uses many-objective optimization to generate alternative efficient designs and reveal the trade-offs between conflicting objectives. This enables informed elicitation of criteria weights for further multi-criteria ranking of alternatives. An ideal design would be ranked as best by all stakeholders. Resource-sharing mechanisms such as power-trade and/or cost sharing may help competing stakeholders arrive at designs acceptable to all. Many-objective optimization helps suggests efficient designs (reservoir site, its storage size and operating rule) and coordination levels considering the perspectives of multiple stakeholders simultaneously. We apply the proposed approach to a proof-of-concept study of the expansion of the Blue Nile transboundary reservoir system.

  18. Modular assembly of optical nanocircuits.

    PubMed

    Shi, Jinwei; Monticone, Francesco; Elias, Sarah; Wu, Yanwen; Ratchford, Daniel; Li, Xiaoqin; Alù, Andrea

    2014-05-29

    A key element enabling the microelectronic technology advances of the past decades has been the conceptualization of complex circuits with versatile functionalities as being composed of the proper combination of basic 'lumped' circuit elements (for example, inductors and capacitors). In contrast, modern nanophotonic systems are still far from a similar level of sophistication, partially because of the lack of modularization of their response in terms of basic building blocks. Here we demonstrate the design, assembly and characterization of relatively complex photonic nanocircuits by accurately positioning a number of metallic and dielectric nanoparticles acting as modular lumped elements. The nanoparticle clusters produce the desired spectral response described by simple circuit rules and are shown to be dynamically reconfigurable by modifying the direction or polarization of impinging signals. Our work represents an important step towards extending the powerful modular design tools of electronic circuits into nanophotonic systems.

  19. Modular assembly of optical nanocircuits

    NASA Astrophysics Data System (ADS)

    Shi, Jinwei; Monticone, Francesco; Elias, Sarah; Wu, Yanwen; Ratchford, Daniel; Li, Xiaoqin; Alù, Andrea

    2014-05-01

    A key element enabling the microelectronic technology advances of the past decades has been the conceptualization of complex circuits with versatile functionalities as being composed of the proper combination of basic ‘lumped’ circuit elements (for example, inductors and capacitors). In contrast, modern nanophotonic systems are still far from a similar level of sophistication, partially because of the lack of modularization of their response in terms of basic building blocks. Here we demonstrate the design, assembly and characterization of relatively complex photonic nanocircuits by accurately positioning a number of metallic and dielectric nanoparticles acting as modular lumped elements. The nanoparticle clusters produce the desired spectral response described by simple circuit rules and are shown to be dynamically reconfigurable by modifying the direction or polarization of impinging signals. Our work represents an important step towards extending the powerful modular design tools of electronic circuits into nanophotonic systems.

  20. Modular synthesis and in vitro and in vivo antimalarial assessment of C-10 pyrrole mannich base derivatives of artemisinin.

    PubMed

    Pacorel, Bénédicte; Leung, Suet C; Stachulski, Andrew V; Davies, Jill; Vivas, Livia; Lander, Hollie; Ward, Stephen A; Kaiser, Marcel; Brun, Reto; O'Neill, Paul M

    2010-01-28

    In two steps from dihydroartemisinin, a small array of 16 semisynthetic C-10 pyrrole Mannich artemisinin derivatives (7a-p) have been prepared in moderate to excellent yield. In vitro analysis against both chloroquine sensitive and resistant strains has demonstrated that these analogues have nanomolar antimalarial activity, with several compounds being more than 3 times more potent than the natural product artemisinin. In addition to a potent antimalarial profile, these molecules also have very high in vitro therapeutic indices. Analysis of the optimal Mannich side chain substitution for in vitro and in vivo activity reveals that the morpholine and N-methylpiperazine Mannich side chains provide analogues with the best activity profiles, both in vitro and in vivo in the Peter's 4 day test.

  1. Automated multiplex genome-scale engineering in yeast

    PubMed Central

    Si, Tong; Chao, Ran; Min, Yuhao; Wu, Yuying; Ren, Wen; Zhao, Huimin

    2017-01-01

    Genome-scale engineering is indispensable in understanding and engineering microorganisms, but the current tools are mainly limited to bacterial systems. Here we report an automated platform for multiplex genome-scale engineering in Saccharomyces cerevisiae, an important eukaryotic model and widely used microbial cell factory. Standardized genetic parts encoding overexpression and knockdown mutations of >90% yeast genes are created in a single step from a full-length cDNA library. With the aid of CRISPR-Cas, these genetic parts are iteratively integrated into the repetitive genomic sequences in a modular manner using robotic automation. This system allows functional mapping and multiplex optimization on a genome scale for diverse phenotypes including cellulase expression, isobutanol production, glycerol utilization and acetic acid tolerance, and may greatly accelerate future genome-scale engineering endeavours in yeast. PMID:28469255

  2. Combining analysis with optimization at Langley Research Center. An evolutionary process

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1982-01-01

    The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.

  3. A Multi-Objective Decision Making Approach for Solving the Image Segmentation Fusion Problem.

    PubMed

    Khelifi, Lazhar; Mignotte, Max

    2017-08-01

    Image segmentation fusion is defined as the set of methods which aim at merging several image segmentations, in a manner that takes full advantage of the complementarity of each one. Previous relevant researches in this field have been impeded by the difficulty in identifying an appropriate single segmentation fusion criterion, providing the best possible, i.e., the more informative, result of fusion. In this paper, we propose a new model of image segmentation fusion based on multi-objective optimization which can mitigate this problem, to obtain a final improved result of segmentation. Our fusion framework incorporates the dominance concept in order to efficiently combine and optimize two complementary segmentation criteria, namely, the global consistency error and the F-measure (precision-recall) criterion. To this end, we present a hierarchical and efficient way to optimize the multi-objective consensus energy function related to this fusion model, which exploits a simple and deterministic iterative relaxation strategy combining the different image segments. This step is followed by a decision making task based on the so-called "technique for order performance by similarity to ideal solution". Results obtained on two publicly available databases with manual ground truth segmentations clearly show that our multi-objective energy-based model gives better results than the classical mono-objective one.

  4. Influence of multi-step washing using Na2EDTA, oxalic acid and phosphoric acid on metal fractionation and spectroscopy characteristics from contaminated soil.

    PubMed

    Wei, Meng; Chen, Jiajun

    2016-11-01

    A multi-step soil washing test using a typical chelating agent (Na 2 EDTA), organic acid (oxalic acid), and inorganic weak acid (phosphoric acid) was conducted to remediate soil contaminated with heavy metals near an arsenic mining area. The aim of the test was to improve the heavy metal removal efficiency and investigate its influence on metal fractionation and the spectroscopy characteristics of contaminated soil. The results indicated that the orders of the multi-step washing were critical for the removal efficiencies of the metal fractions, bioavailability, and potential mobility due to the different dissolution levels of mineral fractions and the inter-transformation of metal fractions by XRD and FT-IR spectral analyses. The optimal soil washing options were identified as the Na 2 EDTA-phosphoric-oxalic acid (EPO) and phosphoric-oxalic acid-Na 2 EDTA (POE) sequences because of their high removal efficiencies (approximately 45 % for arsenic and 88 % for cadmium) and the minimal harmful effects that were determined by the mobility and bioavailability of the remaining heavy metals based on the metal stability (I R ) and modified redistribution index ([Formula: see text]).

  5. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Supinski, B.; Caliga, D.

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  6. Modular neural networks: a survey.

    PubMed

    Auda, G; Kamel, M

    1999-04-01

    Modular Neural Networks (MNNs) is a rapidly growing field in artificial Neural Networks (NNs) research. This paper surveys the different motivations for creating MNNs: biological, psychological, hardware, and computational. Then, the general stages of MNN design are outlined and surveyed as well, viz., task decomposition techniques, learning schemes and multi-module decision-making strategies. Advantages and disadvantages of the surveyed methods are pointed out, and an assessment with respect to practical potential is provided. Finally, some general recommendations for future designs are presented.

  7. Modular reservoir concept for MEMS-based transdermal drug delivery systems

    NASA Astrophysics Data System (ADS)

    Cantwell, Cara T.; Wei, Pinghung; Ziaie, Babak; Rao, Masaru P.

    2014-11-01

    While MEMS-based transdermal drug delivery device development efforts have typically focused on tightly-integrated solutions, we propose an alternate conception based upon a novel, modular drug reservoir approach. By decoupling the drug storage functionality from the rest of the delivery system, this approach seeks to minimize cold chain storage volume, enhance compatibility with conventional pharmaceutical practices, and allow independent optimization of reservoir device design, materials, and fabrication. Herein, we report the design, fabrication, and preliminary characterization of modular reservoirs that demonstrate the virtue of this approach within the application context of transdermal insulin administration for diabetes management.

  8. Promoter library-based module combination (PLMC) technology for optimization of threonine biosynthesis in Corynebacterium glutamicum.

    PubMed

    Wei, Liang; Xu, Ning; Wang, Yiran; Zhou, Wei; Han, Guoqiang; Ma, Yanhe; Liu, Jun

    2018-05-01

    Due to the lack of efficient control elements and tools, the fine-tuning of gene expression in the multi-gene metabolic pathways is still a great challenge for engineering microbial cell factories, especially for the important industrial microorganism Corynebacterium glutamicum. In this study, the promoter library-based module combination (PLMC) technology was developed to efficiently optimize the expression of genes in C. glutamicum. A random promoter library was designed to contain the putative - 10 (NNTANANT) and - 35 (NNGNCN) consensus motifs, and refined through a three-step screening procedure to achieve numerous genetic control elements with different strength levels, including fluorescence-activated cell sorting (FACS) screening, agar plate screening, and 96-well plate screening. Multiple conventional strategies were employed for further precise characterizations of the promoter library, such as real-time quantitative PCR, sodium dodecyl sulfate polyacrylamide gel electrophoresis, FACS analysis, and the lacZ reporter system. These results suggested that the established promoter elements effectively regulated gene expression and showed varying strengths over a wide range. Subsequently, a multi-module combination technology was created based on the efficient promoter elements for combination and optimization of modules in the multi-gene pathways. Using this technology, the threonine biosynthesis pathway was reconstructed and optimized by predictable tuning expression of five modules in C. glutamicum. The threonine titer of the optimized strain was significantly improved to 12.8 g/L, an approximate 6.1-fold higher than that of the control strain. Overall, the PLMC technology presented in this study provides a rapid and effective method for combination and optimization of multi-gene pathways in C. glutamicum.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fong, Erika J.; Huang, Chao; Hamilton, Julie

    Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less

  10. Towards a Multifunctional Electrochemical Sensing and Niosome Generation Lab-on-Chip Platform Based on a Plug-and-Play Concept.

    PubMed

    Kara, Adnane; Rouillard, Camille; Mathault, Jessy; Boisvert, Martin; Tessier, Frédéric; Landari, Hamza; Melki, Imene; Laprise-Pelletier, Myriam; Boisselier, Elodie; Fortin, Marc-André; Boilard, Eric; Greener, Jesse; Miled, Amine

    2016-05-28

    In this paper, we present a new modular lab on a chip design for multimodal neurotransmitter (NT) sensing and niosome generation based on a plug-and-play concept. This architecture is a first step toward an automated platform for an automated modulation of neurotransmitter concentration to understand and/or treat neurodegenerative diseases. A modular approach has been adopted in order to handle measurement or drug delivery or both measurement and drug delivery simultaneously. The system is composed of three fully independent modules: three-channel peristaltic micropumping system, a three-channel potentiostat and a multi-unit microfluidic system composed of pseudo-Y and cross-shape channels containing a miniature electrode array. The system was wirelessly controlled by a computer interface. The system is compact, with all the microfluidic and sensing components packaged in a 5 cm × 4 cm × 4 cm box. Applied to serotonin, a linear calibration curve down to 0.125 mM, with a limit of detection of 31 μ M was collected at unfunctionalized electrodes. Added sensitivity and selectivity was achieved by incorporating functionalized electrodes for dopamine sensing. Electrode functionalization was achieved with gold nanoparticles and using DNA and o-phenylene diamine polymer. The as-configured platform is demonstrated as a central component toward an "intelligent" drug delivery system based on a feedback loop to monitor drug delivery.

  11. Towards a Multifunctional Electrochemical Sensing and Niosome Generation Lab-on-Chip Platform Based on a Plug-and-Play Concept

    PubMed Central

    Kara, Adnane; Rouillard, Camille; Mathault, Jessy; Boisvert, Martin; Tessier, Frédéric; Landari, Hamza; Melki, Imene; Laprise-Pelletier, Myriam; Boisselier, Elodie; Fortin, Marc-André; Boilard, Eric; Greener, Jesse; Miled, Amine

    2016-01-01

    In this paper, we present a new modular lab on a chip design for multimodal neurotransmitter (NT) sensing and niosome generation based on a plug-and-play concept. This architecture is a first step toward an automated platform for an automated modulation of neurotransmitter concentration to understand and/or treat neurodegenerative diseases. A modular approach has been adopted in order to handle measurement or drug delivery or both measurement and drug delivery simultaneously. The system is composed of three fully independent modules: three-channel peristaltic micropumping system, a three-channel potentiostat and a multi-unit microfluidic system composed of pseudo-Y and cross-shape channels containing a miniature electrode array. The system was wirelessly controlled by a computer interface. The system is compact, with all the microfluidic and sensing components packaged in a 5 cm × 4 cm × 4 cm box. Applied to serotonin, a linear calibration curve down to 0.125 mM, with a limit of detection of 31 μM was collected at unfunctionalized electrodes. Added sensitivity and selectivity was achieved by incorporating functionalized electrodes for dopamine sensing. Electrode functionalization was achieved with gold nanoparticles and using DNA and o-phenylene diamine polymer. The as-configured platform is demonstrated as a central component toward an “intelligent” drug delivery system based on a feedback loop to monitor drug delivery. PMID:27240377

  12. Chimeric Antigen Receptor-Redirected T cells return to the bench

    PubMed Central

    Geldres, Claudia; Savoldo, Barbara; Dotti, Gianpietro

    2016-01-01

    While the clinical progress of chimeric antigen receptor T cell (CAR-T) immunotherapy has garnered attention to the field, our understanding of the biology of these chimeric molecules is still emerging. Our aim within this review is to bring to light the mechanistic understanding of these multi-modular receptors and how these individual components confer particular properties to CAR-Ts. In addition, we will discuss extrinsic factors that can be manipulated to influence CAR-T performance such as choice of cellular population, culturing conditions and additional modifications that enhance their activity particularly in solid tumors. Finally, we will also consider the emerging toxicity associated with CAR-Ts. By breaking apart the CAR and examining the role of each piece, we can build a better functioning cellular vehicle for optimized treatment of cancer patients. PMID:26797495

  13. Modular Ligation Extension of Guide RNA Operons (LEGO) for Multiplexed dCas9 Regulation of Metabolic Pathways in Saccharomyces cerevisiae.

    PubMed

    Deaner, Matthew; Holzman, Allison; Alper, Hal S

    2018-04-16

    Metabolic engineering typically utilizes a suboptimal step-wise gene target optimization approach to parse a highly connected and regulated cellular metabolism. While the endonuclease-null CRISPR/Cas system has enabled gene expression perturbations without genetic modification, it has been mostly limited to small sets of gene targets in eukaryotes due to inefficient methods to assemble and express large sgRNA operons. In this work, we develop a TEF1p-tRNA expression system and demonstrate that the use of tRNAs as splicing elements flanking sgRNAs provides higher efficiency than both Pol III and ribozyme-based expression across a variety of single sgRNA and multiplexed contexts. Next, we devise and validate a scheme to allow modular construction of tRNA-sgRNA (TST) operons using an iterative Type IIs digestion/ligation extension approach, termed CRISPR-Ligation Extension of sgRNA Operons (LEGO). This approach enables facile construction of large TST operons. We demonstrate this utility by constructing a metabolic rewiring prototype for 2,3-butanediol production in 2 distinct yeast strain backgrounds. These results demonstrate that our approach can act as a surrogate for traditional genetic modification on a much shorter design-cycle timescale. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  15. Optimization of a novel large field of view distortion phantom for MR-only treatment planning.

    PubMed

    Price, Ryan G; Knight, Robert A; Hwang, Ken-Pin; Bayram, Ersin; Nejad-Davarani, Siamak P; Glide-Hurst, Carri K

    2017-07-01

    MR-only treatment planning requires images of high geometric fidelity, particularly for large fields of view (FOV). However, the availability of large FOV distortion phantoms with analysis software is currently limited. This work sought to optimize a modular distortion phantom to accommodate multiple bore configurations and implement distortion characterization in a widely implementable solution. To determine candidate materials, 1.0 T MR and CT images were acquired of twelve urethane foam samples of various densities and strengths. Samples were precision-machined to accommodate 6 mm diameter paintballs used as landmarks. Final material candidates were selected by balancing strength, machinability, weight, and cost. Bore sizes and minimum aperture width resulting from couch position were tabulated from the literature (14 systems, 5 vendors). Bore geometry and couch position were simulated using MATLAB to generate machine-specific models to optimize the phantom build. Previously developed software for distortion characterization was modified for several magnet geometries (1.0 T, 1.5 T, 3.0 T), compared against previously published 1.0 T results, and integrated into the 3D Slicer application platform. All foam samples provided sufficient MR image contrast with paintball landmarks. Urethane foam (compressive strength ∼1000 psi, density ~20 lb/ft 3 ) was selected for its accurate machinability and weight characteristics. For smaller bores, a phantom version with the following parameters was used: 15 foam plates, 55 × 55 × 37.5 cm 3 (L×W×H), 5,082 landmarks, and weight ~30 kg. To accommodate > 70 cm wide bores, an extended build used 20 plates spanning 55 × 55 × 50 cm 3 with 7,497 landmarks and weight ~44 kg. Distortion characterization software was implemented as an external module into 3D Slicer's plugin framework and results agreed with the literature. The design and implementation of a modular, extendable distortion phantom was optimized for several bore configurations. The phantom and analysis software will be available for multi-institutional collaborations and cross-validation trials to support MR-only planning. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  16. A two-phase control algorithm for gear-shifting in a novel multi-speed transmission for electric vehicles

    NASA Astrophysics Data System (ADS)

    Roozegar, M.; Angeles, J.

    2018-05-01

    In light of the current low energy-storage capacity of electric batteries, multi-speed transmissions (MSTs) are being considered for applications in electric vehicles (EVs), since MSTs decrease the energy consumption of the EV via gear-shifting. Nonetheless, swiftness and seamlessness are the major concerns in gear-shifting. This study focuses on developing a gear-shifting control scheme for a novel MST designed for EVs. The main advantages of the proposed MST are simplicity and modularity. Firstly, the dynamics model of the transmission is formulated. Then, a two-phase algorithm is proposed for shifting between each two gear ratios, which guarantees a smooth and swift shift. In other words, a separate control set is applied for shifting between each gear pair, which includes two independent PID controllers, tuned using trial-and-error and a genetic algorithm (GA), for the two steps of the algorithm and a switch. A supervisory controller is also employed to choose the proper PID gains, called PID gain-scheduling. Simulation results for various controllers and conditions are reported and compared, indicating that the proposed scheme is highly promising for a desired gear-shifting even in the presence of an unknown external disturbance.

  17. Multidimensional bioseparation with modular microfluidics

    DOEpatents

    Chirica, Gabriela S.; Renzi, Ronald F.

    2013-08-27

    A multidimensional chemical separation and analysis system is described including a prototyping platform and modular microfluidic components capable of rapid and convenient assembly, alteration and disassembly of numerous candidate separation systems. Partial or total computer control of the separation system is possible. Single or multiple alternative processing trains can be tested, optimized and/or run in parallel. Examples related to the separation and analysis of human bodily fluids are given.

  18. Modularized battery management for large lithium ion cells

    NASA Astrophysics Data System (ADS)

    Stuart, Thomas A.; Zhu, Wei

    A modular electronic battery management system (BMS) is described along with important features for protecting and optimizing the performance of large lithium ion (LiIon) battery packs. Of particular interest is the use of a much improved cell equalization system that can increase or decrease individual cell voltages. Experimental results are included for a pack of six series connected 60 Ah (amp-hour) LiIon cells.

  19. Efficient Machine Learning Approach for Optimizing Scientific Computing Applications on Emerging HPC Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arumugam, Kamesh

    Efficient parallel implementations of scientific applications on multi-core CPUs with accelerators such as GPUs and Xeon Phis is challenging. This requires - exploiting the data parallel architecture of the accelerator along with the vector pipelines of modern x86 CPU architectures, load balancing, and efficient memory transfer between different devices. It is relatively easy to meet these requirements for highly structured scientific applications. In contrast, a number of scientific and engineering applications are unstructured. Getting performance on accelerators for these applications is extremely challenging because many of these applications employ irregular algorithms which exhibit data-dependent control-ow and irregular memory accesses. Furthermore,more » these applications are often iterative with dependency between steps, and thus making it hard to parallelize across steps. As a result, parallelism in these applications is often limited to a single step. Numerical simulation of charged particles beam dynamics is one such application where the distribution of work and memory access pattern at each time step is irregular. Applications with these properties tend to present significant branch and memory divergence, load imbalance between different processor cores, and poor compute and memory utilization. Prior research on parallelizing such irregular applications have been focused around optimizing the irregular, data-dependent memory accesses and control-ow during a single step of the application independent of the other steps, with the assumption that these patterns are completely unpredictable. We observed that the structure of computation leading to control-ow divergence and irregular memory accesses in one step is similar to that in the next step. It is possible to predict this structure in the current step by observing the computation structure of previous steps. In this dissertation, we present novel machine learning based optimization techniques to address the parallel implementation challenges of such irregular applications on different HPC architectures. In particular, we use supervised learning to predict the computation structure and use it to address the control-ow and memory access irregularities in the parallel implementation of such applications on GPUs, Xeon Phis, and heterogeneous architectures composed of multi-core CPUs with GPUs or Xeon Phis. We use numerical simulation of charged particles beam dynamics simulation as a motivating example throughout the dissertation to present our new approach, though they should be equally applicable to a wide range of irregular applications. The machine learning approach presented here use predictive analytics and forecasting techniques to adaptively model and track the irregular memory access pattern at each time step of the simulation to anticipate the future memory access pattern. Access pattern forecasts can then be used to formulate optimization decisions during application execution which improves the performance of the application at a future time step based on the observations from earlier time steps. In heterogeneous architectures, forecasts can also be used to improve the memory performance and resource utilization of all the processing units to deliver a good aggregate performance. We used these optimization techniques and anticipation strategy to design a cache-aware, memory efficient parallel algorithm to address the irregularities in the parallel implementation of charged particles beam dynamics simulation on different HPC architectures. Experimental result using a diverse mix of HPC architectures shows that our approach in using anticipation strategy is effective in maximizing data reuse, ensuring workload balance, minimizing branch and memory divergence, and in improving resource utilization.« less

  20. Multi-parameter phenotypic profiling: using cellular effects to characterize small-molecule compounds.

    PubMed

    Feng, Yan; Mitchison, Timothy J; Bender, Andreas; Young, Daniel W; Tallarico, John A

    2009-07-01

    Multi-parameter phenotypic profiling of small molecules provides important insights into their mechanisms of action, as well as a systems level understanding of biological pathways and their responses to small molecule treatments. It therefore deserves more attention at an early step in the drug discovery pipeline. Here, we summarize the technologies that are currently in use for phenotypic profiling--including mRNA-, protein- and imaging-based multi-parameter profiling--in the drug discovery context. We think that an earlier integration of phenotypic profiling technologies, combined with effective experimental and in silico target identification approaches, can improve success rates of lead selection and optimization in the drug discovery process.

  1. Model of brain activation predicts the neural collective influence map of the brain

    PubMed Central

    Morone, Flaviano; Roth, Kevin; Min, Byungjoon; Makse, Hernán A.

    2017-01-01

    Efficient complex systems have a modular structure, but modularity does not guarantee robustness, because efficiency also requires an ingenious interplay of the interacting modular components. The human brain is the elemental paradigm of an efficient robust modular system interconnected as a network of networks (NoN). Understanding the emergence of robustness in such modular architectures from the interconnections of its parts is a longstanding challenge that has concerned many scientists. Current models of dependencies in NoN inspired by the power grid express interactions among modules with fragile couplings that amplify even small shocks, thus preventing functionality. Therefore, we introduce a model of NoN to shape the pattern of brain activations to form a modular environment that is robust. The model predicts the map of neural collective influencers (NCIs) in the brain, through the optimization of the influence of the minimal set of essential nodes responsible for broadcasting information to the whole-brain NoN. Our results suggest intervention protocols to control brain activity by targeting influential neural nodes predicted by network theory. PMID:28351973

  2. Modular Construction of Large Non-Immune Human Antibody Phage-Display Libraries from Variable Heavy and Light Chain Gene Cassettes.

    PubMed

    Lee, Nam-Kyung; Bidlingmaier, Scott; Su, Yang; Liu, Bin

    2018-01-01

    Monoclonal antibodies and antibody-derived therapeutics have emerged as a rapidly growing class of biological drugs for the treatment of cancer, autoimmunity, infection, and neurological diseases. To support the development of human antibodies, various display techniques based on antibody gene repertoires have been constructed over the last two decades. In particular, scFv-antibody phage display has been extensively utilized to select lead antibodies against a variety of target antigens. To construct a scFv phage display that enables efficient antibody discovery, and optimization, it is desirable to develop a system that allows modular assembly of highly diverse variable heavy chain and light chain (Vκ and Vλ) repertoires. Here, we describe modular construction of large non-immune human antibody phage-display libraries built on variable gene cassettes from heavy chain and light chain repertoires (Vκ- and Vλ-light can be made into independent cassettes). We describe utility of such libraries in antibody discovery and optimization through chain shuffling.

  3. Multi-time Scale Coordination of Distributed Energy Resources in Isolated Power Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayhorn, Ebony; Xie, Le; Butler-Purry, Karen

    2016-03-31

    In isolated power systems, including microgrids, distributed assets, such as renewable energy resources (e.g. wind, solar) and energy storage, can be actively coordinated to reduce dependency on fossil fuel generation. The key challenge of such coordination arises from significant uncertainty and variability occurring at small time scales associated with increased penetration of renewables. Specifically, the problem is with ensuring economic and efficient utilization of DERs, while also meeting operational objectives such as adequate frequency performance. One possible solution is to reduce the time step at which tertiary controls are implemented and to ensure feedback and look-ahead capability are incorporated tomore » handle variability and uncertainty. However, reducing the time step of tertiary controls necessitates investigating time-scale coupling with primary controls so as not to exacerbate system stability issues. In this paper, an optimal coordination (OC) strategy, which considers multiple time-scales, is proposed for isolated microgrid systems with a mix of DERs. This coordination strategy is based on an online moving horizon optimization approach. The effectiveness of the strategy was evaluated in terms of economics, technical performance, and computation time by varying key parameters that significantly impact performance. The illustrative example with realistic scenarios on a simulated isolated microgrid test system suggests that the proposed approach is generalizable towards designing multi-time scale optimal coordination strategies for isolated power systems.« less

  4. Simultaneous segmentation of the bone and cartilage surfaces of a knee joint in 3D

    NASA Astrophysics Data System (ADS)

    Yin, Y.; Zhang, X.; Anderson, D. D.; Brown, T. D.; Hofwegen, C. Van; Sonka, M.

    2009-02-01

    We present a novel framework for the simultaneous segmentation of multiple interacting surfaces belonging to multiple mutually interacting objects. The method is a non-trivial extension of our previously reported optimal multi-surface segmentation. Considering an example application of knee-cartilage segmentation, the framework consists of the following main steps: 1) Shape model construction: Building a mean shape for each bone of the joint (femur, tibia, patella) from interactively segmented volumetric datasets. Using the resulting mean-shape model - identification of cartilage, non-cartilage, and transition areas on the mean-shape bone model surfaces. 2) Presegmentation: Employment of iterative optimal surface detection method to achieve approximate segmentation of individual bone surfaces. 3) Cross-object surface mapping: Detection of inter-bone equidistant separating sheets to help identify corresponding vertex pairs for all interacting surfaces. 4) Multi-object, multi-surface graph construction and final segmentation: Construction of a single multi-bone, multi-surface graph so that two surfaces (bone and cartilage) with zero and non-zero intervening distances can be detected for each bone of the joint, according to whether or not cartilage can be locally absent or present on the bone. To define inter-object relationships, corresponding vertex pairs identified using the separating sheets were interlinked in the graph. The graph optimization algorithm acted on the entire multiobject, multi-surface graph to yield a globally optimal solution. The segmentation framework was tested on 16 MR-DESS knee-joint datasets from the Osteoarthritis Initiative database. The average signed surface positioning error for the 6 detected surfaces ranged from 0.00 to 0.12 mm. When independently initialized, the signed reproducibility error of bone and cartilage segmentation ranged from 0.00 to 0.26 mm. The results showed that this framework provides robust, accurate, and reproducible segmentation of the knee joint bone and cartilage surfaces of the femur, tibia, and patella. As a general segmentation tool, the developed framework can be applied to a broad range of multi-object segmentation problems.

  5. A modular optical sensor

    NASA Astrophysics Data System (ADS)

    Conklin, John Albert

    This dissertation presents the design of a modular, fiber-optic sensor and the results obtained from testing the modular sensor. The modular fiber-optic sensor is constructed in such manner that the sensor diaphragm can be replaced with different configurations to detect numerous physical phenomena. Additionally, different fiber-optic detection systems can be attached to the sensor. Initially, the modular sensor was developed to be used by university of students to investigate realistic optical sensors and detection systems to prepare for advance studies of micro-optical mechanical systems (MOMS). The design accomplishes this by doing two things. First, the design significantly lowers the costs associated with studying optical sensors by modularizing the sensor design. Second, the sensor broadens the number of physical phenomena that students can apply optical sensing techniques to in a fiber optics sensor course. The dissertation is divided into seven chapters covering the historical development of fiber-optic sensors, a theoretical overview of fiber-optic sensors, the design, fabrication, and the testing of the modular sensor developed in the course of this work. Chapter 1 discusses, in detail, how this dissertation is organized and states the purpose of the dissertation. Chapter 2 presents an historical overview of the development of optical fibers, optical pressure sensors, and fibers, optical pressure sensors, and optical microphones. Chapter 3 reviews the theory of multi-fiber optic detection systems, optical microphones, and pressure sensors. Chapter 4 presents the design details of the modular, optical sensor. Chapter 5 delves into how the modular sensor is fabricated and how the detection systems are constructed. Chapter 6 presents the data collected from the microphone and pressure sensor configurations of the modular sensor. Finally, Chapter 7 discusses the data collected and draws conclusions about the design based on the data collected. Chapter 7 also presents future work needed to expand the functionality and utility of the modular sensor.

  6. Differential evolution-based multi-objective optimization for the definition of a health indicator for fault diagnostics and prognostics

    NASA Astrophysics Data System (ADS)

    Baraldi, P.; Bonfanti, G.; Zio, E.

    2018-03-01

    The identification of the current degradation state of an industrial component and the prediction of its future evolution is a fundamental step for the development of condition-based and predictive maintenance approaches. The objective of the present work is to propose a general method for extracting a health indicator to measure the amount of component degradation from a set of signals measured during operation. The proposed method is based on the combined use of feature extraction techniques, such as Empirical Mode Decomposition and Auto-Associative Kernel Regression, and a multi-objective Binary Differential Evolution (BDE) algorithm for selecting the subset of features optimal for the definition of the health indicator. The objectives of the optimization are desired characteristics of the health indicator, such as monotonicity, trendability and prognosability. A case study is considered, concerning the prediction of the remaining useful life of turbofan engines. The obtained results confirm that the method is capable of extracting health indicators suitable for accurate prognostics.

  7. Automatic segmentation of the liver using multi-planar anatomy and deformable surface model in abdominal contrast-enhanced CT images

    NASA Astrophysics Data System (ADS)

    Jang, Yujin; Hong, Helen; Chung, Jin Wook; Yoon, Young Ho

    2012-02-01

    We propose an effective technique for the extraction of liver boundary based on multi-planar anatomy and deformable surface model in abdominal contrast-enhanced CT images. Our method is composed of four main steps. First, for extracting an optimal volume circumscribing a liver, lower and side boundaries are defined by positional information of pelvis and rib. An upper boundary is defined by separating the lungs and heart from CT images. Second, for extracting an initial liver volume, optimal liver volume is smoothed by anisotropic diffusion filtering and is segmented using adaptively selected threshold value. Third, for removing neighbor organs from initial liver volume, morphological opening and connected component labeling are applied to multiple planes. Finally, for refining the liver boundaries, deformable surface model is applied to a posterior liver surface and missing left robe in previous step. Then, probability summation map is generated by calculating regional information of the segmented liver in coronal plane, which is used for restoring the inaccurate liver boundaries. Experimental results show that our segmentation method can accurately extract liver boundaries without leakage to neighbor organs in spite of various liver shape and ambiguous boundary.

  8. Computer Based Porosity Design by Multi Phase Topology Optimization

    NASA Astrophysics Data System (ADS)

    Burblies, Andreas; Busse, Matthias

    2008-02-01

    A numerical simulation technique called Multi Phase Topology Optimization (MPTO) based on finite element method has been developed and refined by Fraunhofer IFAM during the last five years. MPTO is able to determine the optimum distribution of two or more different materials in components under thermal and mechanical loads. The objective of optimization is to minimize the component's elastic energy. Conventional topology optimization methods which simulate adaptive bone mineralization have got the disadvantage that there is a continuous change of mass by growth processes. MPTO keeps all initial material concentrations and uses methods adapted from molecular dynamics to find energy minimum. Applying MPTO to mechanically loaded components with a high number of different material densities, the optimization results show graded and sometimes anisotropic porosity distributions which are very similar to natural bone structures. Now it is possible to design the macro- and microstructure of a mechanical component in one step. Computer based porosity design structures can be manufactured by new Rapid Prototyping technologies. Fraunhofer IFAM has applied successfully 3D-Printing and Selective Laser Sintering methods in order to produce very stiff light weight components with graded porosities calculated by MPTO.

  9. A frozen Gaussian approximation-based multi-level particle swarm optimization for seismic inversion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jinglai, E-mail: jinglaili@sjtu.edu.cn; Lin, Guang, E-mail: lin491@purdue.edu; Computational Sciences and Mathematics Division, Pacific Northwest National Laboratory, Richland, WA 99352

    2015-09-01

    In this paper, we propose a frozen Gaussian approximation (FGA)-based multi-level particle swarm optimization (MLPSO) method for seismic inversion of high-frequency wave data. The method addresses two challenges in it: First, the optimization problem is highly non-convex, which makes hard for gradient-based methods to reach global minima. This is tackled by MLPSO which can escape from undesired local minima. Second, the character of high-frequency of seismic waves requires a large number of grid points in direct computational methods, and thus renders an extremely high computational demand on the simulation of each sample in MLPSO. We overcome this difficulty by threemore » steps: First, we use FGA to compute high-frequency wave propagation based on asymptotic analysis on phase plane; Then we design a constrained full waveform inversion problem to prevent the optimization search getting into regions of velocity where FGA is not accurate; Last, we solve the constrained optimization problem by MLPSO that employs FGA solvers with different fidelity. The performance of the proposed method is demonstrated by a two-dimensional full-waveform inversion example of the smoothed Marmousi model.« less

  10. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    PubMed

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of intensity and fluorescent pattern with accuracy better or comparable with the state of the art techniques, even when such techniques are run on manually segmented cells. Hence, ANAlyte can be proposed as a valid solution to the problem of ANA testing automatization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Computational techniques for design optimization of thermal protective systems for the space shuttle vehicle. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A modular program for design optimization of thermal protection systems is discussed. Its capabilities and limitations are reviewed. Instructions for the operation of the program, output, and the program itself are given.

  12. MetAMOS: a modular and open source metagenomic assembly and analysis pipeline

    PubMed Central

    2013-01-01

    We describe MetAMOS, an open source and modular metagenomic assembly and analysis pipeline. MetAMOS represents an important step towards fully automated metagenomic analysis, starting with next-generation sequencing reads and producing genomic scaffolds, open-reading frames and taxonomic or functional annotations. MetAMOS can aid in reducing assembly errors, commonly encountered when assembling metagenomic samples, and improves taxonomic assignment accuracy while also reducing computational cost. MetAMOS can be downloaded from: https://github.com/treangen/MetAMOS. PMID:23320958

  13. Learning in Modular Systems

    DTIC Science & Technology

    2010-05-07

    important for deep modular systems is that taking a series of small update steps and stopping before convergence, so called early stopping, is a form of regu...larization around the initial parameters of the system . For example, the stochastic gradient descent 5 1 u + 1 v = 1 6‖x2‖q = ‖x‖22q 22 Chapter 2...Aside from the overall speed of the classifier, no quantitative performance analysis was given, and the role played by the features in the larger system

  14. Modular cryogenic interconnects for multi-qubit devices.

    PubMed

    Colless, J I; Reilly, D J

    2014-11-01

    We have developed a modular interconnect platform for the control and readout of multiple solid-state qubits at cryogenic temperatures. The setup provides 74 filtered dc-bias connections, 32 control and readout connections with -3 dB frequency above 5 GHz, and 4 microwave feed lines that allow low loss (less than 3 dB) transmission 10 GHz. The incorporation of a radio-frequency interposer enables the platform to be separated into two printed circuit boards, decoupling the simple board that is bonded to the qubit chip from the multilayer board that incorporates expensive connectors and components. This modular approach lifts the burden of duplicating complex interconnect circuits for every prototype device. We report the performance of this platform at milli-Kelvin temperatures, including signal transmission and crosstalk measurements.

  15. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    NASA Astrophysics Data System (ADS)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  16. Modular fuel-cell stack assembly

    DOEpatents

    Patel, Pinakin [Danbury, CT; Urko, Willam [West Granby, CT

    2008-01-29

    A modular multi-stack fuel-cell assembly in which the fuel-cell stacks are situated within a containment structure and in which a gas distributor is provided in the structure and distributes received fuel and oxidant gases to the stacks and receives exhausted fuel and oxidant gas from the stacks so as to realize a desired gas flow distribution and gas pressure differential through the stacks. The gas distributor is centrally and symmetrically arranged relative to the stacks so that it itself promotes realization of the desired gas flow distribution and pressure differential.

  17. Subcommunities and Their Mutual Relationships in a Transaction Network

    NASA Astrophysics Data System (ADS)

    Iino, T.; Iyetomi, H.

    We investigate a Japanese transaction network consisting ofabout 800 thousand firms (nodes) and four million business relations (links) with focus on its modular structure. Communities detected by maximizing modularity often are dominated by firms with common features or behaviors in the network, such as characterized by regions or industry sectors. However, it is well known that the modularity optimization approach has a resolution limit problem, that is, it fails in identifying fine communities buried in large communities. To unfold such hidden structures, we apply the community detection to each of subnetworks formed by isolating those communities from the whole body. Subcommunities thus identified are composed of firms with finer regions, more specified sectors or business affiliations. Also we introduce a new idea of reduced modularity matrix to measure the strength of relations between (sub)communities.

  18. Dynamics of intracellular information decoding.

    PubMed

    Kobayashi, Tetsuya J; Kamimura, Atsushi

    2011-10-01

    A variety of cellular functions are robust even to substantial intrinsic and extrinsic noise in intracellular reactions and the environment that could be strong enough to impair or limit them. In particular, of substantial importance is cellular decision-making in which a cell chooses a fate or behavior on the basis of information conveyed in noisy external signals. For robust decoding, the crucial step is filtering out the noise inevitably added during information transmission. As a minimal and optimal implementation of such an information decoding process, the autocatalytic phosphorylation and autocatalytic dephosphorylation (aPadP) cycle was recently proposed. Here, we analyze the dynamical properties of the aPadP cycle in detail. We describe the dynamical roles of the stationary and short-term responses in determining the efficiency of information decoding and clarify the optimality of the threshold value of the stationary response and its information-theoretical meaning. Furthermore, we investigate the robustness of the aPadP cycle against the receptor inactivation time and intrinsic noise. Finally, we discuss the relationship among information decoding with information-dependent actions, bet-hedging and network modularity.

  19. Post-processing interstitialcy diffusion from molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Bhardwaj, U.; Bukkuru, S.; Warrier, M.

    2016-01-01

    An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures is studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms.

  20. Post-processing interstitialcy diffusion from molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhardwaj, U., E-mail: haptork@gmail.com; Bukkuru, S.; Warrier, M.

    2016-01-15

    An algorithm to rigorously trace the interstitialcy diffusion trajectory in crystals is developed. The algorithm incorporates unsupervised learning and graph optimization which obviate the need to input extra domain specific information depending on crystal or temperature of the simulation. The algorithm is implemented in a flexible framework as a post-processor to molecular dynamics (MD) simulations. We describe in detail the reduction of interstitialcy diffusion into known computational problems of unsupervised clustering and graph optimization. We also discuss the steps, computational efficiency and key components of the algorithm. Using the algorithm, thermal interstitialcy diffusion from low to near-melting point temperatures ismore » studied. We encapsulate the algorithms in a modular framework with functionality to calculate diffusion coefficients, migration energies and other trajectory properties. The study validates the algorithm by establishing the conformity of output parameters with experimental values and provides detailed insights for the interstitialcy diffusion mechanism. The algorithm along with the help of supporting visualizations and analysis gives convincing details and a new approach to quantifying diffusion jumps, jump-lengths, time between jumps and to identify interstitials from lattice atoms. -- Graphical abstract:.« less

  1. Fast and Efficient Feature Engineering for Multi-Cohort Analysis of EHR Data.

    PubMed

    Ozery-Flato, Michal; Yanover, Chen; Gottlieb, Assaf; Weissbrod, Omer; Parush Shear-Yashuv, Naama; Goldschmidt, Yaara

    2017-01-01

    We present a framework for feature engineering, tailored for longitudinal structured data, such as electronic health records (EHRs). To fast-track feature engineering and extraction, the framework combines general-use plug-in extractors, a multi-cohort management mechanism, and modular memoization. Using this framework, we rapidly extracted thousands of features from diverse and large healthcare data sources in multiple projects.

  2. Engineering of Data Acquiring Mobile Software and Sustainable End-User Applications

    NASA Technical Reports Server (NTRS)

    Smith, Benton T.

    2013-01-01

    The criteria for which data acquiring software and its supporting infrastructure should be designed should take the following two points into account: the reusability and organization of stored online and remote data and content, and an assessment on whether abandoning a platform optimized design in favor for a multi-platform solution significantly reduces the performance of an end-user application. Furthermore, in-house applications that control or process instrument acquired data for end-users should be designed with a communication and control interface such that the application's modules can be reused as plug-in modular components in greater software systems. The application of the above mentioned is applied using two loosely related projects: a mobile application, and a website containing live and simulated data. For the intelligent devices mobile application AIDM, the end-user interface have a platform and data type optimized design, while the database and back-end applications store this information in an organized manner and manage access to that data to only to authorized user end application(s). Finally, the content for the website was derived from a database such that the content can be included and uniform to all applications accessing the content. With these projects being ongoing, I have concluded from my research that the applicable methods presented are feasible for both projects, and that a multi-platform design for the mobile application only marginally drop the performance of the mobile application.

  3. A multi-period capacitated school location problem with modular equipment and closest assignment considerations

    NASA Astrophysics Data System (ADS)

    Delmelle, Eric M.; Thill, Jean-Claude; Peeters, Dominique; Thomas, Isabelle

    2014-07-01

    In rapidly growing urban areas, it is deemed vital to expand (or contract) an existing network of public facilities to meet anticipated changes in the level of demand. We present a multi-period capacitated median model for school network facility location planning that minimizes transportation costs, while functional costs are subject to a budget constraint. The proposed Vintage Flexible Capacitated Location Problem (ViFCLP) has the flexibility to account for a minimum school-age closing requirement, while the maximum capacity of each school can be adjusted by the addition of modular units. Non-closest assignments are controlled by the introduction of a parameter penalizing excess travel. The applicability of the ViFCLP is illustrated on a large US school system (Charlotte-Mecklenburg, North Carolina) where high school demand is expected to grow faster with distance to the city center. Higher school capacities and greater penalty on travel impedance parameter reduce the number of non-closest assignments. The proposed model is beneficial to policy makers seeking to improve the provision and efficiency of public services over a multi-period planning horizon.

  4. Designing a fully automated multi-bioreactor plant for fast DoE optimization of pharmaceutical protein production.

    PubMed

    Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner

    2013-06-01

    The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Rapid automation of a cell-based assay using a modular approach: case study of a flow-based Varicella Zoster Virus infectivity assay.

    PubMed

    Joelsson, Daniel; Gates, Irina V; Pacchione, Diana; Wang, Christopher J; Bennett, Philip S; Zhang, Yuhua; McMackin, Jennifer; Frey, Tina; Brodbeck, Kristin C; Baxter, Heather; Barmat, Scott L; Benetti, Luca; Bodmer, Jean-Luc

    2010-06-01

    Vaccine manufacturing requires constant analytical monitoring to ensure reliable quality and a consistent safety profile of the final product. Concentration and bioactivity of active components of the vaccine are key attributes routinely evaluated throughout the manufacturing cycle and for product release and dosage. In the case of live attenuated virus vaccines, bioactivity is traditionally measured in vitro by infection of susceptible cells with the vaccine followed by quantification of virus replication, cytopathology or expression of viral markers. These assays are typically multi-day procedures that require trained technicians and constant attention. Considering the need for high volumes of testing, automation and streamlining of these assays is highly desirable. In this study, the automation and streamlining of a complex infectivity assay for Varicella Zoster Virus (VZV) containing test articles is presented. The automation procedure was completed using existing liquid handling infrastructure in a modular fashion, limiting custom-designed elements to a minimum to facilitate transposition. In addition, cellular senescence data provided an optimal population doubling range for long term, reliable assay operation at high throughput. The results presented in this study demonstrate a successful automation paradigm resulting in an eightfold increase in throughput while maintaining assay performance characteristics comparable to the original assay. Copyright 2010 Elsevier B.V. All rights reserved.

  6. Film grain synthesis and its application to re-graining

    NASA Astrophysics Data System (ADS)

    Schallauer, Peter; Mörzinger, Roland

    2006-01-01

    Digital film restoration and special effects compositing require more and more automatic procedures for movie regraining. Missing or inhomogeneous grain decreases perceived quality. For the purpose of grain synthesis an existing texture synthesis algorithm has been evaluated and optimized. We show that this algorithm can produce synthetic grain which is perceptually similar to a given grain template, which has high spatial and temporal variation and which can be applied to multi-spectral images. Furthermore a re-grain application framework is proposed, which synthesises based on an input grain template artificial grain and composites this together with the original image content. Due to its modular approach this framework supports manual as well as automatic re-graining applications. Two example applications are presented, one for re-graining an entire movie and one for fully automatic re-graining of image regions produced by restoration algorithms. Low computational cost of the proposed algorithms allows application in industrial grade software.

  7. Modular assembly of proteins on nanoparticles.

    PubMed

    Ma, Wenwei; Saccardo, Angela; Roccatano, Danilo; Aboagye-Mensah, Dorothy; Alkaseem, Mohammad; Jewkes, Matthew; Di Nezza, Francesca; Baron, Mark; Soloviev, Mikhail; Ferrari, Enrico

    2018-04-16

    Generally, the high diversity of protein properties necessitates the development of unique nanoparticle bio-conjugation methods, optimized for each different protein. Here we describe a universal bio-conjugation approach which makes use of a new recombinant fusion protein combining two distinct domains. The N-terminal part is Glutathione S-Transferase (GST) from Schistosoma japonicum, for which we identify and characterize the remarkable ability to bind gold nanoparticles (GNPs) by forming gold-sulfur bonds (Au-S). The C-terminal part of this multi-domain construct is the SpyCatcher from Streptococcus pyogenes, which provides the ability to capture recombinant proteins encoding a SpyTag. Here we show that SpyCatcher can be immobilized covalently on GNPs through GST without the loss of its full functionality. We then show that GST-SpyCatcher activated particles are able to covalently bind a SpyTag modified protein by simple mixing, through the spontaneous formation of an unusual isopeptide bond.

  8. System architecture for asynchronous multi-processor robotic control system

    NASA Technical Reports Server (NTRS)

    Steele, Robert D.; Long, Mark; Backes, Paul

    1993-01-01

    The architecture for the Modular Telerobot Task Execution System (MOTES) as implemented in the Supervisory Telerobotics (STELER) Laboratory is described. MOTES is the software component of the remote site of a local-remote telerobotic system which is being developed for NASA for space applications, in particular Space Station Freedom applications. The system is being developed to provide control and supervised autonomous control to support both space based operation and ground-remote control with time delay. The local-remote architecture places task planning responsibilities at the local site and task execution responsibilities at the remote site. This separation allows the remote site to be designed to optimize task execution capability within a limited computational environment such as is expected in flight systems. The local site task planning system could be placed on the ground where few computational limitations are expected. MOTES is written in the Ada programming language for a multiprocessor environment.

  9. Significant Scales in Community Structure

    NASA Astrophysics Data System (ADS)

    Traag, V. A.; Krings, G.; van Dooren, P.

    2013-10-01

    Many complex networks show signs of modular structure, uncovered by community detection. Although many methods succeed in revealing various partitions, it remains difficult to detect at what scale some partition is significant. This problem shows foremost in multi-resolution methods. We here introduce an efficient method for scanning for resolutions in one such method. Additionally, we introduce the notion of ``significance'' of a partition, based on subgraph probabilities. Significance is independent of the exact method used, so could also be applied in other methods, and can be interpreted as the gain in encoding a graph by making use of a partition. Using significance, we can determine ``good'' resolution parameters, which we demonstrate on benchmark networks. Moreover, optimizing significance itself also shows excellent performance. We demonstrate our method on voting data from the European Parliament. Our analysis suggests the European Parliament has become increasingly ideologically divided and that nationality plays no role.

  10. A multi-objective optimization approach accurately resolves protein domain architectures

    PubMed Central

    Bernardes, J.S.; Vieira, F.R.J.; Zaverucha, G.; Carbone, A.

    2016-01-01

    Motivation: Given a protein sequence and a number of potential domains matching it, what are the domain content and the most likely domain architecture for the sequence? This problem is of fundamental importance in protein annotation, constituting one of the main steps of all predictive annotation strategies. On the other hand, when potential domains are several and in conflict because of overlapping domain boundaries, finding a solution for the problem might become difficult. An accurate prediction of the domain architecture of a multi-domain protein provides important information for function prediction, comparative genomics and molecular evolution. Results: We developed DAMA (Domain Annotation by a Multi-objective Approach), a novel approach that identifies architectures through a multi-objective optimization algorithm combining scores of domain matches, previously observed multi-domain co-occurrence and domain overlapping. DAMA has been validated on a known benchmark dataset based on CATH structural domain assignments and on the set of Plasmodium falciparum proteins. When compared with existing tools on both datasets, it outperforms all of them. Availability and implementation: DAMA software is implemented in C++ and the source code can be found at http://www.lcqb.upmc.fr/DAMA. Contact: juliana.silva_bernardes@upmc.fr or alessandra.carbone@lip6.fr Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26458889

  11. Toward modular biological models: defining analog modules based on referent physiological mechanisms

    PubMed Central

    2014-01-01

    Background Currently, most biomedical models exist in isolation. It is often difficult to reuse or integrate models or their components, in part because they are not modular. Modular components allow the modeler to think more deeply about the role of the model and to more completely address a modeling project’s requirements. In particular, modularity facilitates component reuse and model integration for models with different use cases, including the ability to exchange modules during or between simulations. The heterogeneous nature of biology and vast range of wet-lab experimental platforms call for modular models designed to satisfy a variety of use cases. We argue that software analogs of biological mechanisms are reasonable candidates for modularization. Biomimetic software mechanisms comprised of physiomimetic mechanism modules offer benefits that are unique or especially important to multi-scale, biomedical modeling and simulation. Results We present a general, scientific method of modularizing mechanisms into reusable software components that we call physiomimetic mechanism modules (PMMs). PMMs utilize parametric containers that partition and expose state information into physiologically meaningful groupings. To demonstrate, we modularize four pharmacodynamic response mechanisms adapted from an in silico liver (ISL). We verified the modularization process by showing that drug clearance results from in silico experiments are identical before and after modularization. The modularized ISL achieves validation targets drawn from propranolol outflow profile data. In addition, an in silico hepatocyte culture (ISHC) is created. The ISHC uses the same PMMs and required no refactoring. The ISHC achieves validation targets drawn from propranolol intrinsic clearance data exhibiting considerable between-lab variability. The data used as validation targets for PMMs originate from both in vitro to in vivo experiments exhibiting large fold differences in time scale. Conclusions This report demonstrates the feasibility of PMMs and their usefulness across multiple model use cases. The pharmacodynamic response module developed here is robust to changes in model context and flexible in its ability to achieve validation targets in the face of considerable experimental uncertainty. Adopting the modularization methods presented here is expected to facilitate model reuse and integration, thereby accelerating the pace of biomedical research. PMID:25123169

  12. Toward modular biological models: defining analog modules based on referent physiological mechanisms.

    PubMed

    Petersen, Brenden K; Ropella, Glen E P; Hunt, C Anthony

    2014-08-16

    Currently, most biomedical models exist in isolation. It is often difficult to reuse or integrate models or their components, in part because they are not modular. Modular components allow the modeler to think more deeply about the role of the model and to more completely address a modeling project's requirements. In particular, modularity facilitates component reuse and model integration for models with different use cases, including the ability to exchange modules during or between simulations. The heterogeneous nature of biology and vast range of wet-lab experimental platforms call for modular models designed to satisfy a variety of use cases. We argue that software analogs of biological mechanisms are reasonable candidates for modularization. Biomimetic software mechanisms comprised of physiomimetic mechanism modules offer benefits that are unique or especially important to multi-scale, biomedical modeling and simulation. We present a general, scientific method of modularizing mechanisms into reusable software components that we call physiomimetic mechanism modules (PMMs). PMMs utilize parametric containers that partition and expose state information into physiologically meaningful groupings. To demonstrate, we modularize four pharmacodynamic response mechanisms adapted from an in silico liver (ISL). We verified the modularization process by showing that drug clearance results from in silico experiments are identical before and after modularization. The modularized ISL achieves validation targets drawn from propranolol outflow profile data. In addition, an in silico hepatocyte culture (ISHC) is created. The ISHC uses the same PMMs and required no refactoring. The ISHC achieves validation targets drawn from propranolol intrinsic clearance data exhibiting considerable between-lab variability. The data used as validation targets for PMMs originate from both in vitro to in vivo experiments exhibiting large fold differences in time scale. This report demonstrates the feasibility of PMMs and their usefulness across multiple model use cases. The pharmacodynamic response module developed here is robust to changes in model context and flexible in its ability to achieve validation targets in the face of considerable experimental uncertainty. Adopting the modularization methods presented here is expected to facilitate model reuse and integration, thereby accelerating the pace of biomedical research.

  13. Development of a Robust and Cost-Effective Friction Stir Welding Process for Use in Advanced Military Vehicles

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Pandurangan, B.; Hariharan, A.; Yen, C.-F.; Cheeseman, B. A.

    2011-02-01

    To respond to the advent of more lethal threats, recently designed aluminum-armor-based military-vehicle systems have resorted to an increasing use of higher strength aluminum alloys (with superior ballistic resistance against armor piercing (AP) threats and with high vehicle-light weighing potential). Unfortunately, these alloys are not very amenable to conventional fusion-based welding technologies and in-order to obtain high-quality welds, solid-state joining technologies such as Friction stir welding (FSW) have to be employed. However, since FSW is a relatively new and fairly complex joining technology, its introduction into advanced military vehicle structures is not straight forward and entails a comprehensive multi-step approach. One such (three-step) approach is developed in the present work. Within the first step, experimental and computational techniques are utilized to determine the optimal tool design and the optimal FSW process parameters which result in maximal productivity of the joining process and the highest quality of the weld. Within the second step, techniques are developed for the identification and qualification of the optimal weld joint designs in different sections of a prototypical military vehicle structure. In the third step, problems associated with the fabrication of a sub-scale military vehicle test structure and the blast survivability of the structure are assessed. The results obtained and the lessons learned are used to judge the potential of the current approach in shortening the development time and in enhancing reliability and blast survivability of military vehicle structures.

  14. Detecting communities using asymptotical surprise

    NASA Astrophysics Data System (ADS)

    Traag, V. A.; Aldecoa, R.; Delvenne, J.-C.

    2015-08-01

    Nodes in real-world networks are repeatedly observed to form dense clusters, often referred to as communities. Methods to detect these groups of nodes usually maximize an objective function, which implicitly contains the definition of a community. We here analyze a recently proposed measure called surprise, which assesses the quality of the partition of a network into communities. In its current form, the formulation of surprise is rather difficult to analyze. We here therefore develop an accurate asymptotic approximation. This allows for the development of an efficient algorithm for optimizing surprise. Incidentally, this leads to a straightforward extension of surprise to weighted graphs. Additionally, the approximation makes it possible to analyze surprise more closely and compare it to other methods, especially modularity. We show that surprise is (nearly) unaffected by the well-known resolution limit, a particular problem for modularity. However, surprise may tend to overestimate the number of communities, whereas they may be underestimated by modularity. In short, surprise works well in the limit of many small communities, whereas modularity works better in the limit of few large communities. In this sense, surprise is more discriminative than modularity and may find communities where modularity fails to discern any structure.

  15. A General-Purpose Optimization Engine for Multi-Disciplinary Design Applications

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A general purpose optimization tool for multidisciplinary applications, which in the literature is known as COMETBOARDS, is being developed at NASA Lewis Research Center. The modular organization of COMETBOARDS includes several analyzers and state-of-the-art optimization algorithms along with their cascading strategy. The code structure allows quick integration of new analyzers and optimizers. The COMETBOARDS code reads input information from a number of data files, formulates a design as a set of multidisciplinary nonlinear programming problems, and then solves the resulting problems. COMETBOARDS can be used to solve a large problem which can be defined through multiple disciplines, each of which can be further broken down into several subproblems. Alternatively, a small portion of a large problem can be optimized in an effort to improve an existing system. Some of the other unique features of COMETBOARDS include design variable formulation, constraint formulation, subproblem coupling strategy, global scaling technique, analysis approximation, use of either sequential or parallel computational modes, and so forth. The special features and unique strengths of COMETBOARDS assist convergence and reduce the amount of CPU time used to solve the difficult optimization problems of aerospace industries. COMETBOARDS has been successfully used to solve a number of problems, including structural design of space station components, design of nozzle components of an air-breathing engine, configuration design of subsonic and supersonic aircraft, mixed flow turbofan engines, wave rotor topped engines, and so forth. This paper introduces the COMETBOARDS design tool and its versatility, which is illustrated by citing examples from structures, aircraft design, and air-breathing propulsion engine design.

  16. Constrained Multi-Level Algorithm for Trajectory Optimization

    NASA Astrophysics Data System (ADS)

    Adimurthy, V.; Tandon, S. R.; Jessy, Antony; Kumar, C. Ravi

    The emphasis on low cost access to space inspired many recent developments in the methodology of trajectory optimization. Ref.1 uses a spectral patching method for optimization, where global orthogonal polynomials are used to describe the dynamical constraints. A two-tier approach of optimization is used in Ref.2 for a missile mid-course trajectory optimization. A hybrid analytical/numerical approach is described in Ref.3, where an initial analytical vacuum solution is taken and gradually atmospheric effects are introduced. Ref.4 emphasizes the fact that the nonlinear constraints which occur in the initial and middle portions of the trajectory behave very nonlinearly with respect the variables making the optimization very difficult to solve in the direct and indirect shooting methods. The problem is further made complex when different phases of the trajectory have different objectives of optimization and also have different path constraints. Such problems can be effectively addressed by multi-level optimization. In the multi-level methods reported so far, optimization is first done in identified sub-level problems, where some coordination variables are kept fixed for global iteration. After all the sub optimizations are completed, higher-level optimization iteration with all the coordination and main variables is done. This is followed by further sub system optimizations with new coordination variables. This process is continued until convergence. In this paper we use a multi-level constrained optimization algorithm which avoids the repeated local sub system optimizations and which also removes the problem of non-linear sensitivity inherent in the single step approaches. Fall-zone constraints, structural load constraints and thermal constraints are considered. In this algorithm, there is only a single multi-level sequence of state and multiplier updates in a framework of an augmented Lagrangian. Han Tapia multiplier updates are used in view of their special role in diagonalised methods, being the only single update with quadratic convergence. For a single level, the diagonalised multiplier method (DMM) is described in Ref.5. The main advantage of the two-level analogue of the DMM approach is that it avoids the inner loop optimizations required in the other methods. The scheme also introduces a gradient change measure to reduce the computational time needed to calculate the gradients. It is demonstrated that the new multi-level scheme leads to a robust procedure to handle the sensitivity of the constraints, and the multiple objectives of different trajectory phases. Ref. 1. Fahroo, F and Ross, M., " A Spectral Patching Method for Direct Trajectory Optimization" The Journal of the Astronautical Sciences, Vol.48, 2000, pp.269-286 Ref. 2. Phililps, C.A. and Drake, J.C., "Trajectory Optimization for a Missile using a Multitier Approach" Journal of Spacecraft and Rockets, Vol.37, 2000, pp.663-669 Ref. 3. Gath, P.F., and Calise, A.J., " Optimization of Launch Vehicle Ascent Trajectories with Path Constraints and Coast Arcs", Journal of Guidance, Control, and Dynamics, Vol. 24, 2001, pp.296-304 Ref. 4. Betts, J.T., " Survey of Numerical Methods for Trajectory Optimization", Journal of Guidance, Control, and Dynamics, Vol.21, 1998, pp. 193-207 Ref. 5. Adimurthy, V., " Launch Vehicle Trajectory Optimization", Acta Astronautica, Vol.15, 1987, pp.845-850.

  17. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  18. Earth As An Unstructured Mesh and Its Recovery from Seismic Waveform Data

    NASA Astrophysics Data System (ADS)

    De Hoop, M. V.

    2015-12-01

    We consider multi-scale representations of Earth's interior from thepoint of view of their possible recovery from multi- andhigh-frequency seismic waveform data. These representations areintrinsically connected to (geologic, tectonic) structures, that is,geometric parametrizations of Earth's interior. Indeed, we address theconstruction and recovery of such parametrizations using localiterative methods with appropriately designed data misfits andguaranteed convergence. The geometric parametrizations containinterior boundaries (defining, for example, faults, salt bodies,tectonic blocks, slabs) which can, in principle, be obtained fromsuccessive segmentation. We make use of unstructured meshes. For the adaptation and recovery of an unstructured mesh we introducean energy functional which is derived from the Hausdorff distance. Viaan augmented Lagrangian method, we incorporate the mentioned datamisfit. The recovery is constrained by shape optimization of theinterior boundaries, and is reminiscent of Hausdorff warping. We useelastic deformation via finite elements as a regularization whilefollowing a two-step procedure. The first step is an update determinedby the energy functional; in the second step, we modify the outcome ofthe first step where necessary to ensure that the new mesh isregular. This modification entails an array of techniques includingtopology correction involving interior boundary contacting andbreakup, edge warping and edge removal. We implement this as afeed-back mechanism from volume to interior boundary meshesoptimization. We invoke and apply a criterion of mesh quality controlfor coarsening, and for dynamical local multi-scale refinement. Wepresent a novel (fluid-solid) numerical framework based on theDiscontinuous Galerkin method.

  19. Revealing in-block nestedness: Detection and benchmarking

    NASA Astrophysics Data System (ADS)

    Solé-Ribalta, Albert; Tessone, Claudio J.; Mariani, Manuel S.; Borge-Holthoefer, Javier

    2018-06-01

    As new instances of nested organization—beyond ecological networks—are discovered, scholars are debating the coexistence of two apparently incompatible macroscale architectures: nestedness and modularity. The discussion is far from being solved, mainly for two reasons. First, nestedness and modularity appear to emerge from two contradictory dynamics, cooperation and competition. Second, existing methods to assess the presence of nestedness and modularity are flawed when it comes to the evaluation of concurrently nested and modular structures. In this work, we tackle the latter problem, presenting the concept of in-block nestedness, a structural property determining to what extent a network is composed of blocks whose internal connectivity exhibits nestedness. We then put forward a set of optimization methods that allow us to identify such organization successfully, in synthetic and in a large number of real networks. These findings challenge our understanding of the topology of ecological and social systems, calling for new models to explain how such patterns emerge.

  20. A generic template for automated bioanalytical ligand-binding assays using modular robotic scripts in support of discovery biotherapeutic programs.

    PubMed

    Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J

    2013-07-01

    Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.

  1. Automatization of hardware configuration for plasma diagnostic system

    NASA Astrophysics Data System (ADS)

    Wojenski, A.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R. D.; Zabolotny, W.; Linczuk, P.; Chernyshova, M.; Czarski, T.; Malinowski, K.

    2016-09-01

    Soft X-ray plasma measurement systems are mostly multi-channel, high performance systems. In case of the modular construction it is necessary to perform sophisticated system discovery in parallel with automatic system configuration. In the paper the structure of the modular system designed for tokamak plasma soft X-ray measurements is described. The concept of the system discovery and further automatic configuration is also presented. FCS application (FMC/ FPGA Configuration Software) is used for running sophisticated system setup with automatic verification of proper configuration. In order to provide flexibility of further system configurations (e.g. user setup), common communication interface is also described. The approach presented here is related to the automatic system firmware building presented in previous papers. Modular construction and multichannel measurements are key requirement in term of SXR diagnostics with use of GEM detectors.

  2. Parallel workflow tools to facilitate human brain MRI post-processing

    PubMed Central

    Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043

  3. Improving Efficiency in Multi-Strange Baryon Reconstruction in d-Au at STAR

    NASA Astrophysics Data System (ADS)

    Leight, William

    2003-10-01

    We report preliminary multi-strange baryon measurements for d-Au collisions recorded at RHIC by the STAR experiment. After using classical topological analysis, in which cuts for each discriminating variable are adjusted by hand, we investigate improvements in signal-to-noise optimization using Linear Discriminant Analysis (LDA). LDA is an algorithm for finding, in the n-dimensional space of the n discriminating variables, the axis on which the signal and noise distributions are most separated. LDA is the first step in moving towards more sophisticated techniques for signal-to-noise optimization, such as Artificial Neural Nets. Due to the relatively low background and sufficiently high yields of d-Au collisions, they form an ideal system to study these possibilities for improving reconstruction methods. Such improvements will be extremely important for forthcoming Au-Au runs in which the size of the combinatoric background is a major problem in reconstruction efforts.

  4. Data Processing And Machine Learning Methods For Multi-Modal Operator State Classification Systems

    NASA Technical Reports Server (NTRS)

    Hearn, Tristan A.

    2015-01-01

    This document is intended as an introduction to a set of common signal processing learning methods that may be used in the software portion of a functional crew state monitoring system. This includes overviews of both the theory of the methods involved, as well as examples of implementation. Practical considerations are discussed for implementing modular, flexible, and scalable processing and classification software for a multi-modal, multi-channel monitoring system. Example source code is also given for all of the discussed processing and classification methods.

  5. A highly modular beamline electrostatic levitation facility, optimized for in situ high-energy x-ray scattering studies of equilibrium and supercooled liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mauro, N.A.; Kelton, K.F.

    2011-10-27

    High-energy x-ray diffraction studies of metallic liquids provide valuable information about structural evolution on the atomic length scale, leading to insights into the origin of the nucleation barrier and the processes of supercooling and glass formation. The containerless processing of the beamline electrostatic levitation (BESL) facility allows coordinated thermophysical and structural studies of equilibrium and supercooled liquids to be made in a contamination-free, high-vacuum ({approx}10{sup -8} Torr) environment. To date, the incorporation of electrostatic levitation facilities into synchrotron beamlines has been difficult due to the large footprint of the apparatus and the difficulties associated with its transportation and implementation. Here,more » we describe a modular levitation facility that is optimized for diffraction studies of high-temperature liquids at high-energy synchrotron beamlines. The modular approach used in the apparatus design allows it to be easily transported and quickly setup. Unlike most previous electrostatic levitation facilities, BESL can be operated by a single user instead of a user team.« less

  6. Knowledge-based modularization and global optimization of artificial neural network models in hydrological forecasting.

    PubMed

    Corzo, Gerald; Solomatine, Dimitri

    2007-05-01

    Natural phenomena are multistationary and are composed of a number of interacting processes, so one single model handling all processes often suffers from inaccuracies. A solution is to partition data in relation to such processes using the available domain knowledge or expert judgment, to train separate models for each of the processes, and to merge them in a modular model (committee). In this paper a problem of water flow forecast in watershed hydrology is considered where the flow process can be presented as consisting of two subprocesses -- base flow and excess flow, so that these two processes can be separated. Several approaches to data separation techniques are studied. Two case studies with different forecast horizons are considered. Parameters of the algorithms responsible for data partitioning are optimized using genetic algorithms and global pattern search. It was found that modularization of ANN models using domain knowledge makes models more accurate, if compared with a global model trained on the whole data set, especially when forecast horizon (and hence the complexity of the modelled processes) is increased.

  7. What is the role of curvature on the properties of nanomaterials for biomedical applications?

    PubMed Central

    Solveyra, Estefania Gonzalez

    2015-01-01

    The use of nanomaterials for drug delivery and theranostics applications is a promising paradigm in nanomedicine, as it brings together the best features of nanotechnolgy, molecular biology and medicine. To fully exploit the synergistic potential of such interdisciplinary strategy, a comprehensive description of the interactions at the interface between nanomaterials and biological systems is not only crucial, but also mandatory. Routine strategies to engineer nanomaterial-based drugs comprise modifying their surface with biocompatible and targeting ligands, in many cases resorting to modular approaches that assume additive behavior. However, emergent behavior can be observed when combining confinement and curvature. The final properties of functionalized nanomaterials become dependent not only on the properties of their constituents but also on the geometry of the nano-bio interface, and on the local molecular environment. Modularity no longer holds, and the coupling between interactions, chemical equilibrium and molecular organization has to be directly addressed in order to design smart nanomaterials with controlled spatial functionalization envisioning optimized biomedical applications. Nanoparticle’s curvature becomes an integral part of the design strategy, enabling to control and engineer the chemical and surface properties with molecular precision. Understanding how NP size, morphology, and surface chemistry are interrelated will put us one step closer to engineering nanobiomaterials capable of mimicking biological structures and their behaviors, paving the way into applications and the possibility to elucidate the use of curvature by biological systems. PMID:26310432

  8. Investigation of structure in the modular light pipe component for LED automotive lamp

    NASA Astrophysics Data System (ADS)

    Chen, Hsi-Chao; Zhou, Yang; Huang, Chien-Sheng; Jhong, Wan-Ling; Cheng, Bo-Wei; Jhang, Jhe-Ming

    2014-09-01

    Light-Emitting Diodes (LEDs) have the advantages of small length, long lifetime, fast response time (μs), low voltage, good mechanical properties and environmental protection. Furthermore, LEDs could replace the halogen lamps to avoid the mercury pollution and economize the use of energy. Therefore, the LEDs could instead of the traditional lamp in the future and became an important light source. The proposal of this study was to investigate the effects of the structure and length of the reflector component for a LED automotive lamp. The novel LED automotive lamp was assembled by several different modularization columnar. The optimized design of the different structure and the length to the reflector was simulated by software TracePro. The design result must met the vehicle regulation of United Nations Economic Commission for Europe (UNECE) such as ECE-R19 etc. The structure of the light pipe could be designed by two steps structure. Then constitute the proper structure and choose different power LED to meet the luminous intensity of the vehicle regulation. The simulation result shows the proper structure and length has the best total luminous flux and a high luminous efficiency for the system. Also, the stray light could meet the vehicle regulation of ECE R19. Finally, the experimental result of the selected structure and length of the light pipe could match the simulation result above 80%.

  9. Susceptibility-weighted imaging using inter-echo-variance channel combination for improved contrast at 7 tesla.

    PubMed

    Hosseini, Zahra; Liu, Junmin; Solovey, Igor; Menon, Ravi S; Drangova, Maria

    2017-04-01

    To implement and optimize a new approach for susceptibility-weighted image (SWI) generation from multi-echo multi-channel image data and compare its performance against optimized traditional SWI pipelines. Five healthy volunteers were imaged at 7 Tesla. The inter-echo-variance (IEV) channel combination, which uses the variance of the local frequency shift at multiple echo times as a weighting factor during channel combination, was used to calculate multi-echo local phase shift maps. Linear phase masks were combined with the magnitude to generate IEV-SWI. The performance of the IEV-SWI pipeline was compared with that of two accepted SWI pipelines-channel combination followed by (i) Homodyne filtering (HPH-SWI) and (ii) unwrapping and high-pass filtering (SVD-SWI). The filtering steps of each pipeline were optimized. Contrast-to-noise ratio was used as the comparison metric. Qualitative assessment of artifact and vessel conspicuity was performed and processing time of pipelines was evaluated. The optimized IEV-SWI pipeline (σ = 7 mm) resulted in continuous vessel visibility throughout the brain. IEV-SWI had significantly higher contrast compared with HPH-SWI and SVD-SWI (P < 0.001, Friedman nonparametric test). Residual background fields and phase wraps in HPH-SWI and SVD-SWI corrupted the vessel signal and/or generated vessel-mimicking artifact. Optimized implementation of the IEV-SWI pipeline processed a six-echo 16-channel dataset in under 10 min. IEV-SWI benefits from channel-by-channel processing of phase data and results in high contrast images with an optimal balance between contrast and background noise removal, thereby presenting evidence of importance of the order in which postprocessing techniques are applied for multi-channel SWI generation. 2 J. Magn. Reson. Imaging 2017;45:1113-1124. © 2016 International Society for Magnetic Resonance in Medicine.

  10. Multi-objective optimization of solid waste flows: environmentally sustainable strategies for municipalities.

    PubMed

    Minciardi, Riccardo; Paolucci, Massimo; Robba, Michela; Sacile, Roberto

    2008-11-01

    An approach to sustainable municipal solid waste (MSW) management is presented, with the aim of supporting the decision on the optimal flows of solid waste sent to landfill, recycling and different types of treatment plants, whose sizes are also decision variables. This problem is modeled with a non-linear, multi-objective formulation. Specifically, four objectives to be minimized have been taken into account, which are related to economic costs, unrecycled waste, sanitary landfill disposal and environmental impact (incinerator emissions). An interactive reference point procedure has been developed to support decision making; these methods are considered appropriate for multi-objective decision problems in environmental applications. In addition, interactive methods are generally preferred by decision makers as they can be directly involved in the various steps of the decision process. Some results deriving from the application of the proposed procedure are presented. The application of the procedure is exemplified by considering the interaction with two different decision makers who are assumed to be in charge of planning the MSW system in the municipality of Genova (Italy).

  11. A Modular Pipelined Processor for High Resolution Gamma-Ray Spectroscopy

    NASA Astrophysics Data System (ADS)

    Veiga, Alejandro; Grunfeld, Christian

    2016-02-01

    The design of a digital signal processor for gamma-ray applications is presented in which a single ADC input can simultaneously provide temporal and energy characterization of gamma radiation for a wide range of applications. Applying pipelining techniques, the processor is able to manage and synchronize very large volumes of streamed real-time data. Its modular user interface provides a flexible environment for experimental design. The processor can fit in a medium-sized FPGA device operating at ADC sampling frequency, providing an efficient solution for multi-channel applications. Two experiments are presented in order to characterize its temporal and energy resolution.

  12. OpenGeoSys-GEMS: Hybrid parallelization of a reactive transport code with MPI and threads

    NASA Astrophysics Data System (ADS)

    Kosakowski, G.; Kulik, D. A.; Shao, H.

    2012-04-01

    OpenGeoSys-GEMS is a generic purpose reactive transport code based on the operator splitting approach. The code couples the Finite-Element groundwater flow and multi-species transport modules of the OpenGeoSys (OGS) project (http://www.ufz.de/index.php?en=18345) with the GEM-Selektor research package to model thermodynamic equilibrium of aquatic (geo)chemical systems utilizing the Gibbs Energy Minimization approach (http://gems.web.psi.ch/). The combination of OGS and the GEM-Selektor kernel (GEMS3K) is highly flexible due to the object-oriented modular code structures and the well defined (memory based) data exchange modules. Like other reactive transport codes, the practical applicability of OGS-GEMS is often hampered by the long calculation time and large memory requirements. • For realistic geochemical systems which might include dozens of mineral phases and several (non-ideal) solid solutions the time needed to solve the chemical system with GEMS3K may increase exceptionally. • The codes are coupled in a sequential non-iterative loop. In order to keep the accuracy, the time step size is restricted. In combination with a fine spatial discretization the time step size may become very small which increases calculation times drastically even for small 1D problems. • The current version of OGS is not optimized for memory use and the MPI version of OGS does not distribute data between nodes. Even for moderately small 2D problems the number of MPI processes that fit into memory of up-to-date workstations or HPC hardware is limited. One strategy to overcome the above mentioned restrictions of OGS-GEMS is to parallelize the coupled code. For OGS a parallelized version already exists. It is based on a domain decomposition method implemented with MPI and provides a parallel solver for fluid and mass transport processes. In the coupled code, after solving fluid flow and solute transport, geochemical calculations are done in form of a central loop over all finite element nodes with calls to GEMS3K and consecutive calculations of changed material parameters. In a first step the existing MPI implementation was utilized to parallelize this loop. Calculations were split between the MPI processes and afterwards data was synchronized by using MPI communication routines. Furthermore, multi-threaded calculation of the loop was implemented with help of the boost thread library (http://www.boost.org). This implementation provides a flexible environment to distribute calculations between several threads. For each MPI process at least one and up to several dozens of worker threads are spawned. These threads do not replicate the complete OGS-GEM data structure and use only a limited amount of memory. Calculation of the central geochemical loop is shared between all threads. Synchronization between the threads is done by barrier commands. The overall number of local threads times MPI processes should match the number of available computing nodes. The combination of multi-threading and MPI provides an effective and flexible environment to speed up OGS-GEMS calculations while limiting the required memory use. Test calculations on different hardware show that for certain types of applications tremendous speedups are possible.

  13. Synthesis of nano-sized lithium cobalt oxide via a sol-gel method

    NASA Astrophysics Data System (ADS)

    Li, Guangfen; Zhang, Jing

    2012-07-01

    In this study, nano-structured LiCoO2 thin film were synthesized by coupling a sol-gel process with a spin-coating method using polyacrylic acid (PAA) as chelating agent. The optimized conditions for obtaining a better gel formulation and subsequent homogenous dense film were investigated by varying the calcination temperature, the molar mass of PAA, and the precursor's molar ratios of PAA, lithium, and cobalt ions. The gel films on the silicon substrate surfaces were deposited by multi-step spin-coating process for either increasing the density of the gel film or adjusting the quantity of PAA in the film. The gel film was calcined by an optimized two-step heating procedure in order to obtain regular nano-structured LiCoO2 materials. Both atomic force microscopy (AFM) and scanning electron microscopy (SEM) were utilized to analyze the crystalline and the morphology of the films, respectively.

  14. Asynchronous Incremental Stochastic Dual Descent Algorithm for Network Resource Allocation

    NASA Astrophysics Data System (ADS)

    Bedi, Amrit Singh; Rajawat, Ketan

    2018-05-01

    Stochastic network optimization problems entail finding resource allocation policies that are optimum on an average but must be designed in an online fashion. Such problems are ubiquitous in communication networks, where resources such as energy and bandwidth are divided among nodes to satisfy certain long-term objectives. This paper proposes an asynchronous incremental dual decent resource allocation algorithm that utilizes delayed stochastic {gradients} for carrying out its updates. The proposed algorithm is well-suited to heterogeneous networks as it allows the computationally-challenged or energy-starved nodes to, at times, postpone the updates. The asymptotic analysis of the proposed algorithm is carried out, establishing dual convergence under both, constant and diminishing step sizes. It is also shown that with constant step size, the proposed resource allocation policy is asymptotically near-optimal. An application involving multi-cell coordinated beamforming is detailed, demonstrating the usefulness of the proposed algorithm.

  15. Theoretical and field experimental evaluation of skewed modular slab bridges : [research summary].

    DOT National Transportation Integrated Search

    2012-12-01

    Adjacent precast, prestressed concrete multi-beam bridges have become more : prevalent due to their rapid construction time and cost effectiveness. Over the : years, various adjustments and refinements have been made to the design of : these bridges ...

  16. Nonlinear robust controller design for multi-robot systems with unknown payloads

    NASA Technical Reports Server (NTRS)

    Song, Y. D.; Anderson, J. N.; Homaifar, A.; Lai, H. Y.

    1992-01-01

    This work is concerned with the control problem of a multi-robot system handling a payload with unknown mass properties. Force constraints at the grasp points are considered. Robust control schemes are proposed that cope with the model uncertainty and achieve asymptotic path tracking. To deal with the force constraints, a strategy for optimally sharing the task is suggested. This strategy basically consists of two steps. The first detects the robots that need help and the second arranges that help. It is shown that the overall system is not only robust to uncertain payload parameters, but also satisfies the force constraints.

  17. A Scalable, Parallel Approach for Multi-Point, High-Fidelity Aerostructural Optimization of Aircraft Configurations

    NASA Astrophysics Data System (ADS)

    Kenway, Gaetan K. W.

    This thesis presents new tools and techniques developed to address the challenging problem of high-fidelity aerostructural optimization with respect to large numbers of design variables. A new mesh-movement scheme is developed that is both computationally efficient and sufficiently robust to accommodate large geometric design changes and aerostructural deformations. A fully coupled Newton-Krylov method is presented that accelerates the convergence of aerostructural systems and provides a 20% performance improvement over the traditional nonlinear block Gauss-Seidel approach and can handle more exible structures. A coupled adjoint method is used that efficiently computes derivatives for a gradient-based optimization algorithm. The implementation uses only machine accurate derivative techniques and is verified to yield fully consistent derivatives by comparing against the complex step method. The fully-coupled large-scale coupled adjoint solution method is shown to have 30% better performance than the segregated approach. The parallel scalability of the coupled adjoint technique is demonstrated on an Euler Computational Fluid Dynamics (CFD) model with more than 80 million state variables coupled to a detailed structural finite-element model of the wing with more than 1 million degrees of freedom. Multi-point high-fidelity aerostructural optimizations of a long-range wide-body, transonic transport aircraft configuration are performed using the developed techniques. The aerostructural analysis employs Euler CFD with a 2 million cell mesh and a structural finite element model with 300 000 DOF. Two design optimization problems are solved: one where takeoff gross weight is minimized, and another where fuel burn is minimized. Each optimization uses a multi-point formulation with 5 cruise conditions and 2 maneuver conditions. The optimization problems have 476 design variables are optimal results are obtained within 36 hours of wall time using 435 processors. The TOGW minimization results in a 4.2% reduction in TOGW with a 6.6% fuel burn reduction, while the fuel burn optimization resulted in a 11.2% fuel burn reduction with no change to the takeoff gross weight.

  18. 48 CFR 15.202 - Advisory multi-step process.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Advisory multi-step... Information 15.202 Advisory multi-step process. (a) The agency may publish a presolicitation notice (see 5.204... participate in the acquisition. This process should not be used for multi-step acquisitions where it would...

  19. Phage-bacteria infection networks: From nestedness to modularity

    NASA Astrophysics Data System (ADS)

    Flores, Cesar O.; Valverde, Sergi; Weitz, Joshua S.

    2013-03-01

    Bacteriophages (viruses that infect bacteria) are the most abundant biological life-forms on Earth. However, very little is known regarding the structure of phage-bacteria infections. In a recent study we re-evaluated 38 prior studies and demonstrated that phage-bacteria infection networks tend to be statistically nested in small scale communities (Flores et al 2011). Nestedness is consistent with a hierarchy of infection and resistance within phages and bacteria, respectively. However, we predicted that at large scales, phage-bacteria infection networks should be typified by a modular structure. We evaluate and confirm this hypothesis using the most extensive study of phage-bacteria infections (Moebus and Nattkemper 1981). In this study, cross-infections were evaluated between 215 marine phages and 286 marine bacteria. We develop a novel multi-scale network analysis and find that the Moebus and Nattkemper (1981) study, is highly modular (at the whole network scale), yet also exhibits nestedness and modularity at the within-module scale. We examine the role of geography in driving these modular patterns and find evidence that phage-bacteria interactions can exhibit strong similarity despite large distances between sites. CFG acknowledges the support of CONACyT Foundation. JSW holds a Career Award at the Scientific Interface from the Burroughs Wellcome Fund and acknowledges the support of the James S. McDonnell Foundation

  20. Multicycle rapid thermal annealing optimization of Mg-implanted GaN: Evolution of surface, optical, and structural properties

    NASA Astrophysics Data System (ADS)

    Greenlee, Jordan D.; Feigelson, Boris N.; Anderson, Travis J.; Tadjer, Marko J.; Hite, Jennifer K.; Mastro, Michael A.; Eddy, Charles R.; Hobart, Karl D.; Kub, Francis J.

    2014-08-01

    The first step of a multi-cycle rapid thermal annealing process was systematically studied. The surface, structure, and optical properties of Mg implanted GaN thin films annealed at temperatures ranging from 900 to 1200 °C were investigated by Raman spectroscopy, photoluminescence, UV-visible spectroscopy, atomic force microscopy, and Nomarski microscopy. The GaN thin films are capped with two layers of in-situ metal organic chemical vapor deposition -grown AlN and annealed in 24 bar of N2 overpressure to avoid GaN decomposition. The crystal quality of the GaN improves with increasing annealing temperature as confirmed by UV-visible spectroscopy and the full widths at half maximums of the E2 and A1 (LO) Raman modes. The crystal quality of films annealed above 1100 °C exceeds the quality of the as-grown films. At 1200 °C, Mg is optically activated, which is determined by photoluminescence measurements. However, at 1200 °C, the GaN begins to decompose as evidenced by pit formation on the surface of the samples. Therefore, it was determined that the optimal temperature for the first step in a multi-cycle rapid thermal anneal process should be conducted at 1150 °C due to crystal quality and surface morphology considerations.

  1. Assessing the weighted multi-objective adaptive surrogate model optimization to derive large-scale reservoir operating rules with sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Wang, Xu; Liu, Pan; Lei, Xiaohui; Li, Zejun; Gong, Wei; Duan, Qingyun; Wang, Hao

    2017-01-01

    The optimization of large-scale reservoir system is time-consuming due to its intrinsic characteristics of non-commensurable objectives and high dimensionality. One way to solve the problem is to employ an efficient multi-objective optimization algorithm in the derivation of large-scale reservoir operating rules. In this study, the Weighted Multi-Objective Adaptive Surrogate Model Optimization (WMO-ASMO) algorithm is used. It consists of three steps: (1) simplifying the large-scale reservoir operating rules by the aggregation-decomposition model, (2) identifying the most sensitive parameters through multivariate adaptive regression splines (MARS) for dimensional reduction, and (3) reducing computational cost and speeding the searching process by WMO-ASMO, embedded with weighted non-dominated sorting genetic algorithm II (WNSGAII). The intercomparison of non-dominated sorting genetic algorithm (NSGAII), WNSGAII and WMO-ASMO are conducted in the large-scale reservoir system of Xijiang river basin in China. Results indicate that: (1) WNSGAII surpasses NSGAII in the median of annual power generation, increased by 1.03% (from 523.29 to 528.67 billion kW h), and the median of ecological index, optimized by 3.87% (from 1.879 to 1.809) with 500 simulations, because of the weighted crowding distance and (2) WMO-ASMO outperforms NSGAII and WNSGAII in terms of better solutions (annual power generation (530.032 billion kW h) and ecological index (1.675)) with 1000 simulations and computational time reduced by 25% (from 10 h to 8 h) with 500 simulations. Therefore, the proposed method is proved to be more efficient and could provide better Pareto frontier.

  2. A modular robust control framework for control of movement elicited by multi-electrode intraspinal microstimulation

    NASA Astrophysics Data System (ADS)

    Roshani, Amir; Erfanian, Abbas

    2016-08-01

    Objective. An important issue in restoring motor function through intraspinal microstimulation (ISMS) is the motor control. To provide a physiologically plausible motor control using ISMS, it should be able to control the individual motor unit which is the lowest functional unit of motor control. By focal stimulation only a small group of motor neurons (MNs) within a motor pool can be activated. Different groups of MNs within a motor pool can potentially be activated without involving adjacent motor pools by local stimulation of different parts of a motor pool via microelectrode array implanted into a motor pool. However, since the system has multiple inputs with single output during multi-electrode ISMS, it poses a challenge to movement control. In this paper, we proposed a modular robust control strategy for movement control, whereas multi-electrode array is implanted into each motor activation pool of a muscle. Approach. The controller was based on the combination of proportional-integral-derivative and adaptive fuzzy sliding mode control. The global stability of the controller was guaranteed. Main results. The results of the experiments on rat models showed that the multi-electrode control can provide a more robust control and accurate tracking performance than a single-electrode control. The control output can be pulse amplitude (pulse amplitude modulation, PAM) or pulse width (pulse width modulation, PWM) of the stimulation signal. The results demonstrated that the controller with PAM provided faster convergence rate and better tracking performance than the controller with PWM. Significance. This work represents a promising control approach to the restoring motor functions using ISMS. The proposed controller requires no prior knowledge about the dynamics of the system to be controlled and no offline learning phase. The proposed control design is modular in the sense that each motor pool has an independent controller and each controller is able to control ISMS through an array of microelectrodes.

  3. Identification of immiscible NAPL contaminant sources in aquifers by a modified two-level saturation based imperialist competitive algorithm

    NASA Astrophysics Data System (ADS)

    Ghafouri, H. R.; Mosharaf-Dehkordi, M.; Afzalan, B.

    2017-07-01

    A simulation-optimization model is proposed for identifying the characteristics of local immiscible NAPL contaminant sources inside aquifers. This model employs the UTCHEM 9.0 software as its simulator for solving the governing equations associated with the multi-phase flow in porous media. As the optimization model, a novel two-level saturation based Imperialist Competitive Algorithm (ICA) is proposed to estimate the parameters of contaminant sources. The first level consists of three parallel independent ICAs and plays as a pre-conditioner for the second level which is a single modified ICA. The ICA in the second level is modified by dividing each country into a number of provinces (smaller parts). Similar to countries in the classical ICA, these provinces are optimized by the assimilation, competition, and revolution steps in the ICA. To increase the diversity of populations, a new approach named knock the base method is proposed. The performance and accuracy of the simulation-optimization model is assessed by solving a set of two and three-dimensional problems considering the effects of different parameters such as the grid size, rock heterogeneity and designated monitoring networks. The obtained numerical results indicate that using this simulation-optimization model provides accurate results at a less number of iterations when compared with the model employing the classical one-level ICA. A model is proposed to identify characteristics of immiscible NAPL contaminant sources. The contaminant is immiscible in water and multi-phase flow is simulated. The model is a multi-level saturation-based optimization algorithm based on ICA. Each answer string in second level is divided into a set of provinces. Each ICA is modified by incorporating a new knock the base model.

  4. An optimized immunohistochemistry protocol for detecting the guidance cue Netrin-1 in neural tissue.

    PubMed

    Salameh, Samer; Nouel, Dominique; Flores, Cecilia; Hoops, Daniel

    2018-01-01

    Netrin-1, an axon guidance protein, is difficult to detect using immunohistochemistry. We performed a multi-step, blinded, and controlled protocol optimization procedure to establish an efficient and effective fluorescent immunohistochemistry protocol for characterizing Netrin-1 expression. Coronal mouse brain sections were used to test numerous antigen retrieval methods and combinations thereof in order to optimize the stain quality of a commercially available Netrin-1 antibody. Stain quality was evaluated by experienced neuroanatomists for two criteria: signal intensity and signal-to-noise ratio. After five rounds of testing protocol variants, we established a modified immunohistochemistry protocol that produced a Netrin-1 signal with good signal intensity and a high signal-to-noise ratio. The key protocol modifications are as follows: •Use phosphate buffer (PB) as the blocking solution solvent.•Use 1% sodium dodecyl sulfate (SDS) treatment for antigen retrieval. The original protocol was optimized for use with the Netrin-1 antibody produced by Novus Biologicals. However, we subsequently further modified the protocol to work with the antibody produced by Abcam. The Abcam protocol uses PBS as the blocking solution solvent and adds a citrate buffer antigen retrieval step.

  5. Optimization of the Production of Inactivated Clostridium novyi Type B Vaccine Using Computational Intelligence Techniques.

    PubMed

    Aquino, P L M; Fonseca, F S; Mozzer, O D; Giordano, R C; Sousa, R

    2016-07-01

    Clostridium novyi causes necrotic hepatitis in sheep and cattle, as well as gas gangrene. The microorganism is strictly anaerobic, fastidious, and difficult to cultivate in industrial scale. C. novyi type B produces alpha and beta toxins, with the alpha toxin being linked to the presence of specific bacteriophages. The main strategy to combat diseases caused by C. novyi is vaccination, employing vaccines produced with toxoids or with toxoids and bacterins. In order to identify culture medium components and concentrations that maximized cell density and alpha toxin production, a neuro-fuzzy algorithm was applied to predict the yields of the fermentation process for production of C. novyi type B, within a global search procedure using the simulated annealing technique. Maximizing cell density and toxin production is a multi-objective optimization problem and could be treated by a Pareto approach. Nevertheless, the approach chosen here was a step-by-step one. The optimum values obtained with this approach were validated in laboratory scale, and the results were used to reload the data matrix for re-parameterization of the neuro-fuzzy model, which was implemented for a final optimization step with regards to the alpha toxin productivity. With this methodology, a threefold increase of alpha toxin could be achieved.

  6. An Alternative Approach to the Operation of Multinational Reservoir Systems: Application to the Amistad & Falcon System (Lower Rio Grande/Rí-o Bravo)

    NASA Astrophysics Data System (ADS)

    Serrat-Capdevila, A.; Valdes, J. B.

    2005-12-01

    An optimization approach for the operation of international multi-reservoir systems is presented. The approach uses Stochastic Dynamic Programming (SDP) algorithms, both steady-state and real-time, to develop two models. In the first model, the reservoirs and flows of the system are aggregated to yield an equivalent reservoir, and the obtained operating policies are disaggregated using a non-linear optimization procedure for each reservoir and for each nation water balance. In the second model a multi-reservoir approach is applied, disaggregating the releases for each country water share in each reservoir. The non-linear disaggregation algorithm uses SDP-derived operating policies as boundary conditions for a local time-step optimization. Finally, the performance of the different approaches and methods is compared. These models are applied to the Amistad-Falcon International Reservoir System as part of a binational dynamic modeling effort to develop a decision support system tool for a better management of the water resources in the Lower Rio Grande Basin, currently enduring a severe drought.

  7. Simulating an underwater vehicle self-correcting guidance system with Simulink

    NASA Astrophysics Data System (ADS)

    Fan, Hui; Zhang, Yu-Wen; Li, Wen-Zhe

    2008-09-01

    Underwater vehicles have already adopted self-correcting directional guidance algorithms based on multi-beam self-guidance systems, not waiting for research to determine the most effective algorithms. The main challenges facing research on these guidance systems have been effective modeling of the guidance algorithm and a means to analyze the simulation results. A simulation structure based on Simulink that dealt with both issues was proposed. Initially, a mathematical model of relative motion between the vehicle and the target was developed, which was then encapsulated as a subsystem. Next, steps for constructing a model of the self-correcting guidance algorithm based on the Stateflow module were examined in detail. Finally, a 3-D model of the vehicle and target was created in VRML, and by processing mathematical results, the model was shown moving in a visual environment. This process gives more intuitive results for analyzing the simulation. The results showed that the simulation structure performs well. The simulation program heavily used modularization and encapsulation, so has broad applicability to simulations of other dynamic systems.

  8. Newmark local time stepping on high-performance computing architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rietmann, Max, E-mail: max.rietmann@erdw.ethz.ch; Institute of Geophysics, ETH Zurich; Grote, Marcus, E-mail: marcus.grote@unibas.ch

    In multi-scale complex media, finite element meshes often require areas of local refinement, creating small elements that can dramatically reduce the global time-step for wave-propagation problems due to the CFL condition. Local time stepping (LTS) algorithms allow an explicit time-stepping scheme to adapt the time-step to the element size, allowing near-optimal time-steps everywhere in the mesh. We develop an efficient multilevel LTS-Newmark scheme and implement it in a widely used continuous finite element seismic wave-propagation package. In particular, we extend the standard LTS formulation with adaptations to continuous finite element methods that can be implemented very efficiently with very strongmore » element-size contrasts (more than 100x). Capable of running on large CPU and GPU clusters, we present both synthetic validation examples and large scale, realistic application examples to demonstrate the performance and applicability of the method and implementation on thousands of CPU cores and hundreds of GPUs.« less

  9. Optimization of magnetic flux density measurement using multiple RF receiver coils and multi-echo in MREIT.

    PubMed

    Jeong, Woo Chul; Chauhan, Munish; Sajib, Saurav Z K; Kim, Hyung Joong; Serša, Igor; Kwon, Oh In; Woo, Eung Je

    2014-09-07

    Magnetic Resonance Electrical Impedance Tomography (MREIT) is an MRI method that enables mapping of internal conductivity and/or current density via measurements of magnetic flux density signals. The MREIT measures only the z-component of the induced magnetic flux density B = (Bx, By, Bz) by external current injection. The measured noise of Bz complicates recovery of magnetic flux density maps, resulting in lower quality conductivity and current-density maps. We present a new method for more accurate measurement of the spatial gradient of the magnetic flux density gradient (∇ Bz). The method relies on the use of multiple radio-frequency receiver coils and an interleaved multi-echo pulse sequence that acquires multiple sampling points within each repetition time. The noise level of the measured magnetic flux density Bz depends on the decay rate of the signal magnitude, the injection current duration, and the coil sensitivity map. The proposed method uses three key steps. The first step is to determine a representative magnetic flux density gradient from multiple receiver coils by using a weighted combination and by denoising the measured noisy data. The second step is to optimize the magnetic flux density gradient by using multi-echo magnetic flux densities at each pixel in order to reduce the noise level of ∇ Bz and the third step is to remove a random noise component from the recovered ∇ Bz by solving an elliptic partial differential equation in a region of interest. Numerical simulation experiments using a cylindrical phantom model with included regions of low MRI signal to noise ('defects') verified the proposed method. Experimental results using a real phantom experiment, that included three different kinds of anomalies, demonstrated that the proposed method reduced the noise level of the measured magnetic flux density. The quality of the recovered conductivity maps using denoised ∇ Bz data showed that the proposed method reduced the conductivity noise level up to 3-4 times at each anomaly region in comparison to the conventional method.

  10. Stability of discrete time recurrent neural networks and nonlinear optimization problems.

    PubMed

    Singh, Jayant; Barabanov, Nikita

    2016-02-01

    We consider the method of Reduction of Dissipativity Domain to prove global Lyapunov stability of Discrete Time Recurrent Neural Networks. The standard and advanced criteria for Absolute Stability of these essentially nonlinear systems produce rather weak results. The method mentioned above is proved to be more powerful. It involves a multi-step procedure with maximization of special nonconvex functions over polytopes on every step. We derive conditions which guarantee an existence of at most one point of local maximum for such functions over every hyperplane. This nontrivial result is valid for wide range of neuron transfer functions. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Synthesis of robust water-soluble ZnS:Mn/SiO2 core/shell nanoparticles

    NASA Astrophysics Data System (ADS)

    Sun, Jing; Zhuang, Jiaqi; Guan, Shaowei; Yang, Wensheng

    2008-04-01

    Water-soluble Mn doped ZnS (ZnS:Mn) nanocrystals synthesized by using 3-mercaptopropionic acid (MPA) as stabilizer were homogeneously coated with a dense silica shell through a multi-step procedure. First, 3-mercaptopropyl triethoxy silane (MPS) was used to replace MPA on the particle surface to form a vitreophilic layer for further silica deposition under optimal experimental conditions. Then a two-step silica deposition was performed to form the final water-soluble ZnS:Mn/SiO2 core/shell nanoparticles. The as-prepared core/shell nanoparticles show little change in fluorescence intensity in a wide range of pH value.

  12. A Simple Method to Simultaneously Detect and Identify Spikes from Raw Extracellular Recordings.

    PubMed

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2015-01-01

    The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered) extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.

  13. Synthesis of the trisaccharide outer core fragment of Burkholderia cepacia pv. vietnamiensis lipooligosaccharide.

    PubMed

    Bedini, Emiliano; Cirillo, Luigi; Parrilli, Michelangelo

    2012-02-15

    The synthesis of β-Gal-(1→3)-α-GalNAc-(1→3)-β-GalNAc allyl trisaccharide as the outer core fragment of Burkholderia cepacia pv. vietnamiensis lipooligosaccharide was accomplished through a concise, optimized, multi-step synthesis, having as key steps three glycosylations, that were in-depth studied performing them under several conditions. The target trisaccharide was designed with an allyl aglycone in order to open a future access to the conjugation with an immunogenic protein en route to the development of a synthetic neoglycoconjugate vaccine against this Burkholderia pathogen. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. An overview of instrumentation for the Large Binocular Telescope

    NASA Astrophysics Data System (ADS)

    Wagner, R. Mark

    2006-06-01

    An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27' × 27') mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6' field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4' × 4') imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0'.5 × 0'.5) imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench near-infrared beam combiner utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC-NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support.

  15. An overview of instrumentation for the Large Binocular Telescope

    NASA Astrophysics Data System (ADS)

    Wagner, R. Mark

    2008-07-01

    An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27' × 27') mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6 field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4' × 4') imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0.5' × 0.5') imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench near-infrared beam combiner utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC-NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support.

  16. Inertial Motion Capture Costume Design Study

    PubMed Central

    Szczęsna, Agnieszka; Skurowski, Przemysław; Lach, Ewa; Pruszowski, Przemysław; Pęszor, Damian; Paszkuta, Marcin; Słupik, Janusz; Lebek, Kamil; Janiak, Mateusz; Polański, Andrzej; Wojciechowski, Konrad

    2017-01-01

    The paper describes a scalable, wearable multi-sensor system for motion capture based on inertial measurement units (IMUs). Such a unit is composed of accelerometer, gyroscope and magnetometer. The final quality of an obtained motion arises from all the individual parts of the described system. The proposed system is a sequence of the following stages: sensor data acquisition, sensor orientation estimation, system calibration, pose estimation and data visualisation. The construction of the system’s architecture with the dataflow programming paradigm makes it easy to add, remove and replace the data processing steps. The modular architecture of the system allows an effortless introduction of a new sensor orientation estimation algorithms. The original contribution of the paper is the design study of the individual components used in the motion capture system. The two key steps of the system design are explored in this paper: the evaluation of sensors and algorithms for the orientation estimation. The three chosen algorithms have been implemented and investigated as part of the experiment. Due to the fact that the selection of the sensor has a significant impact on the final result, the sensor evaluation process is also explained and tested. The experimental results confirmed that the choice of sensor and orientation estimation algorithm affect the quality of the final results. PMID:28304337

  17. Modular Adder Designs Using Optimal Reversible and Fault Tolerant Gates in Field-Coupled QCA Nanocomputing

    NASA Astrophysics Data System (ADS)

    Bilal, Bisma; Ahmed, Suhaib; Kakkar, Vipan

    2018-02-01

    The challenges which the CMOS technology is facing toward the end of the technology roadmap calls for an investigation of various logical and technological solutions to CMOS at the nano scale. Two such paradigms which are considered in this paper are the reversible logic and the quantum-dot cellular automata (QCA) nanotechnology. Firstly, a new 3 × 3 reversible and universal gate, RG-QCA, is proposed and implemented in QCA technology using conventional 3-input majority voter based logic. Further the gate is optimized by using explicit interaction of cells and this optimized gate is then used to design an optimized modular full adder in QCA. Another configuration of RG-QCA gate, CRG-QCA, is then proposed which is a 4 × 4 gate and includes the fault tolerant characteristics and parity preserving nature. The proposed CRG-QCA gate is then tested to design a fault tolerant full adder circuit. Extensive comparisons of gate and adder circuits are drawn with the existing literature and it is envisaged that our proposed designs perform better and are cost efficient in QCA technology.

  18. Modular Countermine Payload for Small Robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman Herman; Doug Few; Roelof Versteeg

    2010-04-01

    Payloads for small robotic platforms have historically been designed and implemented as platform and task specific solutions. A consequence of this approach is that payloads cannot be deployed on different robotic platforms without substantial re-engineering efforts. To address this issue, we developed a modular countermine payload that is designed from the ground-up to be platform agnostic. The payload consists of the multi-mission payload controller unit (PCU) coupled with the configurable mission specific threat detection, navigation and marking payloads. The multi-mission PCU has all the common electronics to control and interface to all the payloads. It also contains the embedded processormore » that can be used to run the navigational and control software. The PCU has a very flexible robot interface which can be configured to interface to various robot platforms. The threat detection payload consists of a two axis sweeping arm and the detector. The navigation payload consists of several perception sensors that are used for terrain mapping, obstacle detection and navigation. Finally, the marking payload consists of a dual-color paint marking system. Through the multi-mission PCU, all these payloads are packaged in a platform agnostic way to allow deployment on multiple robotic platforms, including Talon and Packbot.« less

  19. [The modular principle in the organization of territorial andrological service].

    PubMed

    Aliev, R T; Koliado, V B; Neĭmark, A I; Burdeĭn, A V

    2010-01-01

    To establish the andrological service at the territory of the Russian Federation it is proposed to use the modular principle with the organization of the net of primary and consolidated modules in the regions of the Altai Krai and in the Coordinating Federal Center. For the realization of the project a step-by-step principle is needed. At the first stage--the creation of the net of men's consultations having personnel arrangements, material and technical provision, a set of premises, auxiliary services, and delivering a certain amount of care under the interregional urological departments. At the second stage it is planned to amalgamate men's consultations into a single telemedical net, to construct buildings of the Consultative Andrological Center (CAC) and the buildings of a sanatorium. At the third stage it is planned to complete equipping the CAC and the sanatorium in the town of Belokurikha.

  20. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    NASA Astrophysics Data System (ADS)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  1. Automated Grading of Gliomas using Deep Learning in Digital Pathology Images: A modular approach with ensemble of convolutional neural networks.

    PubMed

    Ertosun, Mehmet Günhan; Rubin, Daniel L

    2015-01-01

    Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository.

  2. Automated Grading of Gliomas using Deep Learning in Digital Pathology Images: A modular approach with ensemble of convolutional neural networks

    PubMed Central

    Ertosun, Mehmet Günhan; Rubin, Daniel L.

    2015-01-01

    Brain glioma is the most common primary malignant brain tumors in adults with different pathologic subtypes: Lower Grade Glioma (LGG) Grade II, Lower Grade Glioma (LGG) Grade III, and Glioblastoma Multiforme (GBM) Grade IV. The survival and treatment options are highly dependent of this glioma grade. We propose a deep learning-based, modular classification pipeline for automated grading of gliomas using digital pathology images. Whole tissue digitized images of pathology slides obtained from The Cancer Genome Atlas (TCGA) were used to train our deep learning modules. Our modular pipeline provides diagnostic quality statistics, such as precision, sensitivity and specificity, of the individual deep learning modules, and (1) facilitates training given the limited data in this domain, (2) enables exploration of different deep learning structures for each module, (3) leads to developing less complex modules that are simpler to analyze, and (4) provides flexibility, permitting use of single modules within the framework or use of other modeling or machine learning applications, such as probabilistic graphical models or support vector machines. Our modular approach helps us meet the requirements of minimum accuracy levels that are demanded by the context of different decision points within a multi-class classification scheme. Convolutional Neural Networks are trained for each module for each sub-task with more than 90% classification accuracies on validation data set, and achieved classification accuracy of 96% for the task of GBM vs LGG classification, 71% for further identifying the grade of LGG into Grade II or Grade III on independent data set coming from new patients from the multi-institutional repository. PMID:26958289

  3. SMARBot: a modular miniature mobile robot platform

    NASA Astrophysics Data System (ADS)

    Meng, Yan; Johnson, Kerry; Simms, Brian; Conforth, Matthew

    2008-04-01

    Miniature robots have many advantages over their larger counterparts, such as low cost, low power, and easy to build a large scale team for complex tasks. Heterogeneous multi miniature robots could provide powerful situation awareness capability due to different locomotion capabilities and sensor information. However, it would be expensive and time consuming to develop specific embedded system for different type of robots. In this paper, we propose a generic modular embedded system architecture called SMARbot (Stevens Modular Autonomous Robot), which consists of a set of hardware and software modules that can be configured to construct various types of robot systems. These modules include a high performance microprocessor, a reconfigurable hardware component, wireless communication, and diverse sensor and actuator interfaces. The design of all the modules in electrical subsystem, the selection criteria for module components, and the real-time operating system are described. Some proofs of concept experimental results are also presented.

  4. Modular health services: a single case study approach to the applicability of modularity to residential mental healthcare.

    PubMed

    Soffers, Rutger; Meijboom, Bert; van Zaanen, Jos; van der Feltz-Cornelis, Christina

    2014-05-09

    The Dutch mental healthcare sector has to decrease costs by reducing intramural capacity with one third by 2020 and treating more patients in outpatient care. This transition necessitates enabling patients to become as self-supporting as possible, by customising the residential care they receive to their needs for self-development. Theoretically, modularity might help mental healthcare institutions with this. Modularity entails the decomposition of a healthcare service in parts that can be mixed-and-matched in a variety of ways, and combined form a functional whole. It brings about easier and better configuration, increased transparency and more variety without increasing costs. this study aims to explore the applicability of the modularity concept to the residential care provided in Assisted Living Facilities (ALFs) of Dutch mental healthcare institutions. A single case study is carried out at the centre for psychosis in Etten-Leur, part of the GGz Breburg IMPACT care group. The design enables in-depth analysis of a case in a specific context. This is considered appropriate since theory concerning healthcare modularity is in an early stage of development. The present study can be considered a pilot case. Data were gathered by means of interviews, observations and documentary analysis. At the centre for psychosis, the majority of the residential care can be decomposed in modules, which can be grouped in service bundles and sub-bundles; the service customisation process is sufficiently fit to apply modular thinking; and interfaces for most of the categories are present. Hence, the prerequisites for modular residential care offerings are already largely fulfilled. For not yet fulfilled aspects of these prerequisites, remedies are available. The modularity concept seems applicable to the residential care offered by the ALF of the mental healthcare institution under study. For a successful implementation of modularity however, some steps should be taken by the ALF, such as developing a catalogue of modules and a method for the personnel to work with this catalogue in application of the modules. Whether implementation of modular residential care might facilitate the transition from intramural residential care to outpatient care should be the subject of future research.

  5. Modular health services: a single case study approach to the applicability of modularity to residential mental healthcare

    PubMed Central

    2014-01-01

    Background The Dutch mental healthcare sector has to decrease costs by reducing intramural capacity with one third by 2020 and treating more patients in outpatient care. This transition necessitates enabling patients to become as self-supporting as possible, by customising the residential care they receive to their needs for self-development. Theoretically, modularity might help mental healthcare institutions with this. Modularity entails the decomposition of a healthcare service in parts that can be mixed-and-matched in a variety of ways, and combined form a functional whole. It brings about easier and better configuration, increased transparency and more variety without increasing costs. Aim: this study aims to explore the applicability of the modularity concept to the residential care provided in Assisted Living Facilities (ALFs) of Dutch mental healthcare institutions. Methods A single case study is carried out at the centre for psychosis in Etten-Leur, part of the GGz Breburg IMPACT care group. The design enables in-depth analysis of a case in a specific context. This is considered appropriate since theory concerning healthcare modularity is in an early stage of development. The present study can be considered a pilot case. Data were gathered by means of interviews, observations and documentary analysis. Results At the centre for psychosis, the majority of the residential care can be decomposed in modules, which can be grouped in service bundles and sub-bundles; the service customisation process is sufficiently fit to apply modular thinking; and interfaces for most of the categories are present. Hence, the prerequisites for modular residential care offerings are already largely fulfilled. For not yet fulfilled aspects of these prerequisites, remedies are available. Conclusion The modularity concept seems applicable to the residential care offered by the ALF of the mental healthcare institution under study. For a successful implementation of modularity however, some steps should be taken by the ALF, such as developing a catalogue of modules and a method for the personnel to work with this catalogue in application of the modules. Whether implementation of modular residential care might facilitate the transition from intramural residential care to outpatient care should be the subject of future research. PMID:24886367

  6. A Fuzzy Goal Programming for a Multi-Depot Distribution Problem

    NASA Astrophysics Data System (ADS)

    Nunkaew, Wuttinan; Phruksaphanrat, Busaba

    2010-10-01

    A fuzzy goal programming model for solving a Multi-Depot Distribution Problem (MDDP) is proposed in this research. This effective proposed model is applied for solving in the first step of Assignment First-Routing Second (AFRS) approach. Practically, a basic transportation model is firstly chosen to solve this kind of problem in the assignment step. After that the Vehicle Routing Problem (VRP) model is used to compute the delivery cost in the routing step. However, in the basic transportation model, only depot to customer relationship is concerned. In addition, the consideration of customer to customer relationship should also be considered since this relationship exists in the routing step. Both considerations of relationships are solved using Preemptive Fuzzy Goal Programming (P-FGP). The first fuzzy goal is set by a total transportation cost and the second fuzzy goal is set by a satisfactory level of the overall independence value. A case study is used for describing the effectiveness of the proposed model. Results from the proposed model are compared with the basic transportation model that has previously been used in this company. The proposed model can reduce the actual delivery cost in the routing step owing to the better result in the assignment step. Defining fuzzy goals by membership functions are more realistic than crisps. Furthermore, flexibility to adjust goals and an acceptable satisfactory level for decision maker can also be increased and the optimal solution can be obtained.

  7. Robust model predictive control for multi-step short range spacecraft rendezvous

    NASA Astrophysics Data System (ADS)

    Zhu, Shuyi; Sun, Ran; Wang, Jiaolong; Wang, Jihe; Shao, Xiaowei

    2018-07-01

    This work presents a robust model predictive control (MPC) approach for the multi-step short range spacecraft rendezvous problem. During the specific short range phase concerned, the chaser is supposed to be initially outside the line-of-sight (LOS) cone. Therefore, the rendezvous process naturally includes two steps: the first step is to transfer the chaser into the LOS cone and the second step is to transfer the chaser into the aimed region with its motion confined within the LOS cone. A novel MPC framework named after Mixed MPC (M-MPC) is proposed, which is the combination of the Variable-Horizon MPC (VH-MPC) framework and the Fixed-Instant MPC (FI-MPC) framework. The M-MPC framework enables the optimization for the two steps to be implemented jointly rather than to be separated factitiously, and its computation workload is acceptable for the usually low-power processors onboard spacecraft. Then considering that disturbances including modeling error, sensor noise and thrust uncertainty may induce undesired constraint violations, a robust technique is developed and it is attached to the above M-MPC framework to form a robust M-MPC approach. The robust technique is based on the chance-constrained idea, which ensures that constraints can be satisfied with a prescribed probability. It improves the robust technique proposed by Gavilan et al., because it eliminates the unnecessary conservativeness by explicitly incorporating known statistical properties of the navigation uncertainty. The efficacy of the robust M-MPC approach is shown in a simulation study.

  8. Toward the Modularization of Decision Support Systems

    NASA Astrophysics Data System (ADS)

    Raskin, R. G.

    2009-12-01

    Decision support systems are typically developed entirely from scratch without the use of modular components. This “stovepiped” approach is inefficient and costly because it prevents a developer from leveraging the data, models, tools, and services of other developers. Even when a decision support component is made available, it is difficult to know what problem it solves, how it relates to other components, or even that the component exists, The Spatial Decision Support (SDS) Consortium was formed in 2008 to organize the body of knowledge in SDS within a common portal. The portal identifies the canonical steps in the decision process and enables decision support components to be registered, categorized, and searched. This presentation describes how a decision support system can be assembled from modular models, data, tools and services, based on the needs of the Earth science application.

  9. Crystal Structure of Chitinase ChiW from Paenibacillus sp. str. FPU-7 Reveals a Novel Type of Bacterial Cell-Surface-Expressed Multi-Modular Enzyme Machinery

    PubMed Central

    Itoh, Takafumi; Hibi, Takao; Suzuki, Fumiko; Sugimoto, Ikumi; Fujiwara, Akihiro; Inaka, Koji; Tanaka, Hiroaki; Ohta, Kazunori; Fujii, Yutaka; Taketo, Akira; Kimoto, Hisashi

    2016-01-01

    The Gram-positive bacterium Paenibacillus sp. str. FPU-7 effectively hydrolyzes chitin by using a number of chitinases. A unique chitinase with two catalytic domains, ChiW, is expressed on the cell surface of this bacterium and has high activity towards various chitins, even crystalline chitin. Here, the crystal structure of ChiW at 2.1 Å resolution is presented and describes how the enzyme degrades chitin on the bacterial cell surface. The crystal structure revealed a unique multi-modular architecture composed of six domains to function efficiently on the cell surface: a right-handed β-helix domain (carbohydrate-binding module family 54, CBM-54), a Gly-Ser-rich loop, 1st immunoglobulin-like (Ig-like) fold domain, 1st β/α-barrel catalytic domain (glycoside hydrolase family 18, GH-18), 2nd Ig-like fold domain and 2nd β/α-barrel catalytic domain (GH-18). The structure of the CBM-54, flexibly linked to the catalytic region of ChiW, is described here for the first time. It is similar to those of carbohydrate lyases but displayed no detectable carbohydrate degradation activities. The CBM-54 of ChiW bound to cell wall polysaccharides, such as chin, chitosan, β-1,3-glucan, xylan and cellulose. The structural and biochemical data obtained here also indicated that the enzyme has deep and short active site clefts with endo-acting character. The affinity of CBM-54 towards cell wall polysaccharides and the degradation pattern of the catalytic domains may help to efficiently decompose the cell wall chitin through the contact surface. Furthermore, we clarify that other Gram-positive bacteria possess similar cell-surface-expressed multi-modular enzymes for cell wall polysaccharide degradation. PMID:27907169

  10. Creating the Infrastructure for Rapid Application Development and Processing Response to the HIRDLS Radiance Anomaly

    NASA Astrophysics Data System (ADS)

    Cavanaugh, C.; Gille, J.; Francis, G.; Nardi, B.; Hannigan, J.; McInerney, J.; Krinsky, C.; Barnett, J.; Dean, V.; Craig, C.

    2005-12-01

    The High Resolution Dynamics Limb Sounder (HIRDLS) instrument onboard the NASA Aura spacecraft experienced a rupture of the thermal blanketing material (Kapton) during the rapid depressurization of launch. The Kapton draped over the HIRDLS scan mirror, severely limiting the aperture through which HIRDLS views space and Earth's atmospheric limb. In order for HIRDLS to achieve its intended measurement goals, rapid characterization of the anomaly, and rapid recovery from it were required. The recovery centered around a new processing module inserted into the standard HIRDLS processing scheme, with a goal of minimizing the effect of the anomaly on the already existing processing modules. We describe the software infrastructure on which the new processing module was built, and how that infrastructure allows for rapid application development and processing response. The scope of the infrastructure spans three distinct anomaly recovery steps and the means for their intercommunication. Each of the three recovery steps (removing the Kapton-induced oscillation in the radiometric signal, removing the Kapton signal contamination upon the radiometric signal, and correcting for the partially-obscured atmospheric view) is completely modularized and insulated from the other steps, allowing focused and rapid application development towards a specific step, and neutralizing unintended inter-step influences, thus greatly shortening the design-development-test lifecycle. The intercommunication is also completely modularized and has a simple interface to which the three recovery steps adhere, allowing easy modification and replacement of specific recovery scenarios, thereby heightening the processing response.

  11. Modular Coils with Low Hydrogen Content Especially for MRI of Dry Solids.

    PubMed

    Eichhorn, Timon; Ludwig, Ute; Fischer, Elmar; Gröbner, Jens; Göpper, Michael; Eisenbeiss, Anne-Katrin; Flügge, Tabea; Hennig, Jürgen; von Elverfeldt, Dominik; Hövener, Jan-Bernd

    2015-01-01

    Recent advances have enabled fast magnetic resonance imaging (MRI) of solid materials. This development has opened up new applications for MRI, but, at the same time, uncovered new challenges. Previously, MRI-invisible materials like the housing of MRI detection coils are now readily depicted and either cause artifacts or lead to a decreased image resolution. In this contribution, we present versatile, multi-nuclear single and dual-tune MRI coils that stand out by (1) a low hydrogen content for high-resolution MRI of dry solids without artifacts; (2) a modular approach with exchangeable inductors of variable volumes to optimally enclose the given object; (3) low cost and low manufacturing effort that is associated with the modular approach; (4) accurate sample placement in the coil outside of the bore, and (5) a wide, single- or dual-tune frequency range that covers several nuclei and enables multinuclear MRI without moving the sample. The inductors of the coils were constructed from self-supporting copper sheets to avoid all plastic materials within or around the resonator. The components that were mounted at a distance from the inductor, including the circuit board, coaxial cable and holder were manufactured from polytetrafluoroethylene. Residual hydrogen signal was sufficiently well suppressed to allow 1H-MRI of dry solids with a minimum field of view that was smaller than the sensitive volume of the coil. The SNR was found to be comparable but somewhat lower with respect to commercial, proton-rich quadrature coils, and higher with respect to a linearly-polarized commercial coil. The potential of the setup presented was exemplified by 1H/23Na high-resolution zero echo time (ZTE) MRI of a model solution and a dried human molar at 9.4 T. A full 3D image dataset of the tooth was obtained, rich in contrast and similar to the resolution of standard cone-beam computed tomography.

  12. Modular Coils with Low Hydrogen Content Especially for MRI of Dry Solids

    PubMed Central

    Fischer, Elmar; Gröbner, Jens; Göpper, Michael; Eisenbeiss, Anne-Katrin; Flügge, Tabea; Hennig, Jürgen; von Elverfeldt, Dominik; Hövener, Jan-Bernd

    2015-01-01

    Introduction Recent advances have enabled fast magnetic resonance imaging (MRI) of solid materials. This development has opened up new applications for MRI, but, at the same time, uncovered new challenges. Previously, MRI-invisible materials like the housing of MRI detection coils are now readily depicted and either cause artifacts or lead to a decreased image resolution. In this contribution, we present versatile, multi-nuclear single and dual-tune MRI coils that stand out by (1) a low hydrogen content for high-resolution MRI of dry solids without artifacts; (2) a modular approach with exchangeable inductors of variable volumes to optimally enclose the given object; (3) low cost and low manufacturing effort that is associated with the modular approach; (4) accurate sample placement in the coil outside of the bore, and (5) a wide, single- or dual-tune frequency range that covers several nuclei and enables multinuclear MRI without moving the sample. Materials and Methods The inductors of the coils were constructed from self-supporting copper sheets to avoid all plastic materials within or around the resonator. The components that were mounted at a distance from the inductor, including the circuit board, coaxial cable and holder were manufactured from polytetrafluoroethylene. Results and Conclusion Residual hydrogen signal was sufficiently well suppressed to allow 1H-MRI of dry solids with a minimum field of view that was smaller than the sensitive volume of the coil. The SNR was found to be comparable but somewhat lower with respect to commercial, proton-rich quadrature coils, and higher with respect to a linearly-polarized commercial coil. The potential of the setup presented was exemplified by 1H / 23Na high-resolution zero echo time (ZTE) MRI of a model solution and a dried human molar at 9.4 T. A full 3D image dataset of the tooth was obtained, rich in contrast and similar to the resolution of standard cone-beam computed tomography. PMID:26496192

  13. Structural design of composite rotor blades with consideration of manufacturability, durability, and manufacturing uncertainties

    NASA Astrophysics Data System (ADS)

    Li, Leihong

    A modular structural design methodology for composite blades is developed. This design method can be used to design composite rotor blades with sophisticate geometric cross-sections. This design method hierarchically decomposed the highly-coupled interdisciplinary rotor analysis into global and local levels. In the global level, aeroelastic response analysis and rotor trim are conduced based on multi-body dynamic models. In the local level, variational asymptotic beam sectional analysis methods are used for the equivalent one-dimensional beam properties. Compared with traditional design methodology, the proposed method is more efficient and accurate. Then, the proposed method is used to study three different design problems that have not been investigated before. The first is to add manufacturing constraints into design optimization. The introduction of manufacturing constraints complicates the optimization process. However, the design with manufacturing constraints benefits the manufacturing process and reduces the risk of violating major performance constraints. Next, a new design procedure for structural design against fatigue failure is proposed. This procedure combines the fatigue analysis with the optimization process. The durability or fatigue analysis employs a strength-based model. The design is subject to stiffness, frequency, and durability constraints. Finally, the manufacturing uncertainty impacts on rotor blade aeroelastic behavior are investigated, and a probabilistic design method is proposed to control the impacts of uncertainty on blade structural performance. The uncertainty factors include dimensions, shapes, material properties, and service loads.

  14. Adaptive surrogate model based multi-objective transfer trajectory optimization between different libration points

    NASA Astrophysics Data System (ADS)

    Peng, Haijun; Wang, Wei

    2016-10-01

    An adaptive surrogate model-based multi-objective optimization strategy that combines the benefits of invariant manifolds and low-thrust control toward developing a low-computational-cost transfer trajectory between libration orbits around the L1 and L2 libration points in the Sun-Earth system has been proposed in this paper. A new structure for a multi-objective transfer trajectory optimization model that divides the transfer trajectory into several segments and gives the dominations for invariant manifolds and low-thrust control in different segments has been established. To reduce the computational cost of multi-objective transfer trajectory optimization, a mixed sampling strategy-based adaptive surrogate model has been proposed. Numerical simulations show that the results obtained from the adaptive surrogate-based multi-objective optimization are in agreement with the results obtained using direct multi-objective optimization methods, and the computational workload of the adaptive surrogate-based multi-objective optimization is only approximately 10% of that of direct multi-objective optimization. Furthermore, the generating efficiency of the Pareto points of the adaptive surrogate-based multi-objective optimization is approximately 8 times that of the direct multi-objective optimization. Therefore, the proposed adaptive surrogate-based multi-objective optimization provides obvious advantages over direct multi-objective optimization methods.

  15. Pebble bed modular reactor safeguards: developing new approaches and implementing safeguards by design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyer, Brian David; Beddingfield, David H; Durst, Philip

    2010-01-01

    The design of the Pebble Bed Modular Reactor (PBMR) does not fit or seem appropriate to the IAEA safeguards approach under the categories of light water reactor (LWR), on-load refueled reactor (OLR, i.e. CANDU), or Other (prismatic HTGR) because the fuel is in a bulk form, rather than discrete items. Because the nuclear fuel is a collection of nuclear material inserted in tennis-ball sized spheres containing structural and moderating material and a PBMR core will contain a bulk load on the order of 500,000 spheres, it could be classified as a 'Bulk-Fuel Reactor.' Hence, the IAEA should develop unique safeguardsmore » criteria. In a multi-lab DOE study, it was found that an optimized blend of: (i) developing techniques to verify the plutonium content in spent fuel pebbles, (ii) improving burn-up computer codes for PBMR spent fuel to provide better understanding of the core and spent fuel makeup, and (iii) utilizing bulk verification techniques for PBMR spent fuel storage bins should be combined with the historic IAEA and South African approaches of containment and surveillance to verify and maintain continuity of knowledge of PBMR fuel. For all of these techniques to work the design of the reactor will need to accommodate safeguards and material accountancy measures to a far greater extent than has thus far been the case. The implementation of Safeguards-by-Design as the PBMR design progresses provides an approach to meets these safeguards and accountancy needs.« less

  16. FoCa: a modular treatment planning system for proton radiotherapy with research and educational purposes

    NASA Astrophysics Data System (ADS)

    Sánchez-Parcerisa, D.; Kondrla, M.; Shaindlin, A.; Carabe, A.

    2014-12-01

    FoCa is an in-house modular treatment planning system, developed entirely in MATLAB, which includes forward dose calculation of proton radiotherapy plans in both active and passive modalities as well as a generic optimization suite for inverse treatment planning. The software has a dual education and research purpose. From the educational point of view, it can be an invaluable teaching tool for educating medical physicists, showing the insights of a treatment planning system from a well-known and widely accessible software platform. From the research point of view, its current and potential uses range from the fast calculation of any physical, radiobiological or clinical quantity in a patient CT geometry, to the development of new treatment modalities not yet available in commercial treatment planning systems. The physical models in FoCa were compared with the commissioning data from our institution and show an excellent agreement in depth dose distributions and longitudinal and transversal fluence profiles for both passive scattering and active scanning modalities. 3D dose distributions in phantom and patient geometries were compared with a commercial treatment planning system, yielding a gamma-index pass rate of above 94% (using FoCa’s most accurate algorithm) for all cases considered. Finally, the inverse treatment planning suite was used to produce the first prototype of intensity-modulated, passive-scattered proton therapy, using 13 passive scattering proton fields and multi-leaf modulation to produce a concave dose distribution on a cylindrical solid water phantom without any field-specific compensator.

  17. A beam optics study of a modular multi-source X-ray tube for novel computed tomography applications

    NASA Astrophysics Data System (ADS)

    Walker, Brandon J.; Radtke, Jeff; Chen, Guang-Hong; Eliceiri, Kevin W.; Mackie, Thomas R.

    2017-10-01

    A modular implementation of a scanning multi-source X-ray tube is designed for the increasing number of multi-source imaging applications in computed tomography (CT). An electron beam array coupled with an oscillating magnetic deflector is proposed as a means for producing an X-ray focal spot at any position along a line. The preliminary multi-source model includes three thermionic electron guns that are deflected in tandem by a slowly varying magnetic field and pulsed according to a scanning sequence that is dependent on the intended imaging application. Particle tracking simulations with particle dynamics analysis software demonstrate that three 100 keV electron beams are laterally swept a combined distance of 15 cm over a stationary target with an oscillating magnetic field of 102 G perpendicular to the beam axis. Beam modulation is accomplished using 25 μs pulse widths to a grid electrode with a reverse gate bias of -500 V and an extraction voltage of +1000 V. Projected focal spot diameters are approximately 1 mm for 138 mA electron beams and the stationary target stays within thermal limits for the 14 kW module. This concept could be used as a research platform for investigating high-speed stationary CT scanners, for lowering dose with virtual fan beam formation, for reducing scatter radiation in cone-beam CT, or for other industrial applications.

  18. Multi-unit Operations in Non-Nuclear Systems: Lessons Learned for Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    OHara J. M.; Higgins, J.; DAgostino, A.

    2012-01-17

    The nuclear-power community has reached the stage of proposing advanced reactor designs to support power generation for decades to come. Small modular reactors (SMRs) are one approach to meet these energy needs. While the power output of individual reactor modules is relatively small, they can be grouped to produce reactor sites with different outputs. Also, they can be designed to generate hydrogen, or to process heat. Many characteristics of SMRs are quite different from those of current plants and may be operated quite differently. One difference is that multiple units may be operated by a single crew (or a singlemore » operator) from one control room. The U.S. Nuclear Regulatory Commission (NRC) is examining the human factors engineering (HFE) aspects of SMRs to support licensing reviews. While we reviewed information on SMR designs to obtain information, the designs are not completed and all of the design and operational information is not yet available. Nor is there information on multi-unit operations as envisioned for SMRs available in operating experience. Thus, to gain a better understanding of multi-unit operations we sought the lesson learned from non-nuclear systems that have experience in multi-unit operations, specifically refineries, unmanned aerial vehicles and tele-intensive care units. In this paper we report the lessons learned from these systems and the implications for SMRs.« less

  19. Modular evolution of phosphorylation-based signalling systems

    PubMed Central

    Jin, Jing; Pawson, Tony

    2012-01-01

    Phosphorylation sites are formed by protein kinases (‘writers’), frequently exert their effects following recognition by phospho-binding proteins (‘readers’) and are removed by protein phosphatases (‘erasers’). This writer–reader–eraser toolkit allows phosphorylation events to control a broad range of regulatory processes, and has been pivotal in the evolution of new functions required for the development of multi-cellular animals. The proteins that comprise this system of protein kinases, phospho-binding targets and phosphatases are typically modular in organization, in the sense that they are composed of multiple globular domains and smaller peptide motifs with binding or catalytic properties. The linkage of these binding and catalytic modules in new ways through genetic recombination, and the selection of particular domain combinations, has promoted the evolution of novel, biologically useful processes. Conversely, the joining of domains in aberrant combinations can subvert cell signalling and be causative in diseases such as cancer. Major inventions such as phosphotyrosine (pTyr)-mediated signalling that flourished in the first multi-cellular animals and their immediate predecessors resulted from stepwise evolutionary progression. This involved changes in the binding properties of interaction domains such as SH2 and their linkage to new domain types, and alterations in the catalytic specificities of kinases and phosphatases. This review will focus on the modular aspects of signalling networks and the mechanism by which they may have evolved. PMID:22889906

  20. A multipurpose modular drone with adjustable arms produced via the FDM additive manufacturing process

    NASA Astrophysics Data System (ADS)

    Brischetto, Salvatore; Ciano, Alessandro; Ferro, Carlo Giovanni

    2016-07-01

    The present paper shows an innovative multirotor Unmanned Aerial Vehicle (UAV) which is able to easily and quickly change its configuration. In order to satisfy this feature, the principal structure is made of an universal plate, combined with a circular ring, to create a rail guide able to host the arms, in a variable number from 3 to 8, and the legs. The arms are adjustable and contain all the avionic and motor drivers to connect the main structure with each electric motor. The unique arm design, defined as all-in-one, allows classical single rotor configurations, double rotor configurations and amphibious configurations including inflatable elements positioned at the bottom of the arms. The proposed multi-rotor system is inexpensive because of the few universal pieces needed to compose the platform which allows the creation of a kit. This modular kit allows to have a modular drone with different configurations. Such configurations are distinguished among them for the number of arms, number of legs, number of rotors and motors, and landing capability. Another innovation feature is the introduction of the 3D printing technology to produce all the structural elements. In this manner, all the pieces are designed to be produced via the Fused Deposition Modelling (FDM) technology using desktop 3D printers. Therefore, an universal, dynamic and economic multi-rotor UAV has been developed.

  1. Algorithmic-Reducibility = Renormalization-Group Fixed-Points; ``Noise''-Induced Phase-Transitions (NITs) to Accelerate Algorithmics (``NIT-Picking'') Replacing CRUTCHES!!!: Gauss Modular/Clock-Arithmetic Congruences = Signal X Noise PRODUCTS..

    NASA Astrophysics Data System (ADS)

    Siegel, J.; Siegel, Edward Carl-Ludwig

    2011-03-01

    Cook-Levin computational-"complexity"(C-C) algorithmic-equivalence reduction-theorem reducibility equivalence to renormalization-(semi)-group phase-transitions critical-phenomena statistical-physics universality-classes fixed-points, is exploited with Gauss modular/clock-arithmetic/model congruences = signal X noise PRODUCT reinterpretation. Siegel-Baez FUZZYICS=CATEGORYICS(SON of ``TRIZ''): Category-Semantics(C-S) tabular list-format truth-table matrix analytics predicts and implements "noise"-induced phase-transitions (NITs) to accelerate versus to decelerate Harel [Algorithmics(1987)]-Sipser[Intro. Theory Computation(1997) algorithmic C-C: "NIT-picking" to optimize optimization-problems optimally(OOPO). Versus iso-"noise" power-spectrum quantitative-only amplitude/magnitude-only variation stochastic-resonance, this "NIT-picking" is "noise" power-spectrum QUALitative-type variation via quantitative critical-exponents variation. Computer-"science" algorithmic C-C models: Turing-machine, finite-state-models/automata, are identified as early-days once-workable but NOW ONLY LIMITING CRUTCHES IMPEDING latter-days new-insights!!!

  2. Rapid construction of insulated genetic circuits via synthetic sequence-guided isothermal assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Torella, JP; Boehm, CR; Lienert, F

    2013-12-28

    In vitro recombination methods have enabled one-step construction of large DNA sequences from multiple parts. Although synthetic biological circuits can in principle be assembled in the same fashion, they typically contain repeated sequence elements such as standard promoters and terminators that interfere with homologous recombination. Here we use a computational approach to design synthetic, biologically inactive unique nucleotide sequences (UNSes) that facilitate accurate ordered assembly. Importantly, our designed UNSes make it possible to assemble parts with repeated terminator and insulator sequences, and thereby create insulated functional genetic circuits in bacteria and mammalian cells. Using UNS-guided assembly to construct repeating promoter-gene-terminatormore » parts, we systematically varied gene expression to optimize production of a deoxychromoviridans biosynthetic pathway in Escherichia coli. We then used this system to construct complex eukaryotic AND-logic gates for genomic integration into embryonic stem cells. Construction was performed by using a standardized series of UNS-bearing BioBrick-compatible vectors, which enable modular assembly and facilitate reuse of individual parts. UNS-guided isothermal assembly is broadly applicable to the construction and optimization of genetic circuits and particularly those requiring tight insulation, such as complex biosynthetic pathways, sensors, counters and logic gates.« less

  3. Preliminary design of 1 kW bipolar Ni-MH battery for LEO-satellite application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, J.H.; Reisner, D.E.; Klein, M.G.

    1996-12-31

    Electro Energy, Inc. (EEI) is developing a bipolar nickel-metal hydride rechargeable battery based upon the use of stackable wafer cells. The key to viable bipolar operation has been this unique modular (unitized) approach. The patented unit wafer-cell construct exploits the chemical and thermal properties of a proprietary electrically conductive plastic film. Characteristic of bipolar batteries, current flows across the cell interfaces-perpendicular to the electrode plane. EEI has recently contracted with NASA Lewis Research Center (LeRC) to develop an optimized design 1 kW flightweight battery, for low-earth-orbit (LEO) satellite applications, over a 4-year period with a deliverable flightweight design package. Themore » contract includes an option for EEI to deliver up to three flight quality batteries in an 18-month follow-on program. NASA LeRC has promulgated that the program steps include the design, fabrication, and evaluation of four evolutionary stages of the final battery design which have been designated Preliminary, Improved, Optimized, and Flightweight Design. Initial results from the Preliminary Stage are presented including a 1 kW battery design, thermal design, parameter study, and component development in subscale bipolar batteries.« less

  4. Nucleic acid amplification using modular branched primers

    DOEpatents

    Ulanovsky, Levy; Raja, Mugasimangalam C.

    2001-01-01

    Methods and compositions expand the options for making primers for use in amplifying nucleic acid segments. The invention eliminates the step of custom synthesis of primers for Polymerase Chain Reactions (PCR). Instead of being custom-synthesized, a primer is replaced by a combination of several oligonucleotide modules selected from a pre-synthesized library. A modular combination of just a few oligonucleotides essentially mimics the performance of a conventional, custom-made primer by matching the sequence of the priming site in the template. Each oligonucleotide module has a segment that matches one of the stretches within the priming site.

  5. Selforganization of modular activity of grid cells

    PubMed Central

    Urdapilleta, Eugenio; Si, Bailu

    2017-01-01

    Abstract A unique topographical representation of space is found in the concerted activity of grid cells in the rodent medial entorhinal cortex. Many among the principal cells in this region exhibit a hexagonal firing pattern, in which each cell expresses its own set of place fields (spatial phases) at the vertices of a triangular grid, the spacing and orientation of which are typically shared with neighboring cells. Grid spacing, in particular, has been found to increase along the dorso‐ventral axis of the entorhinal cortex but in discrete steps, that is, with a modular structure. In this study, we show that such a modular activity may result from the self‐organization of interacting units, which individually would not show discrete but rather continuously varying grid spacing. Within our “adaptation” network model, the effect of a continuously varying time constant, which determines grid spacing in the isolated cell model, is modulated by recurrent collateral connections, which tend to produce a few subnetworks, akin to magnetic domains, each with its own grid spacing. In agreement with experimental evidence, the modular structure is tightly defined by grid spacing, but also involves grid orientation and distortion, due to interactions across modules. Thus, our study sheds light onto a possible mechanism, other than simply assuming separate networks a priori, underlying the formation of modular grid representations. PMID:28768062

  6. Modeling urban air pollution with optimized hierarchical fuzzy inference system.

    PubMed

    Tashayo, Behnam; Alimohammadi, Abbas

    2016-10-01

    Environmental exposure assessments (EEA) and epidemiological studies require urban air pollution models with appropriate spatial and temporal resolutions. Uncertain available data and inflexible models can limit air pollution modeling techniques, particularly in under developing countries. This paper develops a hierarchical fuzzy inference system (HFIS) to model air pollution under different land use, transportation, and meteorological conditions. To improve performance, the system treats the issue as a large-scale and high-dimensional problem and develops the proposed model using a three-step approach. In the first step, a geospatial information system (GIS) and probabilistic methods are used to preprocess the data. In the second step, a hierarchical structure is generated based on the problem. In the third step, the accuracy and complexity of the model are simultaneously optimized with a multiple objective particle swarm optimization (MOPSO) algorithm. We examine the capabilities of the proposed model for predicting daily and annual mean PM2.5 and NO2 and compare the accuracy of the results with representative models from existing literature. The benefits provided by the model features, including probabilistic preprocessing, multi-objective optimization, and hierarchical structure, are precisely evaluated by comparing five different consecutive models in terms of accuracy and complexity criteria. Fivefold cross validation is used to assess the performance of the generated models. The respective average RMSEs and coefficients of determination (R (2)) for the test datasets using proposed model are as follows: daily PM2.5 = (8.13, 0.78), annual mean PM2.5 = (4.96, 0.80), daily NO2 = (5.63, 0.79), and annual mean NO2 = (2.89, 0.83). The obtained results demonstrate that the developed hierarchical fuzzy inference system can be utilized for modeling air pollution in EEA and epidemiological studies.

  7. Exploring synergistic benefits of Water-Food-Energy Nexus through multi-objective reservoir optimization schemes.

    PubMed

    Uen, Tinn-Shuan; Chang, Fi-John; Zhou, Yanlai; Tsai, Wen-Ping

    2018-08-15

    This study proposed a holistic three-fold scheme that synergistically optimizes the benefits of the Water-Food-Energy (WFE) Nexus by integrating the short/long-term joint operation of a multi-objective reservoir with irrigation ponds in response to urbanization. The three-fold scheme was implemented step by step: (1) optimizing short-term (daily scale) reservoir operation for maximizing hydropower output and final reservoir storage during typhoon seasons; (2) simulating long-term (ten-day scale) water shortage rates in consideration of the availability of irrigation ponds for both agricultural and public sectors during non-typhoon seasons; and (3) promoting the synergistic benefits of the WFE Nexus in a year-round perspective by integrating the short-term optimization and long-term simulation of reservoir operations. The pivotal Shihmen Reservoir and 745 irrigation ponds located in Taoyuan City of Taiwan together with the surrounding urban areas formed the study case. The results indicated that the optimal short-term reservoir operation obtained from the non-dominated sorting genetic algorithm II (NSGA-II) could largely increase hydropower output but just slightly affected water supply. The simulation results of the reservoir coupled with irrigation ponds indicated that such joint operation could significantly reduce agricultural and public water shortage rates by 22.2% and 23.7% in average, respectively, as compared to those of reservoir operation excluding irrigation ponds. The results of year-round short/long-term joint operation showed that water shortage rates could be reduced by 10% at most, the food production rate could be increased by up to 47%, and the hydropower benefit could increase up to 9.33 million USD per year, respectively, in a wet year. Consequently, the proposed methodology could be a viable approach to promoting the synergistic benefits of the WFE Nexus, and the results provided unique insights for stakeholders and policymakers to pursue sustainable urban development plans. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Modular Rocket Engine Control Software (MRECS)

    NASA Technical Reports Server (NTRS)

    Tarrant, Charlie; Crook, Jerry

    1997-01-01

    The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.

  9. Modular protein expression by RNA trans-splicing enables flexible expression of antibody formats in mammalian cells from a dual-host phage display vector.

    PubMed

    Shang, Yonglei; Tesar, Devin; Hötzel, Isidro

    2015-10-01

    A recently described dual-host phage display vector that allows expression of immunoglobulin G (IgG) in mammalian cells bypasses the need for subcloning of phage display clone inserts to mammalian vectors for IgG expression in large antibody discovery and optimization campaigns. However, antibody discovery and optimization campaigns usually need different antibody formats for screening, requiring reformatting of the clones in the dual-host phage display vector to an alternative vector. We developed a modular protein expression system mediated by RNA trans-splicing to enable the expression of different antibody formats from the same phage display vector. The heavy-chain region encoded by the phage display vector is directly and precisely fused to different downstream heavy-chain sequences encoded by complementing plasmids simply by joining exons in different pre-mRNAs by trans-splicing. The modular expression system can be used to efficiently express structurally correct IgG and Fab fragments or other antibody formats from the same phage display clone in mammalian cells without clone reformatting. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Rational modular design of metabolic network for efficient production of plant polyphenol pinosylvin.

    PubMed

    Wu, Junjun; Zhang, Xia; Zhu, Yingjie; Tan, Qinyu; He, Jiacheng; Dong, Mingsheng

    2017-05-03

    Efficient biosynthesis of the plant polyphenol pinosylvin, which has numerous applications in nutraceuticals and pharmaceuticals, is necessary to make biological production economically viable. To this end, an efficient Escherichia coli platform for pinosylvin production was developed via a rational modular design approach. Initially, different candidate pathway enzymes were screened to construct de novo pinosylvin pathway directly from D-glucose. A comparative analysis of pathway intermediate pools identified that this initial construct led to the intermediate cinnamic acid accumulation. The pinosylvin synthetic pathway was then divided into two new modules separated at cinnamic acid. Combinatorial optimization of transcriptional and translational levels of these two modules resulted in a 16-fold increase in pinosylvin titer. To further improve the concentration of the limiting precursor malonyl-CoA, the malonyl-CoA synthesis module based on clustered regularly interspaced short palindromic repeats interference was assembled and optimized with other two modules. The final pinosylvin titer was improved to 281 mg/L, which was the highest pinosylvin titer even directly from D-glucose without any additional precursor supplementation. The rational modular design approach described here could bolster our capabilities in synthetic biology for value-added chemical production.

  11. Dynamic implicit 3D adaptive mesh refinement for non-equilibrium radiation diffusion

    NASA Astrophysics Data System (ADS)

    Philip, B.; Wang, Z.; Berrill, M. A.; Birke, M.; Pernice, M.

    2014-04-01

    The time dependent non-equilibrium radiation diffusion equations are important for solving the transport of energy through radiation in optically thick regimes and find applications in several fields including astrophysics and inertial confinement fusion. The associated initial boundary value problems that are encountered often exhibit a wide range of scales in space and time and are extremely challenging to solve. To efficiently and accurately simulate these systems we describe our research on combining techniques that will also find use more broadly for long term time integration of nonlinear multi-physics systems: implicit time integration for efficient long term time integration of stiff multi-physics systems, local control theory based step size control to minimize the required global number of time steps while controlling accuracy, dynamic 3D adaptive mesh refinement (AMR) to minimize memory and computational costs, Jacobian Free Newton-Krylov methods on AMR grids for efficient nonlinear solution, and optimal multilevel preconditioner components that provide level independent solver convergence.

  12. Fully implicit adaptive mesh refinement solver for 2D MHD

    NASA Astrophysics Data System (ADS)

    Philip, B.; Chacon, L.; Pernice, M.

    2008-11-01

    Application of implicit adaptive mesh refinement (AMR) to simulate resistive magnetohydrodynamics is described. Solving this challenging multi-scale, multi-physics problem can improve understanding of reconnection in magnetically-confined plasmas. AMR is employed to resolve extremely thin current sheets, essential for an accurate macroscopic description. Implicit time stepping allows us to accurately follow the dynamical time scale of the developing magnetic field, without being restricted by fast Alfven time scales. At each time step, the large-scale system of nonlinear equations is solved by a Jacobian-free Newton-Krylov method together with a physics-based preconditioner. Each block within the preconditioner is solved optimally using the Fast Adaptive Composite grid method, which can be considered as a multiplicative Schwarz method on AMR grids. We will demonstrate the excellent accuracy and efficiency properties of the method with several challenging reduced MHD applications, including tearing, island coalescence, and tilt instabilities. B. Philip, L. Chac'on, M. Pernice, J. Comput. Phys., in press (2008)

  13. Synthetic spider silk sustainability verification by techno-economic and life cycle analysis

    NASA Astrophysics Data System (ADS)

    Edlund, Alan

    Major ampullate spider silk represents a promising biomaterial with diverse commercial potential ranging from textiles to medical devices due to the excellent physical and thermal properties from the protein structure. Recent advancements in synthetic biology have facilitated the development of recombinant spider silk proteins from Escherichia coli (E. coli), alfalfa, and goats. This study specifically investigates the economic feasibility and environmental impact of synthetic spider silk manufacturing. Pilot scale data was used to validate an engineering process model that includes all of the required sub-processing steps for synthetic fiber manufacture: production, harvesting, purification, drying, and spinning. Modeling was constructed modularly to support assessment of alternative protein production methods (alfalfa and goats) as well as alternative down-stream processing technologies. The techno-economic analysis indicates a minimum sale price from pioneer and optimized E. coli plants at 761 kg-1 and 23 kg-1 with greenhouse gas emissions of 572 kg CO2-eq. kg-1 and 55 kg CO2-eq. kg-1, respectively. Spider silk sale price estimates from goat pioneer and optimized results are 730 kg-1 and 54 kg-1, respectively, with pioneer and optimized alfalfa plants are 207 kg-1 and 9.22 kg-1 respectively. Elevated costs and emissions from the pioneer plant can be directly tied to the high material consumption and low protein yield. Decreased production costs associated with the optimized plants include improved protein yield, process optimization, and an Nth plant assumption. Discussion focuses on the commercial potential of spider silk, the production performance requirements for commercialization, and impact of alternative technologies on the sustainability of the system.

  14. Heuristic decomposition for non-hierarchic systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; Hajela, P.

    1991-01-01

    Design and optimization is substantially more complex in multidisciplinary and large-scale engineering applications due to the existing inherently coupled interactions. The paper introduces a quasi-procedural methodology for multidisciplinary optimization that is applicable for nonhierarchic systems. The necessary decision-making support for the design process is provided by means of an embedded expert systems capability. The method employs a decomposition approach whose modularity allows for implementation of specialized methods for analysis and optimization within disciplines.

  15. jInv: A Modular and Scalable Framework for Electromagnetic Inverse Problems

    NASA Astrophysics Data System (ADS)

    Belliveau, P. T.; Haber, E.

    2016-12-01

    Inversion is a key tool in the interpretation of geophysical electromagnetic (EM) data. Three-dimensional (3D) EM inversion is very computationally expensive and practical software for inverting large 3D EM surveys must be able to take advantage of high performance computing (HPC) resources. It has traditionally been difficult to achieve those goals in a high level dynamic programming environment that allows rapid development and testing of new algorithms, which is important in a research setting. With those goals in mind, we have developed jInv, a framework for PDE constrained parameter estimation problems. jInv provides optimization and regularization routines, a framework for user defined forward problems, and interfaces to several direct and iterative solvers for sparse linear systems. The forward modeling framework provides finite volume discretizations of differential operators on rectangular tensor product meshes and tetrahedral unstructured meshes that can be used to easily construct forward modeling and sensitivity routines for forward problems described by partial differential equations. jInv is written in the emerging programming language Julia. Julia is a dynamic language targeted at the computational science community with a focus on high performance and native support for parallel programming. We have developed frequency and time-domain EM forward modeling and sensitivity routines for jInv. We will illustrate its capabilities and performance with two synthetic time-domain EM inversion examples. First, in airborne surveys, which use many sources, we achieve distributed memory parallelism by decoupling the forward and inverse meshes and performing forward modeling for each source on small, locally refined meshes. Secondly, we invert grounded source time-domain data from a gradient array style induced polarization survey using a novel time-stepping technique that allows us to compute data from different time-steps in parallel. These examples both show that it is possible to invert large scale 3D time-domain EM datasets within a modular, extensible framework written in a high-level, easy to use programming language.

  16. Multiple R&D projects scheduling optimization with improved particle swarm algorithm.

    PubMed

    Liu, Mengqi; Shan, Miyuan; Wu, Juan

    2014-01-01

    For most enterprises, in order to win the initiative in the fierce competition of market, a key step is to improve their R&D ability to meet the various demands of customers more timely and less costly. This paper discusses the features of multiple R&D environments in large make-to-order enterprises under constrained human resource and budget, and puts forward a multi-project scheduling model during a certain period. Furthermore, we make some improvements to existed particle swarm algorithm and apply the one developed here to the resource-constrained multi-project scheduling model for a simulation experiment. Simultaneously, the feasibility of model and the validity of algorithm are proved in the experiment.

  17. Longitudinal in vivo evaluation of bone regeneration by combined measurement of multi-pinhole SPECT and micro-CT for tissue engineering

    NASA Astrophysics Data System (ADS)

    Lienemann, Philipp S.; Metzger, Stéphanie; Kiveliö, Anna-Sofia; Blanc, Alain; Papageorgiou, Panagiota; Astolfo, Alberto; Pinzer, Bernd R.; Cinelli, Paolo; Weber, Franz E.; Schibli, Roger; Béhé, Martin; Ehrbar, Martin

    2015-05-01

    Over the last decades, great strides were made in the development of novel implants for the treatment of bone defects. The increasing versatility and complexity of these implant designs request for concurrent advances in means to assess in vivo the course of induced bone formation in preclinical models. Since its discovery, micro-computed tomography (micro-CT) has excelled as powerful high-resolution technique for non-invasive assessment of newly formed bone tissue. However, micro-CT fails to provide spatiotemporal information on biological processes ongoing during bone regeneration. Conversely, due to the versatile applicability and cost-effectiveness, single photon emission computed tomography (SPECT) would be an ideal technique for assessing such biological processes with high sensitivity and for nuclear imaging comparably high resolution (<1 mm). Herein, we employ modular designed poly(ethylene glycol)-based hydrogels that release bone morphogenetic protein to guide the healing of critical sized calvarial bone defects. By combined in vivo longitudinal multi-pinhole SPECT and micro-CT evaluations we determine the spatiotemporal course of bone formation and remodeling within this synthetic hydrogel implant. End point evaluations by high resolution micro-CT and histological evaluation confirm the value of this approach to follow and optimize bone-inducing biomaterials.

  18. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety

    PubMed Central

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-01-01

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. PMID:27294931

  19. A Novel Multi-Sensor Environmental Perception Method Using Low-Rank Representation and a Particle Filter for Vehicle Reversing Safety.

    PubMed

    Zhang, Zutao; Li, Yanjun; Wang, Fubing; Meng, Guanjun; Salman, Waleed; Saleem, Layth; Zhang, Xiaoliang; Wang, Chunbai; Hu, Guangdi; Liu, Yugang

    2016-06-09

    Environmental perception and information processing are two key steps of active safety for vehicle reversing. Single-sensor environmental perception cannot meet the need for vehicle reversing safety due to its low reliability. In this paper, we present a novel multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety. The proposed system consists of four main steps, namely multi-sensor environmental perception, information fusion, target recognition and tracking using low-rank representation and a particle filter, and vehicle reversing speed control modules. First of all, the multi-sensor environmental perception module, based on a binocular-camera system and ultrasonic range finders, obtains the distance data for obstacles behind the vehicle when the vehicle is reversing. Secondly, the information fusion algorithm using an adaptive Kalman filter is used to process the data obtained with the multi-sensor environmental perception module, which greatly improves the robustness of the sensors. Then the framework of a particle filter and low-rank representation is used to track the main obstacles. The low-rank representation is used to optimize an objective particle template that has the smallest L-1 norm. Finally, the electronic throttle opening and automatic braking is under control of the proposed vehicle reversing control strategy prior to any potential collisions, making the reversing control safer and more reliable. The final system simulation and practical testing results demonstrate the validity of the proposed multi-sensor environmental perception method using low-rank representation and a particle filter for vehicle reversing safety.

  20. Cardiac imaging with multi-sector data acquisition in volumetric CT: variation of effective temporal resolution and its potential clinical consequences

    NASA Astrophysics Data System (ADS)

    Tang, Xiangyang; Hsieh, Jiang; Taha, Basel H.; Vass, Melissa L.; Seamans, John L.; Okerlund, Darin R.

    2009-02-01

    With increasing longitudinal detector dimension available in diagnostic volumetric CT, step-and-shoot scan is becoming popular for cardiac imaging. In comparison to helical scan, step-and-shoot scan decouples patient table movement from cardiac gating/triggering, which facilitates the cardiac imaging via multi-sector data acquisition, as well as the administration of inter-cycle heart beat variation (arrhythmia) and radiation dose efficiency. Ideally, a multi-sector data acquisition can improve temporal resolution at a factor the same as the number of sectors (best scenario). In reality, however, the effective temporal resolution is jointly determined by gantry rotation speed and patient heart beat rate, which may significantly lower than the ideal or no improvement (worst scenario). Hence, it is clinically relevant to investigate the behavior of effective temporal resolution in cardiac imaging with multi-sector data acquisition. In this study, a 5-second cine scan of a porcine heart, which cascades 6 porcine cardiac cycles, is acquired. In addition to theoretical analysis and motion phantom study, the clinical consequences due to the effective temporal resolution variation are evaluated qualitative or quantitatively. By employing a 2-sector image reconstruction strategy, a total of 15 (the permutation of P(6, 2)) cases between the best and worst scenarios are studied, providing informative guidance for the design and optimization of CT cardiac imaging in volumetric CT with multi-sector data acquisition.

  1. Heterogeneously Assembled Metamaterials and Metadevices via 3D Modular Transfer Printing

    NASA Astrophysics Data System (ADS)

    Lee, Seungwoo; Kang, Byungsoo; Keum, Hohyun; Ahmed, Numair; Rogers, John A.; Ferreira, Placid M.; Kim, Seok; Min, Bumki

    2016-06-01

    Metamaterials have made the exotic control of the flow of electromagnetic waves possible, which is difficult to achieve with natural materials. In recent years, the emergence of functional metadevices has shown immense potential for the practical realization of highly efficient photonic devices. However, complex and heterogeneous architectures that enable diverse functionalities of metamaterials and metadevices have been challenging to realize because of the limited manufacturing capabilities of conventional fabrication methods. Here, we show that three-dimensional (3D) modular transfer printing can be used to construct diverse metamaterials in complex 3D architectures on universal substrates, which is attractive for achieving on-demand photonic properties. Few repetitive processing steps and rapid constructions are additional advantages of 3D modular transfer printing. Thus, this method provides a fascinating route to generate flexible and stretchable 2D/3D metamaterials and metadevices with heterogeneous material components, complex device architectures, and diverse functionalities.

  2. Heterogeneously Assembled Metamaterials and Metadevices via 3D Modular Transfer Printing.

    PubMed

    Lee, Seungwoo; Kang, Byungsoo; Keum, Hohyun; Ahmed, Numair; Rogers, John A; Ferreira, Placid M; Kim, Seok; Min, Bumki

    2016-06-10

    Metamaterials have made the exotic control of the flow of electromagnetic waves possible, which is difficult to achieve with natural materials. In recent years, the emergence of functional metadevices has shown immense potential for the practical realization of highly efficient photonic devices. However, complex and heterogeneous architectures that enable diverse functionalities of metamaterials and metadevices have been challenging to realize because of the limited manufacturing capabilities of conventional fabrication methods. Here, we show that three-dimensional (3D) modular transfer printing can be used to construct diverse metamaterials in complex 3D architectures on universal substrates, which is attractive for achieving on-demand photonic properties. Few repetitive processing steps and rapid constructions are additional advantages of 3D modular transfer printing. Thus, this method provides a fascinating route to generate flexible and stretchable 2D/3D metamaterials and metadevices with heterogeneous material components, complex device architectures, and diverse functionalities.

  3. Heterogeneously Assembled Metamaterials and Metadevices via 3D Modular Transfer Printing

    PubMed Central

    Lee, Seungwoo; Kang, Byungsoo; Keum, Hohyun; Ahmed, Numair; Rogers, John A.; Ferreira, Placid M.; Kim, Seok; Min, Bumki

    2016-01-01

    Metamaterials have made the exotic control of the flow of electromagnetic waves possible, which is difficult to achieve with natural materials. In recent years, the emergence of functional metadevices has shown immense potential for the practical realization of highly efficient photonic devices. However, complex and heterogeneous architectures that enable diverse functionalities of metamaterials and metadevices have been challenging to realize because of the limited manufacturing capabilities of conventional fabrication methods. Here, we show that three-dimensional (3D) modular transfer printing can be used to construct diverse metamaterials in complex 3D architectures on universal substrates, which is attractive for achieving on-demand photonic properties. Few repetitive processing steps and rapid constructions are additional advantages of 3D modular transfer printing. Thus, this method provides a fascinating route to generate flexible and stretchable 2D/3D metamaterials and metadevices with heterogeneous material components, complex device architectures, and diverse functionalities. PMID:27283594

  4. Modular Multi-Function Multi-Band Airborne Radio System (MFBARS). Volume II. Detailed Report.

    DTIC Science & Technology

    1981-06-01

    Three Platforms in a Field of Hyperbolic LOP’s.......................... 187 76 Comparison, MFBARS Versus Baseline .......... 190 77 Program Flow Chart...configure, from a set of common modules, a given total CNI capability on specific platforms for a given mission " the ability to take advantage of...X Comm/Nav GPS L-Band; Spread Spectrum Nay X X SEEK TALK UHF Spread; Spectrum Comm X X SINCGARS VHF; Freq. Hop Comm (some platforms ) AFSATCOM UHF

  5. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    DOEpatents

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  6. Economic feasibility and environmental impact of synthetic spider silk production from escherichia coli.

    PubMed

    Edlund, Alan M; Jones, Justin; Lewis, Randolph; Quinn, Jason C

    2018-05-25

    Major ampullate spider silk represents a promising protein-based biomaterial with diverse commercial potential ranging from textiles to medical devices due to its excellent physical and thermal properties. Recent advancements in synthetic biology have facilitated the development of recombinant spider silk proteins from Escherichia coli (E. coli). This study specifically investigates the economic feasibility and environmental impact of synthetic spider silk manufacturing. Pilot scale data was used to validate an engineering process model that includes all of the required sub-processing steps for synthetic fiber manufacture: production, harvesting, purification, drying, and spinning. Modeling was constructed modularly to support assessment of alternative downstream processing technologies. The techno-economic analysis indicates a minimum sale price from pioneer and optimized E. coli plants of $761 kg -1 and $23 kg -1 with greenhouse gas emissions of 572 kg CO 2-eq. kg -1 and 55 kg CO 2-eq. kg -1 , respectively. Elevated costs and emissions from the pioneer plant can be directly tied to the high material consumption and low protein yield. Decreased production costs associated with the optimized plant includes improved protein yield, process optimization, and an N th plant assumption. Discussion focuses on the commercial potential of spider silk, the production performance requirements for commercialization, and the impact of alternative technologies on the system. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. A microfluidic platform for precision small-volume sample processing and its use to size separate biological particles with an acoustic microdevice [Precision size separation of biological particles in small-volume samples by an acoustic microfluidic system

    DOE PAGES

    Fong, Erika J.; Huang, Chao; Hamilton, Julie; ...

    2015-11-23

    Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less

  8. An Optimal Method for Detecting Internal and External Intrusion in MANET

    NASA Astrophysics Data System (ADS)

    Rafsanjani, Marjan Kuchaki; Aliahmadipour, Laya; Javidi, Mohammad M.

    Mobile Ad hoc Network (MANET) is formed by a set of mobile hosts which communicate among themselves through radio waves. The hosts establish infrastructure and cooperate to forward data in a multi-hop fashion without a central administration. Due to their communication type and resources constraint, MANETs are vulnerable to diverse types of attacks and intrusions. In this paper, we proposed a method for prevention internal intruder and detection external intruder by using game theory in mobile ad hoc network. One optimal solution for reducing the resource consumption of detection external intruder is to elect a leader for each cluster to provide intrusion service to other nodes in the its cluster, we call this mode moderate mode. Moderate mode is only suitable when the probability of attack is low. Once the probability of attack is high, victim nodes should launch their own IDS to detect and thwart intrusions and we call robust mode. In this paper leader should not be malicious or selfish node and must detect external intrusion in its cluster with minimum cost. Our proposed method has three steps: the first step building trust relationship between nodes and estimation trust value for each node to prevent internal intrusion. In the second step we propose an optimal method for leader election by using trust value; and in the third step, finding the threshold value for notifying the victim node to launch its IDS once the probability of attack exceeds that value. In first and third step we apply Bayesian game theory. Our method due to using game theory, trust value and honest leader can effectively improve the network security, performance and reduce resource consumption.

  9. Model-free learning on robot kinematic chains using a nested multi-agent topology

    NASA Astrophysics Data System (ADS)

    Karigiannis, John N.; Tzafestas, Costas S.

    2016-11-01

    This paper proposes a model-free learning scheme for the developmental acquisition of robot kinematic control and dexterous manipulation skills. The approach is based on a nested-hierarchical multi-agent architecture that intuitively encapsulates the topology of robot kinematic chains, where the activity of each independent degree-of-freedom (DOF) is finally mapped onto a distinct agent. Each one of those agents progressively evolves a local kinematic control strategy in a game-theoretic sense, that is, based on a partial (local) view of the whole system topology, which is incrementally updated through a recursive communication process according to the nested-hierarchical topology. Learning is thus approached not through demonstration and training but through an autonomous self-exploration process. A fuzzy reinforcement learning scheme is employed within each agent to enable efficient exploration in a continuous state-action domain. This paper constitutes in fact a proof of concept, demonstrating that global dexterous manipulation skills can indeed evolve through such a distributed iterative learning of local agent sensorimotor mappings. The main motivation behind the development of such an incremental multi-agent topology is to enhance system modularity, to facilitate extensibility to more complex problem domains and to improve robustness with respect to structural variations including unpredictable internal failures. These attributes of the proposed system are assessed in this paper through numerical experiments in different robot manipulation task scenarios, involving both single and multi-robot kinematic chains. The generalisation capacity of the learning scheme is experimentally assessed and robustness properties of the multi-agent system are also evaluated with respect to unpredictable variations in the kinematic topology. Furthermore, these numerical experiments demonstrate the scalability properties of the proposed nested-hierarchical architecture, where new agents can be recursively added in the hierarchy to encapsulate individual active DOFs. The results presented in this paper demonstrate the feasibility of such a distributed multi-agent control framework, showing that the solutions which emerge are plausible and near-optimal. Numerical efficiency and computational cost issues are also discussed.

  10. Algorithm for parametric community detection in networks.

    PubMed

    Bettinelli, Andrea; Hansen, Pierre; Liberti, Leo

    2012-07-01

    Modularity maximization is extensively used to detect communities in complex networks. It has been shown, however, that this method suffers from a resolution limit: Small communities may be undetectable in the presence of larger ones even if they are very dense. To alleviate this defect, various modifications of the modularity function have been proposed as well as multiresolution methods. In this paper we systematically study a simple model (proposed by Pons and Latapy [Theor. Comput. Sci. 412, 892 (2011)] and similar to the parametric model of Reichardt and Bornholdt [Phys. Rev. E 74, 016110 (2006)]) with a single parameter α that balances the fraction of within community edges and the expected fraction of edges according to the configuration model. An exact algorithm is proposed to find optimal solutions for all values of α as well as the corresponding successive intervals of α values for which they are optimal. This algorithm relies upon a routine for exact modularity maximization and is limited to moderate size instances. An agglomerative hierarchical heuristic is therefore proposed to address parametric modularity detection in large networks. At each iteration the smallest value of α for which it is worthwhile to merge two communities of the current partition is found. Then merging is performed and the data are updated accordingly. An implementation is proposed with the same time and space complexity as the well-known Clauset-Newman-Moore (CNM) heuristic [Phys. Rev. E 70, 066111 (2004)]. Experimental results on artificial and real world problems show that (i) communities are detected by both exact and heuristic methods for all values of the parameter α; (ii) the dendrogram summarizing the results of the heuristic method provides a useful tool for substantive analysis, as illustrated particularly on a Les Misérables data set; (iii) the difference between the parametric modularity values given by the exact method and those given by the heuristic is moderate; (iv) the heuristic version of the proposed parametric method, viewed as a modularity maximization tool, gives better results than the CNM heuristic for large instances.

  11. Multi-Objective Control Optimization for Greenhouse Environment Using Evolutionary Algorithms

    PubMed Central

    Hu, Haigen; Xu, Lihong; Wei, Ruihua; Zhu, Bingkun

    2011-01-01

    This paper investigates the issue of tuning the Proportional Integral and Derivative (PID) controller parameters for a greenhouse climate control system using an Evolutionary Algorithm (EA) based on multiple performance measures such as good static-dynamic performance specifications and the smooth process of control. A model of nonlinear thermodynamic laws between numerous system variables affecting the greenhouse climate is formulated. The proposed tuning scheme is tested for greenhouse climate control by minimizing the integrated time square error (ITSE) and the control increment or rate in a simulation experiment. The results show that by tuning the gain parameters the controllers can achieve good control performance through step responses such as small overshoot, fast settling time, and less rise time and steady state error. Besides, it can be applied to tuning the system with different properties, such as strong interactions among variables, nonlinearities and conflicting performance criteria. The results implicate that it is a quite effective and promising tuning method using multi-objective optimization algorithms in the complex greenhouse production. PMID:22163927

  12. Automated microfluidic platform for systematic studies of colloidal perovskite nanocrystals: towards continuous nano-manufacturing.

    PubMed

    Epps, Robert W; Felton, Kobi C; Coley, Connor W; Abolhasani, Milad

    2017-11-21

    Colloidal organic/inorganic metal-halide perovskite nanocrystals have recently emerged as a potential low-cost replacement for the semiconductor materials in commercial photovoltaics and light emitting diodes. However, unlike III-V and IV-VI semiconductor nanocrystals, studies of colloidal perovskite nanocrystals have yet to develop a fundamental and comprehensive understanding of nucleation and growth kinetics. Here, we introduce a modular and automated microfluidic platform for the systematic studies of room-temperature synthesized cesium-lead halide perovskite nanocrystals. With abundant data collection across the entirety of four orders of magnitude reaction time span, we comprehensively characterize nanocrystal growth within a modular microfluidic reactor. The developed high-throughput screening platform features a custom-designed three-port flow cell with translational capability for in situ spectral characterization of the in-flow synthesized perovskite nanocrystals along a tubular microreactor with an adjustable length, ranging from 3 cm to 196 cm. The translational flow cell allows for sampling of twenty unique residence times at a single equilibrated flow rate. The developed technique requires an average total liquid consumption of 20 μL per spectra and as little as 2 μL at the time of sampling. It may continuously sample up to 30 000 unique spectra per day in both single and multi-phase flow formats. Using the developed plug-and-play microfluidic platform, we study the growth of cesium lead trihalide perovskite nanocrystals through in situ monitoring of their absorption and emission band-gaps at residence times ranging from 100 ms to 17 min. The automated microfluidic platform enables a systematic study of the effect of mixing enhancement on the quality of the synthesized nanocrystals through a direct comparison between single- and multi-phase flow systems at similar reaction time scales. The improved mixing characteristics of the multi-phase flow format results in high-quality perovskite nanocrystals with kinetically tunable emission wavelength, ranging as much as 25 nm at equivalent residence times. Further application of this unique platform would allow rapid parameter optimization in the colloidal synthesis of a wide range of nanomaterials (e.g., metal or semiconductor), that is directly transferable to continuous manufacturing in a numbered-up platform with a similar characteristic length scale.

  13. Distributed learning automata-based algorithm for community detection in complex networks

    NASA Astrophysics Data System (ADS)

    Khomami, Mohammad Mehdi Daliri; Rezvanian, Alireza; Meybodi, Mohammad Reza

    2016-03-01

    Community structure is an important and universal topological property of many complex networks such as social and information networks. The detection of communities of a network is a significant technique for understanding the structure and function of networks. In this paper, we propose an algorithm based on distributed learning automata for community detection (DLACD) in complex networks. In the proposed algorithm, each vertex of network is equipped with a learning automation. According to the cooperation among network of learning automata and updating action probabilities of each automaton, the algorithm interactively tries to identify high-density local communities. The performance of the proposed algorithm is investigated through a number of simulations on popular synthetic and real networks. Experimental results in comparison with popular community detection algorithms such as walk trap, Danon greedy optimization, Fuzzy community detection, Multi-resolution community detection and label propagation demonstrated the superiority of DLACD in terms of modularity, NMI, performance, min-max-cut and coverage.

  14. Predictor - Predictive Reaction Design via Informatics, Computation and Theories of Reactivity

    DTIC Science & Technology

    2017-10-10

    into more complex and valuable molecules, but are limited by: 1. The extensive time it takes to design and optimize a synthesis 2. Multi-step...system. As it is fully compatible to the industry standard SQL, designing a server- based system at a later time will be trivial. Producing a JAVA front...Report: PREDICTOR - Predictive REaction Design via Informatics, Computation and Theories of Reactivity The goal of this program was to create a cyber

  15. Multi-model groundwater-management optimization: reconciling disparate conceptual models

    NASA Astrophysics Data System (ADS)

    Timani, Bassel; Peralta, Richard

    2015-09-01

    Disagreement among policymakers often involves policy issues and differences between the decision makers' implicit utility functions. Significant disagreement can also exist concerning conceptual models of the physical system. Disagreement on the validity of a single simulation model delays discussion on policy issues and prevents the adoption of consensus management strategies. For such a contentious situation, the proposed multi-conceptual model optimization (MCMO) can help stakeholders reach a compromise strategy. MCMO computes mathematically optimal strategies that simultaneously satisfy analogous constraints and bounds in multiple numerical models that differ in boundary conditions, hydrogeologic stratigraphy, and discretization. Shadow prices and trade-offs guide the process of refining the first MCMO-developed `multi-model strategy into a realistic compromise management strategy. By employing automated cycling, MCMO is practical for linear and nonlinear aquifer systems. In this reconnaissance study, MCMO application to the multilayer Cache Valley (Utah and Idaho, USA) river-aquifer system employs two simulation models with analogous background conditions but different vertical discretization and boundary conditions. The objective is to maximize additional safe pumping (beyond current pumping), subject to constraints on groundwater head and seepage from the aquifer to surface waters. MCMO application reveals that in order to protect the local ecosystem, increased groundwater pumping can satisfy only 40 % of projected water demand increase. To explore the possibility of increasing that pumping while protecting the ecosystem, MCMO clearly identifies localities requiring additional field data. MCMO is applicable to other areas and optimization problems than used here. Steps to prepare comparable sub-models for MCMO use are area-dependent.

  16. Multiple Integrated Examinations: An Observational Study of Different Academic Curricula Based on a Business Administration Assessment

    ERIC Educational Resources Information Center

    Ardolino, Piermatteo; Noventa, Stefano; Formicuzzi, Maddalena; Cubico, Serena; Favretto, Giuseppe

    2016-01-01

    An observational study has been carried out to analyse differences in performance between students of different undergraduate curricula in the same written business administration examination, focusing particularly on possible effects of "integrated" or "multi-modular" examinations, a recently widespread format in Italian…

  17. Multicycle rapid thermal annealing optimization of Mg-implanted GaN: Evolution of surface, optical, and structural properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenlee, Jordan D., E-mail: jordan.greenlee.ctr@nrl.navy.mil; Feigelson, Boris N.; Anderson, Travis J.

    2014-08-14

    The first step of a multi-cycle rapid thermal annealing process was systematically studied. The surface, structure, and optical properties of Mg implanted GaN thin films annealed at temperatures ranging from 900 to 1200 °C were investigated by Raman spectroscopy, photoluminescence, UV-visible spectroscopy, atomic force microscopy, and Nomarski microscopy. The GaN thin films are capped with two layers of in-situ metal organic chemical vapor deposition -grown AlN and annealed in 24 bar of N{sub 2} overpressure to avoid GaN decomposition. The crystal quality of the GaN improves with increasing annealing temperature as confirmed by UV-visible spectroscopy and the full widths at halfmore » maximums of the E{sub 2} and A{sub 1} (LO) Raman modes. The crystal quality of films annealed above 1100 °C exceeds the quality of the as-grown films. At 1200 °C, Mg is optically activated, which is determined by photoluminescence measurements. However, at 1200 °C, the GaN begins to decompose as evidenced by pit formation on the surface of the samples. Therefore, it was determined that the optimal temperature for the first step in a multi-cycle rapid thermal anneal process should be conducted at 1150 °C due to crystal quality and surface morphology considerations.« less

  18. A ruthenium dimer complex with a flexible linker slowly threads between DNA bases in two distinct steps.

    PubMed

    Bahira, Meriem; McCauley, Micah J; Almaqwashi, Ali A; Lincoln, Per; Westerlund, Fredrik; Rouzina, Ioulia; Williams, Mark C

    2015-10-15

    Several multi-component DNA intercalating small molecules have been designed around ruthenium-based intercalating monomers to optimize DNA binding properties for therapeutic use. Here we probe the DNA binding ligand [μ-C4(cpdppz)2(phen)4Ru2](4+), which consists of two Ru(phen)2dppz(2+) moieties joined by a flexible linker. To quantify ligand binding, double-stranded DNA is stretched with optical tweezers and exposed to ligand under constant applied force. In contrast to other bis-intercalators, we find that ligand association is described by a two-step process, which consists of fast bimolecular intercalation of the first dppz moiety followed by ∼10-fold slower intercalation of the second dppz moiety. The second step is rate-limited by the requirement for a DNA-ligand conformational change that allows the flexible linker to pass through the DNA duplex. Based on our measured force-dependent binding rates and ligand-induced DNA elongation measurements, we are able to map out the energy landscape and structural dynamics for both ligand binding steps. In addition, we find that at zero force the overall binding process involves fast association (∼10 s), slow dissociation (∼300 s), and very high affinity (Kd ∼10 nM). The methodology developed in this work will be useful for studying the mechanism of DNA binding by other multi-step intercalating ligands and proteins. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. An optimized routing algorithm for the automated assembly of standard multimode ribbon fibers in a full-mesh optical backplane

    NASA Astrophysics Data System (ADS)

    Basile, Vito; Guadagno, Gianluca; Ferrario, Maddalena; Fassi, Irene

    2018-03-01

    In this paper a parametric, modular and scalable algorithm allowing a fully automated assembly of a backplane fiber-optic interconnection circuit is presented. This approach guarantees the optimization of the optical fiber routing inside the backplane with respect to specific criteria (i.e. bending power losses), addressing both transmission performance and overall costs issues. Graph theory has been exploited to simplify the complexity of the NxN full-mesh backplane interconnection topology, firstly, into N independent sub-circuits and then, recursively, into a limited number of loops easier to be generated. Afterwards, the proposed algorithm selects a set of geometrical and architectural parameters whose optimization allows to identify the optimal fiber optic routing for each sub-circuit of the backplane. The topological and numerical information provided by the algorithm are then exploited to control a robot which performs the automated assembly of the backplane sub-circuits. The proposed routing algorithm can be extended to any array architecture and number of connections thanks to its modularity and scalability. Finally, the algorithm has been exploited for the automated assembly of an 8x8 optical backplane realized with standard multimode (MM) 12-fiber ribbons.

  20. Automated multi-plug filtration cleanup for liquid chromatographic-tandem mass spectrometric pesticide multi-residue analysis in representative crop commodities.

    PubMed

    Qin, Yuhong; Zhang, Jingru; Zhang, Yuan; Li, Fangbing; Han, Yongtao; Zou, Nan; Xu, Haowei; Qian, Meiyuan; Pan, Canping

    2016-09-02

    An automated multi-plug filtration cleanup (m-PFC) method on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts was developed. The automatic device was aimed to reduce labor-consuming manual operation workload in the cleanup steps. It could control the volume and the speed of pulling and pushing cycles accurately. In this work, m-PFC was based on multi-walled carbon nanotubes (MWCNTs) mixed with other sorbents and anhydrous magnesium sulfate (MgSO4) in a packed tip for analysis of pesticide multi-residues in crop commodities followed by liquid chromatography with tandem mass spectrometric (LC-MS/MS) detection. It was validated by analyzing 25 pesticides in six representative matrices spiked at two concentration levels of 10 and 100μg/kg. Salts, sorbents, m-PFC procedure, automated pulling and pushing volume, automated pulling speed, and pushing speed for each matrix were optimized. After optimization, two general automated m-PFC methods were introduced to relatively simple (apple, citrus fruit, peanut) and relatively complex (spinach, leek, green tea) matrices. Spike recoveries were within 83 and 108% and 1-14% RSD for most analytes in the tested matrices. Matrix-matched calibrations were performed with the coefficients of determination >0.997 between concentration levels of 10 and 1000μg/kg. The developed method was successfully applied to the determination of pesticide residues in market samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Optimization of multi-objective micro-grid based on improved particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Gan, Yang

    2018-04-01

    The paper presents a multi-objective optimal configuration model for independent micro-grid with the aim of economy and environmental protection. The Pareto solution set can be obtained by solving the multi-objective optimization configuration model of micro-grid with the improved particle swarm algorithm. The feasibility of the improved particle swarm optimization algorithm for multi-objective model is verified, which provides an important reference for multi-objective optimization of independent micro-grid.

  2. Stored program concept for analog computers

    NASA Technical Reports Server (NTRS)

    Hannauer, G., III; Patmore, J. R.

    1971-01-01

    Optimization of three-stage matrices, modularization, and black boxes design techniques provides for automatically interconnecting computing component inputs and outputs in general purpose analog computer. Design also produces relatively inexpensive and less complex automatic patching system.

  3. An overview of instrumentation for the Large Binocular Telescope

    NASA Astrophysics Data System (ADS)

    Wagner, R. Mark

    2010-07-01

    An overview of instrumentation for the Large Binocular Telescope is presented. Optical instrumentation includes the Large Binocular Camera (LBC), a pair of wide-field (27 × 27) mosaic CCD imagers at the prime focus, and the Multi-Object Double Spectrograph (MODS), a pair of dual-beam blue-red optimized long-slit spectrographs mounted at the straight-through F/15 Gregorian focus incorporating multiple slit masks for multi-object spectroscopy over a 6 field and spectral resolutions of up to 8000. Infrared instrumentation includes the LBT Near-IR Spectroscopic Utility with Camera and Integral Field Unit for Extragalactic Research (LUCIFER), a modular near-infrared (0.9-2.5 μm) imager and spectrograph pair mounted at a bent interior focal station and designed for seeing-limited (FOV: 4 × 4) imaging, long-slit spectroscopy, and multi-object spectroscopy utilizing cooled slit masks and diffraction limited (FOV: 0.5 × 0.5) imaging and long-slit spectroscopy. Strategic instruments under development for the remaining two combined focal stations include an interferometric cryogenic beam combiner with near-infrared and thermal-infrared instruments for Fizeau imaging and nulling interferometry (LBTI) and an optical bench near-infrared beam combiner utilizing multi-conjugate adaptive optics for high angular resolution and sensitivity (LINC-NIRVANA). In addition, a fiber-fed bench spectrograph (PEPSI) capable of ultra high resolution spectroscopy and spectropolarimetry (R = 40,000-300,000) will be available as a principal investigator instrument. The availability of all these instruments mounted simultaneously on the LBT permits unique science, flexible scheduling, and improved operational support. Over the past two years the LBC and the first LUCIFER instrument have been brought into routine scientific operation and MODS1 commissioning is set to begin in the fall of 2010.

  4. Multi Objective Controller Design for Linear System via Optimal Interpolation

    NASA Technical Reports Server (NTRS)

    Ozbay, Hitay

    1996-01-01

    We propose a methodology for the design of a controller which satisfies a set of closed-loop objectives simultaneously. The set of objectives consists of: (1) pole placement, (2) decoupled command tracking of step inputs at steady-state, and (3) minimization of step response transients with respect to envelope specifications. We first obtain a characterization of all controllers placing the closed-loop poles in a prescribed region of the complex plane. In this characterization, the free parameter matrix Q(s) is to be determined to attain objectives (2) and (3). Objective (2) is expressed as determining a Pareto optimal solution to a vector valued optimization problem. The solution of this problem is obtained by transforming it to a scalar convex optimization problem. This solution determines Q(O) and the remaining freedom in choosing Q(s) is used to satisfy objective (3). We write Q(s) = (l/v(s))bar-Q(s) for a prescribed polynomial v(s). Bar-Q(s) is a polynomial matrix which is arbitrary except that Q(O) and the order of bar-Q(s) are fixed. Obeying these constraints bar-Q(s) is now to be 'shaped' to minimize the step response characteristics of specific input/output pairs according to the maximum envelope violations. This problem is expressed as a vector valued optimization problem using the concept of Pareto optimality. We then investigate a scalar optimization problem associated with this vector valued problem and show that it is convex. The organization of the report is as follows. The next section includes some definitions and preliminary lemmas. We then give the problem statement which is followed by a section including a detailed development of the design procedure. We then consider an aircraft control example. The last section gives some concluding remarks. The Appendix includes the proofs of technical lemmas, printouts of computer programs, and figures.

  5. Manual of phosphoric acid fuel cell power plant optimization model and computer program

    NASA Technical Reports Server (NTRS)

    Lu, C. Y.; Alkasab, K. A.

    1984-01-01

    An optimized cost and performance model for a phosphoric acid fuel cell power plant system was derived and developed into a modular FORTRAN computer code. Cost, energy, mass, and electrochemical analyses were combined to develop a mathematical model for optimizing the steam to methane ratio in the reformer, hydrogen utilization in the PAFC plates per stack. The nonlinear programming code, COMPUTE, was used to solve this model, in which the method of mixed penalty function combined with Hooke and Jeeves pattern search was chosen to evaluate this specific optimization problem.

  6. A Modular Aerospike Engine Design Using Additive Manufacturing

    NASA Technical Reports Server (NTRS)

    Peugeot, John; Garcia, Chance; Burkhardt, Wendel

    2014-01-01

    A modular aerospike engine concept has been developed with the objective of demonstrating the viability of the aerospike design using additive manufacturing techniques. The aerospike system is a self-compensating design that allows for optimal performance over the entire flight regime and allows for the lowest possible mass vehicle designs. At low altitudes, improvements in Isp can be traded against chamber pressure, staging, and payload. In upper stage applications, expansion ratio and engine envelope can be traded against nozzle efficiency. These features provide flexibility to the System Designer optimizing a complete vehicle stage. The aerospike concept is a good example of a component that has demonstrated improved performance capability, but traditionally has manufacturing requirements that are too expensive and complex to use in a production vehicle. In recent years, additive manufacturing has emerged as a potential method for improving the speed and cost of building geometrically complex components in rocket engines. It offers a reduction in tooling overhead and significant improvements in the integration of the designer and manufacturing method. In addition, the modularity of the engine design provides the ability to perform full scale testing on the combustion devices outside of the full engine configuration. The proposed design uses a hydrocarbon based gas-generator cycle, with plans to take advantage of existing powerhead hardware while focusing DDT&E resources on manufacturing and sub-system testing of the combustion devices. The major risks for the modular aerospike concept lie in the performance of the propellant feed system, the structural integrity of the additive manufactured components, and the aerodynamic efficiency of the exhaust flow.

  7. A novel method for a multi-level hierarchical composite with brick-and-mortar structure

    PubMed Central

    Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.

    2013-01-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships. PMID:23900554

  8. A novel method for a multi-level hierarchical composite with brick-and-mortar structure.

    PubMed

    Brandt, Kristina; Wolff, Michael F H; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A

    2013-01-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.

  9. A novel method for a multi-level hierarchical composite with brick-and-mortar structure

    NASA Astrophysics Data System (ADS)

    Brandt, Kristina; Wolff, Michael F. H.; Salikov, Vitalij; Heinrich, Stefan; Schneider, Gerold A.

    2013-07-01

    The fascination for hierarchically structured hard tissues such as enamel or nacre arises from their unique structure-properties-relationship. During the last decades this numerously motivated the synthesis of composites, mimicking the brick-and-mortar structure of nacre. However, there is still a lack in synthetic engineering materials displaying a true hierarchical structure. Here, we present a novel multi-step processing route for anisotropic 2-level hierarchical composites by combining different coating techniques on different length scales. It comprises polymer-encapsulated ceramic particles as building blocks for the first level, followed by spouted bed spray granulation for a second level, and finally directional hot pressing to anisotropically consolidate the composite. The microstructure achieved reveals a brick-and-mortar hierarchical structure with distinct, however not yet optimized mechanical properties on each level. It opens up a completely new processing route for the synthesis of multi-level hierarchically structured composites, giving prospects to multi-functional structure-properties relationships.

  10. A modular positron camera for the study of industrial processes

    NASA Astrophysics Data System (ADS)

    Leadbeater, T. W.; Parker, D. J.

    2011-10-01

    Positron imaging techniques rely on the detection of the back-to-back annihilation photons arising from positron decay within the system under study. A standard technique, called positron emitting particle tracking (PEPT) [1], uses a number of these detected events to rapidly determine the position of a positron emitting tracer particle introduced into the system under study. Typical applications of PEPT are in the study of granular and multi-phase materials in the disciplines of engineering and the physical sciences. Using components from redundant medical PET scanners a modular positron camera has been developed. This camera consists of a number of small independent detector modules, which can be arranged in custom geometries tailored towards the application in question. The flexibility of the modular camera geometry allows for high photon detection efficiency within specific regions of interest, the ability to study large and bulky systems and the application of PEPT to difficult or remote processes as the camera is inherently transportable.

  11. Ultramap: the all in One Photogrammetric Solution

    NASA Astrophysics Data System (ADS)

    Wiechert, A.; Gruber, M.; Karner, K.

    2012-07-01

    This paper describes in detail the dense matcher developed since years by Vexcel Imaging in Graz for Microsoft's Bing Maps project. This dense matcher was exclusively developed for and used by Microsoft for the production of the 3D city models of Virtual Earth. It will now be made available to the public with the UltraMap software release mid-2012. That represents a revolutionary step in digital photogrammetry. The dense matcher generates digital surface models (DSM) and digital terrain models (DTM) automatically out of a set of overlapping UltraCam images. The models have an outstanding point density of several hundred points per square meter and sub-pixel accuracy and are generated automatically. The dense matcher consists of two steps. The first step rectifies overlapping image areas to speed up the dense image matching process. This rectification step ensures a very efficient processing and detects occluded areas by applying a back-matching step. In this dense image matching process a cost function consisting of a matching score as well as a smoothness term is minimized. In the second step the resulting range image patches are fused into a DSM by optimizing a global cost function. The whole process is optimized for multi-core CPUs and optionally uses GPUs if available. UltraMap 3.0 features also an additional step which is presented in this paper, a complete automated true-ortho and ortho workflow. For this, the UltraCam images are combined with the DSM or DTM in an automated rectification step and that results in high quality true-ortho or ortho images as a result of a highly automated workflow. The paper presents the new workflow and first results.

  12. Enantioselective total synthesis of hyperforin.

    PubMed

    Sparling, Brian A; Moebius, David C; Shair, Matthew D

    2013-01-16

    A modular, 18-step total synthesis of hyperforin is described. The natural product was quickly accessed using latent symmetry elements, whereby a group-selective, Lewis acid-catalyzed epoxide-opening cascade cyclization was used to furnish the bicyclo[3.3.1]nonane core and set two key quaternary stereocenters.

  13. Effects of a 1 year development programme for recently graduated veterinary professionals on personal and job resources: a combined quantitative and qualitative approach.

    PubMed

    Mastenbroek, N J J M; van Beukelen, P; Demerouti, E; Scherpbier, A J J A; Jaarsma, A D C

    2015-12-30

    The early years in professional practice are for many veterinary and medical professionals a period of great challenges and consequently increased stress levels. Personal resources appear to have a positive impact on the course of this transition period. Personal resources are defined as developable systems of positive beliefs about one's self and the world that are generally linked to resilience. They are negatively related to burnout and positively and reciprocally to job resources, work engagement and job performance. With the aim of enhancing personal resources of recently graduated veterinarians, a 1 year multi-modular resources development programme was designed. This study was conducted to analyse: 1. if and how the development programme affected participants' personal resources, and 2. if and how personal resources affected participants' work characteristics and work engagement. Quantitative study: Twenty-five participants and ten non-participants completed an online survey covering personal resources, job resources and work engagement at the start and finish of the programme. Results showed a significant increase of personal resources in participants for self-reported ratings of proactive behaviour (Effect Size=-0.4), self-efficacy (Effect Size=-0.6) and reflective behaviour (Effect Size=-0.6). Results of the control group were not significant, although some moderate effect sizes were found. Qualitative study: Additionally 16 semi-structured interviews with participants of the programme were taken 6 months after finishing the programme. Analysis of the interviews revealed that participants also developed other important personal resources namely self-acceptance, self-esteem, awareness of own influence and responsibility. The reflection process, which took place in the course of the programme, seemed to be a necessary step for the development of the other personal resources. According to participants of the resources development programme, the increase in personal resources also gave rise to an increase in job resources. The multi-modular resources development programme seems to support development of participants' personal resources. Because personal resources are beneficial in improving well-being irrespective of where an individual starts working, it is important to give them explicit attention in educational settings.

  14. Micro-Vibration Performance Prediction of SEPTA24 Using SMeSim (RUAG Space Mechanism Simulator Tool)

    NASA Astrophysics Data System (ADS)

    Omiciuolo, Manolo; Lang, Andreas; Wismer, Stefan; Barth, Stephan; Szekely, Gerhard

    2013-09-01

    Scientific space missions are currently challenging the performances of their payloads. The performances can be dramatically restricted by micro-vibration loads generated by any moving parts of the satellites, thus by Solar Array Drive Assemblies too. Micro-vibration prediction of SADAs is therefore very important to support their design and optimization in the early stages of a programme. The Space Mechanism Simulator (SMeSim) tool, developed by RUAG, enhances the capability of analysing the micro-vibration emissivity of a Solar Array Drive Assembly (SADA) under a specified set of boundary conditions. The tool is developed in the Matlab/Simulink® environment throughout a library of blocks simulating the different components a SADA is made of. The modular architecture of the blocks, assembled by the user, and the set up of the boundary conditions allow time-domain and frequency-domain analyses of a rigid multi-body model with concentrated flexibilities and coupled- electronic control of the mechanism. SMeSim is used to model the SEPTA24 Solar Array Drive Mechanism and predict its micro-vibration emissivity. SMeSim and the return of experience earned throughout its development and use can now support activities like verification by analysis of micro-vibration emissivity requirements and/or design optimization to minimize the micro- vibration emissivity of a SADA.

  15. Experimental high-speed network

    NASA Astrophysics Data System (ADS)

    McNeill, Kevin M.; Klein, William P.; Vercillo, Richard; Alsafadi, Yasser H.; Parra, Miguel V.; Dallas, William J.

    1993-09-01

    Many existing local area networking protocols currently applied in medical imaging were originally designed for relatively low-speed, low-volume networking. These protocols utilize small packet sizes appropriate for text based communication. Local area networks of this type typically provide raw bandwidth under 125 MHz. These older network technologies are not optimized for the low delay, high data traffic environment of a totally digital radiology department. Some current implementations use point-to-point links when greater bandwidth is required. However, the use of point-to-point communications for a total digital radiology department network presents many disadvantages. This paper describes work on an experimental multi-access local area network called XFT. The work includes the protocol specification, and the design and implementation of network interface hardware and software. The protocol specifies the Physical and Data Link layers (OSI layers 1 & 2) for a fiber-optic based token ring providing a raw bandwidth of 500 MHz. The protocol design and implementation of the XFT interface hardware includes many features to optimize image transfer and provide flexibility for additional future enhancements which include: a modular hardware design supporting easy portability to a variety of host system buses, a versatile message buffer design providing 16 MB of memory, and the capability to extend the raw bandwidth of the network to 3.0 GHz.

  16. Scalable Triadic Analysis of Large-Scale Graphs: Multi-Core vs. Multi-Processor vs. Multi-Threaded Shared Memory Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Marquez, Andres; Choudhury, Sutanay

    2012-09-01

    Triadic analysis encompasses a useful set of graph mining methods that is centered on the concept of a triad, which is a subgraph of three nodes and the configuration of directed edges across the nodes. Such methods are often applied in the social sciences as well as many other diverse fields. Triadic methods commonly operate on a triad census that counts the number of triads of every possible edge configuration in a graph. Like other graph algorithms, triadic census algorithms do not scale well when graphs reach tens of millions to billions of nodes. To enable the triadic analysis ofmore » large-scale graphs, we developed and optimized a triad census algorithm to efficiently execute on shared memory architectures. We will retrace the development and evolution of a parallel triad census algorithm. Over the course of several versions, we continually adapted the code’s data structures and program logic to expose more opportunities to exploit parallelism on shared memory that would translate into improved computational performance. We will recall the critical steps and modifications that occurred during code development and optimization. Furthermore, we will compare the performances of triad census algorithm versions on three specific systems: Cray XMT, HP Superdome, and AMD multi-core NUMA machine. These three systems have shared memory architectures but with markedly different hardware capabilities to manage parallelism.« less

  17. A mass, momentum, and energy conserving, fully implicit, scalable algorithm for the multi-dimensional, multi-species Rosenbluth-Fokker-Planck equation

    NASA Astrophysics Data System (ADS)

    Taitano, W. T.; Chacón, L.; Simakov, A. N.; Molvig, K.

    2015-09-01

    In this study, we demonstrate a fully implicit algorithm for the multi-species, multidimensional Rosenbluth-Fokker-Planck equation which is exactly mass-, momentum-, and energy-conserving, and which preserves positivity. Unlike most earlier studies, we base our development on the Rosenbluth (rather than Landau) form of the Fokker-Planck collision operator, which reduces complexity while allowing for an optimal fully implicit treatment. Our discrete conservation strategy employs nonlinear constraints that force the continuum symmetries of the collision operator to be satisfied upon discretization. We converge the resulting nonlinear system iteratively using Jacobian-free Newton-Krylov methods, effectively preconditioned with multigrid methods for efficiency. Single- and multi-species numerical examples demonstrate the advertised accuracy properties of the scheme, and the superior algorithmic performance of our approach. In particular, the discretization approach is numerically shown to be second-order accurate in time and velocity space and to exhibit manifestly positive entropy production. That is, H-theorem behavior is indicated for all the examples we have tested. The solution approach is demonstrated to scale optimally with respect to grid refinement (with CPU time growing linearly with the number of mesh points), and timestep (showing very weak dependence of CPU time with time-step size). As a result, the proposed algorithm delivers several orders-of-magnitude speedup vs. explicit algorithms.

  18. The Package-Based Development Process in the Flight Dynamics Division

    NASA Technical Reports Server (NTRS)

    Parra, Amalia; Seaman, Carolyn; Basili, Victor; Kraft, Stephen; Condon, Steven; Burke, Steven; Yakimovich, Daniil

    1997-01-01

    The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.

  19. Modular Rocket Engine Control Software (MRECS)

    NASA Technical Reports Server (NTRS)

    Tarrant, C.; Crook, J.

    1998-01-01

    The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.

  20. Modular closed-loop control of diabetes.

    PubMed

    Patek, S D; Magni, L; Dassau, E; Karvetski, C; Toffanin, C; De Nicolao, G; Del Favero, S; Breton, M; Man, C Dalla; Renard, E; Zisser, H; Doyle, F J; Cobelli, C; Kovatchev, B P

    2012-11-01

    Modularity plays a key role in many engineering systems, allowing for plug-and-play integration of components, enhancing flexibility and adaptability, and facilitating standardization. In the control of diabetes, i.e., the so-called "artificial pancreas," modularity allows for the step-wise introduction of (and regulatory approval for) algorithmic components, starting with subsystems for assured patient safety and followed by higher layer components that serve to modify the patient's basal rate in real time. In this paper, we introduce a three-layer modular architecture for the control of diabetes, consisting in a sensor/pump interface module (IM), a continuous safety module (CSM), and a real-time control module (RTCM), which separates the functions of insulin recommendation (postmeal insulin for mitigating hyperglycemia) and safety (prevention of hypoglycemia). In addition, we provide details of instances of all three layers of the architecture: the APS© serving as the IM, the safety supervision module (SSM) serving as the CSM, and the range correction module (RCM) serving as the RTCM. We evaluate the performance of the integrated system via in silico preclinical trials, demonstrating 1) the ability of the SSM to reduce the incidence of hypoglycemia under nonideal operating conditions and 2) the ability of the RCM to reduce glycemic variability.

  1. Multi-Sensor Registration of Earth Remotely Sensed Imagery

    NASA Technical Reports Server (NTRS)

    LeMoigne, Jacqueline; Cole-Rhodes, Arlene; Eastman, Roger; Johnson, Kisha; Morisette, Jeffrey; Netanyahu, Nathan S.; Stone, Harold S.; Zavorin, Ilya; Zukor, Dorothy (Technical Monitor)

    2001-01-01

    Assuming that approximate registration is given within a few pixels by a systematic correction system, we develop automatic image registration methods for multi-sensor data with the goal of achieving sub-pixel accuracy. Automatic image registration is usually defined by three steps; feature extraction, feature matching, and data resampling or fusion. Our previous work focused on image correlation methods based on the use of different features. In this paper, we study different feature matching techniques and present five algorithms where the features are either original gray levels or wavelet-like features, and the feature matching is based on gradient descent optimization, statistical robust matching, and mutual information. These algorithms are tested and compared on several multi-sensor datasets covering one of the EOS Core Sites, the Konza Prairie in Kansas, from four different sensors: IKONOS (4m), Landsat-7/ETM+ (30m), MODIS (500m), and SeaWIFS (1000m).

  2. A modular and optimized single marker system for generating Trypanosoma brucei cell lines expressing T7 RNA polymerase and the tetracycline repressor.

    PubMed

    Poon, S K; Peacock, L; Gibson, W; Gull, K; Kelly, S

    2012-02-01

    Here, we present a simple modular extendable vector system for introducing the T7 RNA polymerase and tetracycline repressor genes into Trypanosoma brucei. This novel system exploits developments in our understanding of gene expression and genome organization to produce a streamlined plasmid optimized for high levels of expression of the introduced transgenes. We demonstrate the utility of this novel system in bloodstream and procyclic forms of Trypanosoma brucei, including the genome strain TREU927/4. We validate these cell lines using a variety of inducible experiments that recapture previously published lethal and non-lethal phenotypes. We further demonstrate the utility of the single marker (SmOx) TREU927/4 cell line for in vivo experiments in the tsetse fly and provide a set of plasmids that enable both whole-fly and salivary gland-specific inducible expression of transgenes.

  3. A modular and optimized single marker system for generating Trypanosoma brucei cell lines expressing T7 RNA polymerase and the tetracycline repressor

    PubMed Central

    Poon, S. K.; Peacock, L.; Gibson, W.; Gull, K.; Kelly, S.

    2012-01-01

    Here, we present a simple modular extendable vector system for introducing the T7 RNA polymerase and tetracycline repressor genes into Trypanosoma brucei. This novel system exploits developments in our understanding of gene expression and genome organization to produce a streamlined plasmid optimized for high levels of expression of the introduced transgenes. We demonstrate the utility of this novel system in bloodstream and procyclic forms of Trypanosoma brucei, including the genome strain TREU927/4. We validate these cell lines using a variety of inducible experiments that recapture previously published lethal and non-lethal phenotypes. We further demonstrate the utility of the single marker (SmOx) TREU927/4 cell line for in vivo experiments in the tsetse fly and provide a set of plasmids that enable both whole-fly and salivary gland-specific inducible expression of transgenes. PMID:22645659

  4. Work Domain Analysis of a Predecessor Sodium-cooled Reactor as Baseline for AdvSMR Operational Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald Farris; David Gertman; Jacques Hugo

    This report presents the results of the Work Domain Analysis for the Experimental Breeder Reactor (EBR-II). This is part of the phase of the research designed to incorporate Cognitive Work Analysis in the development of a framework for the formalization of an Operational Concept (OpsCon) for Advanced Small Modular Reactors (AdvSMRs). For a new AdvSMR design, information obtained through Cognitive Work Analysis, combined with human performance criteria, can and should be used in during the operational phase of a plant to assess the crew performance aspects associated with identified AdvSMR operational concepts. The main objective of this phase was tomore » develop an analytical and descriptive framework that will help systems and human factors engineers to understand the design and operational requirements of the emerging generation of small, advanced, multi-modular reactors. Using EBR-II as a predecessor to emerging sodium-cooled reactor designs required the application of a method suitable to the structured and systematic analysis of the plant to assist in identifying key features of the work associated with it and to clarify the operational and other constraints. The analysis included the identification and description of operating scenarios that were considered characteristic of this type of nuclear power plant. This is an invaluable aspect of Operational Concept development since it typically reveals aspects of future plant configurations that will have an impact on operations. These include, for example, the effect of core design, different coolants, reactor-to-power conversion unit ratios, modular plant layout, modular versus central control rooms, plant siting, and many more. Multi-modular plants in particular are expected to have a significant impact on overall OpsCon in general, and human performance in particular. To support unconventional modes of operation, the modern control room of a multi-module plant would typically require advanced HSIs that would provide sophisticated operational information visualization, coupled with adaptive automation schemes and operator support systems to reduce complexity. These all have to be mapped at some point to human performance requirements. The EBR-II results will be used as a baseline that will be extrapolated in the extended Cognitive Work Analysis phase to the analysis of a selected advanced sodium-cooled SMR design as a way to establish non-conventional operational concepts. The Work Domain Analysis results achieved during this phase have not only established an organizing and analytical framework for describing existing sociotechnical systems, but have also indicated that the method is particularly suited to the analysis of prospective and immature designs. The results of the EBR-II Work Domain Analysis have indicated that the methodology is scientifically sound and generalizable to any operating environment.« less

  5. Statistical Mechanics and Dynamics of the Outer Solar System.I. The Jupiter/Saturn Zone

    NASA Technical Reports Server (NTRS)

    Grazier, K. R.; Newman, W. I.; Kaula, W. M.; Hyman, J. M.

    1996-01-01

    We report on numerical simulations designed to understand how the solar system evolved through a winnowing of planetesimals accreeted from the early solar nebula. This sorting process is driven by the energy and angular momentum and continues to the present day. We reconsider the existence and importance of stable niches in the Jupiter/Saturn Zone using greatly improved numerical techniques based on high-order optimized multi-step integration schemes coupled to roundoff error minimizing methods.

  6. Research of flaw image collecting and processing technology based on multi-baseline stereo imaging

    NASA Astrophysics Data System (ADS)

    Yao, Yong; Zhao, Jiguang; Pang, Xiaoyan

    2008-03-01

    Aiming at the practical situations such as accurate optimal design, complex algorithms and precise technical demands of gun bore flaw image collecting, the design frame of a 3-D image collecting and processing system based on multi-baseline stereo imaging was presented in this paper. This system mainly including computer, electrical control box, stepping motor and CCD camera and it can realize function of image collection, stereo matching, 3-D information reconstruction and after-treatments etc. Proved by theoretical analysis and experiment results, images collected by this system were precise and it can slake efficiently the uncertainty problem produced by universally veins or repeated veins. In the same time, this system has faster measure speed and upper measure precision.

  7. The Fluids And Combustion Facility Combustion Integrated Rack And The Multi-User Droplet Combustion Apparatus: Microgravity Combustion Science Using Modular Multi-User Hardware

    NASA Technical Reports Server (NTRS)

    OMalley, Terence F.; Myhre, Craig A.

    2000-01-01

    The Fluids and Combustion Facility (FCF) is a multi-rack payload planned for the International Space Station (ISS) that will enable the study of fluid physics and combustion science in a microgravity environment. The Combustion Integrated Rack (CIR) is one of two International Standard Payload Racks of the FCF and is being designed primarily to support combustion science experiments. The Multi-user Droplet Combustion Apparatus (MDCA) is a multi-user apparatus designed to accommodate four different droplet combustion science experiments and is the first payload for CIR. The CIR will function independently until the later launch of the Fluids Integrated Rack component of the FCF. This paper provides an overview of the capabilities and the development status of the CIR and MDCA.

  8. Synthesis of water-soluble mono- and ditopic imidazoliums for carbene ligands

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anstey, Mitchell; Murtagh, Dustin; Cordaro, Joseph Gabriel

    2015-09-01

    Synthesis of ditopic imidazoliums was achieved using a modular step-wise procedure. The procedure itself is amenable to a wide array of functional groups that can be incorporated into the imidazolium architecture. The resulting compounds range from ditopic zwitterions to highly-soluble dicationic aromatics

  9. SENSITIVITY OF OZONE AND AEROSOL PREDICTIONS TO THE TRANSPORT ALGORITHMS IN THE MODELS-3 COMMUNITY MULTI-SCALE AIR QUALITY (CMAQ) MODELING SYSTEM

    EPA Science Inventory

    EPA's Models-3 CMAQ system is intended to provide a community modeling paradigm that allows continuous improvement of the one-atmosphere modeling capability in a unified fashion. CMAQ's modular design promotes incorporation of several sets of science process modules representing ...

  10. Design and Implementation of Multi-Campus, Modular Master Classes in Biochemical Engineering

    ERIC Educational Resources Information Center

    Wuyts, Niek; Bruneel, Dorine; Meyers, Myriam; Van Hoof, Etienne; De Vos, Leander; Langie, Greet; Rediers, Hans

    2015-01-01

    The Master of Science in engineering technology: biochemical engineering is organised in KU Leuven at four geographically dispersed campuses. To sustain the Master's programmes at all campuses, it is clear that a unique education profile at each campus is crucial. In addition, a rationalisation is required by increased cooperation, increased…

  11. Medusa: A Scalable MR Console Using USB

    PubMed Central

    Stang, Pascal P.; Conolly, Steven M.; Santos, Juan M.; Pauly, John M.; Scott, Greig C.

    2012-01-01

    MRI pulse sequence consoles typically employ closed proprietary hardware, software, and interfaces, making difficult any adaptation for innovative experimental technology. Yet MRI systems research is trending to higher channel count receivers, transmitters, gradient/shims, and unique interfaces for interventional applications. Customized console designs are now feasible for researchers with modern electronic components, but high data rates, synchronization, scalability, and cost present important challenges. Implementing large multi-channel MR systems with efficiency and flexibility requires a scalable modular architecture. With Medusa, we propose an open system architecture using the Universal Serial Bus (USB) for scalability, combined with distributed processing and buffering to address the high data rates and strict synchronization required by multi-channel MRI. Medusa uses a modular design concept based on digital synthesizer, receiver, and gradient blocks, in conjunction with fast programmable logic for sampling and synchronization. Medusa is a form of synthetic instrument, being reconfigurable for a variety of medical/scientific instrumentation needs. The Medusa distributed architecture, scalability, and data bandwidth limits are presented, and its flexibility is demonstrated in a variety of novel MRI applications. PMID:21954200

  12. GELATIO: a general framework for modular digital analysis of high-purity Ge detector signals

    NASA Astrophysics Data System (ADS)

    Agostini, M.; Pandola, L.; Zavarise, P.; Volynets, O.

    2011-08-01

    GELATIO is a new software framework for advanced data analysis and digital signal processing developed for the GERDA neutrinoless double beta decay experiment. The framework is tailored to handle the full analysis flow of signals recorded by high purity Ge detectors and photo-multipliers from the veto counters. It is designed to support a multi-channel modular and flexible analysis, widely customizable by the user either via human-readable initialization files or via a graphical interface. The framework organizes the data into a multi-level structure, from the raw data up to the condensed analysis parameters, and includes tools and utilities to handle the data stream between the different levels. GELATIO is implemented in C++. It relies upon ROOT and its extension TAM, which provides compatibility with PROOF, enabling the software to run in parallel on clusters of computers or many-core machines. It was tested on different platforms and benchmarked in several GERDA-related applications. A stable version is presently available for the GERDA Collaboration and it is used to provide the reference analysis of the experiment data.

  13. BeBot: A Modular Mobile Miniature Robot Platform Supporting Hardware Reconfiguration and Multi-standard Communication

    NASA Astrophysics Data System (ADS)

    Herbrechtsmeier, Stefan; Witkowski, Ulf; Rückert, Ulrich

    Mobile robots become more and more important in current research and education. Especially small ’on the table’ experiments attract interest, because they need no additional or special laboratory equipments. In this context platforms are desirable which are small, simple to access and relatively easy to program. An additional powerful information processing unit is advantageous to simplify the implementation of algorithm and the porting of software from desktop computers to the robot platform. In this paper we present a new versatile miniature robot that can be ideally used for research and education. The small size of the robot of about 9 cm edge length, its robust drive and its modular structure make the robot a general device for single and multi-robot experiments executed ’on the table’. For programming and evaluation the robot can be wirelessly connected via Bluetooth or WiFi. The operating system of the robot is based on the standard Linux kernel and the GNU C standard library. A player/stage model eases software development and testing.

  14. Integrated multisensor perimeter detection systems

    NASA Astrophysics Data System (ADS)

    Kent, P. J.; Fretwell, P.; Barrett, D. J.; Faulkner, D. A.

    2007-10-01

    The report describes the results of a multi-year programme of research aimed at the development of an integrated multi-sensor perimeter detection system capable of being deployed at an operational site. The research was driven by end user requirements in protective security, particularly in threat detection and assessment, where effective capability was either not available or prohibitively expensive. Novel video analytics have been designed to provide robust detection of pedestrians in clutter while new radar detection and tracking algorithms provide wide area day/night surveillance. A modular integrated architecture based on commercially available components has been developed. A graphical user interface allows intuitive interaction and visualisation with the sensors. The fusion of video, radar and other sensor data provides the basis of a threat detection capability for real life conditions. The system was designed to be modular and extendable in order to accommodate future and legacy surveillance sensors. The current sensor mix includes stereoscopic video cameras, mmWave ground movement radar, CCTV and a commercially available perimeter detection cable. The paper outlines the development of the system and describes the lessons learnt after deployment in a pilot trial.

  15. Generalization and modularization of two-dimensional adaptive coordinate transformations for the Fourier modal method.

    PubMed

    Küchenmeister, Jens

    2014-04-21

    The Fourier modal method (FMM) has advanced greatly by using adaptive coordinates and adaptive spatial resolution. The convergence characteristics were shown to be improved significantly, a construction principle for suitable meshes was demonstrated and a guideline for the optimal choice of the coordinate transformation parameters was found. However, the construction guidelines published so far rely on a certain restriction that is overcome with the formulation presented in this paper. Moreover, a modularization principle is formulated that significantly eases the construction of coordinate transformations in unit cells with reappearing shapes and complex sub-structures.

  16. 3D printed modular centrifugal contactors and method for separating moieties using 3D printed optimized surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, Kent E.

    The present invention provides an annular centrifugal contactor, having a housing to receive a plurality of liquids; a rotor inside the housing; an annular mixing zone, with a plurality of fluid retention reservoirs; and an adjustable stem that can be raised to restrict the flow of a liquid into the rotor or lowered to increase the flow of liquid into the rotor. The invention also provides a method for transferring moieties from a first liquid to a second liquid, the method having the steps of combining the fluids in a housing whose interior has helically shaped first channels; subjecting themore » fluids to a spinning rotor to produce a mixture, whereby the channels simultaneously conduct the mixture downwardly and upwardly; and passing the mixture through the rotor to contact second channels, whereby the channels pump the second liquid through a first aperture while the first fluid exits a second aperture.« less

  17. CERES: A Set of Automated Routines for Echelle Spectra

    NASA Astrophysics Data System (ADS)

    Brahm, Rafael; Jordán, Andrés; Espinoza, Néstor

    2017-03-01

    We present the Collection of Elemental Routines for Echelle Spectra (CERES). These routines were developed for the construction of automated pipelines for the reduction, extraction, and analysis of spectra acquired with different instruments, allowing the obtention of homogeneous and standardized results. This modular code includes tools for handling the different steps of the processing: CCD image reductions; identification and tracing of the echelle orders; optimal and rectangular extraction; computation of the wavelength solution; estimation of radial velocities; and rough and fast estimation of the atmospheric parameters. Currently, CERES has been used to develop automated pipelines for 13 different spectrographs, namely CORALIE, FEROS, HARPS, ESPaDOnS, FIES, PUCHEROS, FIDEOS, CAFE, DuPont/Echelle, Magellan/Mike, Keck/HIRES, Magellan/PFS, and APO/ARCES, but the routines can be easily used to deal with data coming from other spectrographs. We show the high precision in radial velocity that CERES achieves for some of these instruments, and we briefly summarize some results that have already been obtained using the CERES pipelines.

  18. One step DNA assembly for combinatorial metabolic engineering.

    PubMed

    Coussement, Pieter; Maertens, Jo; Beauprez, Joeri; Van Bellegem, Wouter; De Mey, Marjan

    2014-05-01

    The rapid and efficient assembly of multi-step metabolic pathways for generating microbial strains with desirable phenotypes is a critical procedure for metabolic engineering, and remains a significant challenge in synthetic biology. Although several DNA assembly methods have been developed and applied for metabolic pathway engineering, many of them are limited by their suitability for combinatorial pathway assembly. The introduction of transcriptional (promoters), translational (ribosome binding site (RBS)) and enzyme (mutant genes) variability to modulate pathway expression levels is essential for generating balanced metabolic pathways and maximizing the productivity of a strain. We report a novel, highly reliable and rapid single strand assembly (SSA) method for pathway engineering. The method was successfully optimized and applied to create constructs containing promoter, RBS and/or mutant enzyme libraries. To demonstrate its efficiency and reliability, the method was applied to fine-tune multi-gene pathways. Two promoter libraries were simultaneously introduced in front of two target genes, enabling orthogonal expression as demonstrated by principal component analysis. This shows that SSA will increase our ability to tune multi-gene pathways at all control levels for the biotechnological production of complex metabolites, achievable through the combinatorial modulation of transcription, translation and enzyme activity. Copyright © 2014 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  19. Beam shaping as an enabler for new applications

    NASA Astrophysics Data System (ADS)

    Guertler, Yvonne; Kahmann, Max; Havrilla, David

    2017-02-01

    For many years, laser beam shaping has enabled users to achieve optimized process results as well as manage challenging applications. The latest advancements in industrial lasers and processing optics have taken this a step further as users are able to adapt the beam shape to meet specific application requirements in a very flexible way. TRUMPF has developed a wide range of experience in creating beam profiles at the work piece for optimized material processing. This technology is based on the physical model of wave optics and can be used with ultra short pulse lasers as well as multi-kW cw lasers. Basically, the beam shape can be adapted in all three dimensions in space, which allows maximum flexibility. Besides adaption of intensity profile, even multi-spot geometries can be produced. This approach is very cost efficient, because a standard laser source and (in the case of cw lasers) a standard fiber can be used without any special modifications. Based on this innovative beam shaping technology, TRUMPF has developed new and optimized processes. Two of the most recent application developments using these techniques are cutting glass and synthetic sapphire with ultra-short pulse lasers and enhanced brazing of hot dip zinc coated steel for automotive applications. Both developments lead to more efficient and flexible production processes, enabled by laser technology and open the door to new opportunities. They also indicate the potential of beam shaping techniques since they can be applied to both single-mode laser sources (TOP Cleave) and multi-mode laser sources (brazing).

  20. Directivity pattern of the sound radiated from axisymmetric stepped plates.

    PubMed

    He, Xiping; Yan, Xiuli; Li, Na

    2016-08-01

    For the purpose of optimal design and efficient utilization of the kind of stepped plate radiator in air, in this contribution, an approach for calculation of the directivity pattern of the sound radiated from a stepped plate in flexural vibration with a free edge is developed based on Kirchhoff-Love hypothesis and Rayleigh integral principle. Experimental tests of directivity pattern for a fabricated flat plate and two fabricated plates with one and two step radiators were carried out. It shows that the configuration of the measured directivity patterns by the proposed analytic approach is similar to those of the calculated approach. Comparison of the agreement between the calculated directivity pattern of a stepped plate and its corresponding theoretical piston show that the former radiator is equivalent to the latter, and the diffraction field generated by the unbaffled upper surface may be small. It also shows that the directivity pattern of a stepped radiator is independent of the metallic material but dependent on the thickness of base plate and resonant frequency. The thicker the thickness of base plate, the more directive the radiation is. The proposed analytic approach in this work may be adopted for any other plates with multi-steps.

  1. A single-phase multi-level D-STATCOM inverter using modular multi-level converter (MMC) topology for renewable energy sources

    NASA Astrophysics Data System (ADS)

    Sotoodeh, Pedram

    This dissertation presents the design of a novel multi-level inverter with FACTS capability for small to mid-size (10-20kW) permanent-magnet wind installations using modular multi-level converter (MMC) topology. The aim of the work is to design a new type of inverter with D-STATCOM option to provide utilities with more control on active and reactive power transfer of distribution lines. The inverter is placed between the renewable energy source, specifically a wind turbine, and the distribution grid in order to fix the power factor of the grid at a target value, regardless of wind speed, by regulating active and reactive power required by the grid. The inverter is capable of controlling active and reactive power by controlling the phase angle and modulation index, respectively. The unique contribution of the proposed work is to combine the two concepts of inverter and D-STATCOM using a novel voltage source converter (VSC) multi-level topology in a single unit without additional cost. Simulations of the proposed inverter, with 5 and 11 levels, have been conducted in MATLAB/Simulink for two systems including 20 kW/kVAR and 250 W/VAR. To validate the simulation results, a scaled version (250 kW/kVAR) of the proposed inverter with 5 and 11 levels has been built and tested in the laboratory. Experimental results show that the reduced-scale 5- and 11-level inverter is able to fix PF of the grid as well as being compatible with IEEE standards. Furthermore, total cost of the prototype models, which is one of the major objectives of this research, is comparable with market prices.

  2. Construction and analysis of lncRNA-lncRNA synergistic networks to reveal clinically relevant lncRNAs in cancer.

    PubMed

    Li, Yongsheng; Chen, Juan; Zhang, Jinwen; Wang, Zishan; Shao, Tingting; Jiang, Chunjie; Xu, Juan; Li, Xia

    2015-09-22

    Long non-coding RNAs (lncRNAs) play key roles in diverse biological processes. Moreover, the development and progression of cancer often involves the combined actions of several lncRNAs. Here we propose a multi-step method for constructing lncRNA-lncRNA functional synergistic networks (LFSNs) through co-regulation of functional modules having three features: common coexpressed genes of lncRNA pairs, enrichment in the same functional category and close proximity within protein interaction networks. Applied to three cancers, we constructed cancer-specific LFSNs and found that they exhibit a scale free and modular architecture. In addition, cancer-associated lncRNAs tend to be hubs and are enriched within modules. Although there is little synergistic pairing of lncRNAs across cancers, lncRNA pairs involved in the same cancer hallmarks by regulating same or different biological processes. Finally, we identify prognostic biomarkers within cancer lncRNA expression datasets using modules derived from LFSNs. In summary, this proof-of-principle study indicates synergistic lncRNA pairs can be identified through integrative analysis of genome-wide expression data sets and functional information.

  3. Discovering Multimodal Behavior in Ms. Pac-Man through Evolution of Modular Neural Networks.

    PubMed

    Schrum, Jacob; Miikkulainen, Risto

    2016-03-12

    Ms. Pac-Man is a challenging video game in which multiple modes of behavior are required: Ms. Pac-Man must escape ghosts when they are threats and catch them when they are edible, in addition to eating all pills in each level. Past approaches to learning behavior in Ms. Pac-Man have treated the game as a single task to be learned using monolithic policy representations. In contrast, this paper uses a framework called Modular Multi-objective NEAT (MM-NEAT) to evolve modular neural networks. Each module defines a separate behavior. The modules are used at different times according to a policy that can be human-designed (i.e. Multitask) or discovered automatically by evolution. The appropriate number of modules can be fixed or discovered using a genetic operator called Module Mutation. Several versions of Module Mutation are evaluated in this paper. Both fixed modular networks and Module Mutation networks outperform monolithic networks and Multitask networks. Interestingly, the best networks dedicate modules to critical behaviors (such as escaping when surrounded after luring ghosts near a power pill) that do not follow the customary division of the game into chasing edible and escaping threat ghosts. The results demonstrate that MM-NEAT can discover interesting and effective behavior for agents in challenging games.

  4. MOLAR: Modular Linux and Adaptive Runtime Support for HEC OS/R Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frank Mueller

    2009-02-05

    MOLAR is a multi-institution research effort that concentrates on adaptive, reliable,and efficient operating and runtime system solutions for ultra-scale high-end scientific computing on the next generation of supercomputers. This research addresses the challenges outlined by the FAST-OS - forum to address scalable technology for runtime and operating systems --- and HECRTF --- high-end computing revitalization task force --- activities by providing a modular Linux and adaptable runtime support for high-end computing operating and runtime systems. The MOLAR research has the following goals to address these issues. (1) Create a modular and configurable Linux system that allows customized changes based onmore » the requirements of the applications, runtime systems, and cluster management software. (2) Build runtime systems that leverage the OS modularity and configurability to improve efficiency, reliability, scalability, ease-of-use, and provide support to legacy and promising programming models. (3) Advance computer reliability, availability and serviceability (RAS) management systems to work cooperatively with the OS/R to identify and preemptively resolve system issues. (4) Explore the use of advanced monitoring and adaptation to improve application performance and predictability of system interruptions. The overall goal of the research conducted at NCSU is to develop scalable algorithms for high-availability without single points of failure and without single points of control.« less

  5. Discovering Multimodal Behavior in Ms. Pac-Man through Evolution of Modular Neural Networks

    PubMed Central

    Schrum, Jacob; Miikkulainen, Risto

    2015-01-01

    Ms. Pac-Man is a challenging video game in which multiple modes of behavior are required: Ms. Pac-Man must escape ghosts when they are threats and catch them when they are edible, in addition to eating all pills in each level. Past approaches to learning behavior in Ms. Pac-Man have treated the game as a single task to be learned using monolithic policy representations. In contrast, this paper uses a framework called Modular Multi-objective NEAT (MM-NEAT) to evolve modular neural networks. Each module defines a separate behavior. The modules are used at different times according to a policy that can be human-designed (i.e. Multitask) or discovered automatically by evolution. The appropriate number of modules can be fixed or discovered using a genetic operator called Module Mutation. Several versions of Module Mutation are evaluated in this paper. Both fixed modular networks and Module Mutation networks outperform monolithic networks and Multitask networks. Interestingly, the best networks dedicate modules to critical behaviors (such as escaping when surrounded after luring ghosts near a power pill) that do not follow the customary division of the game into chasing edible and escaping threat ghosts. The results demonstrate that MM-NEAT can discover interesting and effective behavior for agents in challenging games. PMID:27030803

  6. Traffic-aware energy saving scheme with modularization supporting in TWDM-PON

    NASA Astrophysics Data System (ADS)

    Xiong, Yu; Sun, Peng; Liu, Chuanbo; Guan, Jianjun

    2017-01-01

    Time and wavelength division multiplexed passive optical network (TWDM-PON) is considered to be a primary solution for next-generation passive optical network stage 2 (NG-PON2). Due to the feature of multi-wavelength transmission of TWDM-PON, some of the transmitters/receivers at the optical line terminal (OLT) could be shut down to reduce the energy consumption. Therefore, a novel scheme called traffic-aware energy saving scheme with modularization supporting is proposed. Through establishing the modular energy consumption model of OLT, the wavelength transmitters/receivers at OLT could be switched on or shut down adaptively depending on sensing the status of network traffic load, thus the energy consumption of OLT will be effectively reduced. Furthermore, exploring the technology of optical network unit (ONU) modularization, each module of ONU could be switched to sleep or active mode independently in order to reduce the energy consumption of ONU. Simultaneously, the polling sequence of ONU could be changed dynamically via sensing the packet arrival time. In order to guarantee the delay performance of network traffic, the sub-cycle division strategy is designed to transmit the real-time traffic preferentially. Finally, simulation results verify that the proposed scheme is able to reduce the energy consumption of the network while maintaining the traffic delay performance.

  7. Chitosan from shrimp shells: A renewable sorbent applied to the clean-up step of the QuEChERS method in order to determine multi-residues of veterinary drugs in different types of milk.

    PubMed

    Arias, Jean Lucas de Oliveira; Schneider, Antunielle; Batista-Andrade, Jahir Antonio; Vieira, Augusto Alves; Caldas, Sergiane Souza; Primel, Ednei Gilberto

    2018-02-01

    Clean extracts are essential in LC-MS/MS, since the matrix effect can interfere in the analysis. Alternative materials which can be used as sorbents, such as chitosan in the clean-up step, are cheap and green options. In this study, chitosan from shrimp shell waste was evaluated as a sorbent in the QuEChERS method in order to determine multi-residues of veterinary drugs in different types of milk, i. e., fatty matrices. After optimization, the method showed correlation coefficients above 0.99, LOQs ranged between 1 and 50μgkg -1 and recoveries ranged between 62 and 125%, with RSD<20% for all veterinary drugs in all types of milk under study. The clean-up step which employed chitosan proved to be effective, since it reduced both the matrix effect (from values between -40 and -10% to values from -10 to +10%) and the extract turbidity (up to 95%). When the proposed method was applied to different milk samples, residues of albendazole (49μgkg -1 ), sulfamethazine (

  8. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography.

    PubMed

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L; Armour, Wes; Waterman, David G; Iwata, So; Evans, Gwyndaf

    2013-08-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

  9. Clustering procedures for the optimal selection of data sets from multiple crystals in macromolecular crystallography

    PubMed Central

    Foadi, James; Aller, Pierre; Alguel, Yilmaz; Cameron, Alex; Axford, Danny; Owen, Robin L.; Armour, Wes; Waterman, David G.; Iwata, So; Evans, Gwyndaf

    2013-01-01

    The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein. PMID:23897484

  10. Distributed optimisation problem with communication delay and external disturbance

    NASA Astrophysics Data System (ADS)

    Tran, Ngoc-Tu; Xiao, Jiang-Wen; Wang, Yan-Wu; Yang, Wu

    2017-12-01

    This paper investigates the distributed optimisation problem for the multi-agent systems (MASs) with the simultaneous presence of external disturbance and the communication delay. To solve this problem, a two-step design scheme is introduced. In the first step, based on the internal model principle, the internal model term is constructed to compensate the disturbance asymptotically. In the second step, a distributed optimisation algorithm is designed to solve the distributed optimisation problem based on the MASs with the simultaneous presence of disturbance and communication delay. Moreover, in the proposed algorithm, each agent interacts with its neighbours through the connected topology and the delay occurs during the information exchange. By utilising Lyapunov-Krasovskii functional, the delay-dependent conditions are derived for both slowly and fast time-varying delay, respectively, to ensure the convergence of the algorithm to the optimal solution of the optimisation problem. Several numerical simulation examples are provided to illustrate the effectiveness of the theoretical results.

  11. NH11B-1726: FrankenRaven: A New Platform for Remote Sensing

    NASA Technical Reports Server (NTRS)

    Dahlgren, Robert; Fladeland, Matthew M.; Pinsker, Ethan A.; Jasionowicz, John P.; Jones, Lowell L.; Pscheid, Matthew J.

    2016-01-01

    Small, modular aircraft are an emerging technology with a goal to maximize flexibility and enable multi-mission support. This reports the progress of an unmanned aerial system (UAS) project conducted at the NASA Ames Research Center (ARC) in 2016. This interdisciplinary effort builds upon the success of the 2014 FrankenEye project to apply rapid prototyping techniques to UAS, to develop a variety of platforms to host remote sensing instruments. In 2016, ARC received AeroVironment RQ-11A and RQ-11B Raven UAS from the US Department of the Interior, Office of Aviation Services. These aircraft have electric propulsion, a wingspan of roughly 1.3m, and have demonstrated reliability in challenging environments. The Raven airframe is an ideal foundation to construct more complex aircraft, and student interns using 3D printing were able to graft multiple Raven wings and fuselages into FrankenRaven aircraft. Aeronautical analysis shows that the new configuration has enhanced flight time, payload capacity, and distance compared to the original Raven. The FrankenRaven avionics architecture replaces the mil-spec avionics with COTS technology based upon the 3DR Pixhawk PX4 autopilot with a safety multiplexer for failsafe handoff to 2.4 GHz RC control and 915 MHz telemetry. This project demonstrates how design reuse, rapid prototyping, and modular subcomponents can be leveraged into flexible airborne platforms that can host a variety of remote sensing payloads and even multiple payloads. Modularity advances a new paradigm: mass-customization of aircraft around given payload(s). Multi-fuselage designs are currently under development to host a wide variety of payloads including a zenith-pointing spectrometer, a magnetometer, a multi-spectral camera, and a RGB camera. After airworthiness certification, flight readiness review, and test flights are performed at Crows Landing airfield in central California, field data will be taken at Kilauea volcano in Hawaii and other locations.

  12. FrankenRaven: A New Platform for Remote Sensing

    NASA Astrophysics Data System (ADS)

    Dahlgren, R. P.; Fladeland, M. M.; Pinsker, E. A.; Jasionowicz, J. P.; Jones, L. L.; Mosser, C. D.; Pscheid, M. J.; Weidow, N. L.; Kelly, P. J.; Kern, C.; Werner, C. A.; Johnson, M. S.

    2016-12-01

    Small, modular aircraft are an emerging technology with a goal to maximize flexibility and enable multi-mission support. This reports the progress of an unmanned aerial system (UAS) project conducted at the NASA Ames Research Center (ARC) in 2016. This interdisciplinary effort builds upon the success of the 2014 FrankenEye project to apply rapid prototyping techniques to UAS, to develop a variety of platforms to host remote sensing instruments. In 2016, ARC received AeroVironment RQ-11A and RQ-11B Raven UAS from the US Department of the Interior, Office of Aviation Services. These aircraft have electric propulsion, a wingspan of roughly 1.3m, and have demonstrated reliability in challenging environments. The Raven airframe is an ideal foundation to construct more complex aircraft, and student interns using 3D printing were able to graft multiple Raven wings and fuselages into "FrankenRaven" aircraft. Aeronautical analysis shows that the new configuration has enhanced flight time, payload capacity, and distance compared to the original Raven. The FrankenRaven avionics architecture replaces the mil-spec avionics with COTS technology based upon the 3DR Pixhawk PX4 autopilot with a safety multiplexer for failsafe handoff to 2.4 GHz RC control and 915 MHz telemetry. This project demonstrates how design reuse, rapid prototyping, and modular subcomponents can be leveraged into flexible airborne platforms that can host a variety of remote sensing payloads and even multiple payloads. Modularity advances a new paradigm: mass-customization of aircraft around given payload(s). Multi-fuselage designs are currently under development to host a wide variety of payloads including a zenith-pointing spectrometer, a magnetometer, a multi-spectral camera, and a RGB camera. After airworthiness certification, flight readiness review, and test flights are performed at Crows Landing airfield in central California, field data will be taken at Kilauea volcano in Hawaii and other locations.

  13. A modular approach to adaptive structures.

    PubMed

    Pagitz, Markus; Pagitz, Manuel; Hühne, Christian

    2014-10-07

    A remarkable property of nastic, shape changing plants is their complete fusion between actuators and structure. This is achieved by combining a large number of cells whose geometry, internal pressures and material properties are optimized for a given set of target shapes and stiffness requirements. An advantage of such a fusion is that cell walls are prestressed by cell pressures which increases, decreases the overall structural stiffness, weight. Inspired by the nastic movement of plants, Pagitz et al (2012 Bioinspir. Biomim. 7) published a novel concept for pressure actuated cellular structures. This article extends previous work by introducing a modular approach to adaptive structures. An algorithm that breaks down any continuous target shapes into a small number of standardized modules is presented. Furthermore it is shown how cytoskeletons within each cell enhance the properties of adaptive modules. An adaptive passenger seat and an aircrafts leading, trailing edge is used to demonstrate the potential of a modular approach.

  14. Low-cost modular array-field designs for flat-panel and concentrator photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Post, H. N.; Carmichael, D. C.; Alexander, G.; Castle, J. A.

    1982-09-01

    Described are the design and development of low-cost, modular array fields for flat-panel and concentrator photovoltaic (PV) systems. The objective of the work was to reduce substantially the cost of the array-field Balance-of-System (BOS) subsystems and site-specific design costs as compared to previous PV installations. These subsystems include site preparation, foundations, support structures, electrical writing, grounding, lightning protection, electromagnetic interference considerations, and controls. To reduce these BOS and design costs, standardized modular (building-block) designs for flat-panel and concentrator array fields have been developed that are fully integrated and optimized for lowest life-cycle costs. Using drawings and specifications now available, these building-block designs can be used in multiples to install various size array fields. The developed designs are immediately applicable (1982) and reduce the array-field BOS costs to a fraction of previous costs.

  15. A mathematical formulation for interface-based modular product design with geometric and weight constraints

    NASA Astrophysics Data System (ADS)

    Jung-Woon Yoo, John

    2016-06-01

    Since customer preferences change rapidly, there is a need for design processes with shorter product development cycles. Modularization plays a key role in achieving mass customization, which is crucial in today's competitive global market environments. Standardized interfaces among modularized parts have facilitated computational product design. To incorporate product size and weight constraints during computational design procedures, a mixed integer programming formulation is presented in this article. Product size and weight are two of the most important design parameters, as evidenced by recent smart-phone products. This article focuses on the integration of geometric, weight and interface constraints into the proposed mathematical formulation. The formulation generates the optimal selection of components for a target product, which satisfies geometric, weight and interface constraints. The formulation is verified through a case study and experiments are performed to demonstrate the performance of the formulation.

  16. Divergent Synthesis of Quinolone Natural Products from Pseudonocardia sp. CL38489.

    PubMed

    Geddis, Stephen M; Carro, Laura; Hodgkinson, James T; Spring, David R

    2016-12-01

    Two divergent synthetic routes are reported offering access to four quinolone natural products from Pseudonocardia sp. CL38489. Key steps to the natural products involved a regioselective epoxidation, an intramolecular Buchwald-Hartwig amination and a final acid-catalysed 1,3-allylic-alcohol rearrangement to give two of the natural products in one step. This study completes the synthesis of all eight antibacterial quinolone natural products reported in the family. In addition, this modular strategy enables an improved synthesis towards two natural products previously reported.

  17. A real-time multi-channel monitoring system for stem cell culture process.

    PubMed

    Xicai Yue; Drakakis, E M; Lim, M; Radomska, A; Hua Ye; Mantalaris, A; Panoskaltsis, N; Cass, A

    2008-06-01

    A novel, up to 128 channels, multi-parametric physiological measurement system suitable for monitoring hematopoietic stem cell culture processes and cell cultures in general is presented in this paper. The system aims to measure in real-time the most important physical and chemical culture parameters of hematopoietic stem cells, including physicochemical parameters, nutrients, and metabolites, in a long-term culture process. The overarching scope of this research effort is to control and optimize the whole bioprocess by means of the acquisition of real-time quantitative physiological information from the culture. The system is designed in a modular manner. Each hardware module can operate as an independent gain programmable, level shift adjustable, 16 channel data acquisition system specific to a sensor type. Up to eight such data acquisition modules can be combined and connected to the host PC to realize the whole system hardware. The control of data acquisition and the subsequent management of data is performed by the system's software which is coded in LabVIEW. Preliminary experimental results presented here show that the system not only has the ability to interface to various types of sensors allowing the monitoring of different types of culture parameters. Moreover, it can capture dynamic variations of culture parameters by means of real-time multi-channel measurements thus providing additional information on both temporal and spatial profiles of these parameters within a bioreactor. The system is by no means constrained in the hematopoietic stem cell culture field only. It is suitable for cell growth monitoring applications in general.

  18. Harmony Theory: A Mathematical Framework for Stochastic Parallel Processing.

    ERIC Educational Resources Information Center

    Smolensky, Paul

    This paper presents preliminary results of research founded on the hypothesis that in real environments there exist regularities that can be idealized as mathematical structures that are simple enough to be analyzed. The author considered three steps in analyzing the encoding of modularity of the environment. First, a general information…

  19. Design of Ultra-Wideband Tapered Slot Antenna by Using Binomial Transformer with Corrugation

    NASA Astrophysics Data System (ADS)

    Chareonsiri, Yosita; Thaiwirot, Wanwisa; Akkaraekthalin, Prayoot

    2017-05-01

    In this paper, the tapered slot antenna (TSA) with corrugation is proposed for UWB applications. The multi-section binomial transformer is used to design taper profile of the proposed TSA that does not involve using time consuming optimization. A step-by-step procedure for synthesis of the step impedance values related with step slot widths of taper profile is presented. The smooth taper can be achieved by fitting the smoothing curve to the entire step slot. The design of TSA based on this method yields results with a quite flat gain and wide impedance bandwidth covering UWB spectrum from 3.1 GHz to 10.6 GHz. To further improve the radiation characteristics, the corrugation is added on the both edges of the proposed TSA. The effects of different corrugation shapes on the improvement of antenna gain and front-to-back ratio (F-to-B ratio) are investigated. To demonstrate the validity of the design, the prototypes of TSA without and with corrugation are fabricated and measured. The results show good agreement between simulation and measurement.

  20. Metaphase II oocytes from human unilaminar follicles grown in a multi-step culture system.

    PubMed

    McLaughlin, M; Albertini, D F; Wallace, W H B; Anderson, R A; Telfer, E E

    2018-03-01

    Can complete oocyte development be achieved from human ovarian tissue containing primordial/unilaminar follicles and grown in vitro in a multi-step culture to meiotic maturation demonstrated by the formation of polar bodies and a Metaphase II spindle? Development of human oocytes from primordial/unilaminar stages to resumption of meiosis (Metaphase II) and emission of a polar body was achieved within a serum free multi-step culture system. Complete development of oocytes in vitro has been achieved in mouse, where in vitro grown (IVG) oocytes from primordial follicles have resulted in the production of live offspring. Human oocytes have been grown in vitro from the secondary/multi-laminar stage to obtain fully grown oocytes capable of meiotic maturation. However, there are no reports of a culture system supporting complete growth from the earliest stages of human follicle development through to Metaphase II. Ovarian cortical biopsies were obtained with informed consent from women undergoing elective caesarean section (mean age: 30.7 ± 1.7; range: 25-39 years, n = 10). Laboratory setting. Ovarian biopsies were dissected into thin strips, and after removal of growing follicles were cultured in serum free medium for 8 days (Step 1). At the end of this period secondary/multi-laminar follicles were dissected from the strips and intact follicles 100-150 μm in diameter were selected for further culture. Isolated follicles were cultured individually in serum free medium in the presence of 100 ng/ml of human recombinant Activin A (Step 2). Individual follicles were monitored and after 8 days, cumulus oocyte complexes (COCs) were retrieved by gentle pressure on the cultured follicles. Complexes with complete cumulus and adherent mural granulosa cells were selected and cultured in the presence of Activin A and FSH on membranes for a further 4 days (Step 3). At the end of Step 3, complexes containing oocytes >100 μm diameter were selected for IVM in SAGE medium (Step 4) then fixed for analysis. Pieces of human ovarian cortex cultured in serum free medium for 8 days (Step 1) supported early follicle growth and 87 secondary follicles of diameter 120 ± 6 μm (mean ± SEM) could be dissected for further culture. After a further 8 days, 54 of the 87 follicles had reached the antral stage of development. COCs were retrieved by gentle pressure from the cultured follicles and those with adherent mural granulosa cells (n = 48) were selected and cultured for a further 4 days (Step 3). At the end of Step 3, 32 complexes contained oocytes >100 μm diameter were selected for IVM (Step 4). Nine of these complexes contained polar bodies within 24 h and all polar bodies were abnormally large. Confocal immuno-histochemical analysis showed the presence of a Metaphase II spindle confirming that these IVG oocytes had resumed meiosis but their developmental potential is unknown. This is a small number of samples but provides proof of concept that complete development of human oocytes can occur in vitro. Further optimization with morphological evaluation and fertilization potential of IVG oocytes is required to determine whether they are normal. The ability to develop human oocytes from the earliest follicular stages in vitro through to maturation and fertilization would benefit fertility preservation practice. Funded by MRC Grants (G0901839 and MR/L00299X/1). No competing interests.

  1. Task-based image quality assessment in radiation therapy: initial characterization and demonstration with CT simulation images

    NASA Astrophysics Data System (ADS)

    Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua

    2017-03-01

    In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.

  2. Phase change cellular automata modeling of GeTe, GaSb and SnSe stacked chalcogenide films

    NASA Astrophysics Data System (ADS)

    Mihai, C.; Velea, A.

    2018-06-01

    Data storage needs are increasing at a rapid pace across all economic sectors, so the need for new memory technologies with adequate capabilities is also high. Phase change memories (PCMs) are a leading contender in the emerging race for non-volatile memories due to their fast operation speed, high scalability, good reliability and low power consumption. However, in order to meet the present and future storage demands, PCM technologies must further increase the storage density. Here, we employ a probabilistic cellular automata approach to explore the multi-step threshold switching from the reset (off) to the set (on) state in chalcogenide stacked structures. Simulations have shown that in order to obtain multi-step switching with high contrast among different resistance states, the stacked structure needs to contain materials with a large difference among their crystallization temperatures and careful tuning of strata thicknesses. The crystallization dynamics can be controlled through the external energy pulses applied to the system, in such a way that a balance between nucleation and growth in phase change behavior can be achieved, optimized for PCMs.

  3. Application of dragonfly algorithm for optimal performance analysis of process parameters in turn-mill operations- A case study

    NASA Astrophysics Data System (ADS)

    Vikram, K. Arun; Ratnam, Ch; Lakshmi, VVK; Kumar, A. Sunny; Ramakanth, RT

    2018-02-01

    Meta-heuristic multi-response optimization methods are widely in use to solve multi-objective problems to obtain Pareto optimal solutions during optimization. This work focuses on optimal multi-response evaluation of process parameters in generating responses like surface roughness (Ra), surface hardness (H) and tool vibration displacement amplitude (Vib) while performing operations like tangential and orthogonal turn-mill processes on A-axis Computer Numerical Control vertical milling center. Process parameters like tool speed, feed rate and depth of cut are considered as process parameters machined over brass material under dry condition with high speed steel end milling cutters using Taguchi design of experiments (DOE). Meta-heuristic like Dragonfly algorithm is used to optimize the multi-objectives like ‘Ra’, ‘H’ and ‘Vib’ to identify the optimal multi-response process parameters combination. Later, the results thus obtained from multi-objective dragonfly algorithm (MODA) are compared with another multi-response optimization technique Viz. Grey relational analysis (GRA).

  4. Hardware Design and Testing of SUPERball, A Modular Tensegrity Robot

    NASA Technical Reports Server (NTRS)

    Sabelhaus, Andrew P.; Bruce, Jonathan; Caluwaerts, Ken; Chen, Yangxin; Lu, Dizhou; Liu, Yuejia; Agogino, Adrian K.; SunSpiral, Vytas; Agogino, Alice M.

    2014-01-01

    We are developing a system of modular, autonomous "tensegrity end-caps" to enable the rapid exploration of untethered tensegrity robot morphologies and functions. By adopting a self-contained modular approach, different end-caps with various capabilities (such as peak torques, or motor speeds), can be easily combined into new tensegrity robots composed of rods, cables, and actuators of different scale (such as in length, mass, peak loads, etc). As a first step in developing this concept, we are in the process of designing and testing the end-caps for SUPERball (Spherical Underactuated Planetary Exploration Robot), a project at the Dynamic Tensegrity Robotics Lab (DTRL) within NASA Ames's Intelligent Robotics Group. This work discusses the evolving design concepts and test results that have gone into the structural, mechanical, and sensing aspects of SUPERball. This representative tensegrity end-cap design supports robust and repeatable untethered mobility tests of the SUPERball, while providing high force, high displacement actuation, with a low-friction, compliant cabling system.

  5. Generation and development of RNA ligase ribozymes with modular architecture through "design and selection".

    PubMed

    Fujita, Yuki; Ishikawa, Junya; Furuta, Hiroyuki; Ikawa, Yoshiya

    2010-08-26

    In vitro selection with long random RNA libraries has been used as a powerful method to generate novel functional RNAs, although it often requires laborious structural analysis of isolated RNA molecules. Rational RNA design is an attractive alternative to avoid this laborious step, but rational design of catalytic modules is still a challenging task. A hybrid strategy of in vitro selection and rational design has been proposed. With this strategy termed "design and selection," new ribozymes can be generated through installation of catalytic modules onto RNA scaffolds with defined 3D structures. This approach, the concept of which was inspired by the modular architecture of naturally occurring ribozymes, allows prediction of the overall architectures of the resulting ribozymes, and the structural modularity of the resulting ribozymes allows modification of their structures and functions. In this review, we summarize the design, generation, properties, and engineering of four classes of ligase ribozyme generated by design and selection.

  6. Structural damage detection-oriented multi-type sensor placement with multi-objective optimization

    NASA Astrophysics Data System (ADS)

    Lin, Jian-Fu; Xu, You-Lin; Law, Siu-Seong

    2018-05-01

    A structural damage detection-oriented multi-type sensor placement method with multi-objective optimization is developed in this study. The multi-type response covariance sensitivity-based damage detection method is first introduced. Two objective functions for optimal sensor placement are then introduced in terms of the response covariance sensitivity and the response independence. The multi-objective optimization problem is formed by using the two objective functions, and the non-dominated sorting genetic algorithm (NSGA)-II is adopted to find the solution for the optimal multi-type sensor placement to achieve the best structural damage detection. The proposed method is finally applied to a nine-bay three-dimensional frame structure. Numerical results show that the optimal multi-type sensor placement determined by the proposed method can avoid redundant sensors and provide satisfactory results for structural damage detection. The restriction on the number of each type of sensors in the optimization can reduce the searching space in the optimization to make the proposed method more effective. Moreover, how to select a most optimal sensor placement from the Pareto solutions via the utility function and the knee point method is demonstrated in the case study.

  7. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khawli, Toufik Al; Eppelt, Urs; Hermanns, Torsten

    2016-06-08

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part ismore » to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.« less

  8. An integrated approach for the knowledge discovery in computer simulation models with a multi-dimensional parameter space

    NASA Astrophysics Data System (ADS)

    Khawli, Toufik Al; Gebhardt, Sascha; Eppelt, Urs; Hermanns, Torsten; Kuhlen, Torsten; Schulz, Wolfgang

    2016-06-01

    In production industries, parameter identification, sensitivity analysis and multi-dimensional visualization are vital steps in the planning process for achieving optimal designs and gaining valuable information. Sensitivity analysis and visualization can help in identifying the most-influential parameters and quantify their contribution to the model output, reduce the model complexity, and enhance the understanding of the model behavior. Typically, this requires a large number of simulations, which can be both very expensive and time consuming when the simulation models are numerically complex and the number of parameter inputs increases. There are three main constituent parts in this work. The first part is to substitute the numerical, physical model by an accurate surrogate model, the so-called metamodel. The second part includes a multi-dimensional visualization approach for the visual exploration of metamodels. In the third part, the metamodel is used to provide the two global sensitivity measures: i) the Elementary Effect for screening the parameters, and ii) the variance decomposition method for calculating the Sobol indices that quantify both the main and interaction effects. The application of the proposed approach is illustrated with an industrial application with the goal of optimizing a drilling process using a Gaussian laser beam.

  9. STS multimission modular spacecraft - A new horizon in social and industrial benefits

    NASA Technical Reports Server (NTRS)

    Cepollina, F. J.; Pritchard, E. I.

    1977-01-01

    Economics and benefits of orbiting observatory Multi-mission Modular Spacecraft are discussed. The Space Shuttle can be used both to place these satellites in low-altitude workhorse orbits and to maintain their functioning in any of three ways - on-orbit service (by visiting the satellite with the Shuttle), Shuttle retrieval and ground refurbishment of the entire satellite, and replacement of a failed satellite with a new one. Individuals could receive information from these satellites either indirectly, by turning to a specialty television station, or directly, by calling up information on their television sets as needed. Such information, for example, might include reconnoitering interesting areas, following the weather, or locating migratory fish. Equipment and costs of the proposed Landsats and Seasats are discussed.

  10. Design and Development of a Low-Cost Aerial Mobile Mapping System for Multi-Purpose Applications

    NASA Astrophysics Data System (ADS)

    Acevedo Pardo, C.; Farjas Abadía, M.; Sternberg, H.

    2015-08-01

    The research project with the working title "Design and development of a low-cost modular Aerial Mobile Mapping System" was formed during the last year as the result from numerous discussions and considerations with colleagues from the HafenCity University Hamburg, Department Geomatics. The aim of the project is to design a sensor platform which can be embedded preferentially on an UAV, but also can be integrated on any adaptable vehicle. The system should perform a direct scanning of surfaces with a laser scanner and supported through sensors for determining the position and attitude of the platform. The modular design allows his extension with other sensors such as multispectral cameras, digital cameras or multiple cameras systems.

  11. A modular solid state detector for measuring high energy heavy ion fragmentation near the beam axis

    NASA Technical Reports Server (NTRS)

    Zeitlin, C. J.; Frankel, K. A.; Gong, W.; Heilbronn, L.; Lampo, E. J.; Leres, R.; Miller, J.; Schimmerling, W.

    1994-01-01

    A multi-element solid state detector has been designed to measure fluences of fragments produced near the beam axis by high energy heavy ion beams in thick targets. The detector is compact and modular, so as to be readily reconfigured according to the range of fragment charges and energies to be measured. Preamplifier gain settings and detector calibrations are adjustable remotely under computer control. We describe the central detector, its associated detectors and electronics, triggering scheme, data acquisition and particle identification techniques, illustrated by data taken with 600 MeV/u 56Fe beams and thick polyethylene targets at the LBL Bevalac. The applications of this work to space radiation protection are discussed.

  12. Using CamiTK for rapid prototyping of interactive computer assisted medical intervention applications.

    PubMed

    Promayon, Emmanuel; Fouard, Céline; Bailet, Mathieu; Deram, Aurélien; Fiard, Gaëlle; Hungr, Nikolai; Luboz, Vincent; Payan, Yohan; Sarrazin, Johan; Saubat, Nicolas; Selmi, Sonia Yuki; Voros, Sandrine; Cinquin, Philippe; Troccaz, Jocelyne

    2013-01-01

    Computer Assisted Medical Intervention (CAMI hereafter) is a complex multi-disciplinary field. CAMI research requires the collaboration of experts in several fields as diverse as medicine, computer science, mathematics, instrumentation, signal processing, mechanics, modeling, automatics, optics, etc. CamiTK is a modular framework that helps researchers and clinicians to collaborate together in order to prototype CAMI applications by regrouping the knowledge and expertise from each discipline. It is an open-source, cross-platform generic and modular tool written in C++ which can handle medical images, surgical navigation, biomedicals simulations and robot control. This paper presents the Computer Assisted Medical Intervention ToolKit (CamiTK) and how it is used in various applications in our research team.

  13. Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR

    NASA Technical Reports Server (NTRS)

    Corpaccioli, Luca; Linskens, Harry; Komar, David R.

    2014-01-01

    The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.

  14. A derived heuristics based multi-objective optimization procedure for micro-grid scheduling

    NASA Astrophysics Data System (ADS)

    Li, Xin; Deb, Kalyanmoy; Fang, Yanjun

    2017-06-01

    With the availability of different types of power generators to be used in an electric micro-grid system, their operation scheduling as the load demand changes with time becomes an important task. Besides satisfying load balance constraints and the generator's rated power, several other practicalities, such as limited availability of grid power and restricted ramping of power output from generators, must all be considered during the operation scheduling process, which makes it difficult to decide whether the optimization results are accurate and satisfactory. In solving such complex practical problems, heuristics-based customized optimization algorithms are suggested. However, due to nonlinear and complex interactions of variables, it is difficult to come up with heuristics in such problems off-hand. In this article, a two-step strategy is proposed in which the first task deciphers important heuristics about the problem and the second task utilizes the derived heuristics to solve the original problem in a computationally fast manner. Specifically, the specific operation scheduling is considered from a two-objective (cost and emission) point of view. The first task develops basic and advanced level knowledge bases offline from a series of prior demand-wise optimization runs and then the second task utilizes them to modify optimized solutions in an application scenario. Results on island and grid connected modes and several pragmatic formulations of the micro-grid operation scheduling problem clearly indicate the merit of the proposed two-step procedure.

  15. Parallel Multi-Step/Multi-Rate Integration of Two-Time Scale Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Chang, Johnny T.; Ploen, Scott R.; Sohl, Garett. A,; Martin, Bryan J.

    2004-01-01

    Increasing demands on the fidelity of simulations for real-time and high-fidelity simulations are stressing the capacity of modern processors. New integration techniques are required that provide maximum efficiency for systems that are parallelizable. However many current techniques make assumptions that are at odds with non-cascadable systems. A new serial multi-step/multi-rate integration algorithm for dual-timescale continuous state systems is presented which applies to these systems, and is extended to a parallel multi-step/multi-rate algorithm. The superior performance of both algorithms is demonstrated through a representative example.

  16. A multi-scale convolutional neural network for phenotyping high-content cellular images.

    PubMed

    Godinez, William J; Hossain, Imtiaz; Lazic, Stanley E; Davies, John W; Zhang, Xian

    2017-07-01

    Identifying phenotypes based on high-content cellular images is challenging. Conventional image analysis pipelines for phenotype identification comprise multiple independent steps, with each step requiring method customization and adjustment of multiple parameters. Here, we present an approach based on a multi-scale convolutional neural network (M-CNN) that classifies, in a single cohesive step, cellular images into phenotypes by using directly and solely the images' pixel intensity values. The only parameters in the approach are the weights of the neural network, which are automatically optimized based on training images. The approach requires no a priori knowledge or manual customization, and is applicable to single- or multi-channel images displaying single or multiple cells. We evaluated the classification performance of the approach on eight diverse benchmark datasets. The approach yielded overall a higher classification accuracy compared with state-of-the-art results, including those of other deep CNN architectures. In addition to using the network to simply obtain a yes-or-no prediction for a given phenotype, we use the probability outputs calculated by the network to quantitatively describe the phenotypes. This study shows that these probability values correlate with chemical treatment concentrations. This finding validates further our approach and enables chemical treatment potency estimation via CNNs. The network specifications and solver definitions are provided in Supplementary Software 1. william_jose.godinez_navarro@novartis.com or xian-1.zhang@novartis.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. A modular approach to large-scale design optimization of aerospace systems

    NASA Astrophysics Data System (ADS)

    Hwang, John T.

    Gradient-based optimization and the adjoint method form a synergistic combination that enables the efficient solution of large-scale optimization problems. Though the gradient-based approach struggles with non-smooth or multi-modal problems, the capability to efficiently optimize up to tens of thousands of design variables provides a valuable design tool for exploring complex tradeoffs and finding unintuitive designs. However, the widespread adoption of gradient-based optimization is limited by the implementation challenges for computing derivatives efficiently and accurately, particularly in multidisciplinary and shape design problems. This thesis addresses these difficulties in two ways. First, to deal with the heterogeneity and integration challenges of multidisciplinary problems, this thesis presents a computational modeling framework that solves multidisciplinary systems and computes their derivatives in a semi-automated fashion. This framework is built upon a new mathematical formulation developed in this thesis that expresses any computational model as a system of algebraic equations and unifies all methods for computing derivatives using a single equation. The framework is applied to two engineering problems: the optimization of a nanosatellite with 7 disciplines and over 25,000 design variables; and simultaneous allocation and mission optimization for commercial aircraft involving 330 design variables, 12 of which are integer variables handled using the branch-and-bound method. In both cases, the framework makes large-scale optimization possible by reducing the implementation effort and code complexity. The second half of this thesis presents a differentiable parametrization of aircraft geometries and structures for high-fidelity shape optimization. Existing geometry parametrizations are not differentiable, or they are limited in the types of shape changes they allow. This is addressed by a novel parametrization that smoothly interpolates aircraft components, providing differentiability. An unstructured quadrilateral mesh generation algorithm is also developed to automate the creation of detailed meshes for aircraft structures, and a mesh convergence study is performed to verify that the quality of the mesh is maintained as it is refined. As a demonstration, high-fidelity aerostructural analysis is performed for two unconventional configurations with detailed structures included, and aerodynamic shape optimization is applied to the truss-braced wing, which finds and eliminates a shock in the region bounded by the struts and the wing.

  18. Resolving anatomical and functional structure in human brain organization: identifying mesoscale organization in weighted network representations.

    PubMed

    Lohse, Christian; Bassett, Danielle S; Lim, Kelvin O; Carlson, Jean M

    2014-10-01

    Human brain anatomy and function display a combination of modular and hierarchical organization, suggesting the importance of both cohesive structures and variable resolutions in the facilitation of healthy cognitive processes. However, tools to simultaneously probe these features of brain architecture require further development. We propose and apply a set of methods to extract cohesive structures in network representations of brain connectivity using multi-resolution techniques. We employ a combination of soft thresholding, windowed thresholding, and resolution in community detection, that enable us to identify and isolate structures associated with different weights. One such mesoscale structure is bipartivity, which quantifies the extent to which the brain is divided into two partitions with high connectivity between partitions and low connectivity within partitions. A second, complementary mesoscale structure is modularity, which quantifies the extent to which the brain is divided into multiple communities with strong connectivity within each community and weak connectivity between communities. Our methods lead to multi-resolution curves of these network diagnostics over a range of spatial, geometric, and structural scales. For statistical comparison, we contrast our results with those obtained for several benchmark null models. Our work demonstrates that multi-resolution diagnostic curves capture complex organizational profiles in weighted graphs. We apply these methods to the identification of resolution-specific characteristics of healthy weighted graph architecture and altered connectivity profiles in psychiatric disease.

  19. Designing nacre-like materials for simultaneous stiffness, strength and toughness: Optimum materials, composition, microstructure and size

    NASA Astrophysics Data System (ADS)

    Barthelat, Francois

    2014-12-01

    Nacre, bone and spider silk are staggered composites where inclusions of high aspect ratio reinforce a softer matrix. Such staggered composites have emerged through natural selection as the best configuration to produce stiffness, strength and toughness simultaneously. As a result, these remarkable materials are increasingly serving as model for synthetic composites with unusual and attractive performance. While several models have been developed to predict basic properties for biological and bio-inspired staggered composites, the designer is still left to struggle with finding optimum parameters. Unresolved issues include choosing optimum properties for inclusions and matrix, and resolving the contradictory effects of certain design variables. Here we overcome these difficulties with a multi-objective optimization for simultaneous high stiffness, strength and energy absorption in staggered composites. Our optimization scheme includes material properties for inclusions and matrix as design variables. This process reveals new guidelines, for example the staggered microstructure is only advantageous if the tablets are at least five times stronger than the interfaces, and only if high volume concentrations of tablets are used. We finally compile the results into a step-by-step optimization procedure which can be applied for the design of any type of high-performance staggered composite and at any length scale. The procedure produces optimum designs which are consistent with the materials and microstructure of natural nacre, confirming that this natural material is indeed optimized for mechanical performance.

  20. Multi-period equilibrium/near-equilibrium in electricity markets based on locational marginal prices

    NASA Astrophysics Data System (ADS)

    Garcia Bertrand, Raquel

    In this dissertation we propose an equilibrium procedure that coordinates the point of view of every market agent resulting in an equilibrium that simultaneously maximizes the independent objective of every market agent and satisfies network constraints. Therefore, the activities of the generating companies, consumers and an independent system operator are modeled: (1) The generating companies seek to maximize profits by specifying hourly step functions of productions and minimum selling prices, and bounds on productions. (2) The goals of the consumers are to maximize their economic utilities by specifying hourly step functions of demands and maximum buying prices, and bounds on demands. (3) The independent system operator then clears the market taking into account consistency conditions as well as capacity and line losses so as to achieve maximum social welfare. Then, we approach this equilibrium problem using complementarity theory in order to have the capability of imposing constraints on dual variables, i.e., on prices, such as minimum profit conditions for the generating units or maximum cost conditions for the consumers. In this way, given the form of the individual optimization problems, the Karush-Kuhn-Tucker conditions for the generating companies, the consumers and the independent system operator are both necessary and sufficient. The simultaneous solution to all these conditions constitutes a mixed linear complementarity problem. We include minimum profit constraints imposed by the units in the market equilibrium model. These constraints are added as additional constraints to the equivalent quadratic programming problem of the mixed linear complementarity problem previously described. For the sake of clarity, the proposed equilibrium or near-equilibrium is first developed for the particular case considering only one time period. Afterwards, we consider an equilibrium or near-equilibrium applied to a multi-period framework. This model embodies binary decisions, i.e., on/off status for the units, and therefore optimality conditions cannot be directly applied. To avoid limitations provoked by binary variables, while retaining the advantages of using optimality conditions, we define the multi-period market equilibrium using Benders decomposition, which allows computing binary variables through the master problem and continuous variables through the subproblem. Finally, we illustrate these market equilibrium concepts through several case studies.

  1. What is the role of curvature on the properties of nanomaterials for biomedical applications?

    PubMed

    Gonzalez Solveyra, Estefania; Szleifer, Igal

    2016-05-01

    The use of nanomaterials for drug delivery and theranostics applications is a promising paradigm in nanomedicine, as it brings together the best features of nanotechnolgy, molecular biology, and medicine. To fully exploit the synergistic potential of such interdisciplinary strategy, a comprehensive description of the interactions at the interface between nanomaterials and biological systems is not only crucial, but also mandatory. Routine strategies to engineer nanomaterial-based drugs comprise modifying their surface with biocompatible and targeting ligands, in many cases resorting to modular approaches that assume additive behavior. However, emergent behavior can be observed when combining confinement and curvature. The final properties of functionalized nanomaterials become dependent not only on the properties of their constituents but also on the geometry of the nano-bio interface, and on the local molecular environment. Modularity no longer holds, and the coupling between interactions, chemical equilibrium, and molecular organization has to be directly addressed in order to design smart nanomaterials with controlled spatial functionalization envisioning optimized biomedical applications. Nanoparticle's curvature becomes an integral part of the design strategy, enabling to control and engineer the chemical and surface properties with molecular precision. Understanding how nanoparticle size, morphology, and surface chemistry are interrelated will put us one step closer to engineering nanobiomaterials capable of mimicking biological structures and their behaviors, paving the way into applications and the possibility to elucidate the use of curvature by biological systems. WIREs Nanomed Nanobiotechnol 2016, 8:334-354. doi: 10.1002/wnan.1365 For further resources related to this article, please visit the WIREs website. © 2015 Wiley Periodicals, Inc.

  2. Performance study of the gamma-ray bursts polarimeter POLAR

    NASA Astrophysics Data System (ADS)

    Sun, J. C.; Wu, B. B.; Bao, T. W.; Batsch, T.; Bernasconi, T.; Britvitch, I.; Cadoux, F.; Cernuda, I.; Chai, J. Y.; Dong, Y. W.; Gauvin, N.; Hajdas, W.; He, J. J.; Kole, M.; Kong, M. N.; Kong, S. W.; Lechanoine-Leluc, C.; Li, Lu; Liu, J. T.; Liu, X.; Marcinkowski, R.; Orsi, S.; Pohl, M.; Produit, N.; Rapin, D.; Rutczynska, A.; Rybka, D.; Shi, H. L.; Song, L. M.; Szabelski, J.; Wang, R. J.; Wen, X.; Xiao, H. L.; Xiong, S. L.; Xu, H. H.; Xu, M.; Zhang, L.; Zhang, L. Y.; Zhang, S. N.; Zhang, X. F.; Zhang, Y. J.; Zwolinska, A.

    2016-07-01

    The Gamma-ray Burst Polarimeter-POLAR is a highly sensitive detector which is dedicated to the measurement of GRB's polarization with a large effective detection area and a large field of view (FOV). The optimized performance of POLAR will contribute to the capture and measurement of the transient sources like GRBs and Solar Flares. The detection energy range of POLAR is 50 keV 500 keV, and mainly dominated by the Compton scattering effect. POLAR consists of 25 detector modular units (DMUs), and each DMU is composed of low Z material Plastic Scintillators (PS), multi-anode photomultipliers (MAPMT) and multi-channel ASIC Front-end Electronics (FEE). POLAR experiment is an international collaboration project involving China, Switzerland and Poland, and is expected to be launched in September in 2016 onboard the Chinese space laboratory "Tiangong-2 (TG-2)". With the efforts from the collaborations, POLAR has experienced the Demonstration Model (DM) phase, Engineering and Qualification Model (EQM) phase, Qualification Model (QM) phase, and now a full Flight Model (FM) of POLAR has been constructed. The FM of POLAR has passed the environmental acceptance tests (thermal cycling, vibration, shock and thermal vacuum tests) and experienced the calibration tests with both radioactive sources and 100% polarized Gamma-Ray beam at ESRF after its construction. The design of POLAR, Monte-Carlo simulation analysis, as well as the performance test results will all be introduced in this paper.

  3. Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit

    NASA Technical Reports Server (NTRS)

    Sartori, John

    2005-01-01

    The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, F.; Banks, J. W.; Henshaw, W. D.

    We describe a new partitioned approach for solving conjugate heat transfer (CHT) problems where the governing temperature equations in different material domains are time-stepped in a implicit manner, but where the interface coupling is explicit. The new approach, called the CHAMP scheme (Conjugate Heat transfer Advanced Multi-domain Partitioned), is based on a discretization of the interface coupling conditions using a generalized Robin (mixed) condition. The weights in the Robin condition are determined from the optimization of a condition derived from a local stability analysis of the coupling scheme. The interface treatment combines ideas from optimized-Schwarz methods for domain-decomposition problems togethermore » with the interface jump conditions and additional compatibility jump conditions derived from the governing equations. For many problems (i.e. for a wide range of material properties, grid-spacings and time-steps) the CHAMP algorithm is stable and second-order accurate using no sub-time-step iterations (i.e. a single implicit solve of the temperature equation in each domain). In extreme cases (e.g. very fine grids with very large time-steps) it may be necessary to perform one or more sub-iterations. Each sub-iteration generally increases the range of stability substantially and thus one sub-iteration is likely sufficient for the vast majority of practical problems. The CHAMP algorithm is developed first for a model problem and analyzed using normal-mode the- ory. The theory provides a mechanism for choosing optimal parameters in the mixed interface condition. A comparison is made to the classical Dirichlet-Neumann (DN) method and, where applicable, to the optimized- Schwarz (OS) domain-decomposition method. For problems with different thermal conductivities and dif- fusivities, the CHAMP algorithm outperforms the DN scheme. For domain-decomposition problems with uniform conductivities and diffusivities, the CHAMP algorithm performs better than the typical OS scheme with one grid-cell overlap. Lastly, the CHAMP scheme is also developed for general curvilinear grids and CHT ex- amples are presented using composite overset grids that confirm the theory and demonstrate the effectiveness of the approach.« less

  5. Automated Detection of Clinically Significant Prostate Cancer in mp-MRI Images Based on an End-to-End Deep Neural Network.

    PubMed

    Wang, Zhiwei; Liu, Chaoyue; Cheng, Danpeng; Wang, Liang; Yang, Xin; Cheng, Kwang-Ting

    2018-05-01

    Automated methods for detecting clinically significant (CS) prostate cancer (PCa) in multi-parameter magnetic resonance images (mp-MRI) are of high demand. Existing methods typically employ several separate steps, each of which is optimized individually without considering the error tolerance of other steps. As a result, they could either involve unnecessary computational cost or suffer from errors accumulated over steps. In this paper, we present an automated CS PCa detection system, where all steps are optimized jointly in an end-to-end trainable deep neural network. The proposed neural network consists of concatenated subnets: 1) a novel tissue deformation network (TDN) for automated prostate detection and multimodal registration and 2) a dual-path convolutional neural network (CNN) for CS PCa detection. Three types of loss functions, i.e., classification loss, inconsistency loss, and overlap loss, are employed for optimizing all parameters of the proposed TDN and CNN. In the training phase, the two nets mutually affect each other and effectively guide registration and extraction of representative CS PCa-relevant features to achieve results with sufficient accuracy. The entire network is trained in a weakly supervised manner by providing only image-level annotations (i.e., presence/absence of PCa) without exact priors of lesions' locations. Compared with most existing systems which require supervised labels, e.g., manual delineation of PCa lesions, it is much more convenient for clinical usage. Comprehensive evaluation based on fivefold cross validation using 360 patient data demonstrates that our system achieves a high accuracy for CS PCa detection, i.e., a sensitivity of 0.6374 and 0.8978 at 0.1 and 1 false positives per normal/benign patient.

  6. Multi-Criteria Adaptation in a Personalized Multimedia Testing Tool Based on Semantic Technologies

    ERIC Educational Resources Information Center

    Lazarinis, Fotis; Green, Steve; Pearson, Elaine

    2011-01-01

    In this article, we present the characteristics and the design of a modular personalized multimedia testing tool based fully on XML learning specifications. Personalization is based on the characteristics of the individual learners, thus the testing paths are tailored to their needs and goals. The system maintains learner profiles rich in content…

  7. FLUKA: A Multi-Particle Transport Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrari, A.; Sala, P.R.; /CERN /INFN, Milan

    2005-12-14

    This report describes the 2005 version of the Fluka particle transport code. The first part introduces the basic notions, describes the modular structure of the system, and contains an installation and beginner's guide. The second part complements this initial information with details about the various components of Fluka and how to use them. It concludes with a detailed history and bibliography.

  8. Multi-objective optimization of a continuous bio-dissimilation process of glycerol to 1, 3-propanediol.

    PubMed

    Xu, Gongxian; Liu, Ying; Gao, Qunwang

    2016-02-10

    This paper deals with multi-objective optimization of continuous bio-dissimilation process of glycerol to 1, 3-propanediol. In order to maximize the production rate of 1, 3-propanediol, maximize the conversion rate of glycerol to 1, 3-propanediol, maximize the conversion rate of glycerol, and minimize the concentration of by-product ethanol, we first propose six new multi-objective optimization models that can simultaneously optimize any two of the four objectives above. Then these multi-objective optimization problems are solved by using the weighted-sum and normal-boundary intersection methods respectively. Both the Pareto filter algorithm and removal criteria are used to remove those non-Pareto optimal points obtained by the normal-boundary intersection method. The results show that the normal-boundary intersection method can successfully obtain the approximate Pareto optimal sets of all the proposed multi-objective optimization problems, while the weighted-sum approach cannot achieve the overall Pareto optimal solutions of some multi-objective problems. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Mining the modular structure of protein interaction networks.

    PubMed

    Berenstein, Ariel José; Piñero, Janet; Furlong, Laura Inés; Chernomoretz, Ariel

    2015-01-01

    Cluster-based descriptions of biological networks have received much attention in recent years fostered by accumulated evidence of the existence of meaningful correlations between topological network clusters and biological functional modules. Several well-performing clustering algorithms exist to infer topological network partitions. However, due to respective technical idiosyncrasies they might produce dissimilar modular decompositions of a given network. In this contribution, we aimed to analyze how alternative modular descriptions could condition the outcome of follow-up network biology analysis. We considered a human protein interaction network and two paradigmatic cluster recognition algorithms, namely: the Clauset-Newman-Moore and the infomap procedures. We analyzed to what extent both methodologies yielded different results in terms of granularity and biological congruency. In addition, taking into account Guimera's cartographic role characterization of network nodes, we explored how the adoption of a given clustering methodology impinged on the ability to highlight relevant network meso-scale connectivity patterns. As a case study we considered a set of aging related proteins and showed that only the high-resolution modular description provided by infomap, could unveil statistically significant associations between them and inter/intra modular cartographic features. Besides reporting novel biological insights that could be gained from the discovered associations, our contribution warns against possible technical concerns that might affect the tools used to mine for interaction patterns in network biology studies. In particular our results suggested that sub-optimal partitions from the strict point of view of their modularity levels might still be worth being analyzed when meso-scale features were to be explored in connection with external source of biological knowledge.

  10. Multi-objective Optimization Design of Gear Reducer Based on Adaptive Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Li, Rui; Chang, Tian; Wang, Jianwei; Wei, Xiaopeng; Wang, Jinming

    2008-11-01

    An adaptive Genetic Algorithm (GA) is introduced to solve the multi-objective optimized design of the reducer. Firstly, according to the structure, strength, etc. in a reducer, a multi-objective optimized model of the helical gear reducer is established. And then an adaptive GA based on a fuzzy controller is introduced, aiming at the characteristics of multi-objective, multi-parameter, multi-constraint conditions. Finally, a numerical example is illustrated to show the advantages of this approach and the effectiveness of an adaptive genetic algorithm used in optimized design of a reducer.

  11. Multi-electrolyte-step anodic aluminum oxide method for the fabrication of self-organized nanochannel arrays

    PubMed Central

    2012-01-01

    Nanochannel arrays were fabricated by the self-organized multi-electrolyte-step anodic aluminum oxide [AAO] method in this study. The anodization conditions used in the multi-electrolyte-step AAO method included a phosphoric acid solution as the electrolyte and an applied high voltage. There was a change in the phosphoric acid by the oxalic acid solution as the electrolyte and the applied low voltage. This method was used to produce self-organized nanochannel arrays with good regularity and circularity, meaning less power loss and processing time than with the multi-step AAO method. PMID:22333268

  12. Optimization of sample preparation by central composite design for multi-class determination of veterinary drugs in bovine muscle, kidney and liver by ultra-high-performance liquid chromatographic-tandem mass spectrometry.

    PubMed

    Rizzetti, Tiele M; de Souza, Maiara P; Prestes, Osmar D; Adaime, Martha B; Zanella, Renato

    2018-04-25

    In this study a simple and fast multi-class method for the determination of veterinary drugs in bovine liver, kidney and muscle was developed. The method employed acetonitrile for extraction followed by clean-up with EMR-Lipid® sorbent and trichloracetic acid. Tests indicated that the use of TCA was most effective when added in the final step of the clean-up procedure instead of during extraction. Different sorbents were tested and optimized using central composite design and the analytes determined by ultra-high-performance liquid chromatographic-tandem mass spectrometry (UHPLC-MS/MS). The method was validated according the European Commission Decision 2002/657 presenting satisfactory results for 69 veterinary drugs in bovine liver and 68 compounds in bovine muscle and kidney. The method was applied in real samples and in proficiency tests and proved to be adequate for routine analysis. Residues of abamectin, doramectin, eprinomectin and ivermectin were found in samples of bovine muscle and only ivermectin in bovine liver. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Multi-length scale tomography for the determination and optimization of the effective microstructural properties in novel hierarchical solid oxide fuel cell anodes

    NASA Astrophysics Data System (ADS)

    Lu, Xuekun; Taiwo, Oluwadamilola O.; Bertei, Antonio; Li, Tao; Li, Kang; Brett, Dan J. L.; Shearing, Paul R.

    2017-11-01

    Effective microstructural properties are critical in determining the electrochemical performance of solid oxide fuel cells (SOFCs), particularly when operating at high current densities. A novel tubular SOFC anode with a hierarchical microstructure, composed of self-organized micro-channels and sponge-like regions, has been fabricated by a phase inversion technique to mitigate concentration losses. However, since pore sizes span over two orders of magnitude, the determination of the effective transport parameters using image-based techniques remains challenging. Pioneering steps are made in this study to characterize and optimize the microstructure by coupling multi-length scale 3D tomography and modeling. The results conclusively show that embedding finger-like micro-channels into the tubular anode can improve the mass transport by 250% and the permeability by 2-3 orders of magnitude. Our parametric study shows that increasing the porosity in the spongy layer beyond 10% enhances the effective transport parameters of the spongy layer at an exponential rate, but linearly for the full anode. For the first time, local and global mass transport properties are correlated to the microstructure, which is of wide interest for rationalizing the design optimization of SOFC electrodes and more generally for hierarchical materials in batteries and membranes.

  14. Optimal Frequency-Domain System Realization with Weighting

    NASA Technical Reports Server (NTRS)

    Juang, Jer-Nan; Maghami, Peiman G.

    1999-01-01

    Several approaches are presented to identify an experimental system model directly from frequency response data. The formulation uses a matrix-fraction description as the model structure. Frequency weighting such as exponential weighting is introduced to solve a weighted least-squares problem to obtain the coefficient matrices for the matrix-fraction description. A multi-variable state-space model can then be formed using the coefficient matrices of the matrix-fraction description. Three different approaches are introduced to fine-tune the model using nonlinear programming methods to minimize the desired cost function. The first method uses an eigenvalue assignment technique to reassign a subset of system poles to improve the identified model. The second method deals with the model in the real Schur or modal form, reassigns a subset of system poles, and adjusts the columns (rows) of the input (output) influence matrix using a nonlinear optimizer. The third method also optimizes a subset of poles, but the input and output influence matrices are refined at every optimization step through least-squares procedures.

  15. Mechanical and Metallurgical Evolution of Stainless Steel 321 in a Multi-step Forming Process

    NASA Astrophysics Data System (ADS)

    Anderson, M.; Bridier, F.; Gholipour, J.; Jahazi, M.; Wanjara, P.; Bocher, P.; Savoie, J.

    2016-04-01

    This paper examines the metallurgical evolution of AISI Stainless Steel 321 (SS 321) during multi-step forming, a process that involves cycles of deformation with intermediate heat treatment steps. The multi-step forming process was simulated by implementing interrupted uniaxial tensile testing experiments. Evolution of the mechanical properties as well as the microstructural features, such as twins and textures of the austenite and martensite phases, was studied as a function of the multi-step forming process. The characteristics of the Strain-Induced Martensite (SIM) were also documented for each deformation step and intermediate stress relief heat treatment. The results indicated that the intermediate heat treatments considerably increased the formability of SS 321. Texture analysis showed that the effect of the intermediate heat treatment on the austenite was minor and led to partial recrystallization, while deformation was observed to reinforce the crystallographic texture of austenite. For the SIM, an Olson-Cohen equation type was identified to analytically predict its formation during the multi-step forming process. The generated SIM was textured and weakened with increasing deformation.

  16. Modular Closed-Loop Control of Diabetes

    PubMed Central

    Magni, L.; Dassau, E.; Hughes-Karvetski, C.; Toffanin, C.; De Nicolao, G.; Del Favero, S.; Breton, M.; Man, C. Dalla; Renard, E.; Zisser, H.; Doyle, F. J.; Cobelli, C.; Kovatchev, B. P.

    2015-01-01

    Modularity plays a key role in many engineering systems, allowing for plug-and-play integration of components, enhancing flexibility and adaptability, and facilitating standardization. In the control of diabetes, i.e., the so-called “artificial pancreas,” modularity allows for the step-wise introduction of (and regulatory approval for) algorithmic components, starting with subsystems for assured patient safety and followed by higher layer components that serve to modify the patient’s basal rate in real time. In this paper, we introduce a three-layer modular architecture for the control of diabetes, consisting in a sensor/pump interface module (IM), a continuous safety module (CSM), and a real-time control module (RTCM), which separates the functions of insulin recommendation (postmeal insulin for mitigating hyperglycemia) and safety (prevention of hypoglycemia). In addition, we provide details of instances of all three layers of the architecture: the APS© serving as the IM, the safety supervision module (SSM) serving as the CSM, and the range correction module (RCM) serving as the RTCM. We evaluate the performance of the integrated system via in silico preclinical trials, demonstrating 1) the ability of the SSM to reduce the incidence of hypoglycemia under nonideal operating conditions and 2) the ability of the RCM to reduce glycemic variability. PMID:22481809

  17. One-pot DNA construction for synthetic biology: the Modular Overlap-Directed Assembly with Linkers (MODAL) strategy

    PubMed Central

    Casini, Arturo; MacDonald, James T.; Jonghe, Joachim De; Christodoulou, Georgia; Freemont, Paul S.; Baldwin, Geoff S.; Ellis, Tom

    2014-01-01

    Overlap-directed DNA assembly methods allow multiple DNA parts to be assembled together in one reaction. These methods, which rely on sequence homology between the ends of DNA parts, have become widely adopted in synthetic biology, despite being incompatible with a key principle of engineering: modularity. To answer this, we present MODAL: a Modular Overlap-Directed Assembly with Linkers strategy that brings modularity to overlap-directed methods, allowing assembly of an initial set of DNA parts into a variety of arrangements in one-pot reactions. MODAL is accompanied by a custom software tool that designs overlap linkers to guide assembly, allowing parts to be assembled in any specified order and orientation. The in silico design of synthetic orthogonal overlapping junctions allows for much greater efficiency in DNA assembly for a variety of different methods compared with using non-designed sequence. In tests with three different assembly technologies, the MODAL strategy gives assembly of both yeast and bacterial plasmids, composed of up to five DNA parts in the kilobase range with efficiencies of between 75 and 100%. It also seamlessly allows mutagenesis to be performed on any specified DNA parts during the process, allowing the one-step creation of construct libraries valuable for synthetic biology applications. PMID:24153110

  18. An integrated cell culture lab on a chip: modular microdevices for cultivation of mammalian cells and delivery into microfluidic microdroplets.

    PubMed

    Hufnagel, Hansjörg; Huebner, Ansgar; Gülch, Carina; Güse, Katharina; Abell, Chris; Hollfelder, Florian

    2009-06-07

    We present a modular system of microfluidic PDMS devices designed to incorporate the steps necessary for cell biological assays based on mammalian tissue culture 'on-chip'. The methods described herein include the on-chip immobilization and culturing of cells as well as their manipulation by transfection. Assessment of cell viability by flow cytrometry suggests low attrition rates (<3%) and excellent growth properties in the device for up to 7 days for CHO-K1 cells. To demonstrate that key procedures from the repertoire of cell biology are possible in this format, transfection of a reporter gene (encoding green fluorescent protein) was carried out. The modular design enables efficient detachment and recollection of cells and allows assessment of the success of transfection achieved on-chip. The transfection levels (20%) are comparable to standard large scale procedures and more than 500 cells could be transfected. Finally, cells are transferred into microfluidic microdoplets, where in principle a wide range of subsequent assays can be carried out at the single cell level in droplet compartments. The procedures developed for this modular device layout further demonstrate that commonly used methods in cell biology involving mammalian cells can be reliably scaled down to allow single cell investigations in picolitre volumes.

  19. Modular Expression Language for Ordinary Differential Equation Editing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blake, Robert C.

    MELODEEis a system for describing systems of initial value problem ordinary differential equations, and a compiler for the language that produces optimized code to integrate the differential equations. Features include rational polynomial approximation for expensive functions and automatic differentiation for symbolic jacobians

  20. Characterizing Variability of Modular Brain Connectivity with Constrained Principal Component Analysis

    PubMed Central

    Hirayama, Jun-ichiro; Hyvärinen, Aapo; Kiviniemi, Vesa; Kawanabe, Motoaki; Yamashita, Okito

    2016-01-01

    Characterizing the variability of resting-state functional brain connectivity across subjects and/or over time has recently attracted much attention. Principal component analysis (PCA) serves as a fundamental statistical technique for such analyses. However, performing PCA on high-dimensional connectivity matrices yields complicated “eigenconnectivity” patterns, for which systematic interpretation is a challenging issue. Here, we overcome this issue with a novel constrained PCA method for connectivity matrices by extending the idea of the previously proposed orthogonal connectivity factorization method. Our new method, modular connectivity factorization (MCF), explicitly introduces the modularity of brain networks as a parametric constraint on eigenconnectivity matrices. In particular, MCF analyzes the variability in both intra- and inter-module connectivities, simultaneously finding network modules in a principled, data-driven manner. The parametric constraint provides a compact module-based visualization scheme with which the result can be intuitively interpreted. We develop an optimization algorithm to solve the constrained PCA problem and validate our method in simulation studies and with a resting-state functional connectivity MRI dataset of 986 subjects. The results show that the proposed MCF method successfully reveals the underlying modular eigenconnectivity patterns in more general situations and is a promising alternative to existing methods. PMID:28002474

Top