Sample records for performing complex minimally

  1. AMMOS2: a web server for protein-ligand-water complexes refinement via molecular mechanics.

    PubMed

    Labbé, Céline M; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O; Pajeva, Ilza; Miteva, Maria A

    2017-07-03

    AMMOS2 is an interactive web server for efficient computational refinement of protein-small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein-ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein-ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein-ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein-ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein-ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein-ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. AMMOS2: a web server for protein–ligand–water complexes refinement via molecular mechanics

    PubMed Central

    Labbé, Céline M.; Pencheva, Tania; Jereva, Dessislava; Desvillechabrol, Dimitri; Becot, Jérôme; Villoutreix, Bruno O.; Pajeva, Ilza

    2017-01-01

    Abstract AMMOS2 is an interactive web server for efficient computational refinement of protein–small organic molecule complexes. The AMMOS2 protocol employs atomic-level energy minimization of a large number of experimental or modeled protein–ligand complexes. The web server is based on the previously developed standalone software AMMOS (Automatic Molecular Mechanics Optimization for in silico Screening). AMMOS utilizes the physics-based force field AMMP sp4 and performs optimization of protein–ligand interactions at five levels of flexibility of the protein receptor. The new version 2 of AMMOS implemented in the AMMOS2 web server allows the users to include explicit water molecules and individual metal ions in the protein–ligand complexes during minimization. The web server provides comprehensive analysis of computed energies and interactive visualization of refined protein–ligand complexes. The ligands are ranked by the minimized binding energies allowing the users to perform additional analysis for drug discovery or chemical biology projects. The web server has been extensively tested on 21 diverse protein–ligand complexes. AMMOS2 minimization shows consistent improvement over the initial complex structures in terms of minimized protein–ligand binding energies and water positions optimization. The AMMOS2 web server is freely available without any registration requirement at the URL: http://drugmod.rpbs.univ-paris-diderot.fr/ammosHome.php. PMID:28486703

  3. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  4. Analysis of complex network performance and heuristic node removal strategies

    NASA Astrophysics Data System (ADS)

    Jahanpour, Ehsan; Chen, Xin

    2013-12-01

    Removing important nodes from complex networks is a great challenge in fighting against criminal organizations and preventing disease outbreaks. Six network performance metrics, including four new metrics, are applied to quantify networks' diffusion speed, diffusion scale, homogeneity, and diameter. In order to efficiently identify nodes whose removal maximally destroys a network, i.e., minimizes network performance, ten structured heuristic node removal strategies are designed using different node centrality metrics including degree, betweenness, reciprocal closeness, complement-derived closeness, and eigenvector centrality. These strategies are applied to remove nodes from the September 11, 2001 hijackers' network, and their performance are compared to that of a random strategy, which removes randomly selected nodes, and the locally optimal solution (LOS), which removes nodes to minimize network performance at each step. The computational complexity of the 11 strategies and LOS is also analyzed. Results show that the node removal strategies using degree and betweenness centralities are more efficient than other strategies.

  5. Robotic pancreaticoduodenectomy in a case of duodenal gastrointestinal stromal tumor.

    PubMed

    Parisi, Amilcare; Desiderio, Jacopo; Trastulli, Stefano; Grassi, Veronica; Ricci, Francesco; Farinacci, Federico; Cacurri, Alban; Castellani, Elisa; Corsi, Alessia; Renzi, Claudio; Barberini, Francesco; D'Andrea, Vito; Santoro, Alberto; Cirocchi, Roberto

    2014-12-04

    Laparoscopic pancreaticoduodenectomy is rarely performed, and it has not been particularly successful due to its technical complexity. The objective of this study is to highlight how robotic surgery could improve a minimally invasive approach and to expose the usefulness of robotic surgery even in complex surgical procedures. The surgical technique employed in our center to perform a pancreaticoduodenectomy, which was by means of the da Vinci™ robotic system in order to remove a duodenal gastrointestinal stromal tumor, is reported. Robotic technology has improved significantly over the traditional laparoscopic approach, representing an evolution of minimally invasive techniques, allowing procedures to be safely performed that are still considered to be scarcely feasible or reproducible.

  6. Boosted ARTMAP: modifications to fuzzy ARTMAP motivated by boosting theory.

    PubMed

    Verzi, Stephen J; Heileman, Gregory L; Georgiopoulos, Michael

    2006-05-01

    In this paper, several modifications to the Fuzzy ARTMAP neural network architecture are proposed for conducting classification in complex, possibly noisy, environments. The goal of these modifications is to improve upon the generalization performance of Fuzzy ART-based neural networks, such as Fuzzy ARTMAP, in these situations. One of the major difficulties of employing Fuzzy ARTMAP on such learning problems involves over-fitting of the training data. Structural risk minimization is a machine-learning framework that addresses the issue of over-fitting by providing a backbone for analysis as well as an impetus for the design of better learning algorithms. The theory of structural risk minimization reveals a trade-off between training error and classifier complexity in reducing generalization error, which will be exploited in the learning algorithms proposed in this paper. Boosted ART extends Fuzzy ART by allowing the spatial extent of each cluster formed to be adjusted independently. Boosted ARTMAP generalizes upon Fuzzy ARTMAP by allowing non-zero training error in an effort to reduce the hypothesis complexity and hence improve overall generalization performance. Although Boosted ARTMAP is strictly speaking not a boosting algorithm, the changes it encompasses were motivated by the goals that one strives to achieve when employing boosting. Boosted ARTMAP is an on-line learner, it does not require excessive parameter tuning to operate, and it reduces precisely to Fuzzy ARTMAP for particular parameter values. Another architecture described in this paper is Structural Boosted ARTMAP, which uses both Boosted ART and Boosted ARTMAP to perform structural risk minimization learning. Structural Boosted ARTMAP will allow comparison of the capabilities of off-line versus on-line learning as well as empirical risk minimization versus structural risk minimization using Fuzzy ARTMAP-based neural network architectures. Both empirical and theoretical results are presented to enhance the understanding of these architectures.

  7. The trellis complexity of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Lin, W.

    1995-01-01

    It has long been known that convolutional codes have a natural, regular trellis structure that facilitates the implementation of Viterbi's algorithm. It has gradually become apparent that linear block codes also have a natural, though not in general a regular, 'minimal' trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of the Viterbi decoding algorithm can be accurately estimated by the number of trellis edges per encoded bit. It would, therefore, appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations that are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the minimal trellis representation. Thus, ironically, at present we seem to know more about the minimal trellis representation for block than for convolutional codes. In this article, we provide a remedy, by developing a theory of minimal trellises for convolutional codes. (A similar theory has recently been given by Sidorenko and Zyablov). This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-minimal generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that, in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.

  8. Massively parallel GPU-accelerated minimization of classical density functional theory

    NASA Astrophysics Data System (ADS)

    Stopper, Daniel; Roth, Roland

    2017-08-01

    In this paper, we discuss the ability to numerically minimize the grand potential of hard disks in two-dimensional and of hard spheres in three-dimensional space within the framework of classical density functional and fundamental measure theory on modern graphics cards. Our main finding is that a massively parallel minimization leads to an enormous performance gain in comparison to standard sequential minimization schemes. Furthermore, the results indicate that in complex multi-dimensional situations, a heavy parallel minimization of the grand potential seems to be mandatory in order to reach a reasonable balance between accuracy and computational cost.

  9. Recognition of surgical skills using hidden Markov models

    NASA Astrophysics Data System (ADS)

    Speidel, Stefanie; Zentek, Tom; Sudra, Gunther; Gehrig, Tobias; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2009-02-01

    Minimally invasive surgery is a highly complex medical discipline and can be regarded as a major breakthrough in surgical technique. A minimally invasive intervention requires enhanced motor skills to deal with difficulties like the complex hand-eye coordination and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality techniques. To recognize and analyze the current situation for context-aware assistance, we need intraoperative sensor data and a model of the intervention. Characteristics of a situation are the performed activity, the used instruments, the surgical objects and the anatomical structures. Important information about the surgical activity can be acquired by recognizing the surgical gesture performed. Surgical gestures in minimally invasive surgery like cutting, knot-tying or suturing are here referred to as surgical skills. We use the motion data from the endoscopic instruments to classify and analyze the performed skill and even use it for skill evaluation in a training scenario. The system uses Hidden Markov Models (HMM) to model and recognize a specific surgical skill like knot-tying or suturing with an average recognition rate of 92%.

  10. Hierarchical Control Using Networks Trained with Higher-Level Forward Models

    PubMed Central

    Wayne, Greg; Abbott, L.F.

    2015-01-01

    We propose and develop a hierarchical approach to network control of complex tasks. In this approach, a low-level controller directs the activity of a “plant,” the system that performs the task. However, the low-level controller may only be able to solve fairly simple problems involving the plant. To accomplish more complex tasks, we introduce a higher-level controller that controls the lower-level controller. We use this system to direct an articulated truck to a specified location through an environment filled with static or moving obstacles. The final system consists of networks that have memorized associations between the sensory data they receive and the commands they issue. These networks are trained on a set of optimal associations that are generated by minimizing cost functions. Cost function minimization requires predicting the consequences of sequences of commands, which is achieved by constructing forward models, including a model of the lower-level controller. The forward models and cost minimization are only used during training, allowing the trained networks to respond rapidly. In general, the hierarchical approach can be extended to larger numbers of levels, dividing complex tasks into more manageable sub-tasks. The optimization procedure and the construction of the forward models and controllers can be performed in similar ways at each level of the hierarchy, which allows the system to be modified to perform other tasks, or to be extended for more complex tasks without retraining lower-levels. PMID:25058706

  11. Implementation of Complex Biological Logic Circuits Using Spatially Distributed Multicellular Consortia

    PubMed Central

    Urrios, Arturo; de Nadal, Eulàlia; Solé, Ricard; Posas, Francesc

    2016-01-01

    Engineered synthetic biological devices have been designed to perform a variety of functions from sensing molecules and bioremediation to energy production and biomedicine. Notwithstanding, a major limitation of in vivo circuit implementation is the constraint associated to the use of standard methodologies for circuit design. Thus, future success of these devices depends on obtaining circuits with scalable complexity and reusable parts. Here we show how to build complex computational devices using multicellular consortia and space as key computational elements. This spatial modular design grants scalability since its general architecture is independent of the circuit’s complexity, minimizes wiring requirements and allows component reusability with minimal genetic engineering. The potential use of this approach is demonstrated by implementation of complex logical functions with up to six inputs, thus demonstrating the scalability and flexibility of this method. The potential implications of our results are outlined. PMID:26829588

  12. Essays on wholesale auctions in deregulated electricity markets

    NASA Astrophysics Data System (ADS)

    Baltaduonis, Rimvydas

    2007-12-01

    The early experience in the restructured electric power markets raised several issues, including price spikes, inefficiency, security, and the overall relationship of market clearing prices to generation costs. Unsatisfactory outcomes in these markets are thought to have resulted in part from strategic generator behaviors encouraged by inappropriate market design features. In this dissertation, I examine the performance of three auction mechanisms for wholesale power markets - Offer Cost Minimization auction, Payment Cost Minimization auction and Simple-Offer auction - when electricity suppliers act strategically. A Payment Cost Minimization auction has been proposed as an alternative to the traditional Offer Cost Minimization auction with the intention to solve the problem of inflated wholesale electricity prices. Efficiency concerns for this proposal were voiced due to insights predicated on the assumption of true production cost revelation. Using a game theoretic approach and an experimental method, I compare the two auctions, strictly controlling for the level of unilateral market power. A specific feature of these complex-offer auctions is that the sellers submit not only the quantities and the minimum prices that they are willing to sell at, but also the start-up fees, which are designed to reimburse the fixed start-up costs of the generation plants. I find that the complex structure of the offers leaves considerable room for strategic behavior, which consequently leads to anti-competitive and inefficient market outcomes. In the last chapter of my dissertation, I use laboratory experiments to contrast the performance of two complex-offer auctions against the performance of a simple-offer auction, in which the sellers have to recover all their generation costs - fixed and variable - through a uniform market-clearing price. I find that a simple-offer auction significantly reduces consumer prices and lowers price volatility. It mitigates anti-competitive effects that are present in the complex-offer auctions and achieves allocative efficiency more quickly.

  13. Robotic lateral pancreaticojejunostomy (Puestow).

    PubMed

    Meehan, John J; Sawin, Robert

    2011-06-01

    A lateral pancreaticojejunostomy (LPJ), also known as the Puestow procedure, is a complex procedure performed for chronic pancreatitis when the pancreatic duct is dilated and unable to drain properly. Traditionally, these procedures are performed with open surgery. A minimally invasive approach to the LPJ using rigid handheld nonarticulating instruments is tedious and rarely performed. In fact, there are no prior laparoscopic case reports for LPJ in children and only a small handful of cases in the adult literature. This lack of laparoscopic information may be an indication of the difficulty in performing this complex operation with nonarticulating laparoscopic instruments. The advantages of robotic surgery may help overcome these difficulties. We present the first robotic LPJ ever reported in a 14-year-old child with idiopathic chronic pancreatitis. This case demonstrates the utility of this advanced surgical technology and may lead to a new minimally invasive option for both adults and children with chronic pancreatitis requiring surgical intervention. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. CCOMP: An efficient algorithm for complex roots computation of determinantal equations

    NASA Astrophysics Data System (ADS)

    Zouros, Grigorios P.

    2018-01-01

    In this paper a free Python algorithm, entitled CCOMP (Complex roots COMPutation), is developed for the efficient computation of complex roots of determinantal equations inside a prescribed complex domain. The key to the method presented is the efficient determination of the candidate points inside the domain which, in their close neighborhood, a complex root may lie. Once these points are detected, the algorithm proceeds to a two-dimensional minimization problem with respect to the minimum modulus eigenvalue of the system matrix. In the core of CCOMP exist three sub-algorithms whose tasks are the efficient estimation of the minimum modulus eigenvalues of the system matrix inside the prescribed domain, the efficient computation of candidate points which guarantee the existence of minima, and finally, the computation of minima via bound constrained minimization algorithms. Theoretical results and heuristics support the development and the performance of the algorithm, which is discussed in detail. CCOMP supports general complex matrices, and its efficiency, applicability and validity is demonstrated to a variety of microwave applications.

  15. Energy-efficient ECG compression on wireless biosensors via minimal coherence sensing and weighted ℓ₁ minimization reconstruction.

    PubMed

    Zhang, Jun; Gu, Zhenghui; Yu, Zhu Liang; Li, Yuanqing

    2015-03-01

    Low energy consumption is crucial for body area networks (BANs). In BAN-enabled ECG monitoring, the continuous monitoring entails the need of the sensor nodes to transmit a huge data to the sink node, which leads to excessive energy consumption. To reduce airtime over energy-hungry wireless links, this paper presents an energy-efficient compressed sensing (CS)-based approach for on-node ECG compression. At first, an algorithm called minimal mutual coherence pursuit is proposed to construct sparse binary measurement matrices, which can be used to encode the ECG signals with superior performance and extremely low complexity. Second, in order to minimize the data rate required for faithful reconstruction, a weighted ℓ1 minimization model is derived by exploring the multisource prior knowledge in wavelet domain. Experimental results on MIT-BIH arrhythmia database reveals that the proposed approach can obtain higher compression ratio than the state-of-the-art CS-based methods. Together with its low encoding complexity, our approach can achieve significant energy saving in both encoding process and wireless transmission.

  16. Combined shape and topology optimization for minimization of maximal von Mises stress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lian, Haojie; Christiansen, Asger N.; Tortorelli, Daniel A.

    Here, this work shows that a combined shape and topology optimization method can produce optimal 2D designs with minimal stress subject to a volume constraint. The method represents the surface explicitly and discretizes the domain into a simplicial complex which adapts both structural shape and topology. By performing repeated topology and shape optimizations and adaptive mesh updates, we can minimize the maximum von Mises stress using the p-norm stress measure with p-values as high as 30, provided that the stress is calculated with sufficient accuracy.

  17. Combined shape and topology optimization for minimization of maximal von Mises stress

    DOE PAGES

    Lian, Haojie; Christiansen, Asger N.; Tortorelli, Daniel A.; ...

    2017-01-27

    Here, this work shows that a combined shape and topology optimization method can produce optimal 2D designs with minimal stress subject to a volume constraint. The method represents the surface explicitly and discretizes the domain into a simplicial complex which adapts both structural shape and topology. By performing repeated topology and shape optimizations and adaptive mesh updates, we can minimize the maximum von Mises stress using the p-norm stress measure with p-values as high as 30, provided that the stress is calculated with sufficient accuracy.

  18. The Influence of Relational Complexity and Strategy Selection on Children's Reasoning in the Latin Square Task

    ERIC Educational Resources Information Center

    Perret, Patrick; Bailleux, Christine; Dauvier, Bruno

    2011-01-01

    The present study focused on children's deductive reasoning when performing the Latin Square Task, an experimental task designed to explore the influence of relational complexity. Building on Birney, Halford, and Andrew's (2006) research, we created a version of the task that minimized nonrelational factors and introduced new categories of items.…

  19. Aspects of transport system management within mining complex using information and telecommunication systems

    NASA Astrophysics Data System (ADS)

    Semykina, A. S.; Zagorodniy, N. A.; Konev, A. A.; Duganova, E. V.

    2018-05-01

    The paper considers aspects of transport system management within the mining complex. It indicates information and telecommunication systems that are used to increase transportation efficiency. It also describes key advantages and disadvantages. It is found that software products of the Modular Company used in pits allow increasing transport performance, minimizing losses and ensuring efficient transportation of minerals.

  20. Balancing Near-Field Enhancement, Absorption, and Scattering for Effective Antenna-Reactor Plasmonic Photocatalysis.

    PubMed

    Li, Kun; Hogan, Nathaniel J; Kale, Matthew J; Halas, Naomi J; Nordlander, Peter; Christopher, Phillip

    2017-06-14

    Efficient photocatalysis requires multifunctional materials that absorb photons and generate energetic charge carriers at catalytic active sites to facilitate a desired chemical reaction. Antenna-reactor complexes are an emerging multifunctional photocatalytic structure where the strong, localized near field of the plasmonic metal nanoparticle (e.g., Ag) is coupled to the catalytic properties of the nonplasmonic metal nanoparticle (e.g., Pt) to enable chemical transformations. With an eye toward sustainable solar driven photocatalysis, we investigate how the structure of antenna-reactor complexes governs their photocatalytic activity in the light-limited regime, where all photons need to be effectively utilized. By synthesizing core@shell/satellite (Ag@SiO 2 /Pt) antenna-reactor complexes with varying Ag nanoparticle diameters and performing photocatalytic CO oxidation, we observed plasmon-enhanced photocatalysis only for antenna-reactor complexes with antenna components of intermediate sizes (25 and 50 nm). Optimal photocatalytic performance was shown to be determined by a balance between maximized local field enhancements at the catalytically active Pt surface, minimized collective scattering of photons out of the catalyst bed by the complexes, and minimal light absorption in the Ag nanoparticle antenna. These results elucidate the critical aspects of local field enhancement, light scattering, and absorption in plasmonic photocatalyst design, especially under light-limited illumination conditions.

  1. Robotic surgery of the pancreas

    PubMed Central

    Joyce, Daniel; Morris-Stiff, Gareth; Falk, Gavin A; El-Hayek, Kevin; Chalikonda, Sricharan; Walsh, R Matthew

    2014-01-01

    Pancreatic surgery is one of the most challenging and complex fields in general surgery. While minimally invasive surgery has become the standard of care for many intra-abdominal pathologies the overwhelming majority of pancreatic surgery is performed in an open fashion. This is attributed to the retroperitoneal location of the pancreas, its intimate relationship to major vasculature and the complexity of reconstruction in the case of pancreatoduodenectomy. Herein, we describe the application of robotic technology to minimally invasive pancreatic surgery. The unique capabilities of the robotic platform have made the minimally invasive approach feasible and safe with equivalent if not better outcomes (e.g., decreased length of stay, less surgical site infections) to conventional open surgery. However, it is unclear whether the robotic approach is truly superior to traditional laparoscopy; this is a key point given the substantial costs associated with procuring and maintaining robotic capabilities. PMID:25356035

  2. Problem of quality assurance during metal constructions welding via robotic technological complexes

    NASA Astrophysics Data System (ADS)

    Fominykh, D. S.; Rezchikov, A. F.; Kushnikov, V. A.; Ivashchenko, V. A.; Bogomolov, A. S.; Filimonyuk, L. Yu; Dolinina, O. N.; Kushnikov, O. V.; Shulga, T. E.; Tverdokhlebov, V. A.

    2018-05-01

    The problem of minimizing the probability for critical combinations of events that lead to a loss in welding quality via robotic process automation is examined. The problem is formulated, models and algorithms for its solution are developed. The problem is solved by minimizing the criterion characterizing the losses caused by defective products. Solving the problem may enhance the quality and accuracy of operations performed and reduce the losses caused by defective product

  3. Myocardial Protection and Financial Considerations of Custodiol Cardioplegia in Minimally Invasive and Open Valve Surgery.

    PubMed

    Hummel, Brian W; Buss, Randall W; DiGiorgi, Paul L; Laviano, Brittany N; Yaeger, Nalani A; Lucas, M Lee; Comas, George M

    Single-dose antegrade crystalloid cardioplegia with Custodiol-HTK (histidine-tryptophan-ketoglutarate) has been used for many years. Its safety and efficacy were established in experimental and clinical studies. It is beneficial in complex valve surgery because it provides a long period of myocardial protection with a single dose. Thus, valve procedures (minimally invasive or open) can be performed with limited interruption. The aim of this study is to compare the use of Custodiol-HTK cardioplegia with traditional blood cardioplegia in patients undergoing minimally invasive and open valve surgery. A single-institution, retrospective case-control review was performed on patients who underwent valve surgery in Lee Memorial Health System at either HealthPark Medical Center or Gulf Coast Medical Center from July 1, 2011, through March 7, 2015. A total of 181 valve cases (aortic or mitral) performed using Custodiol-HTK cardioplegia were compared with 181 cases performed with traditional blood cardioplegia. Each group had an equal distribution of minimally invasive and open valve cases. Right chest thoracotomy or partial sternotomy was performed on minimally invasive valve cases. Demographics, perioperative data, clinical outcomes, and financial data were collected and analyzed. Patient outcomes were superior in the Custodiol-HTK cardioplegia group for blood transfusion, stroke, and hospital readmission within 30 days (P < 0.05). No statistical differences were observed in the other outcomes categories. Hospital charges were reduced on average by $3013 per patient when using Custodiol-HTK cardioplegia. Use of Custodiol-HTK cardioplegia is safe and cost-effective when compared with traditional repetitive blood cardioplegia in patients undergoing minimally invasive and open valve surgery.

  4. Invasive Aspergillus niger complex infections in a Belgian tertiary care hospital.

    PubMed

    Vermeulen, E; Maertens, J; Meersseman, P; Saegeman, V; Dupont, L; Lagrou, K

    2014-05-01

    The incidence of invasive infections caused by the Aspergillus niger species complex was 0.043 cases/10 000 patient-days in a Belgian university hospital (2005-2011). Molecular typing was performed on six available A. niger complex isolates involved in invasive disease from 2010 to 2011, revealing A. tubingensis, which has higher triazole minimal inhibitory concentrations, in five out of six cases. © 2013 The Authors Clinical Microbiology and Infection © 2013 European Society of Clinical Microbiology and Infectious Diseases.

  5. Comparison of joint space versus task force load distribution optimization for a multiarm manipulator system

    NASA Technical Reports Server (NTRS)

    Soloway, Donald I.; Alberts, Thomas E.

    1989-01-01

    It is often proposed that the redundancy in choosing a force distribution for multiple arms grasping a single object should be handled by minimizing a quadratic performance index. The performance index may be formulated in terms of joint torques or in terms of the Cartesian space force/torque applied to the body by the grippers. The former seeks to minimize power consumption while the latter minimizes body stresses. Because the cost functions are related to each other by a joint angle dependent transformation on the weight matrix, it might be argued that either method tends to reduce power consumption, but clearly the joint space minimization is optimal. A comparison of these two options is presented with consideration given to computational cost and power consumption. Simulation results using a two arm robot system are presented to show the savings realized by employing the joint space optimization. These savings are offset by additional complexity, computation time and in some cases processor power consumption.

  6. Influence of hospital-level practice patterns on variation in the application of minimally invasive surgery in United States pediatric patients.

    PubMed

    Train, Arianne T; Harmon, Carroll M; Rothstein, David H

    2017-10-01

    Although disparities in access to minimally invasive surgery are thought to exist in pediatric surgical patients in the United States, hospital-level practice patterns have not been evaluated as a possible contributing factor. Retrospective cohort study using the Kids' Inpatient Database, 2012. Odds ratios of undergoing a minimally invasive compared to open operation were calculated for six typical pediatric surgical operations after adjustment for multiple patient demographic and hospital-level variables. Further adjustment to the regression model was made by incorporating hospital practice patterns, defined as operation-specific minimally invasive frequency and volume. Age was the most significant patient demographic factor affecting application of minimally invasive surgery for all procedures. For several procedures, adjusting for individual hospital practice patterns removed race- and income-based disparities seen in performance of minimally invasive operations. Disparities related to insurance status were not affected by the same adjustment. Variation in the application of minimally invasive surgery in pediatric surgical patients is primarily influenced by patient age and the type of procedure performed. Perceived disparities in access related to some socioeconomic factors are decreased but not eliminated by accounting for individual hospital practice patterns, suggesting that complex underlying factors influence application of advanced surgical techniques. II. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Design of a robust fuzzy controller for the arc stability of CO(2) welding process using the Taguchi method.

    PubMed

    Kim, Dongcheol; Rhee, Sehun

    2002-01-01

    CO(2) welding is a complex process. Weld quality is dependent on arc stability and minimizing the effects of disturbances or changes in the operating condition commonly occurring during the welding process. In order to minimize these effects, a controller can be used. In this study, a fuzzy controller was used in order to stabilize the arc during CO(2) welding. The input variable of the controller was the Mita index. This index estimates quantitatively the arc stability that is influenced by many welding process parameters. Because the welding process is complex, a mathematical model of the Mita index was difficult to derive. Therefore, the parameter settings of the fuzzy controller were determined by performing actual control experiments without using a mathematical model of the controlled process. The solution, the Taguchi method was used to determine the optimal control parameter settings of the fuzzy controller to make the control performance robust and insensitive to the changes in the operating conditions.

  8. Computer Program for Calculation of Complex Chemical Equilibrium Compositions, Rocket Performance, Incident and Reflected Shocks, and Chapman-Jouguet Detonations. Interim Revision, March 1976

    NASA Technical Reports Server (NTRS)

    Gordon, S.; Mcbride, B. J.

    1976-01-01

    A detailed description of the equations and computer program for computations involving chemical equilibria in complex systems is given. A free-energy minimization technique is used. The program permits calculations such as (1) chemical equilibrium for assigned thermodynamic states (T,P), (H,P), (S,P), (T,V), (U,V), or (S,V), (2) theoretical rocket performance for both equilibrium and frozen compositions during expansion, (3) incident and reflected shock properties, and (4) Chapman-Jouguet detonation properties. The program considers condensed species as well as gaseous species.

  9. Copy Hiding Application Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Holger; Poliakoff, David; Robinson, Peter

    2016-10-06

    CHAI is a light-weight framework which abstracts the automated movement of data (e.g. to/from Host/Device) via RAJA like performance portability programming model constructs. It can be viewed as a utility framework and an adjunct to FAJA (A Performance Portability Framework). Performance Portability is a technique that abstracts the complexities of modern Heterogeneous Architectures while allowing the original program to undergo incremental minimally invasive code changes in order to adapt to the newer architectures.

  10. AstroGrid: Taverna in the Virtual Observatory .

    NASA Astrophysics Data System (ADS)

    Benson, K. M.; Walton, N. A.

    This paper reports on the implementation of the Taverna workbench by AstroGrid, a tool for designing and executing workflows of tasks in the Virtual Observatory. The workflow approach helps astronomers perform complex task sequences with little technical effort. Visual approach to workflow construction streamlines highly complex analysis over public and private data and uses computational resources as minimal as a desktop computer. Some integration issues and future work are discussed in this article.

  11. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  12. Robotic partial nephrectomy for complex renal tumors: surgical technique.

    PubMed

    Rogers, Craig G; Singh, Amar; Blatt, Adam M; Linehan, W Marston; Pinto, Peter A

    2008-03-01

    Laparoscopic partial nephrectomy requires advanced training to accomplish tumor resection and renal reconstruction while minimizing warm ischemia times. Complex renal tumors add an additional challenge to a minimally invasive approach to nephron-sparing surgery. We describe our technique, illustrated with video, of robotic partial nephrectomy for complex renal tumors, including hilar, endophytic, and multiple tumors. Robotic assistance was used to resect 14 tumors in eight patients (mean age: 50.3 yr; range: 30-68 yr). Three patients had hereditary kidney cancer. All patients had complex tumor features, including hilar tumors (n=5), endophytic tumors (n=4), and/or multiple tumors (n=3). Robotic partial nephrectomy procedures were performed successfully without complications. Hilar clamping was used with a mean warm ischemia time of 31 min (range: 24-45 min). Mean blood loss was 230 ml (range: 100-450 ml). Histopathology confirmed clear-cell renal cell carcinoma (n=3), hybrid oncocytic tumor (n=2), chromophobe renal cell carcinoma (n=2), and oncocytoma (n=1). All patients had negative surgical margins. Mean index tumor size was 3.6 cm (range: 2.6-6.4 cm). Mean hospital stay was 2.6 d. At 3-mo follow-up, no patients experienced a statistically significant change in serum creatinine or estimated glomerular filtration rate and there was no evidence of tumor recurrence. Robotic partial nephrectomy is safe and feasible for select patients with complex renal tumors, including hilar, endophytic, and multiple tumors. Robotic assistance may facilitate a minimally invasive, nephron-sparing approach for select patients with complex renal tumors who might otherwise require open surgery or total nephrectomy.

  13. A meta-cognitive learning algorithm for a Fully Complex-valued Relaxation Network.

    PubMed

    Savitha, R; Suresh, S; Sundararajan, N

    2012-08-01

    This paper presents a meta-cognitive learning algorithm for a single hidden layer complex-valued neural network called "Meta-cognitive Fully Complex-valued Relaxation Network (McFCRN)". McFCRN has two components: a cognitive component and a meta-cognitive component. A Fully Complex-valued Relaxation Network (FCRN) with a fully complex-valued Gaussian like activation function (sech) in the hidden layer and an exponential activation function in the output layer forms the cognitive component. The meta-cognitive component contains a self-regulatory learning mechanism which controls the learning ability of FCRN by deciding what-to-learn, when-to-learn and how-to-learn from a sequence of training data. The input parameters of cognitive components are chosen randomly and the output parameters are estimated by minimizing a logarithmic error function. The problem of explicit minimization of magnitude and phase errors in the logarithmic error function is converted to system of linear equations and output parameters of FCRN are computed analytically. McFCRN starts with zero hidden neuron and builds the number of neurons required to approximate the target function. The meta-cognitive component selects the best learning strategy for FCRN to acquire the knowledge from training data and also adapts the learning strategies to implement best human learning components. Performance studies on a function approximation and real-valued classification problems show that proposed McFCRN performs better than the existing results reported in the literature. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Minimally invasive surgery. Future developments.

    PubMed

    Wickham, J E

    1994-01-15

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures.

  15. Robotics in surgery: is a robot necessary? For what?

    PubMed

    Ross, Sharona B; Downs, Darrell; Saeed, Sabrina M; Dolce, John K; Rosemurgy, Alexander S

    2017-02-01

    Every operation can be categorized along a spectrum from "most invasive" to "least invasive", based on the approach(es) through which it is commonly undertaken. Operations that are considered "most invasive" are characterized by "open" approaches with a relatively high degree of morbidity, while operations that are considered "least invasive" are undertaken with minimally invasive techniques and are associated with relatively improved patient outcomes, including faster recovery times and fewer complications. Because of the potential for reduced morbidity, movement along the spectrum towards minimally invasive surgery (MIS) is associated with a host of salutary benefits and, as well, lower costs of patient care. Accordingly, the goal of all stakeholders in surgery should be to attain universal application of the most minimally invasive approaches. Yet the difficulty of performing minimally invasive operations has largely limited its widespread application in surgery, particularly in the context of complex operations (i.e., those requiring complex extirpation and/or reconstruction). Robotic surgery, however, may facilitate application of minimally invasive techniques requisite for particular operations. Enhancements in visualization and dexterity offered by robotic surgical systems allow busy surgeons to quickly gain proficiency in demanding techniques (e.g., pancreaticojejunostomy), within a short learning curve. That is not to say, however, that all operations undertaken with minimally invasive techniques require robotic technology. Herein, we attempt to define how surgeon skill, operative difficulty, patient outcomes, and cost factors determine when robotic technology should be reasonably applied to patient care in surgery.

  16. Mini-Bentall: An Interesting Approach for Selected Patients.

    PubMed

    Mikus, Elisa; Micari, Antonio; Calvi, Simone; Salomone, Maria; Panzavolta, Marco; Paris, Marco; Del Giglio, Mauro

    Minimally invasive surgery through an upper hemisternotomy for aortic valve replacement has become the routine approach with excellent results. Actually, the same minimally invasive access is used for complex ascending aorta procedures only in few centers. We report our experience with minimally invasive approach for aortic valve and ascending aorta replacement using Bentall technique. From January 2010 to November 2015, a total of 238 patients received ascending aorta and aortic valve replacement using Bentall De Bono procedure at our institution. Low- and intermediate-risk patients underwent elective surgery with a minimally invasive approach. The "J"-shaped partial upper sternotomy was performed through a 6-cm skin incision from the notch to the third right intercostal space. Patients who had previous cardiac surgery or affected by active endocarditis were excluded. The study included 53 patients, 44 male (83 %) with a median age of 63 years [interquartile range (IQR), 51-73 years]. A bicuspid aortic valve was diagnosed in 27 patients (51%). A biological Bentall using a pericardial Mitroflow or Crown bioprosthesis implanted in a Valsalva graft was performed in 49 patents. The remaining four patients were treated with a traditional mechanical conduit. Median cardiopulmonary bypass time and median cross-clamp time were respectively 84 (IQR, 75-103) minutes and 73 (IQR, 64-89) minutes. Hospital mortality was zero as well as 30-day mortality. Median intensive care unit and hospital stay were 1.9 and 8 days, respectively. The study population compared with patients treated with standard full sternotomy and similar preoperative characteristics showed similar results in terms of postoperative outcomes with a slightly superiority of minimally invasive group mainly regarding operative times, incidence of atrial fibrillation, and postoperative ventilation times. A partial upper sternotomy is considered a safe option for aortic valve replacement. Our experience confirms that a minimally invasive approach using a partial upper J-shaped sternotomy can be a safe alternative approach to the standard in selected patients presenting with complex aortic root pathology.

  17. Developing a robotic pancreas program: the Dutch experience

    PubMed Central

    Nota, Carolijn L.; Zwart, Maurice J.; Fong, Yuman; Hagendoorn, Jeroen; Hogg, Melissa E.; Koerkamp, Bas Groot; Besselink, Marc G.

    2017-01-01

    Robot-assisted surgery has been developed to overcome limitations of conventional laparoscopy aiming to further optimize minimally invasive surgery. Despite the fact that robotics already have been widely adopted in urology, gynecology, and several gastro-intestinal procedures, like colorectal surgery, pancreatic surgery lags behind. Due to the complex nature of the procedure, surgeons probably have been hesitant to apply minimally invasive techniques in pancreatic surgery. Nevertheless, the past few years pancreatic surgery has been catching up. An increasing number of procedures are being performed laparoscopically and robotically, despite it being a highly complex procedure with high morbidity and mortality rates. Since the complex nature and extensiveness of the procedure, the start of a robotic pancreatic program should be properly prepared and should comply with several conditions within high-volume centers. Robotic training plays a significant role in the preparation. In this review we discuss the different aspects of preparation when working towards the start of a robotic pancreas program against the background of our nationwide experience in the Netherlands. PMID:29078666

  18. Developing a robotic pancreas program: the Dutch experience.

    PubMed

    Nota, Carolijn L; Zwart, Maurice J; Fong, Yuman; Hagendoorn, Jeroen; Hogg, Melissa E; Koerkamp, Bas Groot; Besselink, Marc G; Molenaar, I Quintus

    2017-01-01

    Robot-assisted surgery has been developed to overcome limitations of conventional laparoscopy aiming to further optimize minimally invasive surgery. Despite the fact that robotics already have been widely adopted in urology, gynecology, and several gastro-intestinal procedures, like colorectal surgery, pancreatic surgery lags behind. Due to the complex nature of the procedure, surgeons probably have been hesitant to apply minimally invasive techniques in pancreatic surgery. Nevertheless, the past few years pancreatic surgery has been catching up. An increasing number of procedures are being performed laparoscopically and robotically, despite it being a highly complex procedure with high morbidity and mortality rates. Since the complex nature and extensiveness of the procedure, the start of a robotic pancreatic program should be properly prepared and should comply with several conditions within high-volume centers. Robotic training plays a significant role in the preparation. In this review we discuss the different aspects of preparation when working towards the start of a robotic pancreas program against the background of our nationwide experience in the Netherlands.

  19. Minimally invasive percutaneous pericardial ICD placement in an infant piglet model: Head-to-head comparison with an open surgical thoracotomy approach.

    PubMed

    Clark, Bradley C; Davis, Tanya D; El-Sayed Ahmed, Magdy M; McCarter, Robert; Ishibashi, Nobuyuki; Jordan, Christopher P; Kane, Timothy D; Kim, Peter C W; Krieger, Axel; Nath, Dilip S; Opfermann, Justin D; Berul, Charles I

    2016-05-01

    Epicardial implantable cardioverter-defibrillator (ICD) placement in infants, children, and patients with complex cardiac anatomy requires an open surgical thoracotomy and is associated with increased pain, longer length of stay, and higher cost. The purpose of this study was to compare an open surgical epicardial placement approach with percutaneous pericardial placement of an ICD lead system in an infant piglet model. Animals underwent either epicardial placement by direct suture fixation through a left thoracotomy or minimally invasive pericardial placement with thoracoscopic visualization. Initial lead testing and defibrillation threshold testing (DFT) were performed. After the 2-week survival period, repeat lead testing and DFT were performed before euthanasia. Minimally invasive placement was performed in 8 piglets and open surgical placement in 7 piglets without procedural morbidity or mortality. The mean initial DFT value was 10.5 J (range 3-28 J) in the minimally invasive group and 10.0 J (range 5-35 J) in the open surgical group (P = .90). After the survival period, the mean DFT value was 12.0 J (range 3-20 J) in the minimally invasive group and 12.3 J (range 3-35 J) in the open surgical group (P = .95). All lead and shock impedances, R-wave amplitudes, and ventricular pacing thresholds remained stable throughout the survival period. Compared with open surgical epicardial ICD lead placement, minimally invasive pericardial placement demonstrates an equivalent ability to effectively defibrillate the heart and has demonstrated similar lead stability. With continued technical development and operator experience, the minimally invasive method may provide a viable alternative to epicardial ICD lead placement in infants, children, and adults at risk of sudden cardiac death. Copyright © 2016 Heart Rhythm Society. All rights reserved.

  20. On complexity of trellis structure of linear block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1990-01-01

    The trellis structure of linear block codes (LBCs) is discussed. The state and branch complexities of a trellis diagram (TD) for a LBC is investigated. The TD with the minimum number of states is said to be minimal. The branch complexity of a minimal TD for a LBC is expressed in terms of the dimensions of specific subcodes of the given code. Then upper and lower bounds are derived on the number of states of a minimal TD for a LBC, and it is shown that a cyclic (or shortened cyclic) code is the worst in terms of the state complexity among the LBCs of the same length and dimension. Furthermore, it is shown that the structural complexity of a minimal TD for a LBC depends on the order of its bit positions. This fact suggests that an appropriate permutation of the bit positions of a code may result in an equivalent code with a much simpler minimal TD. Boolean polynomial representation of codewords of a LBC is also considered. This representation helps in study of the trellis structure of the code. Boolean polynomial representation of a code is applied to construct its minimal TD. Particularly, the construction of minimal trellises for Reed-Muller codes and the extended and permuted binary primitive BCH codes which contain Reed-Muller as subcodes is emphasized. Finally, the structural complexity of minimal trellises for the extended and permuted, and double-error-correcting BCH codes is analyzed and presented. It is shown that these codes have relatively simple trellis structure and hence can be decoded with the Viterbi decoding algorithm.

  1. Development of minimally invasive techniques for management of medically-complicated obesity

    PubMed Central

    Rashti, Farzin; Gupta, Ekta; Ebrahimi, Suzan; Shope, Timothy R; Koch, Timothy R; Gostout, Christopher J

    2014-01-01

    The field of bariatric surgery has been rapidly growing and evolving over the past several decades. During the period that obesity has become a worldwide epidemic, new interventions have been developed to combat this complex disorder. The development of new laparoscopic and minimally invasive treatments for medically-complicated obesity has made it essential that gastrointestinal physicians obtain a thorough understanding of past developments and possible future directions in bariatrics. New laparoscopic advancements provide patients and practitioners with a variety of options that have an improved safety profile and better efficacy without open, invasive surgery. The mechanisms of weight loss after bariatric surgery are complex and may in part be related to altered release of regulatory peptide hormones from the gut. Endoscopic techniques designed to mimic the effects of bariatric surgery and endolumenal interventions performed entirely through the gastrointestinal tract offer potential advantages. Several of these new techniques have demonstrated promising, preliminary results. We outline herein historical and current trends in the development of bariatric surgery and its transition to safer and more minimally invasive procedures designed to induce weight loss. PMID:25309074

  2. Energy and time determine scaling in biological and computer designs

    PubMed Central

    Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-01-01

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy–time minimization principle may govern the design of many complex systems that process energy, materials and information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431524

  3. Energy and time determine scaling in biological and computer designs.

    PubMed

    Moses, Melanie; Bezerra, George; Edwards, Benjamin; Brown, James; Forrest, Stephanie

    2016-08-19

    Metabolic rate in animals and power consumption in computers are analogous quantities that scale similarly with size. We analyse vascular systems of mammals and on-chip networks of microprocessors, where natural selection and human engineering, respectively, have produced systems that minimize both energy dissipation and delivery times. Using a simple network model that simultaneously minimizes energy and time, our analysis explains empirically observed trends in the scaling of metabolic rate in mammals and power consumption and performance in microprocessors across several orders of magnitude in size. Just as the evolutionary transitions from unicellular to multicellular animals in biology are associated with shifts in metabolic scaling, our model suggests that the scaling of power and performance will change as computer designs transition to decentralized multi-core and distributed cyber-physical systems. More generally, a single energy-time minimization principle may govern the design of many complex systems that process energy, materials and information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).

  4. A Parallel Rendering Algorithm for MIMD Architectures

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.; Orloff, Tobias

    1991-01-01

    Applications such as animation and scientific visualization demand high performance rendering of complex three dimensional scenes. To deliver the necessary rendering rates, highly parallel hardware architectures are required. The challenge is then to design algorithms and software which effectively use the hardware parallelism. A rendering algorithm targeted to distributed memory MIMD architectures is described. For maximum performance, the algorithm exploits both object-level and pixel-level parallelism. The behavior of the algorithm is examined both analytically and experimentally. Its performance for large numbers of processors is found to be limited primarily by communication overheads. An experimental implementation for the Intel iPSC/860 shows increasing performance from 1 to 128 processors across a wide range of scene complexities. It is shown that minimal modifications to the algorithm will adapt it for use on shared memory architectures as well.

  5. Development of a parameter optimization technique for the design of automatic control systems

    NASA Technical Reports Server (NTRS)

    Whitaker, P. H.

    1977-01-01

    Parameter optimization techniques for the design of linear automatic control systems that are applicable to both continuous and digital systems are described. The model performance index is used as the optimization criterion because of the physical insight that can be attached to it. The design emphasis is to start with the simplest system configuration that experience indicates would be practical. Design parameters are specified, and a digital computer program is used to select that set of parameter values which minimizes the performance index. The resulting design is examined, and complexity, through the use of more complex information processing or more feedback paths, is added only if performance fails to meet operational specifications. System performance specifications are assumed to be such that the desired step function time response of the system can be inferred.

  6. Distributed query plan generation using multiobjective genetic algorithm.

    PubMed

    Panicker, Shina; Kumar, T V Vijay

    2014-01-01

    A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability.

  7. Distributed Query Plan Generation Using Multiobjective Genetic Algorithm

    PubMed Central

    Panicker, Shina; Vijay Kumar, T. V.

    2014-01-01

    A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability. PMID:24963513

  8. Shape Sensing Techniques for Continuum Robots in Minimally Invasive Surgery: A Survey.

    PubMed

    Shi, Chaoyang; Luo, Xiongbiao; Qi, Peng; Li, Tianliang; Song, Shuang; Najdovski, Zoran; Fukuda, Toshio; Ren, Hongliang

    2017-08-01

    Continuum robots provide inherent structural compliance with high dexterity to access the surgical target sites along tortuous anatomical paths under constrained environments and enable to perform complex and delicate operations through small incisions in minimally invasive surgery. These advantages enable their broad applications with minimal trauma and make challenging clinical procedures possible with miniaturized instrumentation and high curvilinear access capabilities. However, their inherent deformable designs make it difficult to realize 3-D intraoperative real-time shape sensing to accurately model their shape. Solutions to this limitation can lead themselves to further develop closely associated techniques of closed-loop control, path planning, human-robot interaction, and surgical manipulation safety concerns in minimally invasive surgery. Although extensive model-based research that relies on kinematics and mechanics has been performed, accurate shape sensing of continuum robots remains challenging, particularly in cases of unknown and dynamic payloads. This survey investigates the recent advances in alternative emerging techniques for 3-D shape sensing in this field and focuses on the following categories: fiber-optic-sensor-based, electromagnetic-tracking-based, and intraoperative imaging modality-based shape-reconstruction methods. The limitations of existing technologies and prospects of new technologies are also discussed.

  9. Minimally invasive surgery. Future developments.

    PubMed Central

    Wickham, J. E.

    1994-01-01

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures. Images Fig 1 Fig 2 Fig 3 Fig 4 Fig 5 PMID:8312776

  10. Minimizing makespan in a two-stage flow shop with parallel batch-processing machines and re-entrant jobs

    NASA Astrophysics Data System (ADS)

    Huang, J. D.; Liu, J. J.; Chen, Q. X.; Mao, N.

    2017-06-01

    Against a background of heat-treatment operations in mould manufacturing, a two-stage flow-shop scheduling problem is described for minimizing makespan with parallel batch-processing machines and re-entrant jobs. The weights and release dates of jobs are non-identical, but job processing times are equal. A mixed-integer linear programming model is developed and tested with small-scale scenarios. Given that the problem is NP hard, three heuristic construction methods with polynomial complexity are proposed. The worst case of the new constructive heuristic is analysed in detail. A method for computing lower bounds is proposed to test heuristic performance. Heuristic efficiency is tested with sets of scenarios. Compared with the two improved heuristics, the performance of the new constructive heuristic is superior.

  11. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  12. Sensitivity based coupling strengths in complex engineering systems

    NASA Technical Reports Server (NTRS)

    Bloebaum, C. L.; Sobieszczanski-Sobieski, J.

    1993-01-01

    The iterative design scheme necessary for complex engineering systems is generally time consuming and difficult to implement. Although a decomposition approach results in a more tractable problem, the inherent couplings make establishing the interdependencies of the various subsystems difficult. Another difficulty lies in identifying the most efficient order of execution for the subsystem analyses. The paper describes an approach for determining the dependencies that could be suspended during the system analysis with minimal accuracy losses, thereby reducing the system complexity. A new multidisciplinary testbed is presented, involving the interaction of structures, aerodynamics, and performance disciplines. Results are presented to demonstrate the effectiveness of the system reduction scheme.

  13. Hybrid acousto-optic and digital equalization for microwave digital radio channels

    NASA Astrophysics Data System (ADS)

    Anderson, C. S.; Vanderlugt, A.

    1990-11-01

    Digital radio transmission systems use complex modulation schemes that require powerful signal-processing techniques to correct channel distortions and to minimize BERs. This paper proposes combining the computation power of acoustooptic processing and the accuracy of digital processing to produce a hybrid channel equalizer that exceeds the performance of digital equalization alone. Analysis shows that a hybrid equalizer for 256-level quadrature amplitude modulation (QAM) performs better than a digital equalizer for 64-level QAM.

  14. Systems Biology Perspectives on Minimal and Simpler Cells

    PubMed Central

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  15. Randomization in clinical trials: stratification or minimization? The HERMES free simulation software.

    PubMed

    Fron Chabouis, Hélène; Chabouis, Francis; Gillaizeau, Florence; Durieux, Pierre; Chatellier, Gilles; Ruse, N Dorin; Attal, Jean-Pierre

    2014-01-01

    Operative clinical trials are often small and open-label. Randomization is therefore very important. Stratification and minimization are two randomization options in such trials. The first aim of this study was to compare stratification and minimization in terms of predictability and balance in order to help investigators choose the most appropriate allocation method. Our second aim was to evaluate the influence of various parameters on the performance of these techniques. The created software generated patients according to chosen trial parameters (e.g., number of important prognostic factors, number of operators or centers, etc.) and computed predictability and balance indicators for several stratification and minimization methods over a given number of simulations. Block size and proportion of random allocations could be chosen. A reference trial was chosen (50 patients, 1 prognostic factor, and 2 operators) and eight other trials derived from this reference trial were modeled. Predictability and balance indicators were calculated from 10,000 simulations per trial. Minimization performed better with complex trials (e.g., smaller sample size, increasing number of prognostic factors, and operators); stratification imbalance increased when the number of strata increased. An inverse correlation between imbalance and predictability was observed. A compromise between predictability and imbalance still has to be found by the investigator but our software (HERMES) gives concrete reasons for choosing between stratification and minimization; it can be downloaded free of charge. This software will help investigators choose the appropriate randomization method in future two-arm trials.

  16. Hemodynamic Performance and Thrombogenic Properties of a Superhydrophobic Bileaflet Mechanical Heart Valve

    PubMed Central

    Bark, David L.; Vahabi, Hamed; Bui, Hieu; Movafaghi, Sanli; Moore, Brandon; Kota, Arun K.; Popat, Ketul; Dasi, Lakshmi P.

    2016-01-01

    In this study, we explore how blood-material interactions and hemodynamics are impacted by rendering a clinical quality 25 mm St. Jude Medical Bileaflet mechanical heart valve (BMHV) superhydrophobic (SH) with the aim of reducing thrombo-embolic complications associated with BMHVs. Basic cell adhesion is evaluated to assess blood-material interactions, while hemodynamic performance is analyzed with and without the SH coating. Results show that a SH coating with a receding contact angle (CA) of 160º strikingly eliminates platelet and leukocyte adhesion to the surface. Alternatively, many platelets attach to and activate on pyrolytic carbon (receding CA=47), the base material for BMHVs. We further show that the performance index increases by 2.5% for coated valve relative to an uncoated valve, with a maximum possible improved performance of 5%. Both valves exhibit instantaneous shear stress below 10 N/m2 and Reynolds Shear Stress below 100 N/m2. Therefore, a SH BMHV has the potential to relax the requirement for antiplatelet and anticoagulant drug regimens typically required for patients receiving MHVs by minimizing blood-material interactions, while having a minimal impact on hemodynamics. We show for the first time that SH-coated surfaces may be a promising direction to minimize thrombotic complications in complex devices such as heart valves. PMID:27098219

  17. A perverse quality incentive in surgery: implications of reimbursing surgeons less for doing laparoscopic surgery.

    PubMed

    Fader, Amanda N; Xu, Tim; Dunkin, Brian J; Makary, Martin A

    2016-11-01

    Surgery is one of the highest priced services in health care, and complications from surgery can be serious and costly. Recently, advances in surgical techniques have allowed surgeons to perform many common operations using minimally invasive methods that result in fewer complications. Despite this, the rates of open surgery remain high across multiple surgical disciplines. This is an expert commentary and review of the contemporary literature regarding minimally invasive surgery practices nationwide, the benefits of less invasive approaches, and how minimally invasive compared with open procedures are differentially reimbursed in the United States. We explore the incentive of the current surgeon reimbursement fee schedule and its potential implications. A surgeon's preference to perform minimally invasive compared with open surgery remains highly variable in the U.S., even after adjustment for patient comorbidities and surgical complexity. Nationwide administrative claims data across several surgical disciplines demonstrates that minimally invasive surgery utilization in place of open surgery is associated with reduced adverse events and cost savings. Reducing surgical complications by increasing adoption of minimally invasive operations has significant cost implications for health care. However, current U.S. payment structures may perversely incentivize open surgery and financially reward physicians who do not necessarily embrace newer or best minimally invasive surgery practices. Utilization of minimally invasive surgery varies considerably in the U.S., representing one of the greatest disparities in health care. Existing physician payment models must translate the growing body of research in surgical care into physician-level rewards for quality, including choice of operation. Promoting safe surgery should be an important component of a strong, value-based healthcare system. Resolving the potentially perverse incentives in paying for surgical approaches may help address disparities in surgical care, reduce the prevalent problem of variation, and help contain health care costs.

  18. Reinforcement-learning-based dual-control methodology for complex nonlinear discrete-time systems with application to spark engine EGR operation.

    PubMed

    Shih, Peter; Kaul, Brian C; Jagannathan, S; Drallmeier, James A

    2008-08-01

    A novel reinforcement-learning-based dual-control methodology adaptive neural network (NN) controller is developed to deliver a desired tracking performance for a class of complex feedback nonlinear discrete-time systems, which consists of a second-order nonlinear discrete-time system in nonstrict feedback form and an affine nonlinear discrete-time system, in the presence of bounded and unknown disturbances. For example, the exhaust gas recirculation (EGR) operation of a spark ignition (SI) engine is modeled by using such a complex nonlinear discrete-time system. A dual-controller approach is undertaken where primary adaptive critic NN controller is designed for the nonstrict feedback nonlinear discrete-time system whereas the secondary one for the affine nonlinear discrete-time system but the controllers together offer the desired performance. The primary adaptive critic NN controller includes an NN observer for estimating the states and output, an NN critic, and two action NNs for generating virtual control and actual control inputs for the nonstrict feedback nonlinear discrete-time system, whereas an additional critic NN and an action NN are included for the affine nonlinear discrete-time system by assuming the state availability. All NN weights adapt online towards minimization of a certain performance index, utilizing gradient-descent-based rule. Using Lyapunov theory, the uniformly ultimate boundedness (UUB) of the closed-loop tracking error, weight estimates, and observer estimates are shown. The adaptive critic NN controller performance is evaluated on an SI engine operating with high EGR levels where the controller objective is to reduce cyclic dispersion in heat release while minimizing fuel intake. Simulation and experimental results indicate that engine out emissions drop significantly at 20% EGR due to reduction in dispersion in heat release thus verifying the dual-control approach.

  19. Complexity Management Using Metrics for Trajectory Flexibility Preservation and Constraint Minimization

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Shen, Ni; Wing, David J.

    2011-01-01

    The growing demand for air travel is increasing the need for mitigating air traffic congestion and complexity problems, which are already at high levels. At the same time new surveillance, navigation, and communication technologies are enabling major transformations in the air traffic management system, including net-based information sharing and collaboration, performance-based access to airspace resources, and trajectory-based rather than clearance-based operations. The new system will feature different schemes for allocating tasks and responsibilities between the ground and airborne agents and between the human and automation, with potential capacity and cost benefits. Therefore, complexity management requires new metrics and methods that can support these new schemes. This paper presents metrics and methods for preserving trajectory flexibility that have been proposed to support a trajectory-based approach for complexity management by airborne or ground-based systems. It presents extensions to these metrics as well as to the initial research conducted to investigate the hypothesis that using these metrics to guide user and service provider actions will naturally mitigate traffic complexity. The analysis showed promising results in that: (1) Trajectory flexibility preservation mitigated traffic complexity as indicated by inducing self-organization in the traffic patterns and lowering traffic complexity indicators such as dynamic density and traffic entropy. (2)Trajectory flexibility preservation reduced the potential for secondary conflicts in separation assurance. (3) Trajectory flexibility metrics showed potential application to support user and service provider negotiations for minimizing the constraints imposed on trajectories without jeopardizing their objectives.

  20. Near DC force measurement using PVDF sensors

    NASA Astrophysics Data System (ADS)

    Ramanathan, Arun Kumar; Headings, Leon M.; Dapino, Marcelo J.

    2018-03-01

    There is a need for high-performance force sensors capable of operating at frequencies near DC while producing a minimal mass penalty. Example application areas include steering wheel sensors, powertrain torque sensors, robotic arms, and minimally invasive surgery. The beta crystallographic phase polyvinylidene fluoride (PVDF) films are suitable for this purpose owing to their large piezoelectric constant. Unlike conventional capacitive sensors, beta crystallographic phase PVDF films exhibit a broad linear range and can potentially be designed to operate without complex electronics or signal processing. A fundamental challenge that prevents the implementation of PVDF in certain high-performance applications is their inability to measure static signals, which results from their first-order electrical impedance. Charge readout algorithms have been implemented which address this issue only partially, as they often require integration of the output signal to obtain the applied force profile, resulting in signal drift and signal processing complexities. In this paper, we propose a straightforward real time drift compensation strategy that is applicable to high output impedance PVDF films. This strategy makes it possible to utilize long sample times with a minimal loss of accuracy; our measurements show that the static output remains within 5% of the original value during half-hour measurements. The sensitivity and full-scale range are shown to be determined by the feedback capacitance of the charge amplifier. A linear model of the PVDF sensor system is developed and validated against experimental measurements, along with benchmark tests against a commercial load cell.

  1. Design optimization of transmitting antennas for weakly coupled magnetic induction communication systems

    PubMed Central

    2017-01-01

    This work focuses on the design of transmitting coils in weakly coupled magnetic induction communication systems. We propose several optimization methods that reduce the active, reactive and apparent power consumption of the coil. These problems are formulated as minimization problems, in which the power consumed by the transmitting coil is minimized, under the constraint of providing a required magnetic field at the receiver location. We develop efficient numeric and analytic methods to solve the resulting problems, which are of high dimension, and in certain cases non-convex. For the objective of minimal reactive power an analytic solution for the optimal current distribution in flat disc transmitting coils is provided. This problem is extended to general three-dimensional coils, for which we develop an expression for the optimal current distribution. Considering the objective of minimal apparent power, a method is developed to reduce the computational complexity of the problem by transforming it to an equivalent problem of lower dimension, allowing a quick and accurate numeric solution. These results are verified experimentally by testing a number of coil geometries. The results obtained allow reduced power consumption and increased performances in magnetic induction communication systems. Specifically, for wideband systems, an optimal design of the transmitter coil reduces the peak instantaneous power provided by the transmitter circuitry, and thus reduces its size, complexity and cost. PMID:28192463

  2. The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction.

    PubMed

    Casey, M

    1996-08-15

    Recurrent neural networks (RNNs) can learn to perform finite state computations. It is shown that an RNN performing a finite state computation must organize its state space to mimic the states in the minimal deterministic finite state machine that can perform that computation, and a precise description of the attractor structure of such systems is given. This knowledge effectively predicts activation space dynamics, which allows one to understand RNN computation dynamics in spite of complexity in activation dynamics. This theory provides a theoretical framework for understanding finite state machine (FSM) extraction techniques and can be used to improve training methods for RNNs performing FSM computations. This provides an example of a successful approach to understanding a general class of complex systems that has not been explicitly designed, e.g., systems that have evolved or learned their internal structure.

  3. RMP: Reduced-set matching pursuit approach for efficient compressed sensing signal reconstruction.

    PubMed

    Abdel-Sayed, Michael M; Khattab, Ahmed; Abu-Elyazeed, Mohamed F

    2016-11-01

    Compressed sensing enables the acquisition of sparse signals at a rate that is much lower than the Nyquist rate. Compressed sensing initially adopted [Formula: see text] minimization for signal reconstruction which is computationally expensive. Several greedy recovery algorithms have been recently proposed for signal reconstruction at a lower computational complexity compared to the optimal [Formula: see text] minimization, while maintaining a good reconstruction accuracy. In this paper, the Reduced-set Matching Pursuit (RMP) greedy recovery algorithm is proposed for compressed sensing. Unlike existing approaches which either select too many or too few values per iteration, RMP aims at selecting the most sufficient number of correlation values per iteration, which improves both the reconstruction time and error. Furthermore, RMP prunes the estimated signal, and hence, excludes the incorrectly selected values. The RMP algorithm achieves a higher reconstruction accuracy at a significantly low computational complexity compared to existing greedy recovery algorithms. It is even superior to [Formula: see text] minimization in terms of the normalized time-error product, a new metric introduced to measure the trade-off between the reconstruction time and error. RMP superior performance is illustrated with both noiseless and noisy samples.

  4. Development of a Paradigm to Assess Nutritive and Biochemical Substances in Humans: A Preliminary Report on the Effects of Tyrosine upon Altitude- and Cold-Induced Stress Responses

    DTIC Science & Technology

    1987-03-01

    3/4 hours. Performance tests evaluated simple and choice reaction time to visual stimuli, vigilance, and processing of symbolic, numerical, verbal...minimize the adverse consequences of these stressors. Tyrosine enhanced performance (e.g. complex information processing , vigilance, and reaction time... processes inherent in many real-world tasks. For example, Map Compass requires association of Wsi PL AFCm uA O-SV CHETCLtISS) direction and degree

  5. Charliecloud: Unprivileged containers for user-defined software stacks in HPC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Priedhorsky, Reid; Randles, Timothy C.

    Supercomputing centers are seeing increasing demand for user-defined software stacks (UDSS), instead of or in addition to the stack provided by the center. These UDSS support user needs such as complex dependencies or build requirements, externally required configurations, portability, and consistency. The challenge for centers is to provide these services in a usable manner while minimizing the risks: security, support burden, missing functionality, and performance. We present Charliecloud, which uses the Linux user and mount namespaces to run industry-standard Docker containers with no privileged operations or daemons on center resources. Our simple approach avoids most security risks while maintaining accessmore » to the performance and functionality already on offer, doing so in less than 500 lines of code. Charliecloud promises to bring an industry-standard UDSS user workflow to existing, minimally altered HPC resources.« less

  6. Time estimation as a secondary task to measure workload: Summary of research

    NASA Technical Reports Server (NTRS)

    Hart, S. G.; Mcpherson, D.; Loomis, L. L.

    1978-01-01

    Actively produced intervals of time were found to increase in length and variability, whereas retrospectively produced intervals decreased in length although they also increased in variability with the addition of a variety of flight-related tasks. If pilots counted aloud while making a production, however, the impact of concurrent activity was minimized, at least for the moderately demanding primary tasks that were selected. The effects of feedback on estimation accuracy and consistency were greatly enhanced if a counting or tapping production technique was used. This compares with the minimal effect that feedback had when no overt timekeeping technique was used. Actively made verbal estimates of sessions filled with different activities performed during the interval were increased. Retrospectively made verbal estimates, however, increased in length as the amount and complexity of activities performed during the interval were increased.

  7. Systems biology perspectives on minimal and simpler cells.

    PubMed

    Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel

    2014-09-01

    The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  8. Determining biosonar images using sparse representations.

    PubMed

    Fontaine, Bertrand; Peremans, Herbert

    2009-05-01

    Echolocating bats are thought to be able to create an image of their environment by emitting pulses and analyzing the reflected echoes. In this paper, the theory of sparse representations and its more recent further development into compressed sensing are applied to this biosonar image formation task. Considering the target image representation as sparse allows formulation of this inverse problem as a convex optimization problem for which well defined and efficient solution methods have been established. The resulting technique, referred to as L1-minimization, is applied to simulated data to analyze its performance relative to delay accuracy and delay resolution experiments. This method performs comparably to the coherent receiver for the delay accuracy experiments, is quite robust to noise, and can reconstruct complex target impulse responses as generated by many closely spaced reflectors with different reflection strengths. This same technique, in addition to reconstructing biosonar target images, can be used to simultaneously localize these complex targets by interpreting location cues induced by the bat's head related transfer function. Finally, a tentative explanation is proposed for specific bat behavioral experiments in terms of the properties of target images as reconstructed by the L1-minimization method.

  9. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    PubMed Central

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  10. Laparoscopic revolution in bariatric surgery

    PubMed Central

    Sundbom, Magnus

    2014-01-01

    The history of bariatric surgery is investigational. Dedicated surgeons have continuously sought for an ideal procedure to relieve morbidly obese patients from their burden of comorbid conditions, reduced life expectancy and low quality of life. The ideal procedure must have low complication risk, both in short- and long term, as well as minimal impact on daily life. The revolution of laparoscopic techniques in bariatric surgery is described in this summary. Advances in minimal invasive techniques have contributed to reduced operative time, length of stay, and complications. The development in bariatric surgery has been exceptional, resulting in a dramatic increase of the number of procedures performed world wide during the last decades. Although, a complex bariatric procedure can be performed with operative mortality no greater than cholecystectomy, specific procedure-related complications and other drawbacks must be taken into account. The evolution of laparoscopy will be the legacy of the 21st century and at present, day-care surgery and further reduction of the operative trauma is in focus. The impressive effects on comorbid conditions have prompted the adoption of minimal invasive bariatric procedures into the field of metabolic surgery. PMID:25386062

  11. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    PubMed

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  12. Robotics-based synthesis of human motion.

    PubMed

    Khatib, O; Demircan, E; De Sapio, V; Sentis, L; Besier, T; Delp, S

    2009-01-01

    The synthesis of human motion is a complex procedure that involves accurate reconstruction of movement sequences, modeling of musculoskeletal kinematics, dynamics and actuation, and characterization of reliable performance criteria. Many of these processes have much in common with the problems found in robotics research. Task-based methods used in robotics may be leveraged to provide novel musculoskeletal modeling methods and physiologically accurate performance predictions. In this paper, we present (i) a new method for the real-time reconstruction of human motion trajectories using direct marker tracking, (ii) a task-driven muscular effort minimization criterion and (iii) new human performance metrics for dynamic characterization of athletic skills. Dynamic motion reconstruction is achieved through the control of a simulated human model to follow the captured marker trajectories in real-time. The operational space control and real-time simulation provide human dynamics at any configuration of the performance. A new criteria of muscular effort minimization has been introduced to analyze human static postures. Extensive motion capture experiments were conducted to validate the new minimization criterion. Finally, new human performance metrics were introduced to study in details an athletic skill. These metrics include the effort expenditure and the feasible set of operational space accelerations during the performance of the skill. The dynamic characterization takes into account skeletal kinematics as well as muscle routing kinematics and force generating capacities. The developments draw upon an advanced musculoskeletal modeling platform and a task-oriented framework for the effective integration of biomechanics and robotics methods.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strickland, Madeleine; Stanley, Ann Marie; Wang, Guangshun

    Paralogous enzymes arise from gene duplication events that confer a novel function, although it is unclear how cross-reaction between the original and duplicate protein interaction network is minimized. We investigated HPr:EIsugar and NPr:EINtr, the initial complexes of paralogous phosphorylation cascades involved in sugar import and nitrogen regulation in bacteria, respectively. Although the HPr:EIsugar interaction has been well characterized, involving multiple complexes and transient interactions, the exact nature of the NPr:EINtr complex was unknown. We set out to identify the key features of the interaction by performing binding assays and elucidating the structure of NPr in complex with the phosphorylation domainmore » of EINtr (EINNtr), using a hybrid approach involving X-ray, homology, and sparse nuclear magnetic resonance. We found that the overall fold and active-site structure of the two complexes are conserved in order to maintain productive phosphorylation, however, the interface surface potential differs between the two complexes, which prevents cross-reaction.« less

  14. Vehicle routing problem with time windows using natural inspired algorithms

    NASA Astrophysics Data System (ADS)

    Pratiwi, A. B.; Pratama, A.; Sa’diyah, I.; Suprajitno, H.

    2018-03-01

    Process of distribution of goods needs a strategy to make the total cost spent for operational activities minimized. But there are several constrains have to be satisfied which are the capacity of the vehicles and the service time of the customers. This Vehicle Routing Problem with Time Windows (VRPTW) gives complex constrains problem. This paper proposes natural inspired algorithms for dealing with constrains of VRPTW which involves Bat Algorithm and Cat Swarm Optimization. Bat Algorithm is being hybrid with Simulated Annealing, the worst solution of Bat Algorithm is replaced by the solution from Simulated Annealing. Algorithm which is based on behavior of cats, Cat Swarm Optimization, is improved using Crow Search Algorithm to make simplier and faster convergence. From the computational result, these algorithms give good performances in finding the minimized total distance. Higher number of population causes better computational performance. The improved Cat Swarm Optimization with Crow Search gives better performance than the hybridization of Bat Algorithm and Simulated Annealing in dealing with big data.

  15. Uniportal anatomic combined unusual segmentectomies.

    PubMed

    González-Rivas, Diego; Lirio, Francisco; Sesma, Julio

    2017-01-01

    Nowadays, sublobar anatomic resections are gaining momentum as a valid alternative for early stage lung cancer. Despite being technically demanding, anatomic segmentectomies can be performed by uniportal video-assisted thoracic surgery (VATS) approach to combine the benefits of minimally invasiveness with the maximum lung sparing. This procedure can be even more complex if a combined resection of multiple segments from different lobes has to be done. Here we report five cases of combined and unusual segmentectomies done by the same experienced surgeon in high volume institutions to show uniportal VATS is a feasible approach for these complex resections and to share an excellent educational resource.

  16. TDRSS telecommunications system, PN code analysis

    NASA Technical Reports Server (NTRS)

    Dixon, R.; Gold, R.; Kaiser, F.

    1976-01-01

    The pseudo noise (PN) codes required to support the TDRSS telecommunications services are analyzed and the impact of alternate coding techniques on the user transponder equipment, the TDRSS equipment, and all factors that contribute to the acquisition and performance of these telecommunication services is assessed. Possible alternatives to the currently proposed hybrid FH/direct sequence acquisition procedures are considered and compared relative to acquisition time, implementation complexity, operational reliability, and cost. The hybrid FH/direct sequence technique is analyzed and rejected in favor of a recommended approach which minimizes acquisition time and user transponder complexity while maximizing probability of acquisition and overall link reliability.

  17. Active Correction of Aperture Discontinuities-Optimized Stroke Minimization. II. Optimization for Future Missions

    NASA Astrophysics Data System (ADS)

    Mazoyer, J.; Pueyo, L.; N'Diaye, M.; Fogarty, K.; Zimmerman, N.; Soummer, R.; Shaklan, S.; Norman, C.

    2018-01-01

    High-contrast imaging and spectroscopy provide unique constraints for exoplanet formation models as well as for planetary atmosphere models. Instrumentation techniques in this field have greatly improved over the last two decades, with the development of stellar coronagraphy, in parallel with specific methods of wavefront sensing and control. Next generation space- and ground-based telescopes will enable the characterization of cold solar-system-like planets for the first time and maybe even in situ detection of bio-markers. However, the growth of primary mirror diameters, necessary for these detections, comes with an increase of their complexity (segmentation, secondary mirror features). These discontinuities in the aperture can greatly limit the performance of coronagraphic instruments. In this context, we introduced a new technique, Active Correction of Aperture Discontinuities-Optimized Stroke Minimization (ACAD-OSM), to correct for the diffractive effects of aperture discontinuities in the final image plane of a coronagraph, using deformable mirrors. In this paper, we present several tools that can be used to optimize the performance of this technique for its application to future large missions. In particular, we analyzed the influence of the deformable setup (size and separating distance) and found that there is an optimal point for this setup, optimizing the performance of the instrument in contrast and throughput while minimizing the strokes applied to the deformable mirrors. These results will help us design future coronagraphic instruments to obtain the best performance.

  18. Minimally Invasive Surgery in Gastrointestinal Cancer: Benefits, Challenges, and Solutions for Underutilization

    PubMed Central

    Gusani, Niraj J.; Kimchi, Eric T.; Kavic, Stephen M.

    2014-01-01

    Background and Objectives: After the widespread application of minimally invasive surgery for benign diseases and given its proven safety and efficacy, minimally invasive surgery for gastrointestinal cancer has gained substantial attention in the past several years. Despite the large number of publications on the topic and level I evidence to support its use in colon cancer, minimally invasive surgery for most gastrointestinal malignancies is still underused. Methods: We explore some of the challenges that face the fusion of minimally invasive surgery technology in the management of gastrointestinal malignancies and propose solutions that may help increase the utilization in the future. These solutions are based on extensive literature review, observation of current trends and practices in this field, and discussion made with experts in the field. Results: We propose 4 different solutions to increase the use of minimally invasive surgery in the treatment of gastrointestinal malignancies: collaboration between surgical oncologists/hepatopancreatobiliary surgeons and minimally invasive surgeons at the same institution; a single surgeon performing 2 fellowships in surgical oncology/hepatopancreatobiliary surgery and minimally invasive surgery; establishing centers of excellence in minimally invasive gastrointestinal cancer management; and finally, using robotic technology to help with complex laparoscopic skills. Conclusions: Multiple studies have confirmed the utility of minimally invasive surgery techniques in dealing with patients with gastrointestinal malignancies. However, training continues to be the most important challenge that faces the use of minimally invasive surgery in the management of gastrointestinal malignancy; implementation of our proposed solutions may help increase the rate of adoption in the future. PMID:25489209

  19. Minimal-scan filtered backpropagation algorithms for diffraction tomography.

    PubMed

    Pan, X; Anastasio, M A

    1999-12-01

    The filtered backpropagation (FBPP) algorithm, originally developed by Devaney [Ultrason. Imaging 4, 336 (1982)], has been widely used for reconstructing images in diffraction tomography. It is generally known that the FBPP algorithm requires scattered data from a full angular range of 2 pi for exact reconstruction of a generally complex-valued object function. However, we reveal that one needs scattered data only over the angular range 0 < or = phi < or = 3 pi/2 for exact reconstruction of a generally complex-valued object function. Using this insight, we develop and analyze a family of minimal-scan filtered backpropagation (MS-FBPP) algorithms, which, unlike the FBPP algorithm, use scattered data acquired from view angles over the range 0 < or = phi < or = 3 pi/2. We show analytically that these MS-FBPP algorithms are mathematically identical to the FBPP algorithm. We also perform computer simulation studies for validation, demonstration, and comparison of these MS-FBPP algorithms. The numerical results in these simulation studies corroborate our theoretical assertions.

  20. Neural networks for vertical microcode compaction

    NASA Astrophysics Data System (ADS)

    Chu, Pong P.

    1992-09-01

    Neural networks provide an alternative way to solve complex optimization problems. Instead of performing a program of instructions sequentially as in a traditional computer, neural network model explores many competing hypotheses simultaneously using its massively parallel net. The paper shows how to use the neural network approach to perform vertical micro-code compaction for a micro-programmed control unit. The compaction procedure includes two basic steps. The first step determines the compatibility classes and the second step selects a minimal subset to cover the control signals. Since the selection process is an NP- complete problem, to find an optimal solution is impractical. In this study, we employ a customized neural network to obtain the minimal subset. We first formalize this problem, and then define an `energy function' and map it to a two-layer fully connected neural network. The modified network has two types of neurons and can always obtain a valid solution.

  1. High-order sliding-mode control for blood glucose regulation in the presence of uncertain dynamics.

    PubMed

    Hernández, Ana Gabriela Gallardo; Fridman, Leonid; Leder, Ron; Andrade, Sergio Islas; Monsalve, Cristina Revilla; Shtessel, Yuri; Levant, Arie

    2011-01-01

    The success of blood glucose automatic regulation depends on the robustness of the control algorithm used. It is a difficult task to perform due to the complexity of the glucose-insulin regulation system. The variety of model existing reflects the great amount of phenomena involved in the process, and the inter-patient variability of the parameters represent another challenge. In this research a High-Order Sliding-Mode Control is proposed. It is applied to two well known models, Bergman Minimal Model, and Sorensen Model, to test its robustness with respect to uncertain dynamics, and patients' parameter variability. The controller designed based on the simulations is tested with the specific Bergman Minimal Model of a diabetic patient whose parameters were identified from an in vivo assay. To minimize the insulin infusion rate, and avoid the hypoglycemia risk, the glucose target is a dynamical profile.

  2. Minimal-memory realization of pearl-necklace encoders of general quantum convolutional codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houshmand, Monireh; Hosseini-Khayat, Saied

    2011-02-15

    Quantum convolutional codes, like their classical counterparts, promise to offer higher error correction performance than block codes of equivalent encoding complexity, and are expected to find important applications in reliable quantum communication where a continuous stream of qubits is transmitted. Grassl and Roetteler devised an algorithm to encode a quantum convolutional code with a ''pearl-necklace'' encoder. Despite their algorithm's theoretical significance as a neat way of representing quantum convolutional codes, it is not well suited to practical realization. In fact, there is no straightforward way to implement any given pearl-necklace structure. This paper closes the gap between theoretical representation andmore » practical implementation. In our previous work, we presented an efficient algorithm to find a minimal-memory realization of a pearl-necklace encoder for Calderbank-Shor-Steane (CSS) convolutional codes. This work is an extension of our previous work and presents an algorithm for turning a pearl-necklace encoder for a general (non-CSS) quantum convolutional code into a realizable quantum convolutional encoder. We show that a minimal-memory realization depends on the commutativity relations between the gate strings in the pearl-necklace encoder. We find a realization by means of a weighted graph which details the noncommutative paths through the pearl necklace. The weight of the longest path in this graph is equal to the minimal amount of memory needed to implement the encoder. The algorithm has a polynomial-time complexity in the number of gate strings in the pearl-necklace encoder.« less

  3. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information

    PubMed Central

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102

  4. Minimally invasive surgery for pedal digital deformity: an audit of complications using national benchmark indicators.

    PubMed

    Gilheany, Mark; Baarini, Omar; Samaras, Dean

    2015-01-01

    There is increasing global interest and performance of minimally invasive foot surgery (MIS) however, limited evidence is available in relation to complications associated with MIS for digital deformity correction. The aim of this prospective audit is to report the surgical and medical complications following MIS for digital deformity against standardised clinical indicators. A prospective clinical audit of 179 patients who underwent MIS to reduce simple and complex digital deformities was conducted between June 2011 and June 2013. All patients were followed up to a minimum of 12 months post operatively. Data was collected according to a modified version of the Australian Council of Healthcare standards (ACHS) clinical indicator program. The audit was conducted in accordance with the National Research Ethics Service (NRES) guidelines on clinical audit. The surgical complications included 1 superficial infection (0.53%) and 2 under-corrected digits (0.67%), which required revision surgery. Two patients who underwent isolated complex digital corrections had pain due to delayed union (0.7%), which resolved by 6 months post-op. No neurovascular compromise and no medical complications were encountered. The results compare favourably to rates reported in the literature for open reduction of digital deformity. This audit has illustrated that performing MIS to address simple and complex digital deformity results in low complication rates compared to published standards. MIS procedures were safely performed in a range of clinical settings, on varying degrees of digital deformity and on a wide range of ages and health profiles. Further studies investigating the effectiveness of these techniques are warranted and should evaluate long term patient reported outcome measures, as well as developing treatment algorithms to guide clinical decision making.

  5. Analyzing Genome-Wide Association Studies with an FDR Controlling Modification of the Bayesian Information Criterion

    PubMed Central

    Dolejsi, Erich; Bodenstorfer, Bernhard; Frommlet, Florian

    2014-01-01

    The prevailing method of analyzing GWAS data is still to test each marker individually, although from a statistical point of view it is quite obvious that in case of complex traits such single marker tests are not ideal. Recently several model selection approaches for GWAS have been suggested, most of them based on LASSO-type procedures. Here we will discuss an alternative model selection approach which is based on a modification of the Bayesian Information Criterion (mBIC2) which was previously shown to have certain asymptotic optimality properties in terms of minimizing the misclassification error. Heuristic search strategies are introduced which attempt to find the model which minimizes mBIC2, and which are efficient enough to allow the analysis of GWAS data. Our approach is implemented in a software package called MOSGWA. Its performance in case control GWAS is compared with the two algorithms HLASSO and d-GWASelect, as well as with single marker tests, where we performed a simulation study based on real SNP data from the POPRES sample. Our results show that MOSGWA performs slightly better than HLASSO, where specifically for more complex models MOSGWA is more powerful with only a slight increase in Type I error. On the other hand according to our simulations GWASelect does not at all control the type I error when used to automatically determine the number of important SNPs. We also reanalyze the GWAS data from the Wellcome Trust Case-Control Consortium and compare the findings of the different procedures, where MOSGWA detects for complex diseases a number of interesting SNPs which are not found by other methods. PMID:25061809

  6. Complexity growth in minimal massive 3D gravity

    NASA Astrophysics Data System (ADS)

    Qaemmaqami, Mohammad M.

    2018-01-01

    We study the complexity growth by using "complexity =action " (CA) proposal in the minimal massive 3D gravity (MMG) model which is proposed for resolving the bulk-boundary clash problem of topologically massive gravity (TMG). We observe that the rate of the complexity growth for Banados-Teitelboim-Zanelli (BTZ) black hole saturates the proposed bound by physical mass of the BTZ black hole in the MMG model, when the angular momentum parameter and the inner horizon of black hole goes to zero.

  7. Practical synchronization on complex dynamical networks via optimal pinning control

    NASA Astrophysics Data System (ADS)

    Li, Kezan; Sun, Weigang; Small, Michael; Fu, Xinchu

    2015-07-01

    We consider practical synchronization on complex dynamical networks under linear feedback control designed by optimal control theory. The control goal is to minimize global synchronization error and control strength over a given finite time interval, and synchronization error at terminal time. By utilizing the Pontryagin's minimum principle, and based on a general complex dynamical network, we obtain an optimal system to achieve the control goal. The result is verified by performing some numerical simulations on Star networks, Watts-Strogatz networks, and Barabási-Albert networks. Moreover, by combining optimal control and traditional pinning control, we propose an optimal pinning control strategy which depends on the network's topological structure. Obtained results show that optimal pinning control is very effective for synchronization control in real applications.

  8. The Lewis Chemical Equilibrium Program with parametric study capability

    NASA Technical Reports Server (NTRS)

    Sevigny, R.

    1981-01-01

    The program was developed to determine chemical equilibrium in complex systems. Using a free energy minimization technique, the program permits calculations such as: chemical equilibrium for assigned thermodynamic states; theoretical rocket performance for both equilibrium and frozen compositions during expansion; incident and reflected shock properties; and Chapman-Jouget detonation properties. It is shown that the same program can handle solid coal in an entrained flow coal gasification problem.

  9. Complexity and compositionality in fluid intelligence.

    PubMed

    Duncan, John; Chylinski, Daphne; Mitchell, Daniel J; Bhandari, Apoorva

    2017-05-16

    Compositionality, or the ability to build complex cognitive structures from simple parts, is fundamental to the power of the human mind. Here we relate this principle to the psychometric concept of fluid intelligence, traditionally measured with tests of complex reasoning. Following the principle of compositionality, we propose that the critical function in fluid intelligence is splitting a complex whole into simple, separately attended parts. To test this proposal, we modify traditional matrix reasoning problems to minimize requirements on information integration, working memory, and processing speed, creating problems that are trivial once effectively divided into parts. Performance remains poor in participants with low fluid intelligence, but is radically improved by problem layout that aids cognitive segmentation. In line with the principle of compositionality, we suggest that effective cognitive segmentation is important in all organized behavior, explaining the broad role of fluid intelligence in successful cognition.

  10. Role of robotics in managing mesh and suture complications of prior pelvic organ prolapse surgery.

    PubMed

    Wilkinson, Michael N; O'Sullivan, Orfhlaith E; O'Reilly, Barry A

    2017-03-01

    Robotic surgery is proving essential in providing a minimally invasive approach to complex urogynaecological cases. This video highlights the diversity and complexity of cases performed using the robot-assisted approach. The robot-assisted approach was utilised for excellent effect in two complex urogynaecological cases. In the first case the entire left arm of an intravesically placed TVT was removed using a combined vaginal and robotic approach. The second case involved removing four paravaginal sutures, one of which breeched the bladder and was encrusted with calculus. These were placed during a laparoscopic paravaginal repair 2 years previously. She had a concomitant vaginal hysterectomy, Mc Calls culdoplasty and anterior wall repair. The robot-assisted approach allows for excellent access to the pelvis and retropubic space facilitating the surgical management of complex urogynaecology cases.

  11. Complex Instruction Set Quantum Computing

    NASA Astrophysics Data System (ADS)

    Sanders, G. D.; Kim, K. W.; Holton, W. C.

    1998-03-01

    In proposed quantum computers, electromagnetic pulses are used to implement logic gates on quantum bits (qubits). Gates are unitary transformations applied to coherent qubit wavefunctions and a universal computer can be created using a minimal set of gates. By applying many elementary gates in sequence, desired quantum computations can be performed. This reduced instruction set approach to quantum computing (RISC QC) is characterized by serial application of a few basic pulse shapes and a long coherence time. However, the unitary matrix of the overall computation is ultimately a unitary matrix of the same size as any of the elementary matrices. This suggests that we might replace a sequence of reduced instructions with a single complex instruction using an optimally taylored pulse. We refer to this approach as complex instruction set quantum computing (CISC QC). One trades the requirement for long coherence times for the ability to design and generate potentially more complex pulses. We consider a model system of coupled qubits interacting through nearest neighbor coupling and show that CISC QC can reduce the time required to perform quantum computations.

  12. Minimizing embedding impact in steganography using trellis-coded quantization

    NASA Astrophysics Data System (ADS)

    Filler, Tomáš; Judas, Jan; Fridrich, Jessica

    2010-01-01

    In this paper, we propose a practical approach to minimizing embedding impact in steganography based on syndrome coding and trellis-coded quantization and contrast its performance with bounds derived from appropriate rate-distortion bounds. We assume that each cover element can be assigned a positive scalar expressing the impact of making an embedding change at that element (single-letter distortion). The problem is to embed a given payload with minimal possible average embedding impact. This task, which can be viewed as a generalization of matrix embedding or writing on wet paper, has been approached using heuristic and suboptimal tools in the past. Here, we propose a fast and very versatile solution to this problem that can theoretically achieve performance arbitrarily close to the bound. It is based on syndrome coding using linear convolutional codes with the optimal binary quantizer implemented using the Viterbi algorithm run in the dual domain. The complexity and memory requirements of the embedding algorithm are linear w.r.t. the number of cover elements. For practitioners, we include detailed algorithms for finding good codes and their implementation. Finally, we report extensive experimental results for a large set of relative payloads and for different distortion profiles, including the wet paper channel.

  13. Energy minimization in medical image analysis: Methodologies and applications.

    PubMed

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Interlaboratory Study Characterizing a Yeast Performance Standard for Benchmarking LC-MS Platform Performance*

    PubMed Central

    Paulovich, Amanda G.; Billheimer, Dean; Ham, Amy-Joan L.; Vega-Montoto, Lorenzo; Rudnick, Paul A.; Tabb, David L.; Wang, Pei; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Clauser, Karl R.; Kinsinger, Christopher R.; Schilling, Birgit; Tegeler, Tony J.; Variyath, Asokan Mulayath; Wang, Mu; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Fenyo, David; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Mesri, Mehdi; Neubert, Thomas A.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Stein, Stephen E.; Tempst, Paul; Liebler, Daniel C.

    2010-01-01

    Optimal performance of LC-MS/MS platforms is critical to generating high quality proteomics data. Although individual laboratories have developed quality control samples, there is no widely available performance standard of biological complexity (and associated reference data sets) for benchmarking of platform performance for analysis of complex biological proteomes across different laboratories in the community. Individual preparations of the yeast Saccharomyces cerevisiae proteome have been used extensively by laboratories in the proteomics community to characterize LC-MS platform performance. The yeast proteome is uniquely attractive as a performance standard because it is the most extensively characterized complex biological proteome and the only one associated with several large scale studies estimating the abundance of all detectable proteins. In this study, we describe a standard operating protocol for large scale production of the yeast performance standard and offer aliquots to the community through the National Institute of Standards and Technology where the yeast proteome is under development as a certified reference material to meet the long term needs of the community. Using a series of metrics that characterize LC-MS performance, we provide a reference data set demonstrating typical performance of commonly used ion trap instrument platforms in expert laboratories; the results provide a basis for laboratories to benchmark their own performance, to improve upon current methods, and to evaluate new technologies. Additionally, we demonstrate how the yeast reference, spiked with human proteins, can be used to benchmark the power of proteomics platforms for detection of differentially expressed proteins at different levels of concentration in a complex matrix, thereby providing a metric to evaluate and minimize preanalytical and analytical variation in comparative proteomics experiments. PMID:19858499

  15. Uniportal anatomic combined unusual segmentectomies

    PubMed Central

    Lirio, Francisco; Sesma, Julio

    2017-01-01

    Nowadays, sublobar anatomic resections are gaining momentum as a valid alternative for early stage lung cancer. Despite being technically demanding, anatomic segmentectomies can be performed by uniportal video-assisted thoracic surgery (VATS) approach to combine the benefits of minimally invasiveness with the maximum lung sparing. This procedure can be even more complex if a combined resection of multiple segments from different lobes has to be done. Here we report five cases of combined and unusual segmentectomies done by the same experienced surgeon in high volume institutions to show uniportal VATS is a feasible approach for these complex resections and to share an excellent educational resource. PMID:29078653

  16. Novel application of simultaneous multi-image display during complex robotic abdominal procedures

    PubMed Central

    2014-01-01

    Background The surgical robot offers the potential to integrate multiple views into the surgical console screen, and for the assistant’s monitors to provide real-time views of both fields of operation. This function has the potential to increase patient safety and surgical efficiency during an operation. Herein, we present a novel application of the multi-image display system for simultaneous visualization of endoscopic views during various complex robotic gastrointestinal operations. All operations were performed using the da Vinci Surgical System (Intuitive Surgical, Sunnyvale, CA, USA) with the assistance of Tilepro, multi-input display software, during employment of the intraoperative scopes. Three robotic operations, left hepatectomy with intraoperative common bile duct exploration, low anterior resection, and radical distal subtotal gastrectomy with intracorporeal gastrojejunostomy, were performed by three different surgeons at a tertiary academic medical center. Results The three complex robotic abdominal operations were successfully completed without difficulty or intraoperative complications. The use of the Tilepro to simultaneously visualize the images from the colonoscope, gastroscope, and choledochoscope made it possible to perform additional intraoperative endoscopic procedures without extra monitors or interference with the operations. Conclusion We present a novel use of the multi-input display program on the da Vinci Surgical System to facilitate the performance of intraoperative endoscopies during complex robotic operations. Our study offers another potentially beneficial application of the robotic surgery platform toward integration and simplification of combining additional procedures with complex minimally invasive operations. PMID:24628761

  17. Cavity-enhanced measurements for determining dielectric-membrane thickness and complex index of refraction.

    PubMed

    Stambaugh, Corey; Durand, Mathieu; Kemiktarak, Utku; Lawall, John

    2014-08-01

    The material properties of silicon nitride (SiN) play an important role in the performance of SiN membranes used in optomechanical applications. An optimum design of a subwavelength high-contrast grating requires accurate knowledge of the membrane thickness and index of refraction, and its performance is ultimately limited by material absorption. Here we describe a cavity-enhanced method to measure the thickness and complex index of refraction of dielectric membranes with small, but nonzero, absorption coefficients. By determining Brewster's angle and an angle at which reflection is minimized by means of destructive interference, both the real part of the index of refraction and the sample thickness can be measured. A comparison of the losses in the empty cavity and the cavity containing the dielectric sample provides a measurement of the absorption.

  18. Modeling two-phase flow in three-dimensional complex flow-fields of proton exchange membrane fuel cells

    NASA Astrophysics Data System (ADS)

    Kim, Jinyong; Luo, Gang; Wang, Chao-Yang

    2017-10-01

    3D fine-mesh flow-fields recently developed by Toyota Mirai improved water management and mass transport in proton exchange membrane (PEM) fuel cell stacks, suggesting their potential value for robust and high-power PEM fuel cell stack performance. In such complex flow-fields, Forchheimer's inertial effect is dominant at high current density. In this work, a two-phase flow model of 3D complex flow-fields of PEMFCs is developed by accounting for Forchheimer's inertial effect, for the first time, to elucidate the underlying mechanism of liquid water behavior and mass transport inside 3D complex flow-fields and their adjacent gas diffusion layers (GDL). It is found that Forchheimer's inertial effect enhances liquid water removal from flow-fields and adds additional flow resistance around baffles, which improves interfacial liquid water and mass transport. As a result, substantial improvements in high current density cell performance and operational stability are expected in PEMFCs with 3D complex flow-fields, compared to PEMFCs with conventional flow-fields. Higher current density operation required to further reduce PEMFC stack cost per kW in the future will necessitate optimizing complex flow-field designs using the present model, in order to efficiently remove a large amount of product water and hence minimize the mass transport voltage loss.

  19. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

  20. Project UNITY: Cross Domain Visualization Collaboration

    DTIC Science & Technology

    2015-10-18

    location is at the Space Operations Coordination Center (UK-SPOCC) in High Wycombe, UK. Identical AFRL-developed ErgoWorkstations (see Figure 2) were...installed in both locations. The AFRL ErgoWorkstation is made up of a high performance Windows-based PC with three displays: two 30” Dell Cinema ...system can be seen in Figure 1. The intent of using identical hardware is to minimize complexity, to simplify debugging, and to provide an opportunity

  1. Assessment of the relationship between renal volume and renal function after minimally-invasive partial nephrectomy: the role of computed tomography and nuclear renal scan.

    PubMed

    Bertolo, Riccardo; Fiori, Cristian; Piramide, Federico; Amparore, Daniele; Barrera, Monica; Sardo, Diego; Veltri, Andrea; Porpiglia, Francesco

    2018-05-14

    To evaluate the correlation between the loss of renal function as assessed by Tc99MAG-3 renal scan and the loss of renal volume as calculated by volumetric assessment on CT-scan in patients who underwent minimally-invasive partial nephrectomy (PN). PN prospectively-maintained database was retrospectively queried for patients who underwent minimally-invasive PN (2012-2017) for renal mass

  2. Simulated parallel annealing within a neighborhood for optimization of biomechanical systems.

    PubMed

    Higginson, J S; Neptune, R R; Anderson, F C

    2005-09-01

    Optimization problems for biomechanical systems have become extremely complex. Simulated annealing (SA) algorithms have performed well in a variety of test problems and biomechanical applications; however, despite advances in computer speed, convergence to optimal solutions for systems of even moderate complexity has remained prohibitive. The objective of this study was to develop a portable parallel version of a SA algorithm for solving optimization problems in biomechanics. The algorithm for simulated parallel annealing within a neighborhood (SPAN) was designed to minimize interprocessor communication time and closely retain the heuristics of the serial SA algorithm. The computational speed of the SPAN algorithm scaled linearly with the number of processors on different computer platforms for a simple quadratic test problem and for a more complex forward dynamic simulation of human pedaling.

  3. Energy minimization for self-organized structure formation and actuation

    NASA Astrophysics Data System (ADS)

    Kofod, Guggi; Wirges, Werner; Paajanen, Mika; Bauer, Siegfried

    2007-02-01

    An approach for creating complex structures with embedded actuation in planar manufacturing steps is presented. Self-organization and energy minimization are central to this approach, illustrated with a model based on minimization of the hyperelastic free energy strain function of a stretched elastomer and the bending elastic energy of a plastic frame. A tulip-shaped gripper structure illustrates the technological potential of the approach. Advantages are simplicity of manufacture, complexity of final structures, and the ease with which any electroactive material can be exploited as means of actuation.

  4. Adaptive Attitude Control of the Crew Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Muse, Jonathan

    2010-01-01

    An H(sub infinity)-NMA architecture for the Crew Launch Vehicle was developed in a state feedback setting. The minimal complexity adaptive law was shown to improve base line performance relative to a performance metric based on Crew Launch Vehicle design requirements for all most all of the Worst-on-Worst dispersion cases. The adaptive law was able to maintain stability for some dispersions that are unstable with the nominal control law. Due to the nature of the H(sub infinity)-NMA architecture, the augmented adaptive control signal has low bandwidth which is a great benefit for a manned launch vehicle.

  5. Predicting protein interactions by Brownian dynamics simulations.

    PubMed

    Meng, Xuan-Yu; Xu, Yu; Zhang, Hong-Xing; Mezei, Mihaly; Cui, Meng

    2012-01-01

    We present a newly adapted Brownian-Dynamics (BD)-based protein docking method for predicting native protein complexes. The approach includes global BD conformational sampling, compact complex selection, and local energy minimization. In order to reduce the computational costs for energy evaluations, a shell-based grid force field was developed to represent the receptor protein and solvation effects. The performance of this BD protein docking approach has been evaluated on a test set of 24 crystal protein complexes. Reproduction of experimental structures in the test set indicates the adequate conformational sampling and accurate scoring of this BD protein docking approach. Furthermore, we have developed an approach to account for the flexibility of proteins, which has been successfully applied to reproduce the experimental complex structure from the structure of two unbounded proteins. These results indicate that this adapted BD protein docking approach can be useful for the prediction of protein-protein interactions.

  6. Complexity and compositionality in fluid intelligence

    PubMed Central

    Duncan, John; Chylinski, Daphne

    2017-01-01

    Compositionality, or the ability to build complex cognitive structures from simple parts, is fundamental to the power of the human mind. Here we relate this principle to the psychometric concept of fluid intelligence, traditionally measured with tests of complex reasoning. Following the principle of compositionality, we propose that the critical function in fluid intelligence is splitting a complex whole into simple, separately attended parts. To test this proposal, we modify traditional matrix reasoning problems to minimize requirements on information integration, working memory, and processing speed, creating problems that are trivial once effectively divided into parts. Performance remains poor in participants with low fluid intelligence, but is radically improved by problem layout that aids cognitive segmentation. In line with the principle of compositionality, we suggest that effective cognitive segmentation is important in all organized behavior, explaining the broad role of fluid intelligence in successful cognition. PMID:28461462

  7. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance - Empirical Results and a Plea for Ecologically Valid Microworlds.

    PubMed

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell's investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory ( Tailorshop ; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario ( FSYS ; i.e., a complex artificial world system). The results indicate that reasoning - specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) - are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the minimally complex systems (MCS) measurement approach. We suggest ecologically valid microworlds as an indispensable tool for future CPS research and applications.

  8. SEQUOIA: significance enhanced network querying through context-sensitive random walk and minimization of network conductance.

    PubMed

    Jeong, Hyundoo; Yoon, Byung-Jun

    2017-03-14

    Network querying algorithms provide computational means to identify conserved network modules in large-scale biological networks that are similar to known functional modules, such as pathways or molecular complexes. Two main challenges for network querying algorithms are the high computational complexity of detecting potential isomorphism between the query and the target graphs and ensuring the biological significance of the query results. In this paper, we propose SEQUOIA, a novel network querying algorithm that effectively addresses these issues by utilizing a context-sensitive random walk (CSRW) model for network comparison and minimizing the network conductance of potential matches in the target network. The CSRW model, inspired by the pair hidden Markov model (pair-HMM) that has been widely used for sequence comparison and alignment, can accurately assess the node-to-node correspondence between different graphs by accounting for node insertions and deletions. The proposed algorithm identifies high-scoring network regions based on the CSRW scores, which are subsequently extended by maximally reducing the network conductance of the identified subnetworks. Performance assessment based on real PPI networks and known molecular complexes show that SEQUOIA outperforms existing methods and clearly enhances the biological significance of the query results. The source code and datasets can be downloaded from http://www.ece.tamu.edu/~bjyoon/SEQUOIA .

  9. [Minimally invasive reconstruction of the posterolateral corner with simultaneous replacement of the anterior cruciate ligament for complex knee ligament injuries].

    PubMed

    Vega-España, E A; Vilchis-Sámano, H; Ruiz-Mejía, O

    2017-01-01

    To evaluate and describe the results of a simultaneous reconstruction with minimally invasive technique of the posterolateral complex (PLC) and the anterior cruciate ligament (ACL). ACL and PLC reconstruction was performed in seven patients using the technique described, in the period from March to November 2012. All patients were evaluated at six months after the procedure using IKDC and IKSS subjective tests. Their return to work activities and their level of satisfaction were assessed. Six male and one female patients ranging in age between 26 and 46 years were evaluated. The injuries were mostly caused by sports related accidents. All patients were economically active and required an average period of three months of disability. The assessment and outcomes at six months, according to the IKDC scale, were: one patient with IKDC A, four with IKDC B, one patient with C, and one with D. In the subjective scale IKSS, 80% averaged a knee stability of over 90 points, a patient had a 100-point scale and another, of 70 points.

  10. The marvel of percutaneous cardiovascular devices in the elderly.

    PubMed

    Guidoin, Robert; Douville, Yvan; Clavel, Marie-Annick; Zhang, Ze; Nutley, Mark; Pîbarot, Philippe; Dionne, Guy

    2010-06-01

    Thanks to minimally invasive procedures, frail and elderly patients can also benefit from innovative technologies. More than 14 million implanted pacemakers deliver impulses to the heart muscle to regulate the heart rate (treating bradycardias and blocks). The first human implantation of defibrillators was performed in early 2000. The defibrillator detects cardiac arrhythmias and corrects them by delivering electric shocks. The ongoing development of minimally invasive technologies has also broadened the scope of treatment for elderly patients with vascular stenosis and aneurysmal disease as well as other complex vascular pathologies. The nonsurgical cardiac valve replacement represents one of the most recent and exciting developments, demonstrating the feasibility of replacing a heart valve by way of placement through an intra-arterial or trans-ventricular sheath. Percutaneous devices are particularly well suited for the elderly as the surgical risks of minimally invasive surgery are considerably less as compared to open surgery, leading to a shorter hospital stay, a faster recovery, and improved quality of life.

  11. Displacement Based Multilevel Structural Optimization

    NASA Technical Reports Server (NTRS)

    Sobieszezanski-Sobieski, J.; Striz, A. G.

    1996-01-01

    In the complex environment of true multidisciplinary design optimization (MDO), efficiency is one of the most desirable attributes of any approach. In the present research, a new and highly efficient methodology for the MDO subset of structural optimization is proposed and detailed, i.e., for the weight minimization of a given structure under size, strength, and displacement constraints. Specifically, finite element based multilevel optimization of structures is performed. In the system level optimization, the design variables are the coefficients of assumed polynomially based global displacement functions, and the load unbalance resulting from the solution of the global stiffness equations is minimized. In the subsystems level optimizations, the weight of each element is minimized under the action of stress constraints, with the cross sectional dimensions as design variables. The approach is expected to prove very efficient since the design task is broken down into a large number of small and efficient subtasks, each with a small number of variables, which are amenable to parallel computing.

  12. Using percutaneous transhepatic cholangioscopic lithotripsy for intrahepatic calculus in hostile abdomen.

    PubMed

    Kow, A W C; Wang, B; Wong, D; Sundeep, P J; Chan, C Y; Ho, C K; Liau, K H

    2011-04-01

    Hepatolithiasis is a challenging condition to treat especially in patients with previous hepatobiliary surgery. Percutaneous Transhepatic Cholangioscopic Lithotripsy (PTCSL) is an attractive salvage option for the treatment of recurrent hepatolithiasis. We reviewed our experience using PTCSL in treating 4 patients with previous complex abdominal surgery. We studied the 4 patients who underwent PTCSL from October 2007 to July 2009. We reviewed the operative procedures, workflow of performing PTCSL in our institution and the outcome of the procedure. PTCSL was performed in our institution using 3 mm cholangioscope (Dornier MedTech(®)) and Holmium laser with setting at 0.8 J, 20 Hz and 16 W. This was performed through a Percutaneous Transhepatic Cholangio-catheter inserted by interventional radiologists. There were 4 patients with a median age of 50 (43-69) years. The median duration of the condition prior to PTCSL was 102 (60-156) months. Three patients had recurrent pyogenic cholangitis (RPC) with recurrent intrahepatic stone. They all had prior complex hepatobiliary operations. The median duration of surgery was 130 (125-180) min. There was minimal intra-operative blood loss. The first procedure was performed under local anaesthesia and sedation, however, with experience the subsequent 3 patients had the procedure performed under general anaesthesia. The median size of bile duct was 18 (15-20) mm prior to the procedure. The number of stones ranged from one to three with the largest size of stone comparable to the size of bile duct. The median follow up was 18 (10-24) months. All patients were symptom free with neither stone recurrence or cholangitis at the last follow up. PTCSL is a feasible and an effective treatment method for patients with recurrent biliary stone following complex abdominal surgery as the success rates from open surgery and endoscopic procedures are limited. Excellent results can be expected with this minimally invasive technique. Copyright © 2010 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  13. Asynchronous oscillations of rigid rods drive viscous fluid to swirl

    NASA Astrophysics Data System (ADS)

    Hayashi, Rintaro; Takagi, Daisuke

    2017-12-01

    We present a minimal system for generating flow at low Reynolds number by oscillating a pair of rigid rods in silicone oil. Experiments show that oscillating them in phase produces no net flow, but a phase difference alone can generate rich flow fields. Tracer particles follow complex trajectory patterns consisting of small orbital movements every cycle and then drifting or swirling in larger regions after many cycles. Observations are consistent with simulations performed using the method of regularized Stokeslets, which reveal complex three-dimensional flow structures emerging from simple oscillatory actuation. Our findings reveal the basic underlying flow structure around oscillatory protrusions such as hairs and legs as commonly featured on living and nonliving bodies.

  14. A family of variable step-size affine projection adaptive filter algorithms using statistics of channel impulse response

    NASA Astrophysics Data System (ADS)

    Shams Esfand Abadi, Mohammad; AbbasZadeh Arani, Seyed Ali Asghar

    2011-12-01

    This paper extends the recently introduced variable step-size (VSS) approach to the family of adaptive filter algorithms. This method uses prior knowledge of the channel impulse response statistic. Accordingly, optimal step-size vector is obtained by minimizing the mean-square deviation (MSD). The presented algorithms are the VSS affine projection algorithm (VSS-APA), the VSS selective partial update NLMS (VSS-SPU-NLMS), the VSS-SPU-APA, and the VSS selective regressor APA (VSS-SR-APA). In VSS-SPU adaptive algorithms the filter coefficients are partially updated which reduce the computational complexity. In VSS-SR-APA, the optimal selection of input regressors is performed during the adaptation. The presented algorithms have good convergence speed, low steady state mean square error (MSE), and low computational complexity features. We demonstrate the good performance of the proposed algorithms through several simulations in system identification scenario.

  15. Clustered functional MRI of overt speech production.

    PubMed

    Sörös, Peter; Sokoloff, Lisa Guttman; Bose, Arpita; McIntosh, Anthony R; Graham, Simon J; Stuss, Donald T

    2006-08-01

    To investigate the neural network of overt speech production, event-related fMRI was performed in 9 young healthy adult volunteers. A clustered image acquisition technique was chosen to minimize speech-related movement artifacts. Functional images were acquired during the production of oral movements and of speech of increasing complexity (isolated vowel as well as monosyllabic and trisyllabic utterances). This imaging technique and behavioral task enabled depiction of the articulo-phonologic network of speech production from the supplementary motor area at the cranial end to the red nucleus at the caudal end. Speaking a single vowel and performing simple oral movements involved very similar activation of the cortical and subcortical motor systems. More complex, polysyllabic utterances were associated with additional activation in the bilateral cerebellum, reflecting increased demand on speech motor control, and additional activation in the bilateral temporal cortex, reflecting the stronger involvement of phonologic processing.

  16. High-speed low-complexity video coding with EDiCTius: a DCT coding proposal for JPEG XS

    NASA Astrophysics Data System (ADS)

    Richter, Thomas; Fößel, Siegfried; Keinert, Joachim; Scherl, Christian

    2017-09-01

    In its 71th meeting, the JPEG committee issued a call for low complexity, high speed image coding, designed to address the needs of low-cost video-over-ip applications. As an answer to this call, Fraunhofer IIS and the Computing Center of the University of Stuttgart jointly developed an embedded DCT image codec requiring only minimal resources while maximizing throughput on FPGA and GPU implementations. Objective and subjective tests performed for the 73rd meeting confirmed its excellent performance and suitability for its purpose, and it was selected as one of the two key contributions for the development of a joined test model. In this paper, its authors describe the design principles of the codec, provide a high-level overview of the encoder and decoder chain and provide evaluation results on the test corpus selected by the JPEG committee.

  17. Metasurface Freeform Nanophotonics.

    PubMed

    Zhan, Alan; Colburn, Shane; Dodson, Christopher M; Majumdar, Arka

    2017-05-10

    Freeform optics aims to expand the toolkit of optical elements by allowing for more complex phase geometries beyond rotational symmetry. Complex, asymmetric curvatures are employed to enhance the performance of optical components while minimizing their size. Unfortunately, these high curvatures and complex forms are often difficult to manufacture with current technologies, especially at the micron scale. Metasurfaces are planar sub-wavelength structures that can control the phase, amplitude, and polarization of incident light, and can thereby mimic complex geometric curvatures on a flat, wavelength-scale thick surface. We present a methodology for designing analogues of freeform optics using a silicon nitride based metasurface platform for operation at visible wavelengths. We demonstrate a cubic phase plate with a point spread function exhibiting enhanced depth of field over 300 micron along the optical axis with potential for performing metasurface-based white light imaging, and an Alvarez lens with a tunable focal length range of over 2.5 mm corresponding to a change in optical power of ~1600 diopters with 100 micron of total mechanical displacement. The adaptation of freeform optics to a sub-wavelength metasurface platform allows for further miniaturization of optical components and offers a scalable route toward implementing near-arbitrary geometric curvatures in nanophotonics.

  18. Automatic classification of minimally invasive instruments based on endoscopic image sequences

    NASA Astrophysics Data System (ADS)

    Speidel, Stefanie; Benzko, Julia; Krappe, Sebastian; Sudra, Gunther; Azad, Pedram; Müller-Stich, Beat Peter; Gutt, Carsten; Dillmann, Rüdiger

    2009-02-01

    Minimally invasive surgery is nowadays a frequently applied technique and can be regarded as a major breakthrough in surgery. The surgeon has to adopt special operation-techniques and deal with difficulties like the complex hand-eye coordination and restricted mobility. To alleviate these constraints we propose to enhance the surgeon's capabilities by providing a context-aware assistance using augmented reality techniques. To analyze the current situation for context-aware assistance, we need intraoperatively gained sensor data and a model of the intervention. A situation consists of information about the performed activity, the used instruments, the surgical objects, the anatomical structures and defines the state of an intervention for a given moment in time. The endoscopic images provide a rich source of information which can be used for an image-based analysis. Different visual cues are observed in order to perform an image-based analysis with the objective to gain as much information as possible about the current situation. An important visual cue is the automatic recognition of the instruments which appear in the scene. In this paper we present the classification of minimally invasive instruments using the endoscopic images. The instruments are not modified by markers. The system segments the instruments in the current image and recognizes the instrument type based on three-dimensional instrument models.

  19. Analysis of Factors Influencing Building Refurbishment Project Performance

    NASA Astrophysics Data System (ADS)

    Ishak, Nurfadzillah; Aswad Ibrahim, Fazdliel; Azizi Azizan, Muhammad

    2018-03-01

    Presently, the refurbishment approach becomes favourable as it creates opportunities to incorporate sustainable value with other building improvement. In this regard, this approach needs to be implemented due to the issues on overwhelming ratio of existing building to new construction, which also can contribute to the environmental problem. Refurbishment principles imply to minimize the environmental impact and upgrading the performance of an existing building to meet new requirements. In theoretically, building project's performance has a direct bearing on related to its potential for project success. However, in refurbishment building projects, the criteria for measure are become wider because the projects are a complex and multi-dimensional which encompassing many factors which reflect to the nature of works. Therefore, this impetus could be achieve by examine the direct empirical relationship between critical success factors (CSFs) and complexity factors (CFs) during managing the project in relation to delivering success on project performance. The research findings will be expected as the basis of future research in establish appropriate framework that provides information on managing refurbishment building projects and enhancing the project management competency for a better-built environment.

  20. Improving Minimally Invasive Adrenalectomy: Selection of Optimal Approach and Comparison of Outcomes.

    PubMed

    Lairmore, Terry C; Folek, Jessica; Govednik, Cara M; Snyder, Samuel K

    2016-07-01

    Minimally invasive adrenalectomy is commonly performed by either a transperitoneal laparoscopic (TLA) or posterior retroperitoneoscopic (PRA) approach. Our group described the technique for robot-assisted PRA (RAPRA) in 2010. Few studies are available that directly compare outcomes between the available operative approaches. We reviewed our results for minimally invasive adrenalectomy using the three different approaches over a 10-year period. Between January 2005 and April 2015, 160 minimally invasive adrenalectomies were performed. Clinicopathologic data were prospectively collected and retrospectively analyzed. The primary endpoints evaluated were operative time, blood loss, length of stay (LOS), and morbidity. The study included 67 TLA, 76 PRA, and 17 RAPRA procedures. Tumor size for PRA/RAPRA was smaller than for patients undergoing TLA (2.38 vs 3.6 cm, p ≤ 0.0001). Procedure time was shorter for PRA versus TLA (133.3 vs 152.8 min, p = 0.0381), as was LOS (1.85 vs 2.82 days, p = 0.0145). Procedure time was longer in RAPRA versus TLA/PRA (177 vs 153/133 min, p = 0.008), but LOS was significantly decreased (1.53 vs 2.82/1.85 days, p = 0.004). Minimally invasive adrenalectomy is associated with expected excellent outcomes regardless of approach. In our series, the posterior approach is associated with decreased operative time and LOS. Robotic technology provides potential advantages for the surgeon at the expense of more complex setup requirements and costs. Further study is required to demonstrate clear benefit of one surgical approach. Utilization of the entire spectrum of available operative techniques can allow for selection of the optimal approach based on individual patient factors.

  1. Complex Recanalization of Chronic Total Occluison Supported by Minimal Extracorporeal Circulation in a Patient with an Aortic Valve Bioprothesis in Extraanatomic Position

    PubMed Central

    Jansen, Ruben; Bathgate, Brigitte; Bufe, Alexander

    2018-01-01

    Percutaneous coronary intervention (PCI) of chronic total occlusion (CTO) still remains a major challenge in interventional cardiology. This case describes a complex PCI of the left main coronary artery and of a CTO of the right coronary artery using a minimal extracorporeal circulation system (MECC) in a patient with an aortic valve bioprothesis in extraanatomic position. It illustrates that complex recanalization strategies can be solved combining it with mechanical circulatory support technologies. PMID:29850264

  2. Systemic Review of the Feasibility and Advantage of Minimally Invasive Pancreaticoduodenectomy.

    PubMed

    Liao, Chien-Hung; Wu, Yu-Tung; Liu, Yu-Yin; Wang, Shang-Yu; Kang, Shih-Ching; Yeh, Chun-Nan; Yeh, Ta-Sen

    2016-05-01

    Minimally invasive pancreaticoduodenectomy (MIPD), which includes laparoscopic pancreaticoduodenectomy (LPD) and robotic pancreaticoduodenectomy (RPD), is a complex procedure that needs to be performed by experienced surgeons. However, the safety and oncologic performance have not yet been conclusively determined. A systematic literature search was performed using the Embase, Medline, and PubMed databases to identify all studies published up to March 2015. Articles written in English containing the keywords: "pancreaticoduodenectomy" or "Whipple operation" combined with "laparoscopy," "laparoscopic," "robotic," "da vinci," or "minimally invasive surgery" were selected. Furthermore, to increase the power of evidence, articles describing more than ten MIPDs were selected for this review. Twenty-six articles matched the review criteria. A total of 780 LPDs and 248 RPDs were included in the current review. The overall conversion rate to open surgery was 9.1 %. The weighted average operative time was 422.6 min, and the weighted average blood loss was 321.1 mL. The weighted average number of harvested lymph nodes was 17.1, and the rate of microscopically positive tumor margins was 8.4 %. The cumulative morbidity was 35.9 %, and a pancreatic fistula was reported in 17.0 % of cases. The average length of hospital stay was 12.4 days, and the mortality rate was 2.2 %. In conclusion, after reviewing one-thousand cases in the current literature, we conclude that MIPD offers a good perioperative, postoperative, and oncologic outcome. MIPD is feasible and safe in well-selected patients.

  3. Direct comparison of the performance of a bio-inspired synthetic nickel catalyst and a [NiFe]-hydrogenase, both covalently attached to electrodes.

    PubMed

    Rodriguez-Maciá, Patricia; Dutta, Arnab; Lubitz, Wolfgang; Shaw, Wendy J; Rüdiger, Olaf

    2015-10-12

    The active site of hydrogenases has been a source of inspiration for the development of molecular catalysts. However, direct comparisons between molecular catalysts and enzymes have not been possible because different techniques are used to evaluate both types of catalysts, minimizing our ability to determine how far we have come in mimicking the enzymatic performance. The catalytic properties of the [Ni(P(Cy) 2 N(Gly) 2 )2 ](2+) complex with the [NiFe]-hydrogenase from Desulfovibrio vulgaris immobilized on a functionalized electrode were compared under identical conditions. At pH 7, the enzyme shows higher activity and lower overpotential with better stability, while at low pH, the molecular catalyst outperforms the enzyme in all respects. This is the first direct comparison of enzymes and molecular complexes, enabling a unique understanding of the benefits and detriments of both systems, and advancing our understanding of the utilization of these bio-inspired complexes in fuel cells. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Resource-Competing Oscillator Network as a Model of Amoeba-Based Neurocomputer

    NASA Astrophysics Data System (ADS)

    Aono, Masashi; Hirata, Yoshito; Hara, Masahiko; Aihara, Kazuyuki

    An amoeboid organism, Physarum, exhibits rich spatiotemporal oscillatory behavior and various computational capabilities. Previously, the authors created a recurrent neurocomputer incorporating the amoeba as a computing substrate to solve optimization problems. In this paper, considering the amoeba to be a network of oscillators coupled such that they compete for constant amounts of resources, we present a model of the amoeba-based neurocomputer. The model generates a number of oscillation modes and produces not only simple behavior to stabilize a single mode but also complex behavior to spontaneously switch among different modes, which reproduces well the experimentally observed behavior of the amoeba. To explore the significance of the complex behavior, we set a test problem used to compare computational performances of the oscillation modes. The problem is a kind of optimization problem of how to allocate a limited amount of resource to oscillators such that conflicts among them can be minimized. We show that the complex behavior enables to attain a wider variety of solutions to the problem and produces better performances compared with the simple behavior.

  5. Coevolution at protein complex interfaces can be detected by the complementarity trace with important impact for predictive docking

    PubMed Central

    Madaoui, Hocine; Guerois, Raphaël

    2008-01-01

    Protein surfaces are under significant selection pressure to maintain interactions with their partners throughout evolution. Capturing how selection pressure acts at the interfaces of protein–protein complexes is a fundamental issue with high interest for the structural prediction of macromolecular assemblies. We tackled this issue under the assumption that, throughout evolution, mutations should minimally disrupt the physicochemical compatibility between specific clusters of interacting residues. This constraint drove the development of the so-called Surface COmplementarity Trace in Complex History score (SCOTCH), which was found to discriminate with high efficiency the structure of biological complexes. SCOTCH performances were assessed not only with respect to other evolution-based approaches, such as conservation and coevolution analyses, but also with respect to statistically based scoring methods. Validated on a set of 129 complexes of known structure exhibiting both permanent and transient intermolecular interactions, SCOTCH appears as a robust strategy to guide the prediction of protein–protein complex structures. Of particular interest, it also provides a basic framework to efficiently track how protein surfaces could evolve while keeping their partners in contact. PMID:18511568

  6. Molecular identification and in vitro antifungal susceptibility of Scedosporium complex isolates from high-human-activity sites in Mexico.

    PubMed

    Elizondo-Zertuche, Mariana; de J Treviño-Rangel, Rogelio; Robledo-Leal, Efrén; Luna-Rodríguez, Carolina E; Martínez-Fierro, Margarita L; Rodríguez-Sánchez, Iram P; González, Gloria M

    2017-01-01

    The genus Scedosporium is a complex of ubiquitous moulds associated with a wide spectrum of clinical entities, with high mortality principally in immunocompromised hosts. Ecology of these microorganisms has been studied performing isolations from environmental sources, showing a preference for human-impacted environments. This study aimed to evaluate the presence and antifungal susceptibility of Scedosporium complex species in soil samples collected in high-human-activity sites of Mexico. A total of 97 soil samples from 25 Mexican states were collected. Identifications were performed by microscopic morphology and confirmed by sequencing of the rDNA (internal transcribed spacer [ITS], D1/D2) and β-tubulin partial loci. Antifungal susceptibility testing was performed according to the Clinical and Laboratory Standards Institute (CLSI) protocols. Soil samples of urban gardens and industrial parks constituted the best sources for isolation of Scedosporium complex species. S. apiospermum sensu stricto was the most prevalent species (69%), followed by S. boydii (16%). Voriconazole (minimal inhibitory concentration [MIC] geometric mean ≤2.08 µg/mL), followed by posaconazole (MIC geometric mean ≤2.64 µg/mL), exhibited excellent in vitro activity for most species. Amphotericin B and fluconazole demonstrated limited antifungal activity, and all of the strains were resistant to echinocandins. This is the first report in Mexico of environmental distribution and antifungal in vitro susceptibility of these emergent pathogens.

  7. Direct Comparison of the Performance of a Bio-inspired Synthetic Nickel Catalyst and a [NiFe]-Hydrogenase, Both Covalently Attached to Electrodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez-Macia, Patricia; Dutta, Arnab; Lubitz, Wolfgang

    2015-10-12

    The active site of hydrogenases has been a source of inspiration for the development of molecular catalysts. However, direct comparisons between molecular catalysts and enzymes have not been possible because different techniques are used to evaluate both types of catalysts, minimizing our ability to determine how far we’ve come in mimicking the impressive enzymatic performance. Here we directly compare the catalytic properties of the [Ni(PCy2NGly2)2]2+ complex with the [NiFe]-hydrogenase from Desulfobivrio vulgaris Miyazaki F (DvMF) immobilized to a functionalized electrode under identical conditions. At pH=7, the enzyme has higher performance in both activity and overpotential, and is more stable, whilemore » at low pH, the molecular catalyst outperforms the enzyme in all respects. The Ni complex also has increased tolerance to CO. This is the first direct comparison of enzymes and molecular complexes, enabling a unique understanding of the benefits and detriments of both systems, and advancing our understanding of the utilization of these bioinspired complexes in fuel cells. AD and WJS acknowledge the Office of Science Early Career Research Program through the US Department of Energy (US DOE), Office of Science, Office of Basic Energy Sciences (BES), and Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for the US DOE.« less

  8. Performance study of large area encoding readout MRPC

    NASA Astrophysics Data System (ADS)

    Chen, X. L.; Wang, Y.; Chen, G.; Han, D.; Wang, X.; Zeng, M.; Zeng, Z.; Zhao, Z.; Guo, B.

    2018-02-01

    Muon tomography system built by the 2-D readout high spatial resolution Multi-gap Resistive Plate Chamber (MRPC) detector is a project of Tsinghua University. An encoding readout method based on the fine-fine configuration has been used to minimize the number of the readout electronic channels resulting in reducing the complexity and the cost of the system. In this paper, we provide a systematic comparison of the MRPC detector performance with and without fine-fine encoding readout. Our results suggest that the application of the fine-fine encoding readout leads us to achieve a detecting system with slightly worse spatial resolution but dramatically reduce the number of electronic channels.

  9. Performance of the Sleep-Mode Mechanism of the New IEEE 802.16m Proposal for Correlated Downlink Traffic

    NASA Astrophysics Data System (ADS)

    de Turck, Koen; de Vuyst, Stijn; Fiems, Dieter; Wittevrongel, Sabine; Bruneel, Herwig

    There is a considerable interest nowadays in making wireless telecommunication more energy-efficient. The sleep-mode mechanism in WiMAX (IEEE 802.16e) is one of such energy saving measures. Recently, Samsung proposed some modifications on the sleep-mode mechanism, scheduled to appear in the forthcoming IEEE 802.16m standard, aimed at minimizing the signaling overhead. In this work, we present a performance analysis of this proposal and clarify the differences with the standard mechanism included in IEEE 802.16e. We also propose some special algorithms aimed at reducing the computational complexity of the analysis.

  10. Design Considerations for a Computationally-Lightweight Authentication Mechanism for Passive RFID Tags

    DTIC Science & Technology

    2009-09-01

    suffer the power and complexity requirements of a public key system. 28 In [18], a simulation of the SHA –1 algorithm is performed on a Xilinx FPGA ... 256 bits. Thus, the construction of a hash table would need 2512 independent comparisons. It is known that hash collisions of the SHA –1 algorithm... SHA –1 algorithm for small-core FPGA design. Small-core FPGA design is the process by which a circuit is adapted to use the minimal amount of logic

  11. Creep Behavior of Oxide/Oxide Composites with Monazite Fiber Coating at 1100 deg C in Air and in Steam Environments

    DTIC Science & Technology

    2008-09-01

    monolithic ceramics initiates at small defects formed during processing. Minimization of such defects may improve performance, but thermal shock and cyclic...fiber tows are used in CMCs, where the use of small -diameter fibers causes a reduction in scale of microstructural defects associated with the fibers [7... Small Diameter · Improves matrix strength and facilitates fab- rication of thin and complex-shaped CMCs. · Low Density · Improves CMC specific properties

  12. Family medicine outpatient encounters are more complex than those of cardiology and psychiatry.

    PubMed

    Katerndahl, David; Wood, Robert; Jaén, Carlos Roberto

    2011-01-01

    comparison studies suggest that the guideline-concordant care provided for specific medical conditions is less optimal in primary care compared with cardiology and psychiatry settings. The purpose of this study is to estimate the relative complexity of patient encounters in general/family practice, cardiology, and psychiatry settings. secondary analysis of the 2000 National Ambulatory Medical Care Survey data for ambulatory patients seen in general/family practice, cardiology, and psychiatry settings was performed. The complexity for each variable was estimated as the quantity weighted by variability and diversity. there is minimal difference in the unadjusted input and total encounter complexity of general/family practice and cardiology; psychiatry's input is less complex. Cardiology encounters involved more input quantitatively, but the diversity of general/family practice input eliminated the difference. Cardiology also involved more complex output. However, when the duration of visit is factored in, the complexity of care provided per hour in general/family practice is 33% more relative to cardiology and 5 times more relative to psychiatry. care during family physician visits is more complex per hour than the care during visits to cardiologists or psychiatrists. This may account for a lower rate of completion of process items measured for quality of care.

  13. Mononuclear nickel (II) and copper (II) coordination complexes supported by bispicen ligand derivatives: Experimental and computational studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Nirupama; Niklas, Jens; Poluektov, Oleg

    2017-01-01

    The synthesis, characterization and density functional theory calculations of mononuclear Ni and Cu complexes supported by the N,N’-Dimethyl-N,N’-bis-(pyridine-2-ylmethyl)-1,2-diaminoethane ligand and its derivatives are reported. The complexes were characterized by X-ray crystallography as well as by UV-visible absorption spectroscopy and EPR spectroscopy. The solid state structure of these coordination complexes revealed that the geometry of the complex depended on the identity of the metal center. Solution phase characterization data are in accord with the solid phase structure, indicating minimal structural changes in solution. Optical spectroscopy revealed that all of the complexes exhibit color owing to d-d transition bands in the visiblemore » region. Magnetic parameters obtained from EPR spectroscopy with other structural data suggest that the Ni(II) complexes are in pseudo-octahedral geometry and Cu(II) complexes are in a distorted square pyramidal geometry. In order to understand in detail how ligand sterics and electronics affect complex topology detailed computational studies were performed. The series of complexes reported in this article will add significant value in the field of coordination chemistry as Ni(II) and Cu(II) complexes supported by tetradentate pyridyl based ligands are rather scarce.« less

  14. Emergent Embolization of a Very Late Detected Pseudoaneurysm at a Lower Pole Subsegmental Artery of the Kidney after Clampless Laparoscopic Partial Nephrectomy

    PubMed Central

    Chiancone, Francesco; Fedelini, Maurizio; Pucci, Luigi; Di Lorenzo, Domenico; Meccariello, Clemente; Fedelini, Paolo

    2017-01-01

    Renal artery pseudoaneurysm is a rare but life-threatening condition. Its incidence is higher after minimally invasive partial nephrectomy (PN) than after the open approach. We reported a case of a renal artery pseudoaneurysm occurred about four months after a clampless laparoscopic PN. A 49-year-old female underwent a clampless laparoscopic PN for a right renal tumor with high surgical complexity. The patient experienced an intraoperative blood loss from renal bed and the surgeons performed a deep medullary absorbable suture. Three months after surgery the patient underwent a renal ultrasonography with good results. The patient came to our emergency department 115 days after surgery with a hypovolemic shock stage 3. Her CT scan showed a pseudoaneurysm of a lower pole vessel of the right kidney. She underwent a superselective embolization of the segmental renal artery. The surgical complexity of the tumor, the anatomical relationships with the renal sinus and the deep medullary suture could be responsible for the development of the pseudoaneurysm. The authors presented an unusual case of a very late detected pseudoaneurysm of a renal vessel, suggesting that all very complex renal tumors removed with a minimally invasive technique should be followed up closely at least during the first six-months in order to early detect this major complication. PMID:28785196

  15. High Performance Analytics with the R3-Cache

    NASA Astrophysics Data System (ADS)

    Eavis, Todd; Sayeed, Ruhan

    Contemporary data warehouses now represent some of the world’s largest databases. As these systems grow in size and complexity, however, it becomes increasingly difficult for brute force query processing approaches to meet the performance demands of end users. Certainly, improved indexing and more selective view materialization are helpful in this regard. Nevertheless, with warehouses moving into the multi-terabyte range, it is clear that the minimization of external memory accesses must be a primary performance objective. In this paper, we describe the R 3-cache, a natively multi-dimensional caching framework designed specifically to support sophisticated warehouse/OLAP environments. R 3-cache is based upon an in-memory version of the R-tree that has been extended to support buffer pages rather than disk blocks. A key strength of the R 3-cache is that it is able to utilize multi-dimensional fragments of previous query results so as to significantly minimize the frequency and scale of disk accesses. Moreover, the new caching model directly accommodates the standard relational storage model and provides mechanisms for pro-active updates that exploit the existence of query “hot spots”. The current prototype has been evaluated as a component of the Sidera DBMS, a “shared nothing” parallel OLAP server designed for multi-terabyte analytics. Experimental results demonstrate significant performance improvements relative to simpler alternatives.

  16. Fundamental Factors Impacting the Stability of Phosphonate-Derivatized Ruthenium Polypyridyl Sensitizers Adsorbed on Metal Oxide Surfaces.

    PubMed

    Raber, McKenzie; Brady, Matthew David; Troian-Gautier, Ludovic; Dickenson, John; Marquard, Seth L; Hyde, Jacob; Lopez, Santiago; Meyer, Gerald J; Meyer, Thomas J; Harrison, Daniel P

    2018-06-08

    A series of 18 ruthenium(II) polypyridyl complexes were synthesized and evaluated under electrochemically oxidative conditions, which generates the Ru(III) oxidation state and mimics the harsh conditions experienced during the kinetically-limited regime that can occur in dye-sensitized solar cells (DSSCs) and dye-sensitized photoelectrosynthesis cells (DSPECs), to further develop fundamental insights into the factors governing molecular sensitizer surface stability in aqueous 0.1 M HClO4 (aq). Both desorption and oxidatively induced ligand substitution were observed on planar fluorine doped tin oxide, FTO, electrodes, with a dependence on the E1/2 Ru(III/II) redox potential dictating the comparative ratios of the processes. Complexes such as RuP4OMe (E1/2 = 0.91 vs Ag/AgCl) displayed virtually only desorption, while complexes such as RuPbpz (E1/2 > 1.62 V vs Ag/AgCl) displayed only chemical decomposition. Comparing isomers of 4,4'- and 5,5-disubstituted-2,2'-bipyridine ancillary polypyridyl ligands, a dramatic increase in the rate of desorption of the Ru(III) complexes was observed for the 5,5'-ligands. Nanoscopic indium doped tin oxide thin films, nanoITO, were also sensitized and analyzed with cyclic voltammetry, UV-Vis absorption spectroscopy, and XPS, allowing for further distinction of desorption versus ligand substitution processes. Desorption loss to bulk solution associated with the planar surface of FTO is essentially non-existent on nanoITO, where both desorption and ligand substitution are shut down with RuP4OMe. These results revealed that minimizing time spent in the oxidized form, incorporating electron donating groups, maximizing hydrophobicity, and minimizing molecular bulk near the adsorbed ligand are critical to optimizing the performance of ruthenium(II) polypyridyl complexes in dye-sensitized solar cell devices.

  17. Integrated immunoassay using tuneable surface acoustic waves and lensfree detection.

    PubMed

    Bourquin, Yannyk; Reboud, Julien; Wilson, Rab; Zhang, Yi; Cooper, Jonathan M

    2011-08-21

    The diagnosis of infectious diseases in the Developing World is technologically challenging requiring complex biological assays with a high analytical performance, at minimal cost. By using an opto-acoustic immunoassay technology, integrating components commonly used in mobile phone technologies, including surface acoustic wave (SAW) transducers to provide pressure driven flow and a CMOS camera to enable lensfree detection technique, we demonstrate the potential to produce such an assay. To achieve this, antibody functionalised microparticles were manipulated on a low-cost disposable cartridge using the surface acoustic waves and were then detected optically. Our results show that the biomarker, interferon-γ, used for the diagnosis of diseases such as latent tuberculosis, can be detected at pM concentrations, within a few minutes (giving high sensitivity at a minimal cost). This journal is © The Royal Society of Chemistry 2011

  18. Percutaneous Isolated Hepatic Perfusion for the Treatment of Unresectable Liver Malignancies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burgmans, Mark C., E-mail: m.c.burgmans@lumc.nl; Leede, Eleonora M. de, E-mail: e.m.de-leede@lumc.nl; Martini, Christian H., E-mail: c.h.martini@lumc.nl

    2016-06-15

    Liver malignancies are a major burden of disease worldwide. The long-term prognosis for patients with unresectable tumors remains poor, despite advances in systemic chemotherapy, targeted agents, and minimally invasive therapies such as ablation, chemoembolization, and radioembolization. Thus, the demand for new and better treatments for malignant liver tumors remains high. Surgical isolated hepatic perfusion (IHP) has been shown to be effective in patients with various hepatic malignancies, but is complex, associated with high complication rates and not repeatable. Percutaneous isolated liver perfusion (PHP) is a novel minimally invasive, repeatable, and safer alternative to IHP. PHP is rapidly gaining interest andmore » the number of procedures performed in Europe now exceeds 200. This review discusses the indications, technique and patient management of PHP and provides an overview of the available data.« less

  19. Consensus guidelines on plasma cell myeloma minimal residual disease analysis and reporting.

    PubMed

    Arroz, Maria; Came, Neil; Lin, Pei; Chen, Weina; Yuan, Constance; Lagoo, Anand; Monreal, Mariela; de Tute, Ruth; Vergilio, Jo-Anne; Rawstron, Andy C; Paiva, Bruno

    2016-01-01

    Major heterogeneity between laboratories in flow cytometry (FC) minimal residual disease (MRD) testing in multiple myeloma (MM) must be overcome. Cytometry societies such as the International Clinical Cytometry Society and the European Society for Clinical Cell Analysis recognize a strong need to establish minimally acceptable requirements and recommendations to perform such complex testing. A group of 11 flow cytometrists currently performing FC testing in MM using different instrumentation, panel designs (≥ 6-color) and analysis software compared the procedures between their respective laboratories and reviewed the literature to propose a consensus guideline on flow-MRD analysis and reporting in MM. Consensus guidelines support i) the use of minimum of five initial gating parameters (CD38, CD138, CD45, forward, and sideward light scatter) within the same aliquot for accurate identification of the total plasma cell compartment; ii) the analysis of potentially aberrant phenotypic markers and to report the antigen expression pattern on neoplastic plasma cells as being reduced, normal or increased, when compared to a normal reference plasma cell immunophenotype (obtained using the same instrument and parameters); and iii) the percentage of total bone marrow plasma cells plus the percentages of both normal and neoplastic plasma cells within the total bone marrow plasma cell compartment, and over total bone marrow cells. Consensus guidelines on minimal current and future MRD analyses should target a lower limit of detection of 0.001%, and ideally a limit of quantification of 0.001%, which requires at least 3 × 10(6) and 5 × 10(6) bone marrow cells to be measured, respectively. © 2015 International Clinical Cytometry Society.

  20. Comparison of sorting algorithms to increase the range of Hartmann-Shack aberrometry.

    PubMed

    Bedggood, Phillip; Metha, Andrew

    2010-01-01

    Recently many software-based approaches have been suggested for improving the range and accuracy of Hartmann-Shack aberrometry. We compare the performance of four representative algorithms, with a focus on aberrometry for the human eye. Algorithms vary in complexity from the simplistic traditional approach to iterative spline extrapolation based on prior spot measurements. Range is assessed for a variety of aberration types in isolation using computer modeling, and also for complex wavefront shapes using a real adaptive optics system. The effects of common sources of error for ocular wavefront sensing are explored. The results show that the simplest possible iterative algorithm produces comparable range and robustness compared to the more complicated algorithms, while keeping processing time minimal to afford real-time analysis.

  1. Comparison of sorting algorithms to increase the range of Hartmann-Shack aberrometry

    NASA Astrophysics Data System (ADS)

    Bedggood, Phillip; Metha, Andrew

    2010-11-01

    Recently many software-based approaches have been suggested for improving the range and accuracy of Hartmann-Shack aberrometry. We compare the performance of four representative algorithms, with a focus on aberrometry for the human eye. Algorithms vary in complexity from the simplistic traditional approach to iterative spline extrapolation based on prior spot measurements. Range is assessed for a variety of aberration types in isolation using computer modeling, and also for complex wavefront shapes using a real adaptive optics system. The effects of common sources of error for ocular wavefront sensing are explored. The results show that the simplest possible iterative algorithm produces comparable range and robustness compared to the more complicated algorithms, while keeping processing time minimal to afford real-time analysis.

  2. BFEE: A User-Friendly Graphical Interface Facilitating Absolute Binding Free-Energy Calculations.

    PubMed

    Fu, Haohao; Gumbart, James C; Chen, Haochuan; Shao, Xueguang; Cai, Wensheng; Chipot, Christophe

    2018-03-26

    Quantifying protein-ligand binding has attracted the attention of both theorists and experimentalists for decades. Many methods for estimating binding free energies in silico have been reported in recent years. Proper use of the proposed strategies requires, however, adequate knowledge of the protein-ligand complex, the mathematical background for deriving the underlying theory, and time for setting up the simulations, bookkeeping, and postprocessing. Here, to minimize human intervention, we propose a toolkit aimed at facilitating the accurate estimation of standard binding free energies using a geometrical route, coined the binding free-energy estimator (BFEE), and introduced it as a plug-in of the popular visualization program VMD. Benefitting from recent developments in new collective variables, BFEE can be used to generate the simulation input files, based solely on the structure of the complex. Once the simulations are completed, BFEE can also be utilized to perform the post-treatment of the free-energy calculations, allowing the absolute binding free energy to be estimated directly from the one-dimensional potentials of mean force in simulation outputs. The minimal amount of human intervention required during the whole process combined with the ergonomic graphical interface makes BFEE a very effective and practical tool for the end-user.

  3. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    PubMed Central

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system). The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the minimally complex systems (MCS) measurement approach. We suggest ecologically valid microworlds as an indispensable tool for future CPS research and applications. PMID:29867627

  4. Stretchable ultrasonic transducer arrays for three-dimensional imaging on complex surfaces

    PubMed Central

    Zhu, Xuan; Li, Xiaoshi; Chen, Zeyu; Chen, Yimu; Lei, Yusheng; Li, Yang; Nomoto, Akihiro; Zhou, Qifa; di Scalea, Francesco Lanza

    2018-01-01

    Ultrasonic imaging has been implemented as a powerful tool for noninvasive subsurface inspections of both structural and biological media. Current ultrasound probes are rigid and bulky and cannot readily image through nonplanar three-dimensional (3D) surfaces. However, imaging through these complicated surfaces is vital because stress concentrations at geometrical discontinuities render these surfaces highly prone to defects. This study reports a stretchable ultrasound probe that can conform to and detect nonplanar complex surfaces. The probe consists of a 10 × 10 array of piezoelectric transducers that exploit an “island-bridge” layout with multilayer electrodes, encapsulated by thin and compliant silicone elastomers. The stretchable probe shows excellent electromechanical coupling, minimal cross-talk, and more than 50% stretchability. Its performance is demonstrated by reconstructing defects in 3D space with high spatial resolution through flat, concave, and convex surfaces. The results hold great implications for applications of ultrasound that require imaging through complex surfaces. PMID:29740603

  5. Optimization of Highway Work Zone Decisions Considering Short-Term and Long-Term Impacts

    DTIC Science & Technology

    2010-01-01

    strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature of this optimization problem, a heuristic...combination of lane closure and traffic control strategies which can minimize the one-time work zone cost. Considering the complex and combinatorial nature ...zone) NV # the number of vehicle classes NPV $ Net Present Value p’(t) % Adjusted traffic diversion rate at time t p(t) % Natural diversion rate

  6. An Approach for the Distance Delivery of AFIT/LS Resident Degree Curricula

    DTIC Science & Technology

    1991-12-01

    minimal (least complex) distance education technologies appropriate for each learning topic or task. This may be the most time-consuming step in the...34 represents the least complex distance education technology that could be used to deliver the educational material for a particular learning objective. Careful...minimal technology needed to accomplish the learning objective. Look at question Q2.1 (Figure 5.15). Since the lecture offers an essential educational

  7. Architecture of the Yeast RNA Polymerase II Open Complex and Regulation of Activity by TFIIF

    PubMed Central

    Fishburn, James

    2012-01-01

    To investigate the function and architecture of the open complex state of RNA polymerase II (Pol II), Saccharomyces cerevisiae minimal open complexes were assembled by using a series of heteroduplex HIS4 promoters, TATA binding protein (TBP), TFIIB, and Pol II. The yeast system demonstrates great flexibility in the position of active open complexes, spanning 30 to 80 bp downstream from TATA, consistent with the transcription start site scanning behavior of yeast Pol II. TFIIF unexpectedly modulates the activity of the open complexes, either repressing or stimulating initiation. The response to TFIIF was dependent on the sequence of the template strand within the single-stranded bubble. Mutations in the TFIIB reader and linker region, which were inactive on duplex DNA, were suppressed by the heteroduplex templates, showing that a major function of the TFIIB reader and linker is in the initiation or stabilization of single-stranded DNA. Probing of the architecture of the minimal open complexes with TFIIB-FeBABE [TFIIB–p-bromoacetamidobenzyl–EDTA-iron(III)] derivatives showed that the TFIIB core domain is surprisingly positioned away from Pol II, and the addition of TFIIF repositions the TFIIB core domain to the Pol II wall domain. Together, our results show an unexpected architecture of minimal open complexes and the regulation of activity by TFIIF and the TFIIB core domain. PMID:22025674

  8. Modular minimally invasive extracorporeal circulation systems; can they become the standard practice for performing cardiac surgery?

    PubMed

    Anastasiadis, K; Antonitsis, P; Argiriadou, H; Deliopoulos, A; Grosomanidis, V; Tossios, P

    2015-04-01

    Minimally invasive extracorporeal circulation (MiECC) has been developed in an attempt to integrate all advances in cardiopulmonary bypass technology in one closed circuit that shows improved biocompatibility and minimizes the systemic detrimental effects of CPB. Despite well-evidenced clinical advantages, penetration of MiECC technology into clinical practice is hampered by concerns raised by perfusionists and surgeons regarding air handling together with blood and volume management during CPB. We designed a modular MiECC circuit, bearing an accessory circuit for immediate transition to an open system that can be used in every adult cardiac surgical procedure, offering enhanced safety features. We challenged this modular circuit in a series of 50 consecutive patients. Our results showed that the modular AHEPA circuit design offers 100% technical success rate in a cohort of random, high-risk patients who underwent complex procedures, including reoperation and valve and aortic surgery, together with emergency cases. This pilot study applies to the real world and prompts for further evaluation of modular MiECC systems through multicentre trials. © The Author(s) 2015.

  9. Management of pilonidal disease.

    PubMed

    Kallis, Michelle P; Maloney, Caroline; Lipskar, Aaron M

    2018-06-01

    Pilonidal disease, and the treatment associated with it, can cause significant morbidity and substantial burden to patients' quality of life. Despite the plethora of surgical techniques that have been developed to treat pilonidal disease, discrepancies in technique, recurrence rates, complications, time to return to work/school and patients' aesthetic satisfaction between treatment options have led to controversy over the best approach to this common acquired disease of young adults. The management of pilonidal disease must strike a balance between recurrence and surgical morbidity. The commonly performed wide excision without closure has prolonged recovery, while flap closures speed recovery time and improve aesthetics at the expense of increased wound complications. Less invasive surgical techniques have recently evolved and are straightforward, with minimal morbidity and satisfactory results. As with any surgical intervention, the ideal treatment for pilonidal disease would be simple and cost-effective, cause minimal pain, have a limited hospital stay, low recurrence rate and require minimal time off from school or work. Less invasive procedures for pilonidal disease may be favourable as an initial approach for these patients reserving complex surgical treatment for refractory disease.

  10. Local empathy provides global minimization of congestion in communication networks

    NASA Astrophysics Data System (ADS)

    Meloni, Sandro; Gómez-Gardeñes, Jesús

    2010-11-01

    We present a mechanism to avoid congestion in complex networks based on a local knowledge of traffic conditions and the ability of routers to self-coordinate their dynamical behavior. In particular, routers make use of local information about traffic conditions to either reject or accept information packets from their neighbors. We show that when nodes are only aware of their own congestion state they self-organize into a hierarchical configuration that delays remarkably the onset of congestion although leading to a sharp first-order-like congestion transition. We also consider the case when nodes are aware of the congestion state of their neighbors. In this case, we show that empathy between nodes is strongly beneficial to the overall performance of the system and it is possible to achieve larger values for the critical load together with a smooth, second-order-like, transition. Finally, we show how local empathy minimize the impact of congestion as much as global minimization. Therefore, here we present an outstanding example of how local dynamical rules can optimize the system’s functioning up to the levels reached using global knowledge.

  11. Thymic minimally invasive surgery: state of the art across the world: Central-South America

    PubMed Central

    2017-01-01

    Literature suggests that, for thymectomy in myasthenia or resection of thymic tumors, minimally invasive surgery is equivalent to open surgery with regard to long-term outcomes. However, it could bring some benefits in the immediate results as complication rate or length-of-stay. There are doubts about the worldwide adoption of the method, though. In Latin America, the implementation of video-assisted thoracic surgery (VATS) started in the 1990s, but it progressed slowly. The main barriers were associated costs and training. Thymic surgery poses a bigger challenge due to its rarity, so just a few reports mention the use of the method in the region. Nonetheless, in recent years we observe a faster dissemination of the method both in number and in complexity of the procedures performed. Confirming this fact, half of the patients registered in the Brazilian Society of Thoracic Surgery database in the last 2 years as undergoing resection of thymic tumors, underwent a minimally invasive procedure. Although promising, robotic surgery is still in its early days in Latin America. PMID:29078684

  12. Quantitative assessment of 12-lead ECG synthesis using CAVIAR.

    PubMed

    Scherer, J A; Rubel, P; Fayn, J; Willems, J L

    1992-01-01

    The objective of this study is to assess the performance of patient-specific segment-specific (PSSS) synthesis in QRST complexes using CAVIAR, a new method of the serial comparison for electrocardiograms and vectorcardiograms. A collection of 250 multi-lead recordings from the Common Standards for Quantitative Electrocardiography (CSE) diagnostic pilot study is employed. QRS and ST-T segments are independently synthesized using the PSSS algorithm so that the mean-squared error between the original and estimated waveforms is minimized. CAVIAR compares the recorded and synthesized QRS and ST-T segments and calculates the mean-quadratic deviation as a measure of error. The results of this study indicate that estimated QRS complexes are good representatives of their recorded counterparts, and the integrity of the spatial information is maintained by the PSSS synthesis process. Analysis of the ST-T segments suggests that the deviations between recorded and synthesized waveforms are considerably greater than those associated with the QRS complexes. The poorer performance of the ST-T segments is attributed to magnitude normalization of the spatial loops, low-voltage passages, and noise interference. Using the mean-quadratic deviation and CAVIAR as methods of performance assessment, this study indicates that the PSSS-synthesis algorithm accurately maintains the signal information within the 12-lead electrocardiogram.

  13. Component-Level Electronic-Assembly Repair (CLEAR) Synthetic Instrument Capabilities Assessment and Test Report

    NASA Technical Reports Server (NTRS)

    Oeftering, Richard C.; Bradish, Martin A.

    2011-01-01

    The role of synthetic instruments (SIs) for Component-Level Electronic-Assembly Repair (CLEAR) is to provide an external lower-level diagnostic and functional test capability beyond the built-in-test capabilities of spacecraft electronics. Built-in diagnostics can report faults and symptoms, but isolating the root cause and performing corrective action requires specialized instruments. Often a fault can be revealed by emulating the operation of external hardware. This implies complex hardware that is too massive to be accommodated in spacecraft. The SI strategy is aimed at minimizing complexity and mass by employing highly reconfigurable instruments that perform diagnostics and emulate external functions. In effect, SI can synthesize an instrument on demand. The SI architecture section of this document summarizes the result of a recent program diagnostic and test needs assessment based on the International Space Station. The SI architecture addresses operational issues such as minimizing crew time and crew skill level, and the SI data transactions between the crew and supporting ground engineering searching for the root cause and formulating corrective actions. SI technology is described within a teleoperations framework. The remaining sections describe a lab demonstration intended to show that a single SI circuit could synthesize an instrument in hardware and subsequently clear the hardware and synthesize a completely different instrument on demand. An analysis of the capabilities and limitations of commercially available SI hardware and programming tools is included. Future work in SI technology is also described.

  14. Modeling OPC complexity for design for manufacturability

    NASA Astrophysics Data System (ADS)

    Gupta, Puneet; Kahng, Andrew B.; Muddu, Swamy; Nakagawa, Sam; Park, Chul-Hong

    2005-11-01

    Increasing design complexity in sub-90nm designs results in increased mask complexity and cost. Resolution enhancement techniques (RET) such as assist feature addition, phase shifting (attenuated PSM) and aggressive optical proximity correction (OPC) help in preserving feature fidelity in silicon but increase mask complexity and cost. Data volume increase with rise in mask complexity is becoming prohibitive for manufacturing. Mask cost is determined by mask write time and mask inspection time, which are directly related to the complexity of features printed on the mask. Aggressive RET increase complexity by adding assist features and by modifying existing features. Passing design intent to OPC has been identified as a solution for reducing mask complexity and cost in several recent works. The goal of design-aware OPC is to relax OPC tolerances of layout features to minimize mask cost, without sacrificing parametric yield. To convey optimal OPC tolerances for manufacturing, design optimization should drive OPC tolerance optimization using models of mask cost for devices and wires. Design optimization should be aware of impact of OPC correction levels on mask cost and performance of the design. This work introduces mask cost characterization (MCC) that quantifies OPC complexity, measured in terms of fracture count of the mask, for different OPC tolerances. MCC with different OPC tolerances is a critical step in linking design and manufacturing. In this paper, we present a MCC methodology that provides models of fracture count of standard cells and wire patterns for use in design optimization. MCC cannot be performed by designers as they do not have access to foundry OPC recipes and RET tools. To build a fracture count model, we perform OPC and fracturing on a limited set of standard cells and wire configurations with all tolerance combinations. Separately, we identify the characteristics of the layout that impact fracture count. Based on the fracture count (FC) data from OPC and mask data preparation runs, we build models of FC as function of OPC tolerances and layout parameters.

  15. [Radiation protection in interventional cardiology].

    PubMed

    Durán, Ariel

    2015-01-01

    INTERVENTIONAL: cardiology progress makes each year a greater number of procedures and increasing complexity with a very good success rate. The problem is that this progress brings greater dose of radiation not only for the patient but to occupationally exposed workers as well. Simple methods for reducing or minimizing occupational radiation dose include: minimizing fluoroscopy time and the number of acquired images; using available patient dose reduction technologies; using good imaging-chain geometry; collimating; avoiding high-scatter areas; using protective shielding; using imaging equipment whose performance is controlled through a quality assurance programme; and wearing personal dosimeters so that you know your dose. Effective use of these methods requires both appropriate education and training in radiation protection for all interventional cardiology personnel, and the availability and use of appropriate protective tools and equipment. Regular review and investigation of personnel monitoring results, accompanied as appropriate by changes in how procedures are performed and equipment used, will ensure continual improvement in the practice of radiation protection in the interventional suite. Copyright © 2014 Instituto Nacional de Cardiología Ignacio Chávez. Published by Masson Doyma México S.A. All rights reserved.

  16. 3D deformable image matching: a hierarchical approach over nested subspaces

    NASA Astrophysics Data System (ADS)

    Musse, Olivier; Heitz, Fabrice; Armspach, Jean-Paul

    2000-06-01

    This paper presents a fast hierarchical method to perform dense deformable inter-subject matching of 3D MR Images of the brain. To recover the complex morphological variations in neuroanatomy, a hierarchy of 3D deformations fields is estimated, by minimizing a global energy function over a sequence of nested subspaces. The nested subspaces, generated from a single scaling function, consist of deformation fields constrained at different scales. The highly non linear energy function, describing the interactions between the target and the source images, is minimized using a coarse-to-fine continuation strategy over this hierarchy. The resulting deformable matching method shows low sensitivity to local minima and is able to track large non-linear deformations, with moderate computational load. The performances of the approach are assessed both on simulated 3D transformations and on a real data base of 3D brain MR Images from different individuals. The method has shown efficient in putting into correspondence the principle anatomical structures of the brain. An application to atlas-based MRI segmentation, by transporting a labeled segmentation map on patient data, is also presented.

  17. A highly articulated robotic surgical system for minimally invasive surgery.

    PubMed

    Ota, Takeyoshi; Degani, Amir; Schwartzman, David; Zubiate, Brett; McGarvey, Jeremy; Choset, Howie; Zenati, Marco A

    2009-04-01

    We developed a novel, highly articulated robotic surgical system (CardioARM) to enable minimally invasive intrapericardial therapeutic delivery through a subxiphoid approach. We performed preliminary proof of concept studies in a porcine preparation by performing epicardial ablation. CardioARM is a robotic surgical system having an articulated design to provide unlimited but controllable flexibility. The CardioARM consists of serially connected, rigid cyclindrical links housing flexible working ports through which catheter-based tools for therapy and imaging can be advanced. The CardioARM is controlled by a computer-driven, user interface, which is operated outside the operative field. In six experimental subjects, the CardioARM was introduced percutaneously through a subxiphoid access. A commercial 5-French radiofrequency ablation catheter was introduced through the working port, which was then used to guide deployment. In all subjects, regional ("linear") left atrial ablation was successfully achieved without complications. Based on these preliminary studies, we believe that the CardioARM promises to enable deployment of a number of epicardium-based therapies. Improvements in imaging techniques will likely facilitate increasingly complex procedures.

  18. Autism Center First to Study Minimally Verbal Children

    MedlinePlus

    ... on. Feature: Taste, Smell, Hearing, Language, Voice, Balance Autism Center First to Study Minimally Verbal Children Past ... research exploring the causes, diagnosis, and treatment of autism spectrum disorder (ASD), a complex developmental disorder that ...

  19. Toward a Definition of Complexity for Quantum Field Theory States.

    PubMed

    Chapman, Shira; Heller, Michal P; Marrochio, Hugo; Pastawski, Fernando

    2018-03-23

    We investigate notions of complexity of states in continuous many-body quantum systems. We focus on Gaussian states which include ground states of free quantum field theories and their approximations encountered in the context of the continuous version of the multiscale entanglement renormalization ansatz. Our proposal for quantifying state complexity is based on the Fubini-Study metric. It leads to counting the number of applications of each gate (infinitesimal generator) in the transformation, subject to a state-dependent metric. We minimize the defined complexity with respect to momentum-preserving quadratic generators which form su(1,1) algebras. On the manifold of Gaussian states generated by these operations, the Fubini-Study metric factorizes into hyperbolic planes with minimal complexity circuits reducing to known geodesics. Despite working with quantum field theories far outside the regime where Einstein gravity duals exist, we find striking similarities between our results and those of holographic complexity proposals.

  20. Toward a Definition of Complexity for Quantum Field Theory States

    NASA Astrophysics Data System (ADS)

    Chapman, Shira; Heller, Michal P.; Marrochio, Hugo; Pastawski, Fernando

    2018-03-01

    We investigate notions of complexity of states in continuous many-body quantum systems. We focus on Gaussian states which include ground states of free quantum field theories and their approximations encountered in the context of the continuous version of the multiscale entanglement renormalization ansatz. Our proposal for quantifying state complexity is based on the Fubini-Study metric. It leads to counting the number of applications of each gate (infinitesimal generator) in the transformation, subject to a state-dependent metric. We minimize the defined complexity with respect to momentum-preserving quadratic generators which form s u (1 ,1 ) algebras. On the manifold of Gaussian states generated by these operations, the Fubini-Study metric factorizes into hyperbolic planes with minimal complexity circuits reducing to known geodesics. Despite working with quantum field theories far outside the regime where Einstein gravity duals exist, we find striking similarities between our results and those of holographic complexity proposals.

  1. Competitive pressures affect sexual signal complexity in Kurixalus odontotarsus: insights into the evolution of compound calls

    PubMed Central

    2017-01-01

    ABSTRACT Male-male vocal competition in anuran species is critical for mating success; however, it is also energetically demanding and highly time-consuming. Thus, we hypothesized that males may change signal elaboration in response to competition in real time. Male serrate-legged small treefrogs (Kurixalus odontotarsus) produce compound calls that contain two kinds of notes, harmonic sounds called ‘A notes’ and short broadband sounds called ‘B notes’. Using male evoked vocal response experiments, we found that competition influences the temporal structure and complexity of vocal signals produced by males. Males produce calls with a higher ratio of notes:call, and more compound calls including more A notes but fewer B notes with contest escalation. In doing so, males minimize the energy costs and maximize the benefits of competition when the level of competition is high. This means that the evolution of sexual signal complexity in frogs may be susceptible to selection for plasticity related to adjusting performance to the pressures of competition, and supports the idea that more complex social contexts can lead to greater vocal complexity. PMID:29175862

  2. Training-free compressed sensing for wireless neural recording using analysis model and group weighted {{\\ell}_{1}} -minimization

    NASA Astrophysics Data System (ADS)

    Sun, Biao; Zhao, Wenfeng; Zhu, Xinshan

    2017-06-01

    Objective. Data compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and compressed sensing (CS) theory has successfully demonstrated its potential in neural recording applications. In this paper, an analytical, training-free CS recovery method, termed group weighted analysis {{\\ell}1} -minimization (GWALM), is proposed for wireless neural recording. Approach. The GWALM method consists of three parts: (1) the analysis model is adopted to enforce sparsity of the neural signals, therefore overcoming the drawbacks of conventional synthesis models and enhancing the recovery performance. (2) A multi-fractional-order difference matrix is constructed as the analysis operator, thus avoiding the dictionary learning procedure and reducing the need for previously acquired data and computational complexities. (3) By exploiting the statistical properties of the analysis coefficients, a group weighting approach is developed to enhance the performance of analysis {{\\ell}1} -minimization. Main results. Experimental results on synthetic and real datasets reveal that the proposed approach outperforms state-of-the-art CS-based methods in terms of both spike recovery quality and classification accuracy. Significance. Energy and area efficiency of the GWALM make it an ideal candidate for resource-constrained, large scale wireless neural recording applications. The training-free feature of the GWALM further improves its robustness to spike shape variation, thus making it more practical for long term wireless neural recording.

  3. Training-free compressed sensing for wireless neural recording using analysis model and group weighted [Formula: see text]-minimization.

    PubMed

    Sun, Biao; Zhao, Wenfeng; Zhu, Xinshan

    2017-06-01

    Data compression is crucial for resource-constrained wireless neural recording applications with limited data bandwidth, and compressed sensing (CS) theory has successfully demonstrated its potential in neural recording applications. In this paper, an analytical, training-free CS recovery method, termed group weighted analysis [Formula: see text]-minimization (GWALM), is proposed for wireless neural recording. The GWALM method consists of three parts: (1) the analysis model is adopted to enforce sparsity of the neural signals, therefore overcoming the drawbacks of conventional synthesis models and enhancing the recovery performance. (2) A multi-fractional-order difference matrix is constructed as the analysis operator, thus avoiding the dictionary learning procedure and reducing the need for previously acquired data and computational complexities. (3) By exploiting the statistical properties of the analysis coefficients, a group weighting approach is developed to enhance the performance of analysis [Formula: see text]-minimization. Experimental results on synthetic and real datasets reveal that the proposed approach outperforms state-of-the-art CS-based methods in terms of both spike recovery quality and classification accuracy. Energy and area efficiency of the GWALM make it an ideal candidate for resource-constrained, large scale wireless neural recording applications. The training-free feature of the GWALM further improves its robustness to spike shape variation, thus making it more practical for long term wireless neural recording.

  4. Efficient realization of 3D joint inversion of seismic and magnetotelluric data with cross gradient structure constraint

    NASA Astrophysics Data System (ADS)

    Luo, H.; Zhang, H.; Gao, J.

    2016-12-01

    Seismic and magnetotelluric (MT) imaging methods are generally used to characterize subsurface structures at various scales. The two methods are complementary to each other and the integration of them is helpful for more reliably determining the resistivity and velocity models of the target region. Because of the difficulty in finding empirical relationship between resistivity and velocity parameters, Gallardo and Meju [2003] proposed a joint inversion method enforcing resistivity and velocity models consistent in structure, which is realized by minimizing cross gradients between two models. However, it is extremely challenging to combine two different inversion systems together along with the cross gradient constraints. For this reason, Gallardo [2007] proposed a joint inversion scheme that decouples the seismic and MT inversion systems by iteratively performing seismic and MT inversions as well as cross gradient minimization separately. This scheme avoids the complexity of combining two different systems together but it suffers the issue of balancing between data fitting and structure constraint. In this study, we have developed a new joint inversion scheme that avoids the problem encountered by the scheme of Gallardo [2007]. In the new scheme, seismic and MT inversions are still separately performed but the cross gradient minimization is also constrained by model perturbations from separate inversions. In this way, the new scheme still avoids the complexity of combining two different systems together and at the same time the balance between data fitting and structure consistency constraint can be enforced. We have tested our joint inversion algorithm for both 2D and 3D cases. Synthetic tests show that joint inversion better reconstructed the velocity and resistivity models than separate inversions. Compared to separate inversions, joint inversion can remove artifacts in the resistivity model and can improve the resolution for deeper resistivity structures. We will also show results applying the new joint seismic and MT inversion scheme to southwest China, where several MT profiles are available and earthquakes are very active.

  5. Non-Interfering Effects of Active Post-Encoding Tasks on Episodic Memory Consolidation in Humans

    PubMed Central

    Varma, Samarth; Takashima, Atsuko; Krewinkel, Sander; van Kooten, Maaike; Fu, Lily; Medendorp, W. Pieter; Kessels, Roy P. C.; Daselaar, Sander M.

    2017-01-01

    So far, studies that investigated interference effects of post-learning processes on episodic memory consolidation in humans have used tasks involving only complex and meaningful information. Such tasks require reallocation of general or encoding-specific resources away from consolidation-relevant activities. The possibility that interference can be elicited using a task that heavily taxes our limited brain resources, but has low semantic and hippocampal related long-term memory processing demands, has never been tested. We address this question by investigating whether consolidation could persist in parallel with an active, encoding-irrelevant, minimally semantic task, regardless of its high resource demands for cognitive processing. We distinguish the impact of such a task on consolidation based on whether it engages resources that are: (1) general/executive, or (2) specific/overlapping with the encoding modality. Our experiments compared subsequent memory performance across two post-encoding consolidation periods: quiet wakeful rest and a cognitively demanding n-Back task. Across six different experiments (total N = 176), we carefully manipulated the design of the n-Back task to target general or specific resources engaged in the ongoing consolidation process. In contrast to previous studies that employed interference tasks involving conceptual stimuli and complex processing demands, we did not find any differences between n-Back and rest conditions on memory performance at delayed test, using both recall and recognition tests. Our results indicate that: (1) quiet, wakeful rest is not a necessary prerequisite for episodic memory consolidation; and (2) post-encoding cognitive engagement does not interfere with memory consolidation when task-performance has minimal semantic and hippocampally-based episodic memory processing demands. We discuss our findings with reference to resource and reactivation-led interference theories. PMID:28424596

  6. Falling with Style: Bats Perform Complex Aerial Rotations by Adjusting Wing Inertia.

    PubMed

    Bergou, Attila J; Swartz, Sharon M; Vejdani, Hamid; Riskin, Daniel K; Reimnitz, Lauren; Taubin, Gabriel; Breuer, Kenneth S

    2015-01-01

    The remarkable maneuverability of flying animals results from precise movements of their highly specialized wings. Bats have evolved an impressive capacity to control their flight, in large part due to their ability to modulate wing shape, area, and angle of attack through many independently controlled joints. Bat wings, however, also contain many bones and relatively large muscles, and thus the ratio of bats' wing mass to their body mass is larger than it is for all other extant flyers. Although the inertia in bat wings would typically be associated with decreased aerial maneuverability, we show that bat maneuvers challenge this notion. We use a model-based tracking algorithm to measure the wing and body kinematics of bats performing complex aerial rotations. Using a minimal model of a bat with only six degrees of kinematic freedom, we show that bats can perform body rolls by selectively retracting one wing during the flapping cycle. We also show that this maneuver does not rely on aerodynamic forces, and furthermore that a fruit fly, with nearly massless wings, would not exhibit this effect. Similar results are shown for a pitching maneuver. Finally, we combine high-resolution kinematics of wing and body movements during landing and falling maneuvers with a 52-degree-of-freedom dynamical model of a bat to show that modulation of wing inertia plays the dominant role in reorienting the bat during landing and falling maneuvers, with minimal contribution from aerodynamic forces. Bats can, therefore, use their wings as multifunctional organs, capable of sophisticated aerodynamic and inertial dynamics not previously observed in other flying animals. This may also have implications for the control of aerial robotic vehicles.

  7. Non-Interfering Effects of Active Post-Encoding Tasks on Episodic Memory Consolidation in Humans.

    PubMed

    Varma, Samarth; Takashima, Atsuko; Krewinkel, Sander; van Kooten, Maaike; Fu, Lily; Medendorp, W Pieter; Kessels, Roy P C; Daselaar, Sander M

    2017-01-01

    So far, studies that investigated interference effects of post-learning processes on episodic memory consolidation in humans have used tasks involving only complex and meaningful information. Such tasks require reallocation of general or encoding-specific resources away from consolidation-relevant activities. The possibility that interference can be elicited using a task that heavily taxes our limited brain resources, but has low semantic and hippocampal related long-term memory processing demands, has never been tested. We address this question by investigating whether consolidation could persist in parallel with an active, encoding-irrelevant, minimally semantic task, regardless of its high resource demands for cognitive processing. We distinguish the impact of such a task on consolidation based on whether it engages resources that are: (1) general/executive, or (2) specific/overlapping with the encoding modality. Our experiments compared subsequent memory performance across two post-encoding consolidation periods: quiet wakeful rest and a cognitively demanding n-Back task. Across six different experiments (total N = 176), we carefully manipulated the design of the n-Back task to target general or specific resources engaged in the ongoing consolidation process. In contrast to previous studies that employed interference tasks involving conceptual stimuli and complex processing demands, we did not find any differences between n-Back and rest conditions on memory performance at delayed test, using both recall and recognition tests. Our results indicate that: (1) quiet, wakeful rest is not a necessary prerequisite for episodic memory consolidation; and (2) post-encoding cognitive engagement does not interfere with memory consolidation when task-performance has minimal semantic and hippocampally-based episodic memory processing demands. We discuss our findings with reference to resource and reactivation-led interference theories.

  8. Falling with Style: Bats Perform Complex Aerial Rotations by Adjusting Wing Inertia

    PubMed Central

    Bergou, Attila J.; Swartz, Sharon M.; Vejdani, Hamid; Riskin, Daniel K.; Reimnitz, Lauren; Taubin, Gabriel; Breuer, Kenneth S.

    2015-01-01

    The remarkable maneuverability of flying animals results from precise movements of their highly specialized wings. Bats have evolved an impressive capacity to control their flight, in large part due to their ability to modulate wing shape, area, and angle of attack through many independently controlled joints. Bat wings, however, also contain many bones and relatively large muscles, and thus the ratio of bats’ wing mass to their body mass is larger than it is for all other extant flyers. Although the inertia in bat wings would typically be associated with decreased aerial maneuverability, we show that bat maneuvers challenge this notion. We use a model-based tracking algorithm to measure the wing and body kinematics of bats performing complex aerial rotations. Using a minimal model of a bat with only six degrees of kinematic freedom, we show that bats can perform body rolls by selectively retracting one wing during the flapping cycle. We also show that this maneuver does not rely on aerodynamic forces, and furthermore that a fruit fly, with nearly massless wings, would not exhibit this effect. Similar results are shown for a pitching maneuver. Finally, we combine high-resolution kinematics of wing and body movements during landing and falling maneuvers with a 52-degree-of-freedom dynamical model of a bat to show that modulation of wing inertia plays the dominant role in reorienting the bat during landing and falling maneuvers, with minimal contribution from aerodynamic forces. Bats can, therefore, use their wings as multifunctional organs, capable of sophisticated aerodynamic and inertial dynamics not previously observed in other flying animals. This may also have implications for the control of aerial robotic vehicles. PMID:26569116

  9. Essential Requirements for Robust Signaling in Hfq Dependent Small RNA Networks

    PubMed Central

    Adamson, David N.; Lim, Han N.

    2011-01-01

    Bacteria possess networks of small RNAs (sRNAs) that are important for modulating gene expression. At the center of many of these sRNA networks is the Hfq protein. Hfq's role is to quickly match cognate sRNAs and target mRNAs from among a large number of possible combinations and anneal them to form duplexes. Here we show using a kinetic model that Hfq can efficiently and robustly achieve this difficult task by minimizing the sequestration of sRNAs and target mRNAs in Hfq complexes. This sequestration can be reduced by two non-mutually exclusive kinetic mechanisms. The first mechanism involves heterotropic cooperativity (where sRNA and target mRNA binding to Hfq is influenced by other RNAs bound to Hfq); this cooperativity can selectively decrease singly-bound Hfq complexes and ternary complexes with non-cognate sRNA-target mRNA pairs while increasing cognate ternary complexes. The second mechanism relies on frequent RNA dissociation enabling the rapid cycling of sRNAs and target mRNAs among different Hfq complexes; this increases the probability the cognate ternary complex forms before the sRNAs and target mRNAs degrade. We further demonstrate that the performance of sRNAs in isolation is not predictive of their performance within a network. These findings highlight the importance of experimentally characterizing duplex formation in physiologically relevant contexts with multiple RNAs competing for Hfq. The model will provide a valuable framework for guiding and interpreting these experiments. PMID:21876666

  10. Good trellises for IC implementation of viterbi decoders for linear block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Moorthy, Hari T.; Uehara, Gregory T.

    1996-01-01

    This paper investigates trellis structures of linear block codes for the IC (integrated circuit) implementation of Viterbi decoders capable of achieving high decoding speed while satisfying a constraint on the structural complexity of the trellis in terms of the maximum number of states at any particular depth. Only uniform sectionalizations of the code trellis diagram are considered. An upper bound on the number of parallel and structurally identical (or isomorphic) subtrellises in a proper trellis for a code without exceeding the maximum state complexity of the minimal trellis of the code is first derived. Parallel structures of trellises with various section lengths for binary BCH and Reed-Muller (RM) codes of lengths 32 and 64 are analyzed. Next, the complexity of IC implementation of a Viterbi decoder based on an L-section trellis diagram for a code is investigated. A structural property of a Viterbi decoder called ACS-connectivity which is related to state connectivity is introduced. This parameter affects the complexity of wire-routing (interconnections within the IC). The effect of five parameters namely: (1) effective computational complexity; (2) complexity of the ACS-circuit; (3) traceback complexity; (4) ACS-connectivity; and (5) branch complexity of a trellis diagram on the VLSI complexity of a Viterbi decoder is investigated. It is shown that an IC implementation of a Viterbi decoder based on a non-minimal trellis requires less area and is capable of operation at higher speed than one based on the minimal trellis when the commonly used ACS-array architecture is considered.

  11. Optimization of multiple quality characteristics in bone drilling using grey relational analysis

    PubMed Central

    Pandey, Rupesh Kumar; Panda, Sudhansu Sekhar

    2014-01-01

    Purpose Drilling of bone is common during bone fracture treatment to fix the fractured parts with screws wires or plates. Minimally invasive drilling of the bone has a great demand as it helps in better fixation and quick healing of the broken bones. The purpose of the present investigation is to determine the optimum cutting condition for the minimization of the temperature, force and surface roughness simultaneously during bone drilling. Method In this study, drilling experiments have been performed on bovine bone with different conditions of feed rate and drill rotational speed using full factorial design. Optimal level of the drilling parameters is determined by the grey relational grade (GRG) obtained from the GRA as the performance index of multiple quality characteristics. The effect of each drilling parameter on GRG is determined using analysis of variance (ANOVA) and the results obtained are validated by confirmation experiment. Results Grey relational analysis showed that the investigation with feed rate of 40 mm/min and spindle speed of 500 rpm has the highest grey relational grade and is recommended setting for minimum temperature, force and surface roughness simultaneously during bone drilling. Feed rate has the highest contribution (59.49%) on the multiple performance characteristics followed by the spindle speed (37.69%) as obtained from ANOVA analysis. Conclusions The use of grey relational analysis will simplify the complex process of optimization of the multi response characteristics in bone drilling by converting them into a single grey relational grade. The use of the above suggested methodology can greatly minimize the bone tissue injury during drilling. PMID:25829751

  12. Optimization of multiple quality characteristics in bone drilling using grey relational analysis.

    PubMed

    Pandey, Rupesh Kumar; Panda, Sudhansu Sekhar

    2015-03-01

    Drilling of bone is common during bone fracture treatment to fix the fractured parts with screws wires or plates. Minimally invasive drilling of the bone has a great demand as it helps in better fixation and quick healing of the broken bones. The purpose of the present investigation is to determine the optimum cutting condition for the minimization of the temperature, force and surface roughness simultaneously during bone drilling. In this study, drilling experiments have been performed on bovine bone with different conditions of feed rate and drill rotational speed using full factorial design. Optimal level of the drilling parameters is determined by the grey relational grade (GRG) obtained from the GRA as the performance index of multiple quality characteristics. The effect of each drilling parameter on GRG is determined using analysis of variance (ANOVA) and the results obtained are validated by confirmation experiment. Grey relational analysis showed that the investigation with feed rate of 40 mm/min and spindle speed of 500 rpm has the highest grey relational grade and is recommended setting for minimum temperature, force and surface roughness simultaneously during bone drilling. Feed rate has the highest contribution (59.49%) on the multiple performance characteristics followed by the spindle speed (37.69%) as obtained from ANOVA analysis. The use of grey relational analysis will simplify the complex process of optimization of the multi response characteristics in bone drilling by converting them into a single grey relational grade. The use of the above suggested methodology can greatly minimize the bone tissue injury during drilling.

  13. Strategies for Ground Based Testing of Manned Lunar Surface Systems

    NASA Technical Reports Server (NTRS)

    Beyer, Jeff; Peacock, Mike; Gill, Tracy

    2009-01-01

    Integrated testing (such as Multi-Element Integrated Test (MEIT)) is critical to reducing risks and minimizing problems encountered during assembly, activation, and on-orbit operation of large, complex manned spacecraft. Provides the best implementation of "Test Like You Fly:. Planning for integrated testing needs to begin at the earliest stages of Program definition. Program leadership needs to fully understand and buy in to what integrated testing is and why it needs to be performed. As Program evolves and design and schedules mature, continually look for suitable opportunities to perform testing where enough components are together in one place at one time. The benefits to be gained are well worth the costs.

  14. Real-time FPGA architectures for computer vision

    NASA Astrophysics Data System (ADS)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar

    2000-03-01

    This paper presents an architecture for real-time generic convolution of a mask and an image. The architecture is intended for fast low level image processing. The FPGA-based architecture takes advantage of the availability of registers in FPGAs to implement an efficient and compact module to process the convolutions. The architecture is designed to minimize the number of accesses to the image memory and is based on parallel modules with internal pipeline operation in order to improve its performance. The architecture is prototyped in a FPGA, but it can be implemented on a dedicated VLSI to reach higher clock frequencies. Complexity issues, FPGA resources utilization, FPGA limitations, and real time performance are discussed. Some results are presented and discussed.

  15. Analysis of Proteins, Protein Complexes, and Organellar Proteomes Using Sheathless Capillary Zone Electrophoresis - Native Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Belov, Arseniy M.; Viner, Rosa; Santos, Marcia R.; Horn, David M.; Bern, Marshall; Karger, Barry L.; Ivanov, Alexander R.

    2017-12-01

    Native mass spectrometry (MS) is a rapidly advancing field in the analysis of proteins, protein complexes, and macromolecular species of various types. The majority of native MS experiments reported to-date has been conducted using direct infusion of purified analytes into a mass spectrometer. In this study, capillary zone electrophoresis (CZE) was coupled online to Orbitrap mass spectrometers using a commercial sheathless interface to enable high-performance separation, identification, and structural characterization of limited amounts of purified proteins and protein complexes, the latter with preserved non-covalent associations under native conditions. The performance of both bare-fused silica and polyacrylamide-coated capillaries was assessed using mixtures of protein standards known to form non-covalent protein-protein and protein-ligand complexes. High-efficiency separation of native complexes is demonstrated using both capillary types, while the polyacrylamide neutral-coated capillary showed better reproducibility and higher efficiency for more complex samples. The platform was then evaluated for the determination of monoclonal antibody aggregation and for analysis of proteomes of limited complexity using a ribosomal isolate from E. coli. Native CZE-MS, using accurate single stage and tandem-MS measurements, enabled identification of proteoforms and non-covalent complexes at femtomole levels. This study demonstrates that native CZE-MS can serve as an orthogonal and complementary technique to conventional native MS methodologies with the advantages of low sample consumption, minimal sample processing and losses, and high throughput and sensitivity. This study presents a novel platform for analysis of ribosomes and other macromolecular complexes and organelles, with the potential for discovery of novel structural features defining cellular phenotypes (e.g., specialized ribosomes). [Figure not available: see fulltext.

  16. DECIDE: a software for computer-assisted evaluation of diagnostic test performance.

    PubMed

    Chiecchio, A; Bo, A; Manzone, P; Giglioli, F

    1993-05-01

    The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.

  17. Optimization of wastewater treatment plant operation for greenhouse gas mitigation.

    PubMed

    Kim, Dongwook; Bowen, James D; Ozelkan, Ertunga C

    2015-11-01

    This study deals with the determination of optimal operation of a wastewater treatment system for minimizing greenhouse gas emissions, operating costs, and pollution loads in the effluent. To do this, an integrated performance index that includes three objectives was established to assess system performance. The ASMN_G model was used to perform system optimization aimed at determining a set of operational parameters that can satisfy three different objectives. The complex nonlinear optimization problem was simulated using the Nelder-Mead Simplex optimization algorithm. A sensitivity analysis was performed to identify influential operational parameters on system performance. The results obtained from the optimization simulations for six scenarios demonstrated that there are apparent trade-offs among the three conflicting objectives. The best optimized system simultaneously reduced greenhouse gas emissions by 31%, reduced operating cost by 11%, and improved effluent quality by 2% compared to the base case operation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Models of resource allocation optimization when solving the control problems in organizational systems

    NASA Astrophysics Data System (ADS)

    Menshikh, V.; Samorokovskiy, A.; Avsentev, O.

    2018-03-01

    The mathematical model of optimizing the allocation of resources to reduce the time for management decisions and algorithms to solve the general problem of resource allocation. The optimization problem of choice of resources in organizational systems in order to reduce the total execution time of a job is solved. This problem is a complex three-level combinatorial problem, for the solving of which it is necessary to implement the solution to several specific problems: to estimate the duration of performing each action, depending on the number of performers within the group that performs this action; to estimate the total execution time of all actions depending on the quantitative composition of groups of performers; to find such a distribution of the existing resource of performers in groups to minimize the total execution time of all actions. In addition, algorithms to solve the general problem of resource allocation are proposed.

  19. Efficient Lookup Table-Based Adaptive Baseband Predistortion Architecture for Memoryless Nonlinearity

    NASA Astrophysics Data System (ADS)

    Ba, Seydou N.; Waheed, Khurram; Zhou, G. Tong

    2010-12-01

    Digital predistortion is an effective means to compensate for the nonlinear effects of a memoryless system. In case of a cellular transmitter, a digital baseband predistorter can mitigate the undesirable nonlinear effects along the signal chain, particularly the nonlinear impairments in the radiofrequency (RF) amplifiers. To be practically feasible, the implementation complexity of the predistorter must be minimized so that it becomes a cost-effective solution for the resource-limited wireless handset. This paper proposes optimizations that facilitate the design of a low-cost high-performance adaptive digital baseband predistorter for memoryless systems. A comparative performance analysis of the amplitude and power lookup table (LUT) indexing schemes is presented. An optimized low-complexity amplitude approximation and its hardware synthesis results are also studied. An efficient LUT predistorter training algorithm that combines the fast convergence speed of the normalized least mean squares (NLMSs) with a small hardware footprint is proposed. Results of fixed-point simulations based on the measured nonlinear characteristics of an RF amplifier are presented.

  20. High-contrast Imager for Complex Aperture Telescopes (HICAT): II. Design overview and first light results

    NASA Astrophysics Data System (ADS)

    N'Diaye, Mamadou; Choquet, Elodie; Egron, Sylvain; Pueyo, Laurent; Leboulleux, Lucie; Levecq, Olivier; Perrin, Marshall D.; Elliot, Erin; Wallace, J. Kent; Hugot, Emmanuel; Marcos, Michel; Ferrari, Marc; Long, Chris A.; Anderson, Rachel; DiFelice, Audrey; Soummer, Rémi

    2014-08-01

    We present a new high-contrast imaging testbed designed to provide complete solutions in wavefront sensing, control and starlight suppression with complex aperture telescopes. The testbed was designed to enable a wide range of studies of the effects of such telescope geometries, with primary mirror segmentation, central obstruction, and spiders. The associated diffraction features in the point spread function make high-contrast imaging more challenging. In particular the testbed will be compatible with both AFTA-like and ATLAST-like aperture shapes, respectively on-axis monolithic, and on-axis segmented telescopes. The testbed optical design was developed using a novel approach to define the layout and surface error requirements to minimize amplitude­ induced errors at the target contrast level performance. In this communication we compare the as-built surface errors for each optic to their specifications based on end-to-end Fresnel modelling of the testbed. We also report on the testbed optical and optomechanical alignment performance, coronagraph design and manufacturing, and preliminary first light results.

  1. Human error mitigation initiative (HEMI) : summary report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Susan M.; Ramos, M. Victoria; Wenner, Caren A.

    2004-11-01

    Despite continuing efforts to apply existing hazard analysis methods and comply with requirements, human errors persist across the nuclear weapons complex. Due to a number of factors, current retroactive and proactive methods to understand and minimize human error are highly subjective, inconsistent in numerous dimensions, and are cumbersome to characterize as thorough. An alternative and proposed method begins with leveraging historical data to understand what the systemic issues are and where resources need to be brought to bear proactively to minimize the risk of future occurrences. An illustrative analysis was performed using existing incident databases specific to Pantex weapons operationsmore » indicating systemic issues associated with operating procedures that undergo notably less development rigor relative to other task elements such as tooling and process flow. Future recommended steps to improve the objectivity, consistency, and thoroughness of hazard analysis and mitigation were delineated.« less

  2. Major design issues of molten carbonate fuel cell power generation unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, T.P.

    1996-04-01

    In addition to the stack, a fuel cell power generation unit requires fuel desulfurization and reforming, fuel and oxidant preheating, process heat removal, waste heat recovery, steam generation, oxidant supply, power conditioning, water supply and treatment, purge gas supply, instrument air supply, and system control. These support facilities add considerable cost and system complexity. Bechtel, as a system integrator of M-C Power`s molten carbonate fuel cell development team, has spent substantial effort to simplify and minimize these supporting facilities to meet cost and reliability goals for commercialization. Similiar to other fuels cells, MCFC faces design challenge of how to complymore » with codes and standards, achieve high efficiency and part load performance, and meanwhile minimize utility requirements, weight, plot area, and cost. However, MCFC has several unique design issues due to its high operating temperature, use of molten electrolyte, and the requirement of CO2 recycle.« less

  3. Identifying multiple influential spreaders based on generalized closeness centrality

    NASA Astrophysics Data System (ADS)

    Liu, Huan-Li; Ma, Chuang; Xiang, Bing-Bing; Tang, Ming; Zhang, Hai-Feng

    2018-02-01

    To maximize the spreading influence of multiple spreaders in complex networks, one important fact cannot be ignored: the multiple spreaders should be dispersively distributed in networks, which can effectively reduce the redundance of information spreading. For this purpose, we define a generalized closeness centrality (GCC) index by generalizing the closeness centrality index to a set of nodes. The problem converts to how to identify multiple spreaders such that an objective function has the minimal value. By comparing with the K-means clustering algorithm, we find that the optimization problem is very similar to the problem of minimizing the objective function in the K-means method. Therefore, how to find multiple nodes with the highest GCC value can be approximately solved by the K-means method. Two typical transmission dynamics-epidemic spreading process and rumor spreading process are implemented in real networks to verify the good performance of our proposed method.

  4. Optimized Projection Matrix for Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Xu, Jianping; Pi, Yiming; Cao, Zongjie

    2010-12-01

    Compressive sensing (CS) is mainly concerned with low-coherence pairs, since the number of samples needed to recover the signal is proportional to the mutual coherence between projection matrix and sparsifying matrix. Until now, papers on CS always assume the projection matrix to be a random matrix. In this paper, aiming at minimizing the mutual coherence, a method is proposed to optimize the projection matrix. This method is based on equiangular tight frame (ETF) design because an ETF has minimum coherence. It is impossible to solve the problem exactly because of the complexity. Therefore, an alternating minimization type method is used to find a feasible solution. The optimally designed projection matrix can further reduce the necessary number of samples for recovery or improve the recovery accuracy. The proposed method demonstrates better performance than conventional optimization methods, which brings benefits to both basis pursuit and orthogonal matching pursuit.

  5. Large Scale Multi-area Static/Dynamic Economic Dispatch using Nature Inspired Optimization

    NASA Astrophysics Data System (ADS)

    Pandit, Manjaree; Jain, Kalpana; Dubey, Hari Mohan; Singh, Rameshwar

    2017-04-01

    Economic dispatch (ED) ensures that the generation allocation to the power units is carried out such that the total fuel cost is minimized and all the operating equality/inequality constraints are satisfied. Classical ED does not take transmission constraints into consideration, but in the present restructured power systems the tie-line limits play a very important role in deciding operational policies. ED is a dynamic problem which is performed on-line in the central load dispatch centre with changing load scenarios. The dynamic multi-area ED (MAED) problem is more complex due to the additional tie-line, ramp-rate and area-wise power balance constraints. Nature inspired (NI) heuristic optimization methods are gaining popularity over the traditional methods for complex problems. This work presents the modified particle swarm optimization (PSO) based techniques where parameter automation is effectively used for improving the search efficiency by avoiding stagnation to a sub-optimal result. This work validates the performance of the PSO variants with traditional solver GAMS for single as well as multi-area economic dispatch (MAED) on three test cases of a large 140-unit standard test system having complex constraints.

  6. Early prediction of cardiac resynchronization therapy response by non-invasive electrocardiogram markers.

    PubMed

    Ortigosa, Nuria; Pérez-Roselló, Víctor; Donoso, Víctor; Osca, Joaquín; Martínez-Dolz, Luis; Fernández, Carmen; Galbis, Antonio

    2018-04-01

    Cardiac resynchronization therapy (CRT) is an effective treatment for those patients with severe heart failure. Regrettably, there are about one third of CRT "non-responders", i.e. patients who have undergone this form of device therapy but do not respond to it, which adversely affects the utility and cost-effectiveness of CRT. In this paper, we assess the ability of a novel surface ECG marker to predict CRT response. We performed a retrospective exploratory study of the ECG previous to CRT implantation in 43 consecutive patients with ischemic (17) or non-ischemic (26) cardiomyopathy. We extracted the QRST complexes (consisting of the QRS complex, the S-T segment, and the T wave) and obtained a measure of their energy by means of spectral analysis. This ECG marker showed statistically significant lower values for non-responder patients and, joint with the duration of QRS complexes (the current gold-standard to predict CRT response), the following performances: 86% accuracy, 88% sensitivity, and 80% specificity. In this manner, the proposed ECG marker may help clinicians to predict positive response to CRT in a non-invasive way, in order to minimize unsuccessful procedures.

  7. Isometric immersions, energy minimization and self-similar buckling in non-Euclidean elastic sheets

    NASA Astrophysics Data System (ADS)

    Gemmer, John; Sharon, Eran; Shearman, Toby; Venkataramani, Shankar C.

    2016-04-01

    The edges of torn plastic sheets and growing leaves often display hierarchical buckling patterns. We show that this complex morphology i) emerges even in zero strain configurations, and ii) is driven by a competition between the two principal curvatures, rather than between bending and stretching. We identify the key role of branch point (or “monkey saddle”) singularities in generating complex wrinkling patterns in isometric immersions, and show how they arise naturally from minimizing the elastic energy.

  8. Graphical approach for multiple values logic minimization

    NASA Astrophysics Data System (ADS)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  9. Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.

    1998-01-01

    A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

  10. A Rapid Convergent Low Complexity Interference Alignment Algorithm for Wireless Sensor Networks.

    PubMed

    Jiang, Lihui; Wu, Zhilu; Ren, Guanghui; Wang, Gangyi; Zhao, Nan

    2015-07-29

    Interference alignment (IA) is a novel technique that can effectively eliminate the interference and approach the sum capacity of wireless sensor networks (WSNs) when the signal-to-noise ratio (SNR) is high, by casting the desired signal and interference into different signal subspaces. The traditional alternating minimization interference leakage (AMIL) algorithm for IA shows good performance in high SNR regimes, however, the complexity of the AMIL algorithm increases dramatically as the number of users and antennas increases, posing limits to its applications in the practical systems. In this paper, a novel IA algorithm, called directional quartic optimal (DQO) algorithm, is proposed to minimize the interference leakage with rapid convergence and low complexity. The properties of the AMIL algorithm are investigated, and it is discovered that the difference between the two consecutive iteration results of the AMIL algorithm will approximately point to the convergence solution when the precoding and decoding matrices obtained from the intermediate iterations are sufficiently close to their convergence values. Based on this important property, the proposed DQO algorithm employs the line search procedure so that it can converge to the destination directly. In addition, the optimal step size can be determined analytically by optimizing a quartic function. Numerical results show that the proposed DQO algorithm can suppress the interference leakage more rapidly than the traditional AMIL algorithm, and can achieve the same level of sum rate as that of AMIL algorithm with far less iterations and execution time.

  11. Systematic review of learning curves for minimally invasive abdominal surgery: a review of the methodology of data collection, depiction of outcomes, and statistical analysis.

    PubMed

    Harrysson, Iliana J; Cook, Jonathan; Sirimanna, Pramudith; Feldman, Liane S; Darzi, Ara; Aggarwal, Rajesh

    2014-07-01

    To determine how minimally invasive surgical learning curves are assessed and define an ideal framework for this assessment. Learning curves have implications for training and adoption of new procedures and devices. In 2000, a review of the learning curve literature was done by Ramsay et al and it called for improved reporting and statistical evaluation of learning curves. Since then, a body of literature is emerging on learning curves but the presentation and analysis vary. A systematic search was performed of MEDLINE, EMBASE, ISI Web of Science, ERIC, and the Cochrane Library from 1985 to August 2012. The inclusion criteria are minimally invasive abdominal surgery formally analyzing the learning curve and English language. 592 (11.1%) of the identified studies met the selection criteria. Time is the most commonly used proxy for the learning curve (508, 86%). Intraoperative outcomes were used in 316 (53%) of the articles, postoperative outcomes in 306 (52%), technical skills in 102 (17%), and patient-oriented outcomes in 38 (6%) articles. Over time, there was evidence of an increase in the relative amount of laparoscopic and robotic studies (P < 0.001) without statistical evidence of a change in the complexity of analysis (P = 0.121). Assessment of learning curves is needed to inform surgical training and evaluate new clinical procedures. An ideal analysis would account for the degree of complexity of individual cases and the inherent differences between surgeons. There is no single proxy that best represents the success of surgery, and hence multiple outcomes should be collected.

  12. Singularities of Three-Layered Complex-Valued Neural Networks With Split Activation Function.

    PubMed

    Kobayashi, Masaki

    2018-05-01

    There are three important concepts related to learning processes in neural networks: reducibility, nonminimality, and singularity. Although the definitions of these three concepts differ, they are equivalent in real-valued neural networks. This is also true of complex-valued neural networks (CVNNs) with hidden neurons not employing biases. The situation of CVNNs with hidden neurons employing biases, however, is very complicated. Exceptional reducibility was found, and it was shown that reducibility and nonminimality are not the same. Irreducibility consists of minimality and exceptional reducibility. The relationship between minimality and singularity has not yet been established. In this paper, we describe our surprising finding that minimality and singularity are independent. We also provide several examples based on exceptional reducibility.

  13. Robotic thoracic surgery: The state of the art

    PubMed Central

    Kumar, Arvind; Asaf, Belal Bin

    2015-01-01

    Minimally invasive thoracic surgery has come a long way. It has rapidly progressed to complex procedures such as lobectomy, pneumonectomy, esophagectomy, and resection of mediastinal tumors. Video-assisted thoracic surgery (VATS) offered perceptible benefits over thoracotomy in terms of less postoperative pain and narcotic utilization, shorter ICU and hospital stay, decreased incidence of postoperative complications combined with quicker return to work, and better cosmesis. However, despite its obvious advantages, the General Thoracic Surgical Community has been relatively slow in adapting VATS more widely. The introduction of da Vinci surgical system has helped overcome certain inherent limitations of VATS such as two-dimensional (2D) vision and counter intuitive movement using long rigid instruments allowing thoracic surgeons to perform a plethora of minimally invasive thoracic procedures more efficiently. Although the cumulative experience worldwide is still limited and evolving, Robotic Thoracic Surgery is an evolution over VATS. There is however a lot of concern among established high-volume VATS centers regarding the superiority of the robotic technique. We have over 7 years experience and believe that any new technology designed to make minimal invasive surgery easier and more comfortable for the surgeon is most likely to have better and safer outcomes in the long run. Our only concern is its cost effectiveness and we believe that if the cost factor is removed more and more surgeons will use the technology and it will increase the spectrum and the reach of minimally invasive thoracic surgery. This article reviews worldwide experience with robotic thoracic surgery and addresses the potential benefits and limitations of using the robotic platform for the performance of thoracic surgical procedures. PMID:25598601

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gebis, Joseph; Oliker, Leonid; Shalf, John

    The disparity between microprocessor clock frequencies and memory latency is a primary reason why many demanding applications run well below peak achievable performance. Software controlled scratchpad memories, such as the Cell local store, attempt to ameliorate this discrepancy by enabling precise control over memory movement; however, scratchpad technology confronts the programmer and compiler with an unfamiliar and difficult programming model. In this work, we present the Virtual Vector Architecture (ViVA), which combines the memory semantics of vector computers with a software-controlled scratchpad memory in order to provide a more effective and practical approach to latency hiding. ViVA requires minimal changesmore » to the core design and could thus be easily integrated with conventional processor cores. To validate our approach, we implemented ViVA on the Mambo cycle-accurate full system simulator, which was carefully calibrated to match the performance on our underlying PowerPC Apple G5 architecture. Results show that ViVA is able to deliver significant performance benefits over scalar techniques for a variety of memory access patterns as well as two important memory-bound compact kernels, corner turn and sparse matrix-vector multiplication -- achieving 2x-13x improvement compared the scalar version. Overall, our preliminary ViVA exploration points to a promising approach for improving application performance on leading microprocessors with minimal design and complexity costs, in a power efficient manner.« less

  15. Markovian robots: Minimal navigation strategies for active particles

    NASA Astrophysics Data System (ADS)

    Nava, Luis Gómez; Großmann, Robert; Peruani, Fernando

    2018-04-01

    We explore minimal navigation strategies for active particles in complex, dynamical, external fields, introducing a class of autonomous, self-propelled particles which we call Markovian robots (MR). These machines are equipped with a navigation control system (NCS) that triggers random changes in the direction of self-propulsion of the robots. The internal state of the NCS is described by a Boolean variable that adopts two values. The temporal dynamics of this Boolean variable is dictated by a closed Markov chain—ensuring the absence of fixed points in the dynamics—with transition rates that may depend exclusively on the instantaneous, local value of the external field. Importantly, the NCS does not store past measurements of this value in continuous, internal variables. We show that despite the strong constraints, it is possible to conceive closed Markov chain motifs that lead to nontrivial motility behaviors of the MR in one, two, and three dimensions. By analytically reducing the complexity of the NCS dynamics, we obtain an effective description of the long-time motility behavior of the MR that allows us to identify the minimum requirements in the design of NCS motifs and transition rates to perform complex navigation tasks such as adaptive gradient following, detection of minima or maxima, or selection of a desired value in a dynamical, external field. We put these ideas in practice by assembling a robot that operates by the proposed minimalistic NCS to evaluate the robustness of MR, providing a proof of concept that is possible to navigate through complex information landscapes with such a simple NCS whose internal state can be stored in one bit. These ideas may prove useful for the engineering of miniaturized robots.

  16. A duality framework for stochastic optimal control of complex systems

    DOE PAGES

    Malikopoulos, Andreas A.

    2016-01-01

    In this study, we address the problem of minimizing the long-run expected average cost of a complex system consisting of interactive subsystems. We formulate a multiobjective optimization problem of the one-stage expected costs of the subsystems and provide a duality framework to prove that the control policy yielding the Pareto optimal solution minimizes the average cost criterion of the system. We provide the conditions of existence and a geometric interpretation of the solution. For practical situations having constraints consistent with those studied here, our results imply that the Pareto control policy may be of value when we seek to derivemore » online the optimal control policy in complex systems.« less

  17. A Classification System to Guide Physical Therapy Management in Huntington Disease: A Case Series.

    PubMed

    Fritz, Nora E; Busse, Monica; Jones, Karen; Khalil, Hanan; Quinn, Lori

    2017-07-01

    Individuals with Huntington disease (HD), a rare neurological disease, experience impairments in mobility and cognition throughout their disease course. The Medical Research Council framework provides a schema that can be applied to the development and evaluation of complex interventions, such as those provided by physical therapists. Treatment-based classifications, based on expert consensus and available literature, are helpful in guiding physical therapy management across the stages of HD. Such classifications also contribute to the development and further evaluation of well-defined complex interventions in this highly variable and complex neurodegenerative disease. The purpose of this case series was to illustrate the use of these classifications in the management of 2 individuals with late-stage HD. Two females, 40 and 55 years of age, with late-stage HD participated in this case series. Both experienced progressive declines in ambulatory function and balance as well as falls or fear of falling. Both individuals received daily care in the home for activities of daily living. Physical therapy Treatment-Based Classifications for HD guided the interventions and outcomes. Eight weeks of in-home balance training, strength training, task-specific practice of functional activities including transfers and walking tasks, and family/carer education were provided. Both individuals demonstrated improvements that met or exceeded the established minimal detectible change values for gait speed and Timed Up and Go performance. Both also demonstrated improvements on Berg Balance Scale and Physical Performance Test performance, with 1 of the 2 individuals exceeding the established minimal detectible changes for both tests. Reductions in fall risk were evident in both cases. These cases provide proof-of-principle to support use of treatment-based classifications for physical therapy management in individuals with HD. Traditional classification of early-, mid-, and late-stage disease progression may not reflect patients' true capabilities; those with late-stage HD may be as responsive to interventions as those at an earlier disease stage.Video Abstract available for additional insights from the authors (see Supplemental Digital Content 1, available at: http://links.lww.com/JNPT/A172).

  18. Metabolic responses to the seated calf press exercise performed against inertial resistance.

    PubMed

    Caruso, John F; Herron, Jacquelyn C; Hernandez, Daniel A; Porter, Aaron; Schweickert, Torrey; Manning, Tommy F

    2005-11-01

    Future in-flight strength training devices may use inertial resistance to abate mass and strength losses to muscle groups such as the triceps surae, which incurs pronounced deficits from space travel. Yet little data exist regarding physiological outcomes to triceps surae exercise performed against inertial resistance. Two sets of subjects were employed to note either blood lactate (La-) or net caloric cost responses to seated calf presses done on an inertial resistance ergometer. Both sets of subjects performed 3 identical 3-set 10-repetition workouts. Blood La- measurements were made pre- and 5 min post-exercise. During workouts, breath-by-breath O2 uptake values were also recorded to help determine the net caloric cost of exercise. Compared to pre-exercise (mean +/- SEM) blood La- (2.01 +/- 0.08 mmol x L(-1)) values, post-exercise (4.73 +/- 0.24 mmol x L(-1)) measurements showed a significant increase. Delta (post/pre differences) La- correlated significantly (r = 0.31-0.34) to several workout performance measures. Net caloric cost averaged 52.82 +/- 3.26 kcals for workouts; multivariate regression showed a subject's height, body mass, and body surface area described the variance associated with energy expenditure. Workouts evoked minimal energy expenditure, though anaerobic glycolysis likely played a major role in ATP resynthesis. Metabolic and exercise performance measures were likely influenced by series elastic element involvement of the triceps surae-Achilles tendon complex. Ergometer calf presses provided a high-intensity workout stimulus with a minimal metabolic cost.

  19. Free Energy Minimization Calculation of Complex Chemical Equilibria. Reduction of Silicon Dioxide with Carbon at High Temperature.

    ERIC Educational Resources Information Center

    Wai, C. M.; Hutchinson, S. G.

    1989-01-01

    Discusses the calculation of free energy in reactions between silicon dioxide and carbon. Describes several computer programs for calculating the free energy minimization and their uses in chemistry classrooms. Lists 16 references. (YP)

  20. Development of peptoid-based ligands for the removal of cadmium from biological media

    DOE PAGES

    Knight, Abigail S.; Zhou, Effie Y.; Francis, Matthew B.

    2015-05-14

    Cadmium poisoning poses a serious health concern due to cadmium's increasing industrial use, yet there is currently no recommended treatment. The selective coordination of cadmium in a biological environment—i.e. in the presence of serum ions, small molecules, and proteins—is a difficult task. To address this challenge, a combinatorial library of peptoid-based ligands has been evaluated to identify structures that selectively bind to cadmium in human serum with minimal chelation of essential metal ions. Eighteen unique ligands were identified in this screening procedure, and the binding affinity of each was measured using metal titrations monitored by UV-vis spectroscopy. To evaluate themore » significance of each chelating moiety, sequence rearrangements and substitutions were examined. Analysis of a metal–ligand complex by NMR spectroscopy highlighted the importance of particular residues. Depletion experiments were performed in serum mimetics and human serum with exogenously added cadmium. These depletion experiments were used to compare and demonstrate the ability of these peptoids to remove cadmium from blood-like mixtures. In one of these depletion experiments, the peptoid sequence was able to deplete the cadmium to a level comparable to the reported acute toxicity limit. Evaluation of the metal selectivity in buffered solution and in human serum was performed to verify minimal off-target binding. These studies highlight a screening platform for the identification of metal–ligands that are capable of binding in a complex environment. They additionally demonstrate the potential utility of biologically-compatible ligands for the treatment of heavy metal poisoning.« less

  1. Development of peptoid-based ligands for the removal of cadmium from biological media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knight, Abigail S.; Zhou, Effie Y.; Francis, Matthew B.

    Cadmium poisoning poses a serious health concern due to cadmium's increasing industrial use, yet there is currently no recommended treatment. The selective coordination of cadmium in a biological environment—i.e. in the presence of serum ions, small molecules, and proteins—is a difficult task. To address this challenge, a combinatorial library of peptoid-based ligands has been evaluated to identify structures that selectively bind to cadmium in human serum with minimal chelation of essential metal ions. Eighteen unique ligands were identified in this screening procedure, and the binding affinity of each was measured using metal titrations monitored by UV-vis spectroscopy. To evaluate themore » significance of each chelating moiety, sequence rearrangements and substitutions were examined. Analysis of a metal–ligand complex by NMR spectroscopy highlighted the importance of particular residues. Depletion experiments were performed in serum mimetics and human serum with exogenously added cadmium. These depletion experiments were used to compare and demonstrate the ability of these peptoids to remove cadmium from blood-like mixtures. In one of these depletion experiments, the peptoid sequence was able to deplete the cadmium to a level comparable to the reported acute toxicity limit. Evaluation of the metal selectivity in buffered solution and in human serum was performed to verify minimal off-target binding. These studies highlight a screening platform for the identification of metal–ligands that are capable of binding in a complex environment. They additionally demonstrate the potential utility of biologically-compatible ligands for the treatment of heavy metal poisoning.« less

  2. Anatrophic Nephrolithotomy in the Management of Large Staghorn Calculi - A Single Centre Experience.

    PubMed

    Keshavamurthy, Ramaiah; Karthikeyan, Vilvapathy Senguttuvan; Mallya, Ashwin; Sreenivas, Jayaram; Nelivigi, Girish Gurubasappa; Kamath, Ananth Janarthan

    2017-05-01

    With advances in endourology, open stone surgery for staghorn calculi has markedly diminished. Anatrophic Nephrolithotomy (AN) is performed for complex staghorn stones which cannot be cleared by a reasonable number of Percutaneous Nephrolithotomy (PNL) attempts. To assess the indications and outcomes of AN in the modern era. Between April 2008 and July 2015, AN was done in 14 renal units in 13 patients. In this retrospective study, demography, stone characteristics, operative details, clearance and long term outcomes were assessed. AN was performed for complex staghorn calculi involving pelvis and all calyces in 10 patients, infundibular stenosis in two patients and failed PNL in one patient. Mean (SD) in situ cold ischemia time was 47.64 (5.27) minutes. Retroperitoneal drain and double J stent were placed in all 13 patients. Median (IQR) estimated blood loss was 130 (75) ml. There was no perioperative mortality. Surgical site infection was seen in 2 patients and urosepsis in 2 patients. Drain was removed at a mean (SD) of 9.11 (6.15) days. Mean (SD) postoperative length of hospitalization was 15.44 (7.14) days. Stent removal was done in all patients between 2-8 weeks. Median (IQR) clearance was 95 (7.5%). There was no renal failure or new calculi during the follow up period {median (IQR): 1(3) years}. AN is effective in management of large staghorn calculi failed minimally invasive approaches and achieves 80%-100% clearance without much need for secondary interventions. Renal function is preserved and with emergence of laparoscopy and robotics, postoperative stay is minimized with expedited recovery and comparable results with open surgery.

  3. A comprehensive study of MPI parallelism in three-dimensional discrete element method (DEM) simulation of complex-shaped granular particles

    NASA Astrophysics Data System (ADS)

    Yan, Beichuan; Regueiro, Richard A.

    2018-02-01

    A three-dimensional (3D) DEM code for simulating complex-shaped granular particles is parallelized using message-passing interface (MPI). The concepts of link-block, ghost/border layer, and migration layer are put forward for design of the parallel algorithm, and theoretical scalability function of 3-D DEM scalability and memory usage is derived. Many performance-critical implementation details are managed optimally to achieve high performance and scalability, such as: minimizing communication overhead, maintaining dynamic load balance, handling particle migrations across block borders, transmitting C++ dynamic objects of particles between MPI processes efficiently, eliminating redundant contact information between adjacent MPI processes. The code executes on multiple US Department of Defense (DoD) supercomputers and tests up to 2048 compute nodes for simulating 10 million three-axis ellipsoidal particles. Performance analyses of the code including speedup, efficiency, scalability, and granularity across five orders of magnitude of simulation scale (number of particles) are provided, and they demonstrate high speedup and excellent scalability. It is also discovered that communication time is a decreasing function of the number of compute nodes in strong scaling measurements. The code's capability of simulating a large number of complex-shaped particles on modern supercomputers will be of value in both laboratory studies on micromechanical properties of granular materials and many realistic engineering applications involving granular materials.

  4. Factors Influencing New RNs' Supervisory Performance in Long-Term Care Facilities.

    PubMed

    Prentice, Dawn; Boscart, Veronique; McGilton, Katherine S; Escrig, Astrid

    2017-12-01

    In long-term care facilities (LTCF), registered nurses (RNs) perform both clinical and supervisory roles as part of a team aiming to provide high-quality care to residents. The residents have several co-morbidities and complex care needs. Unfortunately, new RNs receive minimal preparation in gerontology and supervisory experience during their program, leading to low retention rates and affecting resident outcomes. This qualitative study explored factors that influence supervisory performance of new RNs in LTCF from the perspective of 24 participants from Ontario, Canada. Data were collected through individual interviews, followed by a directed content analysis. Three levels of influences were identified: personal influences, organizational influences, and external influences. Each level presented with sub-elements, further describing the factors that impact the supervisory performance of the new RN. To retain new RNs in LTC, organizations must provide additional gerontological education and mentoring for new RNs to flourish in their supervisory roles.

  5. Reframed Genome-Scale Metabolic Model to Facilitate Genetic Design and Integration with Expression Data.

    PubMed

    Gu, Deqing; Jian, Xingxing; Zhang, Cheng; Hua, Qiang

    2017-01-01

    Genome-scale metabolic network models (GEMs) have played important roles in the design of genetically engineered strains and helped biologists to decipher metabolism. However, due to the complex gene-reaction relationships that exist in model systems, most algorithms have limited capabilities with respect to directly predicting accurate genetic design for metabolic engineering. In particular, methods that predict reaction knockout strategies leading to overproduction are often impractical in terms of gene manipulations. Recently, we proposed a method named logical transformation of model (LTM) to simplify the gene-reaction associations by introducing intermediate pseudo reactions, which makes it possible to generate genetic design. Here, we propose an alternative method to relieve researchers from deciphering complex gene-reactions by adding pseudo gene controlling reactions. In comparison to LTM, this new method introduces fewer pseudo reactions and generates a much smaller model system named as gModel. We showed that gModel allows two seldom reported applications: identification of minimal genomes and design of minimal cell factories within a modified OptKnock framework. In addition, gModel could be used to integrate expression data directly and improve the performance of the E-Fmin method for predicting fluxes. In conclusion, the model transformation procedure will facilitate genetic research based on GEMs, extending their applications.

  6. Energy transfer in light-adapted photosynthetic membranes: from active to saturated photosynthesis.

    PubMed

    Fassioli, Francesca; Olaya-Castro, Alexandra; Scheuring, Simon; Sturgis, James N; Johnson, Neil F

    2009-11-04

    In bacterial photosynthesis light-harvesting complexes, LH2 and LH1 absorb sunlight energy and deliver it to reaction centers (RCs) with extraordinarily high efficiency. Submolecular resolution images have revealed that both the LH2:LH1 ratio, and the architecture of the photosynthetic membrane itself, adapt to light intensity. We investigate the functional implications of structural adaptations in the energy transfer performance in natural in vivo low- and high-light-adapted membrane architectures of Rhodospirillum photometricum. A model is presented to describe excitation migration across the full range of light intensities that cover states from active photosynthesis, where all RCs are available for charge separation, to saturated photosynthesis where all RCs are unavailable. Our study outlines three key findings. First, there is a critical light-energy density, below which the low-light adapted membrane is more efficient at absorbing photons and generating a charge separation at RCs, than the high-light-adapted membrane. Second, connectivity of core complexes is similar in both membranes, suggesting that, despite different growth conditions, a preferred transfer pathway is through core-core contacts. Third, there may be minimal subareas on the membrane which, containing the same LH2:LH1 ratio, behave as minimal functional units as far as excitation transfer efficiency is concerned.

  7. Web mining for topics defined by complex and precise predicates

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Cheng; Sampathkumar, Sushma

    2004-04-01

    The enormous growth of the World Wide Web has made it important to perform resource discovery efficiently for any given topic. Several new techniques have been proposed in the recent years for this kind of topic specific web-mining, and among them a key new technique called focused crawling which is able to crawl topic-specific portions of the web without having to explore all pages. Most existing research on focused crawling considers a simple topic definition that typically consists of one or more keywords connected by an OR operator. However this kind of simple topic definition may result in too many irrelevant pages in which the same keyword appears in a wrong context. In this research we explore new strategies for crawling topic specific portions of the web using complex and precise predicates. A complex predicate will allow the user to precisely specify a topic using Boolean operators such as "AND", "OR" and "NOT". Our work will concentrate on defining a format to specify this kind of a complex topic definition and secondly on devising a crawl strategy to crawl the topic specific portions of the web defined by the complex predicate, efficiently and with minimal overhead. Our new crawl strategy will improve the performance of topic-specific web crawling by reducing the number of irrelevant pages crawled. In order to demonstrate the effectiveness of the above approach, we have built a complete focused crawler called "Eureka" with complex predicate support, and a search engine that indexes and supports end-user searches on the crawled pages.

  8. Minimally invasive treatment for pubic ramus fractures combined with a sacroiliac joint complex injury.

    PubMed

    Yu, Xiaowei; Tang, Mingjie; Zhou, Zubin; Peng, Xiaochun; Wu, Tianyi; Sun, Yuqiang

    2013-08-01

    Fractures of the pubic rami due to low energy trauma are common in the elderly, with an incidence of 26 per 100,000 people per year in those aged more than 60 years. The purpose of this study was to evaluate the clinical application of this minimally invasive technique in patients with pubic ramus fractures combined with a sacroiliac joint complex injury, including its feasibility, merits, and limitations. Fifteen patients with pubic ramus fractures combined with sacroiliac joint injury were treated with the minimally invasive technique from June 2008 until April 2012. The quality of fracture reduction was evaluated according to the Matta standard. Fourteen cases were excellent (93.3 %), and one case was good (6.7 %). The fracture lines were healed 12 weeks after the surgery. The 15 patients had follow-up visits between four to 50 months (mean, 22.47 months). All patients returned to their pre-injury jobs and lifestyles. One patient suffered a deep vein thrombosis during the peri-operative period. A filter was placed in the patient before the surgery and was removed six weeks later. There was no thrombus found at the follow-up visits of this patient. The minimally invasive technique in patients with pubic ramus fractures combined with a sacroiliac joint complex injury provided satisfactory efficacy.

  9. Fast Transformation of Temporal Plans for Efficient Execution

    NASA Technical Reports Server (NTRS)

    Tsamardinos, Ioannis; Muscettola, Nicola; Morris, Paul

    1998-01-01

    Temporal plans permit significant flexibility in specifying the occurrence time of events. Plan execution can make good use of that flexibility. However, the advantage of execution flexibility is counterbalanced by the cost during execution of propagating the time of occurrence of events throughout the flexible plan. To minimize execution latency, this propagation needs to be very efficient. Previous work showed that every temporal plan can be reformulated as a dispatchable plan, i.e., one for which propagation to immediate neighbors is sufficient. A simple algorithm was given that finds a dispatchable plan with a minimum number of edges in cubic time and quadratic space. In this paper, we focus on the efficiency of the reformulation process, and improve on that result. A new algorithm is presented that uses linear space and has time complexity equivalent to Johnson s algorithm for all-pairs shortest-path problems. Experimental evidence confirms the practical effectiveness of the new algorithm. For example, on a large commercial application, the performance is improved by at least two orders of magnitude. We further show that the dispatchable plan, already minimal in the total number of edges, can also be made minimal in the maximum number of edges incoming or outgoing at any node.

  10. The demodulated band transform

    PubMed Central

    Kovach, Christopher K.; Gander, Phillip E.

    2016-01-01

    Background Windowed Fourier decompositions (WFD) are widely used in measuring stationary and non-stationary spectral phenomena and in describing pairwise relationships among multiple signals. Although a variety of WFDs see frequent application in electrophysiological research, including the short-time Fourier transform, continuous wavelets, band-pass filtering and multitaper-based approaches, each carries certain drawbacks related to computational efficiency and spectral leakage. This work surveys the advantages of a WFD not previously applied in electrophysiological settings. New Methods A computationally efficient form of complex demodulation, the demodulated band transform (DBT), is described. Results DBT is shown to provide an efficient approach to spectral estimation with minimal susceptibility to spectral leakage. In addition, it lends itself well to adaptive filtering of non-stationary narrowband noise. Comparison with existing methods A detailed comparison with alternative WFDs is offered, with an emphasis on the relationship between DBT and Thomson's multitaper. DBT is shown to perform favorably in combining computational efficiency with minimal introduction of spectral leakage. Conclusion DBT is ideally suited to efficient estimation of both stationary and non-stationary spectral and cross-spectral statistics with minimal susceptibility to spectral leakage. These qualities are broadly desirable in many settings. PMID:26711370

  11. Optics measurement and correction during beam acceleration in the Relativistic Heavy Ion Collider

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, C.; Marusic, A.; Minty, M.

    2014-09-09

    To minimize operational complexities, setup of collisions in high energy circular colliders typically involves acceleration with near constant β-functions followed by application of strong focusing quadrupoles at the interaction points (IPs) for the final beta-squeeze. At the Relativistic Heavy Ion Collider (RHIC) beam acceleration and optics squeeze are performed simultaneously. In the past, beam optics correction at RHIC has taken place at injection and at final energy with some interpolation of corrections into the acceleration cycle. Recent measurements of the beam optics during acceleration and squeeze have evidenced significant beta-beats which if corrected could minimize undesirable emittance dilutions and maximizemore » the spin polarization of polarized proton beams by avoidance of higher-order multipole fields sampled by particles within the bunch. In this report the methodology now operational at RHIC for beam optics corrections during acceleration with simultaneous beta-squeeze will be presented together with measurements which conclusively demonstrate the superior beam control. As a valuable by-product, the corrections have minimized the beta-beat at the profile monitors so reducing the dominant error in and providing more precise measurements of the evolution of the beam emittances during acceleration.« less

  12. The inverse problem of brain energetics: ketone bodies as alternative substrates

    NASA Astrophysics Data System (ADS)

    Calvetti, D.; Occhipinti, R.; Somersalo, E.

    2008-07-01

    Little is known about brain energy metabolism under ketosis, although there is evidence that ketone bodies have a neuroprotective role in several neurological disorders. We investigate the inverse problem of estimating reaction fluxes and transport rates in the different cellular compartments of the brain, when the data amounts to a few measured arterial venous concentration differences. By using a recently developed methodology to perform Bayesian Flux Balance Analysis and a new five compartment model of the astrocyte-glutamatergic neuron cellular complex, we are able to identify the preferred biochemical pathways during shortage of glucose and in the presence of ketone bodies in the arterial blood. The analysis is performed in a minimally biased way, therefore revealing the potential of this methodology for hypothesis testing.

  13. Control strategy optimization of HVAC plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Facci, Andrea Luigi; Zanfardino, Antonella; Martini, Fabrizio

    In this paper we present a methodology to optimize the operating conditions of heating, ventilation and air conditioning (HVAC) plants to achieve a higher energy efficiency in use. Semi-empiric numerical models of the plant components are used to predict their performances as a function of their set-point and the environmental and occupied space conditions. The optimization is performed through a graph-based algorithm that finds the set-points of the system components that minimize energy consumption and/or energy costs, while matching the user energy demands. The resulting model can be used with systems of almost any complexity, featuring both HVAC components andmore » energy systems, and is sufficiently fast to make it applicable to real-time setting.« less

  14. Minimum Control Requirements for Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Boulange, Richard; Jones, Harry; Jones, Harry

    2002-01-01

    Advanced control technologies are not necessary for the safe, reliable and continuous operation of Advanced Life Support (ALS) systems. ALS systems can and are adequately controlled by simple, reliable, low-level methodologies and algorithms. The automation provided by advanced control technologies is claimed to decrease system mass and necessary crew time by reducing buffer size and minimizing crew involvement. In truth, these approaches increase control system complexity without clearly demonstrating an increase in reliability across the ALS system. Unless these systems are as reliable as the hardware they control, there is no savings to be had. A baseline ALS system is presented with the minimal control system required for its continuous safe reliable operation. This baseline control system uses simple algorithms and scheduling methodologies and relies on human intervention only in the event of failure of the redundant backup equipment. This ALS system architecture is designed for reliable operation, with minimal components and minimal control system complexity. The fundamental design precept followed is "If it isn't there, it can't fail".

  15. Determining the Composition and Stability of Protein Complexes Using an Integrated Label-Free and Stable Isotope Labeling Strategy

    PubMed Central

    Greco, Todd M.; Guise, Amanda J.; Cristea, Ileana M.

    2016-01-01

    In biological systems, proteins catalyze the fundamental reactions that underlie all cellular functions, including metabolic processes and cell survival and death pathways. These biochemical reactions are rarely accomplished alone. Rather, they involve a concerted effect from many proteins that may operate in a directed signaling pathway and/or may physically associate in a complex to achieve a specific enzymatic activity. Therefore, defining the composition and regulation of protein complexes is critical for understanding cellular functions. In this chapter, we describe an approach that uses quantitative mass spectrometry (MS) to assess the specificity and the relative stability of protein interactions. Isolation of protein complexes from mammalian cells is performed by rapid immunoaffinity purification, and followed by in-solution digestion and high-resolution mass spectrometry analysis. We employ complementary quantitative MS workflows to assess the specificity of protein interactions using label-free MS and statistical analysis, and the relative stability of the interactions using a metabolic labeling technique. For each candidate protein interaction, scores from the two workflows can be correlated to minimize nonspecific background and profile protein complex composition and relative stability. PMID:26867737

  16. Orthogonal Array Testing for Transmit Precoding based Codebooks in Space Shift Keying Systems

    NASA Astrophysics Data System (ADS)

    Al-Ansi, Mohammed; Alwee Aljunid, Syed; Sourour, Essam; Mat Safar, Anuar; Rashidi, C. B. M.

    2018-03-01

    In Space Shift Keying (SSK) systems, transmit precoding based codebook approaches have been proposed to improve the performance in limited feedback channels. The receiver performs an exhaustive search in a predefined Full-Combination (FC) codebook to select the optimal codeword that maximizes the Minimum Euclidean Distance (MED) between the received constellations. This research aims to reduce the codebook size with the purpose of minimizing the selection time and the number of feedback bits. Therefore, we propose to construct the codebooks based on Orthogonal Array Testing (OAT) methods due to their powerful inherent properties. These methods allow to acquire a short codebook where the codewords are sufficient to cover almost all the possible effects included in the FC codebook. Numerical results show the effectiveness of the proposed OAT codebooks in terms of the system performance and complexity.

  17. Recycling and source reduction for long duration space habitation

    NASA Technical Reports Server (NTRS)

    Hightower, T. M.

    1992-01-01

    A direct mathematical approach has been established for characterizing the performance of closed-loop life support systems. The understanding that this approach gives clearly illustrates the options available for increasing the performance of a life support system by changing various parameters. New terms are defined and utilized, such as Segregation Factor, Resource Recovery Efficiency, Overall Reclamation Efficiency, Resupply Reduction Factor, and Life Support Extension Factor. The effects of increases in expendable system supplies required due to increases in life support system complexity are shown. Minimizing resupply through increased recycling and source reduction is illustrated. The effects of recycling upon resupply launch cost is also shown. Finally, material balance analyses have been performed based on quantity and composition data for both supplies and wastes, to illustrate the use of this approach by comparing ten different closed-loop life support system cases.

  18. Integrated source and channel encoded digital communication system design study

    NASA Technical Reports Server (NTRS)

    Alem, W. K.; Huth, G. K.; Simon, M. K.

    1978-01-01

    The particular Ku-band carrier, PN despreading, and symbol synchronization strategies, which were selected for implementation in the Ku-band transponder aboard the orbiter, were assessed and evaluated from a systems performance viewpoint, verifying that system specifications were met. A study was performed of the design and implementation of tracking techniques which are suitable for incorporation into the Orbiter Ku-band communication system. Emphasis was placed on maximizing tracking accuracy and communication system flexibility while minimizing cost, weight, and system complexity of Orbiter and ground systems hardware. The payload communication study assessed the design and performance of the forward link and return link bent-pipe relay modes for attached and detached payloads. As part of this study, a design for a forward link bent-pipe was proposed which employs a residual carrier but which is tracked by the existing Costas loop.

  19. MATLAB Simulation of UPQC for Power Quality Mitigation Using an Ant Colony Based Fuzzy Control Technique

    PubMed Central

    Kumarasabapathy, N.; Manoharan, P. S.

    2015-01-01

    This paper proposes a fuzzy logic based new control scheme for the Unified Power Quality Conditioner (UPQC) for minimizing the voltage sag and total harmonic distortion in the distribution system consequently to improve the power quality. UPQC is a recent power electronic module which guarantees better power quality mitigation as it has both series-active and shunt-active power filters (APFs). The fuzzy logic controller has recently attracted a great deal of attention and possesses conceptually the quality of the simplicity by tackling complex systems with vagueness and ambiguity. In this research, the fuzzy logic controller is utilized for the generation of reference signal controlling the UPQC. To enable this, a systematic approach for creating the fuzzy membership functions is carried out by using an ant colony optimization technique for optimal fuzzy logic control. An exhaustive simulation study using the MATLAB/Simulink is carried out to investigate and demonstrate the performance of the proposed fuzzy logic controller and the simulation results are compared with the PI controller in terms of its performance in improving the power quality by minimizing the voltage sag and total harmonic distortion. PMID:26504895

  20. Dynamism in a Semiconductor Industrial Machine Allocation Problem using a Hybrid of the Bio-inspired and Musical-Harmony Approach

    NASA Astrophysics Data System (ADS)

    Kalsom Yusof, Umi; Nor Akmal Khalid, Mohd

    2015-05-01

    Semiconductor industries need to constantly adjust to the rapid pace of change in the market. Most manufactured products usually have a very short life cycle. These scenarios imply the need to improve the efficiency of capacity planning, an important aspect of the machine allocation plan known for its complexity. Various studies have been performed to balance productivity and flexibility in the flexible manufacturing system (FMS). Many approaches have been developed by the researchers to determine the suitable balance between exploration (global improvement) and exploitation (local improvement). However, not much work has been focused on the domain of machine allocation problem that considers the effects of machine breakdowns. This paper develops a model to minimize the effect of machine breakdowns, thus increasing the productivity. The objectives are to minimize system unbalance and makespan as well as increase throughput while satisfying the technological constraints such as machine time availability. To examine the effectiveness of the proposed model, results for throughput, system unbalance and makespan on real industrial datasets were performed with applications of intelligence techniques, that is, a hybrid of genetic algorithm and harmony search. The result aims to obtain a feasible solution to the domain problem.

  1. Laparoscopic liver resection: when to use the laparoscopic stapler device

    PubMed Central

    Gumbs, Andrew A.; Gayet, Brice

    2008-01-01

    Minimally invasive hepatic resection was first described by Gagner et al. in the early 1990s and since then has become increasingly adopted by hepatobiliary and liver transplant surgeons. Several techniques exist to transect the hepatic parenchyma laparoscopically and include transection with stapler and/or energy devices, such as ultrasonic shears, radiofrequency ablation and bipolar devices. We believe that coagulative techniques allow for superior anatomic resections and ultimately permit for the performance of more complex hepatic resections. In the stapling technique, Glisson's capsule is usually incised with an energy device until the parenchyma is thinned out and multiple firings of the staplers are then used to transect the remaining parenchyma and larger bridging segmental vessels and ducts. Besides the economic constraints of using multiple stapler firings, the remaining staples have the disadvantage of hindering and even preventing additional hemostasis of the raw liver surface with monopolar and bipolar electrocautery. The laparoscopic stapler device is, however, useful for transection of the main portal branches and hepatic veins during minimally invasive major hepatic resections. Techniques to safely perform major hepatic resection with the above techniques will be described with an emphasis on when and how laparoscopic vascular staplers should be used. PMID:18773113

  2. Ideal AFROC and FROC observers.

    PubMed

    Khurd, Parmeshwar; Liu, Bin; Gindi, Gene

    2010-02-01

    Detection of multiple lesions in images is a medically important task and free-response receiver operating characteristic (FROC) analyses and its variants, such as alternative FROC (AFROC) analyses, are commonly used to quantify performance in such tasks. However, ideal observers that optimize FROC or AFROC performance metrics have not yet been formulated in the general case. If available, such ideal observers may turn out to be valuable for imaging system optimization and in the design of computer aided diagnosis techniques for lesion detection in medical images. In this paper, we derive ideal AFROC and FROC observers. They are ideal in that they maximize, amongst all decision strategies, the area, or any partial area, under the associated AFROC or FROC curve. Calculation of observer performance for these ideal observers is computationally quite complex. We can reduce this complexity by considering forms of these observers that use false positive reports derived from signal-absent images only. We also consider a Bayes risk analysis for the multiple-signal detection task with an appropriate definition of costs. A general decision strategy that minimizes Bayes risk is derived. With particular cost constraints, this general decision strategy reduces to the decision strategy associated with the ideal AFROC or FROC observer.

  3. Motor demands impact speed of information processing in Autism Spectrum Disorders

    PubMed Central

    Kenworthy, Lauren; Yerys, Benjamin E.; Weinblatt, Rachel; Abrams, Danielle N.; Wallace, Gregory L.

    2015-01-01

    Objective The apparent contradiction between preserved or even enhanced perceptual processing speed on inspection time tasks in autism spectrum disorders (ASD) and impaired performance on complex processing speed tasks that require motor output (e.g. Wechsler Processing Speed Index) has not yet been systematically investigated. This study investigates whether adding motor output demands to an inspection time task impairs ASD performance compared to that of typically developing control (TDC) children. Method The performance of children with ASD (n=28; mean FSIQ=115) and TDC (n=25; mean FSIQ=122) children was compared on processing speed tasks with increasing motor demand. Correlations were run between ASD task performance and Autism Diagnostic Observation Schedule (ADOS) Communication scores. Results Performance by the ASD and TDC groups on a simple perceptual processing speed task with minimal motor demand was equivalent, though it diverged (ASD worse than TDC) on two tasks with the same stimuli, but increased motor output demands. ASD performance on the moderate but not the high speeded motor output demand task was negatively correlated with ADOS communication symptoms. Conclusions These data address the apparent contradiction between preserved inspection time in the context of slowed “processing speed” in ASD. They show that processing speed is preserved when motor demands are minimized, but that increased motor output demands interfere with the ability to act on perceptual processing of simple stimuli. Reducing motor demands (e.g. through the use of computers) may increase the capacity of people with ASD to demonstrate good perceptual processing in a variety of educational, vocational and social settings. PMID:23937483

  4. Performance Enhancement of MC-CDMA System through Novel Sensitive Bit Algorithm Aided Turbo Multi User Detection

    PubMed Central

    Kumaravel, Rasadurai; Narayanaswamy, Kumaratharan

    2015-01-01

    Multi carrier code division multiple access (MC-CDMA) system is a promising multi carrier modulation (MCM) technique for high data rate wireless communication over frequency selective fading channels. MC-CDMA system is a combination of code division multiple access (CDMA) and orthogonal frequency division multiplexing (OFDM). The OFDM parts reduce multipath fading and inter symbol interference (ISI) and the CDMA part increases spectrum utilization. Advantages of this technique are its robustness in case of multipath propagation and improve security with the minimize ISI. Nevertheless, due to the loss of orthogonality at the receiver in a mobile environment, the multiple access interference (MAI) appears. The MAI is one of the factors that degrade the bit error rate (BER) performance of MC-CDMA system. The multiuser detection (MUD) and turbo coding are the two dominant techniques for enhancing the performance of the MC-CDMA systems in terms of BER as a solution of overcome to MAI effects. In this paper a low complexity iterative soft sensitive bits algorithm (SBA) aided logarithmic-Maximum a-Posteriori algorithm (Log MAP) based turbo MUD is proposed. Simulation results show that the proposed method provides better BER performance with low complexity decoding, by mitigating the detrimental effects of MAI. PMID:25714917

  5. The effects of deanol on cognitive performance and electrophysiology in elderly humans.

    PubMed

    Marsh, G R; Linnoila, M

    1979-01-01

    Deanol (900 mg/day for 21 days) had no effect on learning a list of words when tested at weekly intervals. Tests of simple and complex reaction time and a test of continuous serial decoding of digits showed no enhancement with the drug. Several components of evoked potentials recorded from several scalp sites did show enhanced amplitude under drug treatment. These changes were not accompanied by changes in the EEG spectrum as are seen with some other psychoactive drugs. Deanol seems to be an ineffective treatment for the normal slowing of cognitive function seen in the normal elderly person or those elderly with only minimal cognitive decline and free of symptoms of dementia. Contrary to earlier reports, elderly persons were found to be able to benefit from warning signals in a complex reaction time task.

  6. Efficient two-dimensional compressive sensing in MIMO radar

    NASA Astrophysics Data System (ADS)

    Shahbazi, Nafiseh; Abbasfar, Aliazam; Jabbarian-Jahromi, Mohammad

    2017-12-01

    Compressive sensing (CS) has been a way to lower sampling rate leading to data reduction for processing in multiple-input multiple-output (MIMO) radar systems. In this paper, we further reduce the computational complexity of a pulse-Doppler collocated MIMO radar by introducing a two-dimensional (2D) compressive sensing. To do so, we first introduce a new 2D formulation for the compressed received signals and then we propose a new measurement matrix design for our 2D compressive sensing model that is based on minimizing the coherence of sensing matrix using gradient descent algorithm. The simulation results show that our proposed 2D measurement matrix design using gradient decent algorithm (2D-MMDGD) has much lower computational complexity compared to one-dimensional (1D) methods while having better performance in comparison with conventional methods such as Gaussian random measurement matrix.

  7. Soft micromachines with programmable motility and morphology

    PubMed Central

    Huang, Hen-Wei; Sakar, Mahmut Selman; Petruska, Andrew J.; Pané, Salvador; Nelson, Bradley J.

    2016-01-01

    Nature provides a wide range of inspiration for building mobile micromachines that can navigate through confined heterogenous environments and perform minimally invasive environmental and biomedical operations. For example, microstructures fabricated in the form of bacterial or eukaryotic flagella can act as artificial microswimmers. Due to limitations in their design and material properties, these simple micromachines lack multifunctionality, effective addressability and manoeuvrability in complex environments. Here we develop an origami-inspired rapid prototyping process for building self-folding, magnetically powered micromachines with complex body plans, reconfigurable shape and controllable motility. Selective reprogramming of the mechanical design and magnetic anisotropy of body parts dynamically modulates the swimming characteristics of the micromachines. We find that tail and body morphologies together determine swimming efficiency and, unlike for rigid swimmers, the choice of magnetic field can subtly change the motility of soft microswimmers. PMID:27447088

  8. Soft micromachines with programmable motility and morphology.

    PubMed

    Huang, Hen-Wei; Sakar, Mahmut Selman; Petruska, Andrew J; Pané, Salvador; Nelson, Bradley J

    2016-07-22

    Nature provides a wide range of inspiration for building mobile micromachines that can navigate through confined heterogenous environments and perform minimally invasive environmental and biomedical operations. For example, microstructures fabricated in the form of bacterial or eukaryotic flagella can act as artificial microswimmers. Due to limitations in their design and material properties, these simple micromachines lack multifunctionality, effective addressability and manoeuvrability in complex environments. Here we develop an origami-inspired rapid prototyping process for building self-folding, magnetically powered micromachines with complex body plans, reconfigurable shape and controllable motility. Selective reprogramming of the mechanical design and magnetic anisotropy of body parts dynamically modulates the swimming characteristics of the micromachines. We find that tail and body morphologies together determine swimming efficiency and, unlike for rigid swimmers, the choice of magnetic field can subtly change the motility of soft microswimmers.

  9. Bandwidth Efficient Modulation and Coding Techniques for NASA's Existing Ku/Ka-Band 225 MHz Wide Service

    NASA Technical Reports Server (NTRS)

    Gioannini, Bryan; Wong, Yen; Wesdock, John

    2005-01-01

    The National Aeronautics and Space Administration (NASA) has recently established the Tracking and Data Relay Satellite System (TDRSS) K-band Upgrade Project (TKUP), a project intended to enhance the TDRSS Ku-band and Ka-band Single Access Return 225 MHz (Ku/KaSAR-225) data service by adding the capability to process bandwidth efficient signal design and to replace the White Sand Complex (WSC) KSAR high data rate ground equipment and high rate switches which are nearing obsolescence. As a precursor to this project, a modulation and coding study was performed to identify signal structures which maximized the data rate through the Ku/KaSAR-225 channel, minimized the required customer EIRP and ensured acceptable hardware complexity on the customer platform. This paper presents the results and conclusions of the TKUP modulation and coding study.

  10. Fourier Transform Mass Spectrometry: The Transformation of Modern Environmental Analyses

    PubMed Central

    Lim, Lucy; Yan, Fangzhi; Bach, Stephen; Pihakari, Katianna; Klein, David

    2016-01-01

    Unknown compounds in environmental samples are difficult to identify using standard mass spectrometric methods. Fourier transform mass spectrometry (FTMS) has revolutionized how environmental analyses are performed. With its unsurpassed mass accuracy, high resolution and sensitivity, researchers now have a tool for difficult and complex environmental analyses. Two features of FTMS are responsible for changing the face of how complex analyses are accomplished. First is the ability to quickly and with high mass accuracy determine the presence of unknown chemical residues in samples. For years, the field has been limited by mass spectrometric methods that were based on knowing what compounds of interest were. Secondly, by utilizing the high resolution capabilities coupled with the low detection limits of FTMS, analysts also could dilute the sample sufficiently to minimize the ionization changes from varied matrices. PMID:26784175

  11. Minimal access surgery for mitral valve endocarditis.

    PubMed

    Barbero, Cristina; Marchetto, Giovanni; Ricci, Davide; Mancuso, Samuel; Boffini, Massimo; Cecchi, Enrico; De Rosa, Francesco Giuseppe; Rinaldi, Mauro

    2017-08-01

    Minimal access mitral valve surgery (MVS) has already proved to be feasible and effective with low perioperative mortality and excellent long-term outcomes. However, experience in more complex valve diseases such as infective endocarditis (IE) still remains limited. The aim of this retrospective study was to evaluate early and long-term results of minimal access MVS for IE. Data were entered into a dedicated database. Analysis was performed retrospectively for the 8-year period between January 2007 and April 2015. During the study period, 35 consecutive patients underwent minimal access MVS for IE at our department. Twenty-four had diagnosis of native MV endocarditis (68.6%) and 11 of mitral prosthesis endocarditis (31.4%).Thirty patients underwent early MVS (85.7%), and 5 patients were operated after the completion of antibiotic treatment (14.3%). Seven patients underwent MV repair (20%), 17 patients underwent MV replacement (48.6%), and 11 patients underwent mitral prosthesis replacement (31.4%). Thirty-day mortality was 11.4% (4 patients). No neurological or vascular complications were reported. One patient underwent reoperation for prosthesis IE relapse after 37 days. Overall actuarial survival rate at 1 and 5 years was 83%; freedom from MV reoperation and/or recurrence of IE at 1 and 5 years was 97%. Minimally invasive MVS for IE is feasible and associated with good early and long-term results. Preoperative accurate patient selection and transoesophageal echocardiography evaluation is mandatory for surgical planning. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  12. Minimal Model of Quantum Kinetic Clusters for the Energy-Transfer Network of a Light-Harvesting Protein Complex.

    PubMed

    Wu, Jianlan; Tang, Zhoufei; Gong, Zhihao; Cao, Jianshu; Mukamel, Shaul

    2015-04-02

    The energy absorbed in a light-harvesting protein complex is often transferred collectively through aggregated chromophore clusters. For population evolution of chromophores, the time-integrated effective rate matrix allows us to construct quantum kinetic clusters quantitatively and determine the reduced cluster-cluster transfer rates systematically, thus defining a minimal model of energy-transfer kinetics. For Fenna-Matthews-Olson (FMO) and light-havrvesting complex II (LCHII) monomers, quantum Markovian kinetics of clusters can accurately reproduce the overall energy-transfer process in the long-time scale. The dominant energy-transfer pathways are identified in the picture of aggregated clusters. The chromophores distributed extensively in various clusters can assist a fast and long-range energy transfer.

  13. Linear Matrix Inequality Method for a Quadratic Performance Index Minimization Problem with a class of Bilinear Matrix Inequality Conditions

    NASA Astrophysics Data System (ADS)

    Tanemura, M.; Chida, Y.

    2016-09-01

    There are a lot of design problems of control system which are expressed as a performance index minimization under BMI conditions. However, a minimization problem expressed as LMIs can be easily solved because of the convex property of LMIs. Therefore, many researchers have been studying transforming a variety of control design problems into convex minimization problems expressed as LMIs. This paper proposes an LMI method for a quadratic performance index minimization problem with a class of BMI conditions. The minimization problem treated in this paper includes design problems of state-feedback gain for switched system and so on. The effectiveness of the proposed method is verified through a state-feedback gain design for switched systems and a numerical simulation using the designed feedback gains.

  14. Combining Static Model Checking with Dynamic Enforcement Using the Statecall Policy Language

    NASA Astrophysics Data System (ADS)

    Madhavapeddy, Anil

    Internet protocols encapsulate a significant amount of state, making implementing the host software complex. In this paper, we define the Statecall Policy Language (SPL) which provides a usable middle ground between ad-hoc coding and formal reasoning. It enables programmers to embed automata in their code which can be statically model-checked using SPIN and dynamically enforced. The performance overheads are minimal, and the automata also provide higher-level debugging capabilities. We also describe some practical uses of SPL by describing the automata used in an SSH server written entirely in OCaml/SPL.

  15. Applications for a hybrid operating room in thoracic surgery: from multidisciplinary procedures to ­­image-guided video-assisted thoracoscopic surgery

    PubMed Central

    Terra, Ricardo Mingarini; Andrade, Juliano Ribeiro; Mariani, Alessandro Wasum; Garcia, Rodrigo Gobbo; Succi, Jose Ernesto; Soares, Andrey; Zimmer, Paulo Marcelo

    2016-01-01

    ABSTRACT The concept of a hybrid operating room represents the union of a high-complexity surgical apparatus with state-of-the-art radiological tools (ultrasound, CT, fluoroscopy, or magnetic resonance imaging), in order to perform highly effective, minimally invasive procedures. Although the use of a hybrid operating room is well established in specialties such as neurosurgery and cardiovascular surgery, it has rarely been explored in thoracic surgery. Our objective was to discuss the possible applications of this technology in thoracic surgery, through the reporting of three cases. PMID:27812640

  16. DIY: "Do Imaging Yourself" - Conventional microscopes as powerful tools for in vivo investigation.

    PubMed

    Antunes, Maísa Mota; Carvalho, Érika de; Menezes, Gustavo Batista

    2018-01-01

    Intravital imaging has been increasingly employed in cell biology studies and it is becoming one of the most powerful tools for in vivo investigation. Although some protocols can be extremely complex, most intravital imaging procedures can be performed using basic surgery and animal maintenance techniques. More importantly, regular confocal microscopes - the same that are used for imaging immunofluorescence slides - can also acquire high quality intravital images and movies after minor adaptations. Here we propose minimal adaptations in stock microscopes that allow major improvements in different fields of scientific investigation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Robotics control using isolated word recognition of voice input

    NASA Technical Reports Server (NTRS)

    Weiner, J. M.

    1977-01-01

    A speech input/output system is presented that can be used to communicate with a task oriented system. Human speech commands and synthesized voice output extend conventional information exchange capabilities between man and machine by utilizing audio input and output channels. The speech input facility is comprised of a hardware feature extractor and a microprocessor implemented isolated word or phrase recognition system. The recognizer offers a medium sized (100 commands), syntactically constrained vocabulary, and exhibits close to real time performance. The major portion of the recognition processing required is accomplished through software, minimizing the complexity of the hardware feature extractor.

  18. Liquid film demonstration experiment Skylab SL-4

    NASA Technical Reports Server (NTRS)

    Darbro, W.

    1975-01-01

    The liquid film demonstration experiment performed on Skylab 4 by Astronaut Gerald Carr, which involved the construction of water and soap films by boundary expansion and inertia, is discussed. Results include a 1-ml globule of water expanded into a 7-cm-diameter film as well as complex film structures produced by inertia whose lifetimes are longer in the low-g environment. Also discussed are 1-g acceleration experiments in which the unprovoked rupture of films was photographed and film lifetimes of stationary and rotated soap films were compared. Finally, there is a mathematical discussion regarding minimal surfaces, an isoperimetric problem, and liquid films.

  19. An in-silico walker

    NASA Astrophysics Data System (ADS)

    Xiao, Qiran; Chen, Yanping; Bereau, Tristan; Shi, Yunfeng

    2016-08-01

    The paradox of biomimetic research is to perform bio-functionality, usually associated with sophisticated structures optimized by nature, with minimal structural complexity for the ease of fabrication. Here we show that a three-particle trimer can exhibit kinesin-like autonomous walk on a track via reactive molecular dynamics simulations. The autonomous motion is due to imbalanced transitions resulting from exothermic catalytic reactions, and the spatial asymmetry from the track. This molecular design can be realized by reproducing the particle-particle interactions in functionalized nano- or colloidal particles. Our results open up the possibility of fabricating bio-mimetic nano-systems in a minimalist approach.

  20. Inversion of 2-D DC resistivity data using rapid optimization and minimal complexity neural network

    NASA Astrophysics Data System (ADS)

    Singh, U. K.; Tiwari, R. K.; Singh, S. B.

    2010-02-01

    The backpropagation (BP) artificial neural network (ANN) technique of optimization based on steepest descent algorithm is known to be inept for its poor performance and does not ensure global convergence. Nonlinear and complex DC resistivity data require efficient ANN model and more intensive optimization procedures for better results and interpretations. Improvements in the computational ANN modeling process are described with the goals of enhancing the optimization process and reducing ANN model complexity. Well-established optimization methods, such as Radial basis algorithm (RBA) and Levenberg-Marquardt algorithms (LMA) have frequently been used to deal with complexity and nonlinearity in such complex geophysical records. We examined here the efficiency of trained LMA and RB networks by using 2-D synthetic resistivity data and then finally applied to the actual field vertical electrical resistivity sounding (VES) data collected from the Puga Valley, Jammu and Kashmir, India. The resulting ANN reconstruction resistivity results are compared with the result of existing inversion approaches, which are in good agreement. The depths and resistivity structures obtained by the ANN methods also correlate well with the known drilling results and geologic boundaries. The application of the above ANN algorithms proves to be robust and could be used for fast estimation of resistive structures for other complex earth model also.

  1. Asymmetric flow field flow fractionation with light scattering detection - an orthogonal sensitivity analysis.

    PubMed

    Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S

    2016-11-18

    Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Perceiving environmental properties from motion information: Minimal conditions

    NASA Technical Reports Server (NTRS)

    Proffitt, Dennis R.; Kaiser, Mary K.

    1989-01-01

    The status of motion as a minimal information source for perceiving the environmental properties of surface segregation, three-dimensional (3-D) form, displacement, and dynamics is discussed. The selection of these particular properties was motivated by a desire to present research on perceiving properties that span the range of dimensional complexity.

  3. Modeling of substrate and inhibitor binding to phospholipase A2.

    PubMed

    Sessions, R B; Dauber-Osguthorpe, P; Campbell, M M; Osguthorpe, D J

    1992-09-01

    Molecular graphics and molecular mechanics techniques have been used to study the mode of ligand binding and mechanism of action of the enzyme phospholipase A2. A substrate-enzyme complex was constructed based on the crystal structure of the apoenzyme. The complex was minimized to relieve initial strain, and the structural and energetic features of the resultant complex analyzed in detail, at the molecular and residue level. The minimized complex was then used as a basis for examining the action of the enzyme on modified substrates, binding of inhibitors to the enzyme, and possible reaction intermediate complexes. The model is compatible with the suggested mechanism of hydrolysis and with experimental data about stereoselectivity, efficiency of hydrolysis of modified substrates, and inhibitor potency. In conclusion, the model can be used as a tool in evaluating new ligands as possible substrates and in the rational design of inhibitors, for the therapeutic treatment of diseases such as rheumatoid arthritis, atherosclerosis, and asthma.

  4. Free Energy and Virtual Reality in Neuroscience and Psychoanalysis: A Complexity Theory of Dreaming and Mental Disorder.

    PubMed

    Hopkins, Jim

    2016-01-01

    The main concepts of the free energy (FE) neuroscience developed by Karl Friston and colleagues parallel those of Freud's Project for a Scientific Psychology. In Hobson et al. (2014) these include an innate virtual reality generator that produces the fictive prior beliefs that Freud described as the primary process. This enables Friston's account to encompass a unified treatment-a complexity theory-of the role of virtual reality in both dreaming and mental disorder. In both accounts the brain operates to minimize FE aroused by sensory impingements-including interoceptive impingements that report compliance with biological imperatives-and constructs a representation/model of the causes of impingement that enables this minimization. In Friston's account (variational) FE equals complexity minus accuracy, and is minimized by increasing accuracy and decreasing complexity. Roughly the brain (or model) increases accuracy together with complexity in waking. This is mediated by consciousness-creating active inference-by which it explains sensory impingements in terms of perceptual experiences of their causes. In sleep it reduces complexity by processes that include both synaptic pruning and consciousness/virtual reality/dreaming in REM. The consciousness-creating active inference that effects complexity-reduction in REM dreaming must operate on FE-arousing data distinct from sensory impingement. The most relevant source is remembered arousals of emotion, both recent and remote, as processed in SWS and REM on "active systems" accounts of memory consolidation/reconsolidation. Freud describes these remembered arousals as condensed in the dreamwork for use in the conscious contents of dreams, and similar condensation can be seen in symptoms. Complexity partly reflects emotional conflict and trauma. This indicates that dreams and symptoms are both produced to reduce complexity in the form of potentially adverse (traumatic or conflicting) arousals of amygdala-related emotions. Mental disorder is thus caused by computational complexity together with mechanisms like synaptic pruning that have evolved for complexity-reduction; and important features of disorder can be understood in these terms. Details of the consilience among Freudian, systems consolidation, and complexity-reduction accounts appear clearly in the analysis of a single fragment of a dream, indicating also how complexity reduction proceeds by a process resembling Bayesian model selection.

  5. Visualization of Topology through Simulation

    NASA Astrophysics Data System (ADS)

    Mulderig, Andrew; Beaucage, Gregory; Vogtt, Karsten; Jiang, Hanqiu

    Complex structures can be decomposed into their minimal topological description coupled with complications of tortuosity. We have found that a stick figure representation can account for the topological content of any structure and coupling with scaling measures of tortuosity we can reconstruct an object. This deconstruction is native to static small-angle scattering measurements where we can obtain quantitative measures of the tortuous structure and the minimal topological structure. For example, a crumpled sheet of paper is composed of a minimal sheet structure and parameters reflecting the extent of crumpling. This quantification yields information that can be used to calculate the hydrodynamic radius, radius of gyration, structural conductive pathway, modulus, and other properties of complex structures. The approach is general and has been applied to a wide range of nanostructures from crumpled graphene to branched polymers and unfolded proteins and RNA. In this poster we will demonstrate how simple structural simulations can be used to reconstruct from these parameters a 3d representation of the complex structure through a heuristic approach. Several examples will be given from nano-fractal aggregates.

  6. Spontaneous emergence of milling (vortex state) in a Vicsek-like model

    NASA Astrophysics Data System (ADS)

    Costanzo, A.; Hemelrijk, C. K.

    2018-04-01

    Collective motion is of interest to laymen and scientists in different fields. In groups of animals, many patterns of collective motion arise such as polarized schools and mills (i.e. circular motion). Collective motion can be generated in computational models of different degrees of complexity. In these models, moving individuals coordinate with others nearby. In the more complex models, individuals attract each other, aligning their headings, and avoiding collisions. Simpler models may include only one or two of these types of interactions. The collective pattern that interests us here is milling, which is observed in many animal species. It has been reproduced in the more complex models, but not in simpler models that are based only on alignment, such as the well-known Vicsek model. Our aim is to provide insight in the minimal conditions required for milling by making minimal modifications to the Vicsek model. Our results show that milling occurs when both the field of view and the maximal angular velocity are decreased. Remarkably, apart from milling, our minimal model also exhibits many of the other patterns of collective motion observed in animal groups.

  7. Determination of complex formation constants by phase sensitive alternating current polarography: Cadmium-polymethacrylic acid and cadmium-polygalacturonic acid.

    PubMed

    Garrigosa, Anna Maria; Gusmão, Rui; Ariño, Cristina; Díaz-Cruz, José Manuel; Esteban, Miquel

    2007-10-15

    The use of phase sensitive alternating current polarography (ACP) for the evaluation of complex formation constants of systems where electrodic adsorption is present has been proposed. The applicability of the technique implies the previous selection of the phase angle where contribution of capacitive current is minimized. This is made using Multivariate Curve Resolution by Alternating Least Squares (MCR-ALS) in the analysis of ACP measurements at different phase angles. The method is checked by the study of the complexation of Cd by polymethacrylic (PMA) and polygalacturonic (PGA) acids, and the optimal phase angles have been ca. -10 degrees for Cd-PMA and ca. -15 degrees for Cd-PGA systems. The goodness of phase sensitive ACP has been demonstrated comparing the determined complex formation constants with those obtained by reverse pulse polarography, a technique that minimizes the electrode adsorption effects on the measured currents.

  8. An accurate evaluation of the performance of asynchronous DS-CDMA systems with zero-correlation-zone coding in Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Walker, Ernest; Chen, Xinjia; Cooper, Reginald L.

    2010-04-01

    An arbitrarily accurate approach is used to determine the bit-error rate (BER) performance for generalized asynchronous DS-CDMA systems, in Gaussian noise with Raleigh fading. In this paper, and the sequel, new theoretical work has been contributed which substantially enhances existing performance analysis formulations. Major contributions include: substantial computational complexity reduction, including a priori BER accuracy bounding; an analytical approach that facilitates performance evaluation for systems with arbitrary spectral spreading distributions, with non-uniform transmission delay distributions. Using prior results, augmented by these enhancements, a generalized DS-CDMA system model is constructed and used to evaluated the BER performance, in a variety of scenarios. In this paper, the generalized system modeling was used to evaluate the performance of both Walsh- Hadamard (WH) and Walsh-Hadamard-seeded zero-correlation-zone (WH-ZCZ) coding. The selection of these codes was informed by the observation that WH codes contain N spectral spreading values (0 to N - 1), one for each code sequence; while WH-ZCZ codes contain only two spectral spreading values (N/2 - 1,N/2); where N is the sequence length in chips. Since these codes span the spectral spreading range for DS-CDMA coding, by invoking an induction argument, the generalization of the system model is sufficiently supported. The results in this paper, and the sequel, support the claim that an arbitrary accurate performance analysis for DS-CDMA systems can be evaluated over the full range of binary coding, with minimal computational complexity.

  9. Design of neural network model-based controller in a fed-batch microbial electrolysis cell reactor for bio-hydrogen gas production

    NASA Astrophysics Data System (ADS)

    Azwar; Hussain, M. A.; Abdul-Wahab, A. K.; Zanil, M. F.; Mukhlishien

    2018-03-01

    One of major challenge in bio-hydrogen production process by using MEC process is nonlinear and highly complex system. This is mainly due to the presence of microbial interactions and highly complex phenomena in the system. Its complexity makes MEC system difficult to operate and control under optimal conditions. Thus, precise control is required for the MEC reactor, so that the amount of current required to produce hydrogen gas can be controlled according to the composition of the substrate in the reactor. In this work, two schemes for controlling the current and voltage of MEC were evaluated. The controllers evaluated are PID and Inverse neural network (NN) controller. The comparative study has been carried out under optimal condition for the production of bio-hydrogen gas wherein the controller output is based on the correlation of optimal current and voltage to the MEC. Various simulation tests involving multiple set-point changes and disturbances rejection have been evaluated and the performances of both controllers are discussed. The neural network-based controller results in fast response time and less overshoots while the offset effects are minimal. In conclusion, the Inverse neural network (NN)-based controllers provide better control performance for the MEC system compared to the PID controller.

  10. Minimum time search in uncertain dynamic domains with complex sensorial platforms.

    PubMed

    Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel

    2014-08-04

    The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models.

  11. Minimum Time Search in Uncertain Dynamic Domains with Complex Sensorial Platforms

    PubMed Central

    Lanillos, Pablo; Besada-Portas, Eva; Lopez-Orozco, Jose Antonio; de la Cruz, Jesus Manuel

    2014-01-01

    The minimum time search in uncertain domains is a searching task, which appears in real world problems such as natural disasters and sea rescue operations, where a target has to be found, as soon as possible, by a set of sensor-equipped searchers. The automation of this task, where the time to detect the target is critical, can be achieved by new probabilistic techniques that directly minimize the Expected Time (ET) to detect a dynamic target using the observation probability models and actual observations collected by the sensors on board the searchers. The selected technique, described in algorithmic form in this paper for completeness, has only been previously partially tested with an ideal binary detection model, in spite of being designed to deal with complex non-linear/non-differential sensorial models. This paper covers the gap, testing its performance and applicability over different searching tasks with searchers equipped with different complex sensors. The sensorial models under test vary from stepped detection probabilities to continuous/discontinuous differentiable/non-differentiable detection probabilities dependent on distance, orientation, and structured maps. The analysis of the simulated results of several static and dynamic scenarios performed in this paper validates the applicability of the technique with different types of sensor models. PMID:25093345

  12. Dynamic remapping of parallel computations with varying resource demands

    NASA Technical Reports Server (NTRS)

    Nicol, D. M.; Saltz, J. H.

    1986-01-01

    A large class of computational problems is characterized by frequent synchronization, and computational requirements which change as a function of time. When such a problem must be solved on a message passing multiprocessor machine, the combination of these characteristics lead to system performance which decreases in time. Performance can be improved with periodic redistribution of computational load; however, redistribution can exact a sometimes large delay cost. We study the issue of deciding when to invoke a global load remapping mechanism. Such a decision policy must effectively weigh the costs of remapping against the performance benefits. We treat this problem by constructing two analytic models which exhibit stochastically decreasing performance. One model is quite tractable; we are able to describe the optimal remapping algorithm, and the optimal decision policy governing when to invoke that algorithm. However, computational complexity prohibits the use of the optimal remapping decision policy. We then study the performance of a general remapping policy on both analytic models. This policy attempts to minimize a statistic W(n) which measures the system degradation (including the cost of remapping) per computation step over a period of n steps. We show that as a function of time, the expected value of W(n) has at most one minimum, and that when this minimum exists it defines the optimal fixed-interval remapping policy. Our decision policy appeals to this result by remapping when it estimates that W(n) is minimized. Our performance data suggests that this policy effectively finds the natural frequency of remapping. We also use the analytic models to express the relationship between performance and remapping cost, number of processors, and the computation's stochastic activity.

  13. Electromagnetic interference-aware transmission scheduling and power control for dynamic wireless access in hospital environments.

    PubMed

    Phunchongharn, Phond; Hossain, Ekram; Camorlinga, Sergio

    2011-11-01

    We study the multiple access problem for e-Health applications (referred to as secondary users) coexisting with medical devices (referred to as primary or protected users) in a hospital environment. In particular, we focus on transmission scheduling and power control of secondary users in multiple spatial reuse time-division multiple access (STDMA) networks. The objective is to maximize the spectrum utilization of secondary users and minimize their power consumption subject to the electromagnetic interference (EMI) constraints for active and passive medical devices and minimum throughput guarantee for secondary users. The multiple access problem is formulated as a dual objective optimization problem which is shown to be NP-complete. We propose a joint scheduling and power control algorithm based on a greedy approach to solve the problem with much lower computational complexity. To this end, an enhanced greedy algorithm is proposed to improve the performance of the greedy algorithm by finding the optimal sequence of secondary users for scheduling. Using extensive simulations, the tradeoff in performance in terms of spectrum utilization, energy consumption, and computational complexity is evaluated for both the algorithms.

  14. NASA ERA Integrated CFD for Wind Tunnel Testing of Hybrid Wing-Body Configuration

    NASA Technical Reports Server (NTRS)

    Garcia, Joseph A.; Melton, John E.; Schuh, Michael; James, Kevin D.; Long, Kurtis R.; Vicroy, Dan D.; Deere, Karen A.; Luckring, James M.; Carter, Melissa B.; Flamm, Jeffrey D.; hide

    2016-01-01

    The NASA Environmentally Responsible Aviation (ERA) Project explored enabling technologies to reduce impact of aviation on the environment. One project research challenge area was the study of advanced airframe and engine integration concepts to reduce community noise and fuel burn. To address this challenge, complex wind tunnel experiments at both the NASA Langley Research Center's (LaRC) 14'x22' and the Ames Research Center's 40'x80' low-speed wind tunnel facilities were conducted on a BOEING Hybrid Wing Body (HWB) configuration. These wind tunnel tests entailed various entries to evaluate the propulsion-airframe interference effects, including aerodynamic performance and aeroacoustics. In order to assist these tests in producing high quality data with minimal hardware interference, extensive Computational Fluid Dynamic (CFD) simulations were performed for everything from sting design and placement for both the wing body and powered ejector nacelle systems to the placement of aeroacoustic arrays to minimize its impact on vehicle aerodynamics. This paper presents a high-level summary of the CFD simulations that NASA performed in support of the model integration hardware design as well as the development of some CFD simulation guidelines based on post-test aerodynamic data. In addition, the paper includes details on how multiple CFD codes (OVERFLOW, STAR-CCM+, USM3D, and FUN3D) were efficiently used to provide timely insight into the wind tunnel experimental setup and execution.

  15. NASA ERA Integrated CFD for Wind Tunnel Testing of Hybrid Wing-Body Configuration

    NASA Technical Reports Server (NTRS)

    Garcia, Joseph A.; Melton, John E.; Schuh, Michael; James, Kevin D.; Long, Kurt R.; Vicroy, Dan D.; Deere, Karen A.; Luckring, James M.; Carter, Melissa B.; Flamm, Jeffrey D.; hide

    2016-01-01

    NASAs Environmentally Responsible Aviation (ERA) Project explores enabling technologies to reduce aviations impact on the environment. One research challenge area for the project has been to study advanced airframe and engine integration concepts to reduce community noise and fuel burn. In order to achieve this, complex wind tunnel experiments at both the NASA Langley Research Centers (LaRC) 14x22 and the Ames Research Centers 40x80 low-speed wind tunnel facilities were conducted on a Boeing Hybrid Wing Body (HWB) configuration. These wind tunnel tests entailed various entries to evaluate the propulsion airframe interference effects including aerodynamic performance and aeroacoustics. In order to assist these tests in producing high quality data with minimal hardware interference, extensive Computational Fluid Dynamic (CFD) simulations were performed for everything from sting design and placement for both the wing body and powered ejector nacelle systems to the placement of aeroacoustic arrays to minimize its impact on the vehicles aerodynamics. This paper will provide a high level summary of the CFD simulations that NASA performed in support of the model integration hardware design as well as some simulation guideline development based on post-test aerodynamic data. In addition, the paper includes details on how multiple CFD codes (OVERFLOW, STAR-CCM+, USM3D, and FUN3D) were efficiently used to provide timely insight into the wind tunnel experimental setup and execution.

  16. Electromagnetic Design and Performance of a Conical Microwave Blackbody Target for Radiometer Calibration

    NASA Astrophysics Data System (ADS)

    Houtz, Derek A.; Emery, William; Gu, Dazhen; Jacob, Karl; Murk, Axel; Walker, David K.; Wylde, Richard J.

    2017-08-01

    A conical cavity has been designed and fabricated for use as a broadband passive microwave calibration source, or blackbody, at the National Institute of Standards and Technology. The blackbody will be used as a national primary standard for brightness temperature and will allow for the prelaunch calibration of spaceborne radiometers and calibration of ground-based systems to provide traceability among radiometric data. The conical geometry provides performance independent of polarization, minimizing reflections, and standing waves, thus having a high microwave emissivity. The conical blackbody has advantages over typical pyramidal array geometries, including reduced temperature gradients and excellent broadband electromagnetic performance over more than a frequency decade. The blackbody is designed for use between 18 and 230 GHz, at temperatures between 80 and 350 K, and is vacuum compatible. To approximate theoretical blackbody behavior, the design maximizes emissivity and thus minimizes reflectivity. A newly developed microwave absorber is demonstrated that uses cryogenically compatible, thermally conductive two-part epoxy with magnetic carbonyl iron (CBI) powder loading. We measured the complex permittivity and permeability properties for different CBI-loading percentages; the conical absorber is then designed and optimized with geometric optics and finite-element modeling, and finally, the reflectivity of the resulting fabricated structure is measured. We demonstrated normal incidence reflectivity considerably below -40 dB at all relevant remote sensing frequencies.

  17. Miniature high-performance infrared spectrometer for space applications

    NASA Astrophysics Data System (ADS)

    Kruzelecky, Roman V.; Haddad, Emile; Wong, Brian; Lafrance, Denis; Jamroz, Wes; Ghosh, Asoke K.; Zheng, Wanping; Phong, Linh

    2004-06-01

    Infrared spectroscopy probes the characteristic vibrational and rotational modes of chemical bonds in molecules to provide information about both the chemical composition and the bonding configuration of a sample. The significant advantage of the Infrared spectral technique is that it can be used with minimal consumables to simultaneously detect a large variety of chemical and biochemical species with high chemical specificity. To date, relatively large Fourier Transform (FT-IR) spectrometers employing variations of the Michelson interferometer have been successfully employed in space for various IR spectroscopy applications. However, FT-IR systems are mechanically complex, bulky (> 15 kg), and require considerable processing. This paper discusses the use of advanced integrated optics and smart optical coding techniques to significantly extend the performance of miniature IR spectrometers by several orders of magnitude in sensitivity. This can provide the next-generation of compact, high-performance IR spectrometers with monolithically integrated optical systems for robust optical alignment. The entire module can weigh under 3 kg to minimize the mass penalty for space applications. Miniaturized IR spectrometers are versatile and very convenient for small and micro satellite based missions. They can be dedicated to the monitoring of the CO2 in an Earth Observation mission, to Mars exobiology exploration, as well as to vital life support in manned space system; such as the cabin air quality and the quality of the recycled water supply.

  18. Miniature high-performance infrared spectrometer for space applications

    NASA Astrophysics Data System (ADS)

    Kruzelecky, Roman V.; Haddad, Emile; Wong, Brian; Lafrance, Denis; Jamroz, Wes; Ghosh, Asoke K.; Zheng, Wanping; Phong, Linh

    2017-11-01

    Infrared spectroscopy probes the characteristic vibrational and rotational modes of chemical bonds in molecules to provide information about both the chemical composition and the bonding configuration of a sample. The significant advantage of the Infrared spectral technique is that it can be used with minimal consumables to simultaneously detect a large variety of chemical and biochemical species with high chemical specificity. To date, relatively large Fourier Transform (FT-IR) spectrometers employing variations of the Michelson interferometer have been successfully employed in space for various IR spectroscopy applications. However, FT-IR systems are mechanically complex, bulky (> 15 kg), and require considerable processing. This paper discusses the use of advanced integrated optics and smart optical coding techniques to significantly extend the performance of miniature IR spectrometers by several orders of magnitude in sensitivity. This can provide the next generation of compact, high-performance IR spectrometers with monolithically integrated optical systems for robust optical alignment. The entire module can weigh under 3 kg to minimize the mass penalty for space applications. Miniaturized IR spectrometers are versatile and very convenient for small and micro satellite based missions. They can be dedicated to the monitoring of the CO2 in an Earth Observation mission, to Mars exobiology exploration, as well as to vital life support in manned space system; such as the cabin air quality and the quality of the recycled water supply.

  19. Tensorial Minkowski functionals of triply periodic minimal surfaces

    PubMed Central

    Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus

    2012-01-01

    A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847

  20. Chlorhexidine: beta-cyclodextrin inhibits yeast growth by extraction of ergosterol.

    PubMed

    Teixeira, K I R; Araújo, P V; Sinisterra, R D; Cortés, M E

    2012-04-01

    Chlorhexidine (Cx) augmented with beta-cyclodextrin (β-cd) inclusion compounds, termed Cx:β-cd complexes, have been developed for use as antiseptic agents. The aim of this study was to examine the interactions of Cx:β-cd complexes, prepared at different molecular ratios, with sterol and yeast membranes. The Minimal Inhibitory Concentration (MIC) against the yeast Candida albicans (C.a.) was determined for each complex; the MICs were found to range from 0.5 to 2 μg/mL. To confirm the MIC data, quantitative analysis of viable cells was performed using trypan blue staining. Mechanistic characterization of the interactions that the Cx:β-cd complexes have with the yeast membrane and assessment of membrane morphology following exposure to Cx:β-cd complexes were performed using Sterol Quantification Method analysis (SQM) and scanning electron microscopy (SEM). SQM revealed that sterol extraction increased with increasing β-cd concentrations (1.71 ×10(3); 1.4 ×10(3); 3.45 ×10(3), and 3.74 ×10(3) CFU for 1:1, 1:2, 1:3, and 1:4, respectively), likely as a consequence of membrane ergosterol solubilization. SEM images demonstrated that cell membrane damage is a visible and significant mechanism that contributes to the antimicrobial effects of Cx:β-cd complexes. Cell disorganization increased significantly as the proportion of β-cyclodextrin present in the complex increased. Morphology of cells exposed to complexes with 1:3 and 1:4 molar ratios of Cx:β-cd were observed to have large aggregates mixed with yeast remains, representing more membrane disruption than that observed in cells treated with Cx alone. In conclusion, nanoaggregates of Cx:β-cd complexes block yeast growth via ergosterol extraction, permeabilizing the membrane by creating cluster-like structures within the cell membrane, possibly due to high amounts of hydrogen bonding.

  1. Evolution of an RNP assembly system: A minimal SMN complex facilitates formation of UsnRNPs in Drosophila melanogaster

    PubMed Central

    Kroiss, Matthias; Schultz, Jörg; Wiesner, Julia; Chari, Ashwin; Sickmann, Albert; Fischer, Utz

    2008-01-01

    In vertebrates, assembly of spliceosomal uridine-rich small nuclear ribonucleoproteins (UsnRNPs) is mediated by the SMN complex, a macromolecular entity composed of the proteins SMN and Gemins 2–8. Here we have studied the evolution of this machinery using complete genome assemblies of multiple model organisms. The SMN complex has gained complexity in evolution by a blockwise addition of Gemins onto an ancestral core complex composed of SMN and Gemin2. In contrast to this overall evolutionary trend to more complexity in metazoans, orthologs of most Gemins are missing in dipterans. In accordance with these bioinformatic data a previously undescribed biochemical purification strategy elucidated that the dipteran Drosophila melanogaster contains an SMN complex of remarkable simplicity. Surprisingly, this minimal complex not only mediates the assembly reaction in a manner very similar to its vertebrate counterpart, but also prevents misassembly onto nontarget RNAs. Our data suggest that only a minority of Gemins are required for the assembly reaction per se, whereas others may serve additional functions in the context of UsnRNP biogenesis. The evolution of the SMN complex is an interesting example of how the simplification of a biochemical process contributes to genome compaction. PMID:18621711

  2. Displacement based multilevel structural optimization

    NASA Technical Reports Server (NTRS)

    Striz, Alfred G.

    1995-01-01

    Multidisciplinary design optimization (MDO) is expected to play a major role in the competitive transportation industries of tomorrow, i.e., in the design of aircraft and spacecraft, of high speed trains, boats, and automobiles. All of these vehicles require maximum performance at minimum weight to keep fuel consumption low and conserve resources. Here, MDO can deliver mathematically based design tools to create systems with optimum performance subject to the constraints of disciplines such as structures, aerodynamics, controls, etc. Although some applications of MDO are beginning to surface, the key to a widespread use of this technology lies in the improvement of its efficiency. This aspect is investigated here for the MDO subset of structural optimization, i.e., for the weight minimization of a given structure under size, strength, and displacement constraints. Specifically, finite element based multilevel optimization of structures (here, statically indeterminate trusses and beams for proof of concept) is performed. In the system level optimization, the design variables are the coefficients of assumed displacement functions, and the load unbalance resulting from the solution of the stiffness equations is minimized. Constraints are placed on the deflection amplitudes and the weight of the structure. In the subsystems level optimizations, the weight of each element is minimized under the action of stress constraints, with the cross sectional dimensions as design variables. This approach is expected to prove very efficient, especially for complex structures, since the design task is broken down into a large number of small and efficiently handled subtasks, each with only a small number of variables. This partitioning will also allow for the use of parallel computing, first, by sending the system and subsystems level computations to two different processors, ultimately, by performing all subsystems level optimizations in a massively parallel manner on separate processors. It is expected that the subsystems level optimizations can be further improved through the use of controlled growth, a method which reduces an optimization to a more efficient analysis with only a slight degradation in accuracy. The efficiency of all proposed techniques is being evaluated relative to the performance of the standard single level optimization approach where the complete structure is weight minimized under the action of all given constraints by one processor and to the performance of simultaneous analysis and design which combines analysis and optimization into a single step. It is expected that the present approach can be expanded to include additional structural constraints (buckling, free and forced vibration, etc.) or other disciplines (passive and active controls, aerodynamics, etc.) for true MDO.

  3. Signaling on the continuous spectrum of nonlinear optical fiber.

    PubMed

    Tavakkolnia, Iman; Safari, Majid

    2017-08-07

    This paper studies different signaling techniques on the continuous spectrum (CS) of nonlinear optical fiber defined by nonlinear Fourier transform. Three different signaling techniques are proposed and analyzed based on the statistics of the noise added to CS after propagation along the nonlinear optical fiber. The proposed methods are compared in terms of error performance, distance reach, and complexity. Furthermore, the effect of chromatic dispersion on the data rate and noise in nonlinear spectral domain is investigated. It is demonstrated that, for a given sequence of CS symbols, an optimal bandwidth (or symbol rate) can be determined so that the temporal duration of the propagated signal at the end of the fiber is minimized. In effect, the required guard interval between the subsequently transmitted data packets in time is minimized and the effective data rate is significantly enhanced. Moreover, by selecting the proper signaling method and design criteria a distance reach of 7100 km is reported by only singling on CS at a rate of 9.6 Gbps.

  4. Non-Invasive Breast Cancer Diagnosis through Electrochemical Biosensing at Different Molecular Levels

    PubMed Central

    Campuzano, Susana

    2017-01-01

    The rapid and accurate determination of specific circulating biomarkers at different molecular levels with non- or minimally invasive methods constitutes a major challenge to improve the breast cancer outcomes and life quality of patients. In this field, electrochemical biosensors have demonstrated to be promising alternatives against more complex conventional strategies to perform fast, accurate and on-site determination of circulating biomarkers at low concentrations in minimally treated body fluids. In this article, after discussing briefly the relevance and current challenges associated with the determination of breast cancer circulating biomarkers, an updated overview of the electrochemical affinity biosensing strategies emerged in the last 5 years for this purpose is provided highlighting the great potentiality of these methodologies. After critically discussing the most interesting features of the electrochemical strategies reported so far for the single or multiplexed determination of such biomarkers with demonstrated applicability in liquid biopsy analysis, existing challenges still to be addressed and future directions in this field will be pointed out. PMID:28858236

  5. Minimally invasive endoscope-assisted trans-oral excision of huge parapharyngeal space tumors.

    PubMed

    Li, Shang-Yi; Hsu, Ching-Hui; Chen, Mu-Kuan

    2015-04-01

    Parapharyngeal space tumors are rare head and neck neoplasms, and most are benign lesions. Complete excision of these tumors is difficult because of the complexity of the surrounding anatomic structures. The algorithm for excision of these tumors is typically based on the tumor's characteristics; excision is performed via approaches such as the trans-oral route, the trans-cervical route, and even a combination of the trans-parotid route and mandibulotomy. However, each of these approaches is associated with some complications. Endoscope-assisted minimally invasive surgery is being increasingly employed for surgeries in the head and neck regions. It has the advantage of leaving no facial scars, and ensures better patient comfort after the operation. Here, we report the use of endoscope-assisted trans-oral surgery for excision of parapharyngeal space tumors. The technique yields an excellent outcome and should be a feasible, safe, and economic method for these patients. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Preoperative planning with three-dimensional reconstruction of patient's anatomy, rapid prototyping and simulation for endoscopic mitral valve repair.

    PubMed

    Sardari Nia, Peyman; Heuts, Samuel; Daemen, Jean; Luyten, Peter; Vainer, Jindrich; Hoorntje, Jan; Cheriex, Emile; Maessen, Jos

    2017-02-01

    Mitral valve repair performed by an experienced surgeon is superior to mitral valve replacement for degenerative mitral valve disease; however, many surgeons are still deterred from adapting this procedure because of a steep learning curve. Simulation-based training and planning could improve the surgical performance and reduce the learning curve. The aim of this study was to develop a patient-specific simulation for mitral valve repair and provide a proof of concept of personalized medicine in a patient prospectively planned for mitral valve surgery. A 65-year old male with severe symptomatic mitral valve regurgitation was referred to our mitral valve heart team. On the basis of three-dimensional (3D) transoesophageal echocardiography and computed tomography, 3D reconstructions of the patient's anatomy were constructed. By navigating through these reconstructions, the repair options and surgical access were chosen (minimally invasive repair). Using rapid prototyping and negative mould fabrication, we developed a process to cast a patient-specific mitral valve silicone replica for preoperative repair in a high-fidelity simulator. Mitral valve and negative mould were printed in systole to capture the pathology when the valve closes. A patient-specific mitral valve silicone replica was casted and mounted in the simulator. All repair techniques could be performed in the simulator to choose the best repair strategy. As the valve was printed in systole, no special testing other than adjusting the coaptation area was required. Subsequently, the patient was operated, mitral valve pathology was validated and repair was successfully done as in the simulation. The patient-specific simulation and planning could be applied for surgical training, starting the (minimally invasive) mitral valve repair programme, planning of complex cases and the evaluation of new interventional techniques. The personalized medicine could be a possible pathway towards enhancing reproducibility, patient's safety and effectiveness of a complex surgical procedure. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  7. Silk-polypyrrole biocompatible actuator performance under biologically relevant conditions

    NASA Astrophysics Data System (ADS)

    Hagler, Jo'elen; Peterson, Ben; Murphy, Amanda; Leger, Janelle

    Biocompatible actuators that are capable of controlled movement and can function under biologically relevant conditions are of significant interest in biomedical fields. Previously, we have demonstrated that a composite material of silk biopolymer and the conducting polymer polypyrrole (PPy) can be formed into a bilayer device that can bend under applied voltage. Further, these silk-PPy composites can generate forces comparable to human muscle (>0.1 MPa) making them ideal candidates for interfacing with biological tissues. Here silk-PPy composite films are tested for performance under biologically relevant conditions including exposure to a complex protein serum and biologically relevant temperatures. Free-end bending actuation performance, current response, force generation and, mass degradation were investigated . Preliminary results show that when exposed to proteins and biologically relevant temperatures, these silk-PPy composites show minimal degradation and are able to generate forces and conduct currents comparable to devices tested under standard conditions. NSF.

  8. Automatic translation of digraph to fault-tree models

    NASA Technical Reports Server (NTRS)

    Iverson, David L.

    1992-01-01

    The author presents a technique for converting digraph models, including those models containing cycles, to a fault-tree format. A computer program which automatically performs this translation using an object-oriented representation of the models has been developed. The fault-trees resulting from translations can be used for fault-tree analysis and diagnosis. Programs to calculate fault-tree and digraph cut sets and perform diagnosis with fault-tree models have also been developed. The digraph to fault-tree translation system has been successfully tested on several digraphs of varying size and complexity. Details of some representative translation problems are presented. Most of the computation performed by the program is dedicated to finding minimal cut sets for digraph nodes in order to break cycles in the digraph. Fault-trees produced by the translator have been successfully used with NASA's Fault-Tree Diagnosis System (FTDS) to produce automated diagnostic systems.

  9. Carotid Artery Stenting – Strategies to Improve Procedural Performance and Reduce the Learning Curve

    PubMed Central

    Van Herzeele, Isabelle

    2013-01-01

    Carotid artery stenting (CAS) remains an appealing intervention to reduce the stroke risk because of its minimal invasive nature. Nevertheless, landmark randomised controlled trials have not been able to resolve the controversies surrounding this complex procedure as the peri-operative stroke risk in a non-selected patient population still seems to be higher after CAS in comparison to carotid endarterectomy. What is more, these trials have highlighted that patient outcome after CAS is influenced by patient- and operator-dependant factors. The CAS procedure exhibits a definitive learning curve resulting in higher complication rates if the procedure is performed by inexperienced interventionists or in low-volume centres. This article will outline strategies to improve the performance of physicians carrying out the CAS procedure by means of proficiency-based training, credentialing, virtual reality rehearsal and optimal patient selection. PMID:29588751

  10. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  11. Real-time field programmable gate array architecture for computer vision

    NASA Astrophysics Data System (ADS)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar

    2001-01-01

    This paper presents an architecture for real-time generic convolution of a mask and an image. The architecture is intended for fast low-level image processing. The field programmable gate array (FPGA)-based architecture takes advantage of the availability of registers in FPGAs to implement an efficient and compact module to process the convolutions. The architecture is designed to minimize the number of accesses to the image memory and it is based on parallel modules with internal pipeline operation in order to improve its performance. The architecture is prototyped in a FPGA, but it can be implemented on dedicated very- large-scale-integrated devices to reach higher clock frequencies. Complexity issues, FPGA resources utilization, FPGA limitations, and real-time performance are discussed. Some results are presented and discussed.

  12. A multiplanar complex resection of a low-grade chondrosarcoma of the distal femur guided by K-wires previously inserted under CT-guide: a case report.

    PubMed

    Zoccali, Carmine; Rossi, Barbara; Ferraresi, Virginia; Anelli, Vincenzo; Rita, Alessandro

    2014-08-13

    In muscular skeletal oncology aiming to achieve wide surgical margin is one of the main factors influencing patient prognosis. In cases where lesions are either meta or epiphyseal, surgery most often compromises joint integrity and stability because muscles, tendons and ligaments are involved in wide resection. When lesions are well circumscribed they can be completely resected by performing multi-planar osteotomies guided by computer-assisted navigation. We describe a case of low-grade chondrosarcoma of the distal femur where a simple but effective technique was useful to perform complex multiplanar osteotomies. No similar techniques are reported in the literature. A 57 year-old Caucasian female was referred to our department for the presence of a distal femur chondrosarcoma. A resection with the presenting technique was scheduled. The first step consists of inserting several K-wires under CT-scan control to delimitate the tumor; the second step consists of tumor removal: in operative theatre, following surgical access, k-wires are used as guide positioning; scalpels are externally placed to k-wires to perform a safe osteotomy. Computed assisted resections can be considered the most advantageous method to reach the best surgical outcome; unfortunately navigation systems are only available in specialized centres. The present technique allows for a multiplanar complex resection when navigation systems are not available. This technique can be applied in low-grade tumours where a minimal wide margin can be considered sufficient.

  13. Development of a minimization instrument for allocation of a hospital-level performance improvement intervention to reduce waiting times in Ontario emergency departments.

    PubMed

    Leaver, Chad Andrew; Guttmann, Astrid; Zwarenstein, Merrick; Rowe, Brian H; Anderson, Geoff; Stukel, Therese; Golden, Brian; Bell, Robert; Morra, Dante; Abrams, Howard; Schull, Michael J

    2009-06-08

    Rigorous evaluation of an intervention requires that its allocation be unbiased with respect to confounders; this is especially difficult in complex, system-wide healthcare interventions. We developed a short survey instrument to identify factors for a minimization algorithm for the allocation of a hospital-level intervention to reduce emergency department (ED) waiting times in Ontario, Canada. Potential confounders influencing the intervention's success were identified by literature review, and grouped by healthcare setting specific change stages. An international multi-disciplinary (clinical, administrative, decision maker, management) panel evaluated these factors in a two-stage modified-delphi and nominal group process based on four domains: change readiness, evidence base, face validity, and clarity of definition. An original set of 33 factors were identified from the literature. The panel reduced the list to 12 in the first round survey. In the second survey, experts scored each factor according to the four domains; summary scores and consensus discussion resulted in the final selection and measurement of four hospital-level factors to be used in the minimization algorithm: improved patient flow as a hospital's leadership priority; physicians' receptiveness to organizational change; efficiency of bed management; and physician incentives supporting the change goal. We developed a simple tool designed to gather data from senior hospital administrators on factors likely to affect the success of a hospital patient flow improvement intervention. A minimization algorithm will ensure balanced allocation of the intervention with respect to these factors in study hospitals.

  14. Retention of Habitat Complexity Minimizes Disassembly of Reef Fish Communities following Disturbance: A Large-Scale Natural Experiment

    PubMed Central

    Emslie, Michael J.; Cheal, Alistair J.; Johns, Kerryn A.

    2014-01-01

    High biodiversity ecosystems are commonly associated with complex habitats. Coral reefs are highly diverse ecosystems, but are under increasing pressure from numerous stressors, many of which reduce live coral cover and habitat complexity with concomitant effects on other organisms such as reef fishes. While previous studies have highlighted the importance of habitat complexity in structuring reef fish communities, they employed gradient or meta-analyses which lacked a controlled experimental design over broad spatial scales to explicitly separate the influence of live coral cover from overall habitat complexity. Here a natural experiment using a long term (20 year), spatially extensive (∼115,000 kms2) dataset from the Great Barrier Reef revealed the fundamental importance of overall habitat complexity for reef fishes. Reductions of both live coral cover and habitat complexity had substantial impacts on fish communities compared to relatively minor impacts after major reductions in coral cover but not habitat complexity. Where habitat complexity was substantially reduced, species abundances broadly declined and a far greater number of fish species were locally extirpated, including economically important fishes. This resulted in decreased species richness and a loss of diversity within functional groups. Our results suggest that the retention of habitat complexity following disturbances can ameliorate the impacts of coral declines on reef fishes, so preserving their capacity to perform important functional roles essential to reef resilience. These results add to a growing body of evidence about the importance of habitat complexity for reef fishes, and represent the first large-scale examination of this question on the Great Barrier Reef. PMID:25140801

  15. Retention of habitat complexity minimizes disassembly of reef fish communities following disturbance: a large-scale natural experiment.

    PubMed

    Emslie, Michael J; Cheal, Alistair J; Johns, Kerryn A

    2014-01-01

    High biodiversity ecosystems are commonly associated with complex habitats. Coral reefs are highly diverse ecosystems, but are under increasing pressure from numerous stressors, many of which reduce live coral cover and habitat complexity with concomitant effects on other organisms such as reef fishes. While previous studies have highlighted the importance of habitat complexity in structuring reef fish communities, they employed gradient or meta-analyses which lacked a controlled experimental design over broad spatial scales to explicitly separate the influence of live coral cover from overall habitat complexity. Here a natural experiment using a long term (20 year), spatially extensive (∼ 115,000 kms(2)) dataset from the Great Barrier Reef revealed the fundamental importance of overall habitat complexity for reef fishes. Reductions of both live coral cover and habitat complexity had substantial impacts on fish communities compared to relatively minor impacts after major reductions in coral cover but not habitat complexity. Where habitat complexity was substantially reduced, species abundances broadly declined and a far greater number of fish species were locally extirpated, including economically important fishes. This resulted in decreased species richness and a loss of diversity within functional groups. Our results suggest that the retention of habitat complexity following disturbances can ameliorate the impacts of coral declines on reef fishes, so preserving their capacity to perform important functional roles essential to reef resilience. These results add to a growing body of evidence about the importance of habitat complexity for reef fishes, and represent the first large-scale examination of this question on the Great Barrier Reef.

  16. [Minimal emotional dysfunction and first impression formation in personality disorders].

    PubMed

    Linden, M; Vilain, M

    2011-01-01

    "Minimal cerebral dysfunctions" are isolated impairments of basic mental functions, which are elements of complex functions like speech. The best described are cognitive dysfunctions such as reading and writing problems, dyscalculia, attention deficits, but also motor dysfunctions such as problems with articulation, hyperactivity or impulsivity. Personality disorders can be characterized by isolated emotional dysfunctions in relation to emotional adequacy, intensity and responsivity. For example, paranoid personality disorders can be characterized by continuous and inadequate distrust, as a disorder of emotional adequacy. Schizoid personality disorders can be characterized by low expressive emotionality, as a disorder of effect intensity, or dissocial personality disorders can be characterized by emotional non-responsivity. Minimal emotional dysfunctions cause interactional misunderstandings because of the psychology of "first impression formation". Studies have shown that in 100 ms persons build up complex and lasting emotional judgements about other persons. Therefore, minimal emotional dysfunctions result in interactional problems and adjustment disorders and in corresponding cognitive schemata.From the concept of minimal emotional dysfunctions specific psychotherapeutic interventions in respect to the patient-therapist relationship, the diagnostic process, the clarification of emotions and reality testing, and especially an understanding of personality disorders as impairment and "selection, optimization, and compensation" as a way of coping can be derived.

  17. Producing Zirconium Diboride Components with Complex, Near-Net Shape Geometries by Aqueous Room-Temperature Injection Molding

    NASA Technical Reports Server (NTRS)

    Wiesner, Valerie L.; Youngblood, Jeffrey; Trice, Rodney

    2014-01-01

    Room-temperature injection molding is proposed as a novel, low-cost and more energy efficient manufacturing process capable of forming complex-shaped zirconium diboride (ZrB2) parts. This innovative processing method utilized aqueous suspensions with high powder loading and a minimal amount (5 vol.) of water-soluble polyvinylpyrrolidone (PVP), which was used as a viscosity modifier. Rheological characterization was performed to evaluate the room-temperature flow properties of ZrB2-PVP suspensions. ZrB2 specimens were fabricated with high green body strength and were machinable prior to binder removal despite their low polymer content. After binder burnout and pressureless sintering, the bulk density and microstructure of specimens were characterized using Archimedes technique and scanning electron microscopy. X-Ray Diffraction was used to determine the phase compositions present in sintered specimens. Ultimate strength of sintered specimens will be determined using ASTM C1323-10 compressive C-ring test.

  18. Secondary task for full flight simulation incorporating tasks that commonly cause pilot error: Time estimation

    NASA Technical Reports Server (NTRS)

    Rosch, E.

    1975-01-01

    The task of time estimation, an activity occasionally performed by pilots during actual flight, was investigated with the objective of providing human factors investigators with an unobtrusive and minimally loading additional task that is sensitive to differences in flying conditions and flight instrumentation associated with the main task of piloting an aircraft simulator. Previous research indicated that the duration and consistency of time estimates is associated with the cognitive, perceptual, and motor loads imposed by concurrent simple tasks. The relationships between the length and variability of time estimates and concurrent task variables under a more complex situation involving simulated flight were clarified. The wrap-around effect with respect to baseline duration, a consequence of mode switching at intermediate levels of concurrent task distraction, should contribute substantially to estimate variability and have a complex effect on the shape of the resulting distribution of estimates.

  19. Artificial Intelligence Procedures for Tree Taper Estimation within a Complex Vegetation Mosaic in Brazil

    PubMed Central

    Nunes, Matheus Henrique

    2016-01-01

    Tree stem form in native tropical forests is very irregular, posing a challenge to establishing taper equations that can accurately predict the diameter at any height along the stem and subsequently merchantable volume. Artificial intelligence approaches can be useful techniques in minimizing estimation errors within complex variations of vegetation. We evaluated the performance of Random Forest® regression tree and Artificial Neural Network procedures in modelling stem taper. Diameters and volume outside bark were compared to a traditional taper-based equation across a tropical Brazilian savanna, a seasonal semi-deciduous forest and a rainforest. Neural network models were found to be more accurate than the traditional taper equation. Random forest showed trends in the residuals from the diameter prediction and provided the least precise and accurate estimations for all forest types. This study provides insights into the superiority of a neural network, which provided advantages regarding the handling of local effects. PMID:27187074

  20. Artificial Intelligence Procedures for Tree Taper Estimation within a Complex Vegetation Mosaic in Brazil.

    PubMed

    Nunes, Matheus Henrique; Görgens, Eric Bastos

    2016-01-01

    Tree stem form in native tropical forests is very irregular, posing a challenge to establishing taper equations that can accurately predict the diameter at any height along the stem and subsequently merchantable volume. Artificial intelligence approaches can be useful techniques in minimizing estimation errors within complex variations of vegetation. We evaluated the performance of Random Forest® regression tree and Artificial Neural Network procedures in modelling stem taper. Diameters and volume outside bark were compared to a traditional taper-based equation across a tropical Brazilian savanna, a seasonal semi-deciduous forest and a rainforest. Neural network models were found to be more accurate than the traditional taper equation. Random forest showed trends in the residuals from the diameter prediction and provided the least precise and accurate estimations for all forest types. This study provides insights into the superiority of a neural network, which provided advantages regarding the handling of local effects.

  1. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  2. Hierarchical complexity and the size limits of life.

    PubMed

    Heim, Noel A; Payne, Jonathan L; Finnegan, Seth; Knope, Matthew L; Kowalewski, Michał; Lyons, S Kathleen; McShea, Daniel W; Novack-Gottshall, Philip M; Smith, Felisa A; Wang, Steve C

    2017-06-28

    Over the past 3.8 billion years, the maximum size of life has increased by approximately 18 orders of magnitude. Much of this increase is associated with two major evolutionary innovations: the evolution of eukaryotes from prokaryotic cells approximately 1.9 billion years ago (Ga), and multicellular life diversifying from unicellular ancestors approximately 0.6 Ga. However, the quantitative relationship between organismal size and structural complexity remains poorly documented. We assessed this relationship using a comprehensive dataset that includes organismal size and level of biological complexity for 11 172 extant genera. We find that the distributions of sizes within complexity levels are unimodal, whereas the aggregate distribution is multimodal. Moreover, both the mean size and the range of size occupied increases with each additional level of complexity. Increases in size range are non-symmetric: the maximum organismal size increases more than the minimum. The majority of the observed increase in organismal size over the history of life on the Earth is accounted for by two discrete jumps in complexity rather than evolutionary trends within levels of complexity. Our results provide quantitative support for an evolutionary expansion away from a minimal size constraint and suggest a fundamental rescaling of the constraints on minimal and maximal size as biological complexity increases. © 2017 The Author(s).

  3. Information Search and Decision Making: The Effects of Age and Complexity on Strategy Use

    PubMed Central

    Queen, Tara L.; Hess, Thomas M.; Ennis, Gilda E.; Dowd, Keith; Grühn, Daniel

    2012-01-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults’ performance. Participants utilized two decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants’ preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and this ability may benefit from accrued knowledge and experience. PMID:22663157

  4. Brain correlates of aesthetic judgment of beauty.

    PubMed

    Jacobsen, Thomas; Schubotz, Ricarda I; Höfel, Lea; Cramon, D Yves V

    2006-01-01

    Functional MRI was used to investigate the neural correlates of aesthetic judgments of beauty of geometrical shapes. Participants performed evaluative aesthetic judgments (beautiful or not?) and descriptive symmetry judgments (symmetric or not?) on the same stimulus material. Symmetry was employed because aesthetic judgments are known to be often guided by criteria of symmetry. Novel, abstract graphic patterns were presented to minimize influences of attitudes or memory-related processes and to test effects of stimulus symmetry and complexity. Behavioral results confirmed the influence of stimulus symmetry and complexity on aesthetic judgments. Direct contrasts showed specific activations for aesthetic judgments in the frontomedian cortex (BA 9/10), bilateral prefrontal BA 45/47, and posterior cingulate, left temporal pole, and the temporoparietal junction. In contrast, symmetry judgments elicited specific activations in parietal and premotor areas subserving spatial processing. Interestingly, beautiful judgments enhanced BOLD signals not only in the frontomedian cortex, but also in the left intraparietal sulcus of the symmetry network. Moreover, stimulus complexity caused differential effects for each of the two judgment types. Findings indicate aesthetic judgments of beauty to rely on a network partially overlapping with that underlying evaluative judgments on social and moral cues and substantiate the significance of symmetry and complexity for our judgment of beauty.

  5. Virtual reality system for planning minimally invasive neurosurgery. Technical note.

    PubMed

    Stadie, Axel Thomas; Kockro, Ralf Alfons; Reisch, Robert; Tropine, Andrei; Boor, Stephan; Stoeter, Peter; Perneczky, Axel

    2008-02-01

    The authors report on their experience with a 3D virtual reality system for planning minimally invasive neurosurgical procedures. Between October 2002 and April 2006, the authors used the Dextroscope (Volume Interactions, Ltd.) to plan neurosurgical procedures in 106 patients, including 100 with intracranial and 6 with spinal lesions. The planning was performed 1 to 3 days preoperatively, and in 12 cases, 3D prints of the planning procedure were taken into the operating room. A questionnaire was completed by the neurosurgeon after the planning procedure. After a short period of acclimatization, the system proved easy to operate and is currently used routinely for preoperative planning of difficult cases at the authors' institution. It was felt that working with a virtual reality multimodal model of the patient significantly improved surgical planning. The pathoanatomy in individual patients could easily be understood in great detail, enabling the authors to determine the surgical trajectory precisely and in the most minimally invasive way. The authors found the preoperative 3D model to be in high concordance with intraoperative conditions; the resulting intraoperative "déjà-vu" feeling enhanced surgical confidence. In all procedures planned with the Dextroscope, the chosen surgical strategy proved to be the correct choice. Three-dimensional virtual reality models of a patient allow quick and easy understanding of complex intracranial lesions.

  6. Incorporating Auditory Models in Speech/Audio Applications

    NASA Astrophysics Data System (ADS)

    Krishnamoorthi, Harish

    2011-12-01

    Following the success in incorporating perceptual models in audio coding algorithms, their application in other speech/audio processing systems is expanding. In general, all perceptual speech/audio processing algorithms involve minimization of an objective function that directly/indirectly incorporates properties of human perception. This dissertation primarily investigates the problems associated with directly embedding an auditory model in the objective function formulation and proposes possible solutions to overcome high complexity issues for use in real-time speech/audio algorithms. Specific problems addressed in this dissertation include: 1) the development of approximate but computationally efficient auditory model implementations that are consistent with the principles of psychoacoustics, 2) the development of a mapping scheme that allows synthesizing a time/frequency domain representation from its equivalent auditory model output. The first problem is aimed at addressing the high computational complexity involved in solving perceptual objective functions that require repeated application of auditory model for evaluation of different candidate solutions. In this dissertation, a frequency pruning and a detector pruning algorithm is developed that efficiently implements the various auditory model stages. The performance of the pruned model is compared to that of the original auditory model for different types of test signals in the SQAM database. Experimental results indicate only a 4-7% relative error in loudness while attaining up to 80-90 % reduction in computational complexity. Similarly, a hybrid algorithm is developed specifically for use with sinusoidal signals and employs the proposed auditory pattern combining technique together with a look-up table to store representative auditory patterns. The second problem obtains an estimate of the auditory representation that minimizes a perceptual objective function and transforms the auditory pattern back to its equivalent time/frequency representation. This avoids the repeated application of auditory model stages to test different candidate time/frequency vectors in minimizing perceptual objective functions. In this dissertation, a constrained mapping scheme is developed by linearizing certain auditory model stages that ensures obtaining a time/frequency mapping corresponding to the estimated auditory representation. This paradigm was successfully incorporated in a perceptual speech enhancement algorithm and a sinusoidal component selection task.

  7. [Analysis of Conformational Features of Watson-Crick Duplex Fragments by Molecular Mechanics and Quantum Mechanics Methods].

    PubMed

    Poltev, V I; Anisimov, V M; Sanchez, C; Deriabina, A; Gonzalez, E; Garcia, D; Rivas, F; Polteva, N A

    2016-01-01

    It is generally accepted that the important characteristic features of the Watson-Crick duplex originate from the molecular structure of its subunits. However, it still remains to elucidate what properties of each subunit are responsible for the significant characteristic features of the DNA structure. The computations of desoxydinucleoside monophosphates complexes with Na-ions using density functional theory revealed a pivotal role of DNA conformational properties of single-chain minimal fragments in the development of unique features of the Watson-Crick duplex. We found that directionality of the sugar-phosphate backbone and the preferable ranges of its torsion angles, combined with the difference between purines and pyrimidines. in ring bases, define the dependence of three-dimensional structure of the Watson-Crick duplex on nucleotide base sequence. In this work, we extended these density functional theory computations to the minimal' fragments of DNA duplex, complementary desoxydinucleoside monophosphates complexes with Na-ions. Using several computational methods and various functionals, we performed a search for energy minima of BI-conformation for complementary desoxydinucleoside monophosphates complexes with different nucleoside sequences. Two sequences are optimized using ab initio method at the MP2/6-31++G** level of theory. The analysis of torsion angles, sugar ring puckering and mutual base positions of optimized structures demonstrates that the conformational characteristic features of complementary desoxydinucleoside monophosphates complexes with Na-ions remain within BI ranges and become closer to the corresponding characteristic features of the Watson-Crick duplex crystals. Qualitatively, the main characteristic features of each studied complementary desoxydinucleoside monophosphates complex remain invariant when different computational methods are used, although the quantitative values of some conformational parameters could vary lying within the limits typical for the corresponding family. We observe that popular functionals in density functional theory calculations lead to the overestimated distances between base pairs, while MP2 computations and the newer complex functionals produce the structures that have too close atom-atom contacts. A detailed study of some complementary desoxydinucleoside monophosphate complexes with Na-ions highlights the existence of several energy minima corresponding to BI-conformations, in other words, the complexity of the relief pattern of the potential energy surface of complementary desoxydinucleoside monophosphate complexes. This accounts for variability of conformational parameters of duplex fragments with the same base sequence. Popular molecular mechanics force fields AMBER and CHARMM reproduce most of the conformational characteristics of desoxydinucleoside monophosphates and their complementary complexes with Na-ions but fail to reproduce some details of the dependence of the Watson-Crick duplex conformation on the nucleotide sequence.

  8. Minimization of Dependency Length in Written English

    ERIC Educational Resources Information Center

    Temperley, David

    2007-01-01

    Gibson's Dependency Locality Theory (DLT) [Gibson, E. 1998. "Linguistic complexity: locality of syntactic dependencies." "Cognition," 68, 1-76; Gibson, E. 2000. "The dependency locality theory: A distance-based theory of linguistic complexity." In A. Marantz, Y. Miyashita, & W. O'Neil (Eds.), "Image,…

  9. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  10. Optimizing classification performance in an object-based very-high-resolution land use-land cover urban application

    NASA Astrophysics Data System (ADS)

    Georganos, Stefanos; Grippa, Tais; Vanhuysse, Sabine; Lennert, Moritz; Shimoni, Michal; Wolff, Eléonore

    2017-10-01

    This study evaluates the impact of three Feature Selection (FS) algorithms in an Object Based Image Analysis (OBIA) framework for Very-High-Resolution (VHR) Land Use-Land Cover (LULC) classification. The three selected FS algorithms, Correlation Based Selection (CFS), Mean Decrease in Accuracy (MDA) and Random Forest (RF) based Recursive Feature Elimination (RFE), were tested on Support Vector Machine (SVM), K-Nearest Neighbor, and Random Forest (RF) classifiers. The results demonstrate that the accuracy of SVM and KNN classifiers are the most sensitive to FS. The RF appeared to be more robust to high dimensionality, although a significant increase in accuracy was found by using the RFE method. In terms of classification accuracy, SVM performed the best using FS, followed by RF and KNN. Finally, only a small number of features is needed to achieve the highest performance using each classifier. This study emphasizes the benefits of rigorous FS for maximizing performance, as well as for minimizing model complexity and interpretation.

  11. Distributed Space Mission Design for Earth Observation Using Model-Based Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Cervantes, Ben; DeWeck, Oliver

    2015-01-01

    Distributed Space Missions (DSMs) are gaining momentum in their application to earth observation missions owing to their unique ability to increase observation sampling in multiple dimensions. DSM design is a complex problem with many design variables, multiple objectives determining performance and cost and emergent, often unexpected, behaviors. There are very few open-access tools available to explore the tradespace of variables, minimize cost and maximize performance for pre-defined science goals, and therefore select the most optimal design. This paper presents a software tool that can multiple DSM architectures based on pre-defined design variable ranges and size those architectures in terms of predefined science and cost metrics. The tool will help a user select Pareto optimal DSM designs based on design of experiments techniques. The tool will be applied to some earth observation examples to demonstrate its applicability in making some key decisions between different performance metrics and cost metrics early in the design lifecycle.

  12. Multi-Objective Control Optimization for Greenhouse Environment Using Evolutionary Algorithms

    PubMed Central

    Hu, Haigen; Xu, Lihong; Wei, Ruihua; Zhu, Bingkun

    2011-01-01

    This paper investigates the issue of tuning the Proportional Integral and Derivative (PID) controller parameters for a greenhouse climate control system using an Evolutionary Algorithm (EA) based on multiple performance measures such as good static-dynamic performance specifications and the smooth process of control. A model of nonlinear thermodynamic laws between numerous system variables affecting the greenhouse climate is formulated. The proposed tuning scheme is tested for greenhouse climate control by minimizing the integrated time square error (ITSE) and the control increment or rate in a simulation experiment. The results show that by tuning the gain parameters the controllers can achieve good control performance through step responses such as small overshoot, fast settling time, and less rise time and steady state error. Besides, it can be applied to tuning the system with different properties, such as strong interactions among variables, nonlinearities and conflicting performance criteria. The results implicate that it is a quite effective and promising tuning method using multi-objective optimization algorithms in the complex greenhouse production. PMID:22163927

  13. Designing safety into the minimally invasive surgical revolution: a commentary based on the Jacques Perissat Lecture of the International Congress of the European Association for Endoscopic Surgery.

    PubMed

    Clarke, John R

    2009-01-01

    Surgical errors with minimally invasive surgery differ from those in open surgery. Perforations are typically the result of trocar introduction or electrosurgery. Infections include bioburdens, notably enteric viruses, on complex instruments. Retained foreign objects are primarily unretrieved device fragments and lost gallstones or other specimens. Fires and burns come from illuminated ends of fiber-optic cables and from electrosurgery. Pressure ischemia is more likely with longer endoscopic surgical procedures. Gas emboli can occur. Minimally invasive surgery is more dependent on complex equipment, with high likelihood of failures. Standardization, checklists, and problem reporting are solutions for minimizing failures. The necessity of electrosurgery makes education about best electrosurgical practices important. The recording of minimally invasive surgical procedures is an opportunity to debrief in a way that improves the reliability of future procedures. Safety depends on reliability, designing systems to withstand inevitable human errors. Safe systems are characterized by a commitment to safety, formal protocols for communications, teamwork, standardization around best practice, and reporting of problems for improvement of the system. Teamwork requires shared goals, mental models, and situational awareness in order to facilitate mutual monitoring and backup. An effective team has a flat hierarchy; team members are empowered to speak up if they are concerned about problems. Effective teams plan, rehearse, distribute the workload, and debrief. Surgeons doing minimally invasive surgery have a unique opportunity to incorporate the principles of safety into the development of their discipline.

  14. Complexes of a Zn-metalloenzyme binding site with hydroxamate-containing ligands. A case for detailed benchmarkings of polarizable molecular mechanics/dynamics potentials when the experimental binding structure is unknown.

    PubMed

    Gresh, Nohad; Perahia, David; de Courcy, Benoit; Foret, Johanna; Roux, Céline; El-Khoury, Lea; Piquemal, Jean-Philip; Salmon, Laurent

    2016-12-15

    Zn-metalloproteins are a major class of targets for drug design. They constitute a demanding testing ground for polarizable molecular mechanics/dynamics aimed at extending the realm of quantum chemistry (QC) to very long-duration molecular dynamics (MD). The reliability of such procedures needs to be demonstrated upon comparing the relative stabilities of competing candidate complexes of inhibitors with the recognition site stabilized in the course of MD. This could be necessary when no information is available regarding the experimental structure of the inhibitor-protein complex. Thus, this study bears on the phosphomannose isomerase (PMI) enzyme, considered as a potential therapeutic target for the treatment of several bacterial and parasitic diseases. We consider its complexes with 5-phospho-d-arabinonohydroxamate and three analog ligands differing by the number and location of their hydroxyl groups. We evaluate the energy accuracy expectable from a polarizable molecular mechanics procedure, SIBFA. This is done by comparisons with ab initio quantum-chemistry (QC) calculations in the following cases: (a) the complexes of the four ligands in three distinct structures extracted from the entire PMI-ligand energy-minimized structures, and totaling up to 264 atoms; (b) the solvation energies of several energy-minimized complexes of each ligand with a shell of 64 water molecules; (c) the conformational energy differences of each ligand in different conformations characterized in the course of energy-minimizations; and (d) the continuum solvation energies of the ligands in different conformations. The agreements with the QC results appear convincing. On these bases, we discuss the prospects of applying the procedure to ligand-macromolecule recognition problems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  15. Power Allocation and Outage Probability Analysis for SDN-based Radio Access Networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yongxu; Chen, Yueyun; Mai, Zhiyuan

    2018-01-01

    In this paper, performance of Access network Architecture based SDN (Software Defined Network) is analyzed with respect to the power allocation issue. A power allocation scheme PSO-PA (Particle Swarm Optimization-power allocation) algorithm is proposed, the proposed scheme is subjected to constant total power with the objective of minimizing system outage probability. The entire access network resource configuration is controlled by the SDN controller, then it sends the optimized power distribution factor to the base station source node (SN) and the relay node (RN). Simulation results show that the proposed scheme reduces the system outage probability at a low complexity.

  16. Computationally efficient algorithms for real-time attitude estimation

    NASA Technical Reports Server (NTRS)

    Pringle, Steven R.

    1993-01-01

    For many practical spacecraft applications, algorithms for determining spacecraft attitude must combine inputs from diverse sensors and provide redundancy in the event of sensor failure. A Kalman filter is suitable for this task, however, it may impose a computational burden which may be avoided by sub optimal methods. A suboptimal estimator is presented which was implemented successfully on the Delta Star spacecraft which performed a 9 month SDI flight experiment in 1989. This design sought to minimize algorithm complexity to accommodate the limitations of an 8K guidance computer. The algorithm used is interpreted in the framework of Kalman filtering and a derivation is given for the computation.

  17. Cassini Maneuver Experience for the Fourth Year of the Solstice Mission

    NASA Technical Reports Server (NTRS)

    Vaquero, Mar; Hahn, Yungsun; Stumpf, Paul; Valerino, Powtawche; Wagner, Sean; Wong, Mau

    2014-01-01

    After sixteen years of successful mission operations and invaluable scientific discoveries, the Cassini orbiter continues to tour Saturn on the most complex gravity-assist trajectory ever flown. To ensure that the end-of-mission target of September 2017 is achieved, propellant preservation is highly prioritized over maneuver cycle minimization. Thus, the maneuver decision process, which includes determining whether a maneuver is performed or canceled, designing a targeting strategy and selecting the engine for execution, is being continuously re-evaluated. This paper summarizes the maneuver experience throughout the fourth year of the Solstice Mission highlighting 27 maneuvers targeted to nine Titan flybys.

  18. Regeneration of anion exchange resins by catalyzed electrochemical reduction

    DOEpatents

    Gu, Baohua; Brown, Gilbert M.

    2002-01-01

    Anion exchange resins sorbed with perchlorate may be regenerated by a combination of chemical reduction of perchlorate to chloride using a reducing agent and an electrochemical reduction of the oxidized reducing agent. Transitional metals including Ti, Re, and V are preferred chemical reagents for the reduction of perchlorate to chloride. Complexing agents such as oxalate are used to prevent the precipitation of the oxidized Ti(IV) species, and ethyl alcohol may be added to accelerate the reduction kinetics of perchlorate. The regeneration may be performed by continuously recycling the regenerating solution through the resin bed and an electrochemical cell so that the secondary waste generation is minimized.

  19. Opportunistic Behavior in Motivated Learning Agents.

    PubMed

    Graham, James; Starzyk, Janusz A; Jachyra, Daniel

    2015-08-01

    This paper focuses on the novel motivated learning (ML) scheme and opportunistic behavior of an intelligent agent. It extends previously developed ML to opportunistic behavior in a multitask situation. Our paper describes the virtual world implementation of autonomous opportunistic agents learning in a dynamically changing environment, creating abstract goals, and taking advantage of arising opportunities to improve their performance. An opportunistic agent achieves better results than an agent based on ML only. It does so by minimizing the average value of all need signals rather than a dominating need. This paper applies to the design of autonomous embodied systems (robots) learning in real-time how to operate in a complex environment.

  20. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  1. AEROSOL PARTICLE COLLECTOR DESIGN STUDY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S; Richard Dimenna, R

    2007-09-27

    A computational evaluation of a particle collector design was performed to evaluate the behavior of aerosol particles in a fast flowing gas stream. The objective of the work was to improve the collection efficiency of the device while maintaining a minimum specified air throughput, nominal collector size, and minimal power requirements. The impact of a range of parameters was considered subject to constraints on gas flow rate, overall collector dimensions, and power limitations. Potential improvements were identified, some of which have already been implemented. Other more complex changes were identified and are described here for further consideration. In addition, fruitfulmore » areas for further study are proposed.« less

  2. Design of linear quadratic regulators with eigenvalue placement in a specified region

    NASA Technical Reports Server (NTRS)

    Shieh, Leang-San; Zhen, Liu; Coleman, Norman P.

    1990-01-01

    Two linear quadratic regulators are developed for placing the closed-loop poles of linear multivariable continuous-time systems within the common region of an open sector, bounded by lines inclined at +/- pi/2k (for a specified integer k not less than 1) from the negative real axis, and the left-hand side of a line parallel to the imaginary axis in the complex s-plane, and simultaneously minimizing a quadratic performance index. The design procedure mainly involves the solution of either Liapunov equations or Riccati equations. The general expression for finding the lower bound of a constant gain gamma is also developed.

  3. Laparoscopic paraesophageal hernia repair: current controversies.

    PubMed

    Soper, Nathaniel J; Teitelbaum, Ezra N

    2013-10-01

    The advent of laparoscopy has significantly improved postoperative outcomes in patients undergoing surgical repair of a paraesophageal hernia. Although this minimally invasive approach considerably reduces postoperative pain and recovery times, and may improve physiologic outcomes, laparoscopic paraesophageal hernia repair remains a complex operation requiring advanced laparoscopic skills and experience with the anatomy of the gastroesophageal junction and diaphragmatic hiatus. In this article, we describe our approach to patient selection, preoperative evaluation, operative technique, and postoperative management. Specific attention is paid to performing an adequate hiatal dissection and esophageal mobilization, the decision of whether to use a mesh to reinforce the crural repair, and construction of an adequate antireflux barrier (ie, fundoplication).

  4. Localized heating/bonding techniques in MEMS packaging

    NASA Astrophysics Data System (ADS)

    Mabesa, J. R., Jr.; Scott, A. J.; Wu, X.; Auner, G. W.

    2005-05-01

    Packaging is used to protect and enable intelligent sensor systems utilized in manned/unmanned ground vehicle systems/subsystems. Because Micro electro mechanical systems (MEMS) are used often in these sensor or actuation products, it must interact with the surrounding environment, which may be in direct conflict with the desire to isolate the electronics for improved reliability/durability performance. For some very simple devices, performance requirements may allow a high degree of isolation from the environment (e.g., stints and accelerometers). Other more complex devices (i.e. chemical and biological analysis systems, particularly in vivo systems) present extremely complex packaging requirements. Power and communications to MEMS device arrays are also extremely problematic. The following describes the research being performed at the U.S. Army Research, Development, and Engineering Command (RDECOM) Tank and Automotive Research, Development, and Engineering Center (TARDEC), in collaboration with Wayne State University, in Detroit, MI. The focus of the packaging research is limited to six main categories: a) provision for feed-through for electrical, optical, thermal, and fluidic interfaces; b) environmental management including atmosphere, hermiticity, and temperature; c) control of stress and mechanical durability; d) management of thermal properties to minimize absorption and/or emission; e) durability and structural integrity; and f) management of RF/magnetic/electrical and optical interference and/or radiation properties and exposure.

  5. Process Mining-Based Method of Designing and Optimizing the Layouts of Emergency Departments in Hospitals.

    PubMed

    Rismanchian, Farhood; Lee, Young Hoon

    2017-07-01

    This article proposes an approach to help designers analyze complex care processes and identify the optimal layout of an emergency department (ED) considering several objectives simultaneously. These objectives include minimizing the distances traveled by patients, maximizing design preferences, and minimizing the relocation costs. Rising demand for healthcare services leads to increasing demand for new hospital buildings as well as renovating existing ones. Operations management techniques have been successfully applied in both manufacturing and service industries to design more efficient layouts. However, high complexity of healthcare processes makes it challenging to apply these techniques in healthcare environments. Process mining techniques were applied to address the problem of complexity and to enhance healthcare process analysis. Process-related information, such as information about the clinical pathways, was extracted from the information system of an ED. A goal programming approach was then employed to find a single layout that would simultaneously satisfy several objectives. The layout identified using the proposed method improved the distances traveled by noncritical and critical patients by 42.2% and 47.6%, respectively, and minimized the relocation costs. This study has shown that an efficient placement of the clinical units yields remarkable improvements in the distances traveled by patients.

  6. Karolinska prostatectomy: a robot-assisted laparoscopic radical prostatectomy technique.

    PubMed

    Nilsson, Andreas E; Carlsson, Stefan; Laven, Brett A; Wiklund, N Peter

    2006-01-01

    The last decade has witnessed an increasing trend towards minimally invasive management of prostate cancer, including laparoscopic and, more recently, robot-assisted laparoscopic prostatectomy. Several different laparoscopic approaches have been continuously developed during the last 5 years and it is still unclear which technique yields the best outcome. We present our current technique of robot-assisted laparoscopic radical prostatectomy. The technique described has evolved during the course of >400 robotic prostatectomies performed by the robotic team since the robot-assisted laparoscopic radical prostatectomy program was introduced at Karolinska University Hospital in January 2002. Our procedure comprises several modifications of previously reported ones, and we utilize fewer robotic instruments to reduce costs. An extended posterior dissection is performed to aid in the bladder neck-sparing dissection. In nerve-sparing procedures the vesicles are divided to avoid damage to the erectile nerves. In order to preserve the apical anatomy the dorsal venous complex is incised sharply and is first over-sewn after the apical dissection is completed. Our technique enables a more fluent dissection than previously described robotic techniques. Minimizing changes of instruments and the camera not only cuts costs but also reduces inefficient operating maneuvers, such as switching between 30 degrees and 0 degrees lenses during the procedure. We present a technique which in our hands has achieved excellent functional and oncological results.

  7. An Efficient Multiblock Method for Aerodynamic Analysis and Design on Distributed Memory Systems

    NASA Technical Reports Server (NTRS)

    Reuther, James; Alonso, Juan Jose; Vassberg, John C.; Jameson, Antony; Martinelli, Luigi

    1997-01-01

    The work presented in this paper describes the application of a multiblock gridding strategy to the solution of aerodynamic design optimization problems involving complex configurations. The design process is parallelized using the MPI (Message Passing Interface) Standard such that it can be efficiently run on a variety of distributed memory systems ranging from traditional parallel computers to networks of workstations. Substantial improvements to the parallel performance of the baseline method are presented, with particular attention to their impact on the scalability of the program as a function of the mesh size. Drag minimization calculations at a fixed coefficient of lift are presented for a business jet configuration that includes the wing, body, pylon, aft-mounted nacelle, and vertical and horizontal tails. An aerodynamic design optimization is performed with both the Euler and Reynolds Averaged Navier-Stokes (RANS) equations governing the flow solution and the results are compared. These sample calculations establish the feasibility of efficient aerodynamic optimization of complete aircraft configurations using the RANS equations as the flow model. There still exists, however, the need for detailed studies of the importance of a true viscous adjoint method which holds the promise of tackling the minimization of not only the wave and induced components of drag, but also the viscous drag.

  8. Vascular surgery trainees still need to learn how to sew: importance of learning surgical techniques in the era of endovascular surgery.

    PubMed

    Aziz, Faisal

    2015-01-01

    Vascular surgery represents one of the most rapidly evolving specialties in the field of surgery. It was merely 100 years ago when Dr. Alexis Carrel described vascular anastomosis. Over the course of next several decades, vascular surgeons distinguished themselves from general surgeons by horning the techniques of vascular surgery operations. In the era of minimally invasive interventions, the number of endovascular interventions performed by vascular surgeons has increased exponentially. Vascular surgery trainees in the current times spend considerable time in mastering the techniques of endovascular operations. Unfortunately, the reduction in number of open surgical operations has lead to concerns in regards to adequacy of learning open surgical techniques. In future, majority of vascular interventions will be done with minimally invasive techniques. Combination of poor training in open operations and increasing complexity of open surgical operations may lead to poor surgical outcomes. It is the need of the hour for vascular surgery trainees to realize the importance of learning and mastering open surgical techniques. One of the most distinguishing features of contemporary vascular surgeons is their ability to perform both endovascular and open vascular surgery operations, and we should strive to maintain our excellence in both of these arenas.

  9. Smartphones for cell and biomolecular detection.

    PubMed

    Liu, Xiyuan; Lin, Tung-Yi; Lillehoj, Peter B

    2014-11-01

    Recent advances in biomedical science and technology have played a significant role in the development of new sensors and assays for cell and biomolecular detection. Generally, these efforts are aimed at reducing the complexity and costs associated with diagnostic testing so that it can be performed outside of a laboratory or hospital setting, requiring minimal equipment and user involvement. In particular, point-of-care (POC) testing offers immense potential for many important applications including medical diagnosis, environmental monitoring, food safety, and biosecurity. When coupled with smartphones, POC systems can offer portability, ease of use and enhanced functionality while maintaining performance. This review article focuses on recent advancements and developments in smartphone-based POC systems within the last 6 years with an emphasis on cell and biomolecular detection. These devices typically comprise multiple components, such as detectors, sample processors, disposable chips, batteries, and software, which are integrated with a commercial smartphone. One of the most important aspects of developing these systems is the integration of these components onto a compact and lightweight platform that requires minimal power. Researchers have demonstrated several promising approaches employing various detection schemes and device configurations, and it is expected that further developments in biosensors, battery technology and miniaturized electronics will enable smartphone-based POC technologies to become more mainstream tools in the scientific and biomedical communities.

  10. Endoscopically assisted tunnel approach for minimally invasive corticotomies: a preliminary report.

    PubMed

    Hernández-Alfaro, Federico; Guijarro-Martínez, Raquel

    2012-05-01

    The dental community has expressed low acceptance of traditional corticotomy techniques for corticotomy-facilitated orthodontics. These procedures are time consuming, entail substantial postoperative morbidity and periodontal risks, and are often perceived as highly invasive. A total of 114 interdental sites were treated in nine consecutive patients. Under local anesthesia, a tunnel approach requiring one to three vertical incisions per arch (depending on the targeted teeth) was used. Piezosurgical corticotomies and elective bone augmentation procedures were performed under endoscopic assistance. Postoperative cone-beam computerized tomography evaluation was used to confirm adequate corticotomy depth. Procedures were completed in a mean time of 26 minutes. Follow-up evaluations revealed no loss of tooth vitality, no changes in periodontal probing depth, good preservation of the papillae, and no gingival recession. No evidence of crestal bone height reduction or apical root resorption was detected. The tunnel approach minimizes soft-tissue debridement and permits effective cortical cuts. The combination of piezosurgery technique with endoscopic assistance provides a quick, reliable means to design and perform these corticotomies while maximizing root integrity preservation. Moreover, the sites needing bone augmentation are selected under direct vision. Compared to traditional corticotomies, this procedure has manifest advantages in surgical time, technical complexity, patient morbidity, and periodontium preservation.

  11. Dynamical minimalism: why less is more in psychology.

    PubMed

    Nowak, Andrzej

    2004-01-01

    The principle of parsimony, embraced in all areas of science, states that simple explanations are preferable to complex explanations in theory construction. Parsimony, however, can necessitate a trade-off with depth and richness in understanding. The approach of dynamical minimalism avoids this trade-off. The goal of this approach is to identify the simplest mechanisms and fewest variables capable of producing the phenomenon in question. A dynamical model in which change is produced by simple rules repetitively interacting with each other can exhibit unexpected and complex properties. It is thus possible to explain complex psychological and social phenomena with very simple models if these models are dynamic. In dynamical minimalist theories, then, the principle of parsimony can be followed without sacrificing depth in understanding. Computer simulations have proven especially useful for investigating the emergent properties of simple models.

  12. Lamellar cationic lipid-DNA complexes from lipids with a strong preference for planar geometry: A Minimal Electrostatic Model.

    PubMed

    Perico, Angelo; Manning, Gerald S

    2014-11-01

    We formulate and analyze a minimal model, based on condensation theory, of the lamellar cationic lipid (CL)-DNA complex of alternately charged lipid bilayers and DNA monolayers in a salt solution. Each lipid bilayer, composed by a random mixture of cationic and neutral lipids, is assumed to be a rigid uniformly charged plane. Each DNA monolayer, located between two lipid bilayers, is formed by the same number of parallel DNAs with a uniform separation distance. For the electrostatic calculation, the model lipoplex is collapsed to a single plane with charge density equal to the net lipid and DNA charge. The free energy difference between the lamellar lipoplex and a reference state of the same number of free lipid bilayers and free DNAs, is calculated as a function of the fraction of CLs, of the ratio of the number of CL charges to the number of negative charges of the DNA phosphates, and of the total number of planes. At the isoelectric point the free energy difference is minimal. The complex formation, already favoured by the decrease of the electrostatic charging free energy, is driven further by the free energy gain due to the release of counterions from the DNAs and from the lipid bilayers, if strongly charged. This minimal model compares well with experiment for lipids having a strong preference for planar geometry and with major features of more detailed models of the lipoplex. © 2014 Wiley Periodicals, Inc.

  13. Free Energy and Virtual Reality in Neuroscience and Psychoanalysis: A Complexity Theory of Dreaming and Mental Disorder

    PubMed Central

    Hopkins, Jim

    2016-01-01

    The main concepts of the free energy (FE) neuroscience developed by Karl Friston and colleagues parallel those of Freud's Project for a Scientific Psychology. In Hobson et al. (2014) these include an innate virtual reality generator that produces the fictive prior beliefs that Freud described as the primary process. This enables Friston's account to encompass a unified treatment—a complexity theory—of the role of virtual reality in both dreaming and mental disorder. In both accounts the brain operates to minimize FE aroused by sensory impingements—including interoceptive impingements that report compliance with biological imperatives—and constructs a representation/model of the causes of impingement that enables this minimization. In Friston's account (variational) FE equals complexity minus accuracy, and is minimized by increasing accuracy and decreasing complexity. Roughly the brain (or model) increases accuracy together with complexity in waking. This is mediated by consciousness-creating active inference—by which it explains sensory impingements in terms of perceptual experiences of their causes. In sleep it reduces complexity by processes that include both synaptic pruning and consciousness/virtual reality/dreaming in REM. The consciousness-creating active inference that effects complexity-reduction in REM dreaming must operate on FE-arousing data distinct from sensory impingement. The most relevant source is remembered arousals of emotion, both recent and remote, as processed in SWS and REM on “active systems” accounts of memory consolidation/reconsolidation. Freud describes these remembered arousals as condensed in the dreamwork for use in the conscious contents of dreams, and similar condensation can be seen in symptoms. Complexity partly reflects emotional conflict and trauma. This indicates that dreams and symptoms are both produced to reduce complexity in the form of potentially adverse (traumatic or conflicting) arousals of amygdala-related emotions. Mental disorder is thus caused by computational complexity together with mechanisms like synaptic pruning that have evolved for complexity-reduction; and important features of disorder can be understood in these terms. Details of the consilience among Freudian, systems consolidation, and complexity-reduction accounts appear clearly in the analysis of a single fragment of a dream, indicating also how complexity reduction proceeds by a process resembling Bayesian model selection. PMID:27471478

  14. Liouville action as path-integral complexity: from continuous tensor networks to AdS/CFT

    NASA Astrophysics Data System (ADS)

    Caputa, Pawel; Kundu, Nilay; Miyaji, Masamichi; Takayanagi, Tadashi; Watanabe, Kento

    2017-11-01

    We propose an optimization procedure for Euclidean path-integrals that evaluate CFT wave functionals in arbitrary dimensions. The optimization is performed by minimizing certain functional, which can be interpreted as a measure of computational complexity, with respect to background metrics for the path-integrals. In two dimensional CFTs, this functional is given by the Liouville action. We also formulate the optimization for higher dimensional CFTs and, in various examples, find that the optimized hyperbolic metrics coincide with the time slices of expected gravity duals. Moreover, if we optimize a reduced density matrix, the geometry becomes two copies of the entanglement wedge and reproduces the holographic entanglement entropy. Our approach resembles a continuous tensor network renormalization and provides a concrete realization of the proposed interpretation of AdS/CFT as tensor networks. The present paper is an extended version of our earlier report arXiv:1703.00456 and includes many new results such as evaluations of complexity functionals, energy stress tensor, higher dimensional extensions and time evolutions of thermofield double states.

  15. 40 CFR 230.45 - Riffle and pool complexes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Section 230.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING SECTION... Impacts on Special Aquatic Sites § 230.45 Riffle and pool complexes. (a) Steep gradient sections of... modification. Note: Possible actions to minimize adverse impacts on site or material characteristics can be...

  16. 40 CFR 230.45 - Riffle and pool complexes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Section 230.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING SECTION... Impacts on Special Aquatic Sites § 230.45 Riffle and pool complexes. (a) Steep gradient sections of... modification. Note: Possible actions to minimize adverse impacts on site or material characteristics can be...

  17. 40 CFR 230.45 - Riffle and pool complexes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Section 230.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING SECTION... Impacts on Special Aquatic Sites § 230.45 Riffle and pool complexes. (a) Steep gradient sections of... modification. Note: Possible actions to minimize adverse impacts on site or material characteristics can be...

  18. Model Based Autonomy for Robust Mars Operations

    NASA Technical Reports Server (NTRS)

    Kurien, James A.; Nayak, P. Pandurang; Williams, Brian C.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Space missions have historically relied upon a large ground staff, numbering in the hundreds for complex missions, to maintain routine operations. When an anomaly occurs, this small army of engineers attempts to identify and work around the problem. A piloted Mars mission, with its multiyear duration, cost pressures, half-hour communication delays and two-week blackouts cannot be closely controlled by a battalion of engineers on Earth. Flight crew involvement in routine system operations must also be minimized to maximize science return. It also may be unrealistic to require the crew have the expertise in each mission subsystem needed to diagnose a system failure and effect a timely repair, as engineers did for Apollo 13. Enter model-based autonomy, which allows complex systems to autonomously maintain operation despite failures or anomalous conditions, contributing to safe, robust, and minimally supervised operation of spacecraft, life support, In Situ Resource Utilization (ISRU) and power systems. Autonomous reasoning is central to the approach. A reasoning algorithm uses a logical or mathematical model of a system to infer how to operate the system, diagnose failures and generate appropriate behavior to repair or reconfigure the system in response. The 'plug and play' nature of the models enables low cost development of autonomy for multiple platforms. Declarative, reusable models capture relevant aspects of the behavior of simple devices (e.g. valves or thrusters). Reasoning algorithms combine device models to create a model of the system-wide interactions and behavior of a complex, unique artifact such as a spacecraft. Rather than requiring engineers to all possible interactions and failures at design time or perform analysis during the mission, the reasoning engine generates the appropriate response to the current situation, taking into account its system-wide knowledge, the current state, and even sensor failures or unexpected behavior.

  19. Optimizing for Large Planar Fractures in Multistage Horizontal Wells in Enhanced Geothermal Systems Using a Coupled Fluid and Geomechanics Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Xiexiaomen; Tutuncu, Azra; Eustes, Alfred

    Enhanced Geothermal Systems (EGS) could potentially use technological advancements in coupled implementation of horizontal drilling and multistage hydraulic fracturing techniques in tight oil and shale gas reservoirs along with improvements in reservoir simulation techniques to design and create EGS reservoirs. In this study, a commercial hydraulic fracture simulation package, Mangrove by Schlumberger, was used in an EGS model with largely distributed pre-existing natural fractures to model fracture propagation during the creation of a complex fracture network. The main goal of this study is to investigate optimum treatment parameters in creating multiple large, planar fractures to hydraulically connect a horizontal injectionmore » well and a horizontal production well that are 10,000 ft. deep and spaced 500 ft. apart from each other. A matrix of simulations for this study was carried out to determine the influence of reservoir and treatment parameters on preventing (or aiding) the creation of large planar fractures. The reservoir parameters investigated during the matrix simulations include the in-situ stress state and properties of the natural fracture set such as the primary and secondary fracture orientation, average fracture length, and average fracture spacing. The treatment parameters investigated during the simulations were fluid viscosity, proppant concentration, pump rate, and pump volume. A final simulation with optimized design parameters was performed. The optimized design simulation indicated that high fluid viscosity, high proppant concentration, large pump volume and pump rate tend to minimize the complexity of the created fracture network. Additionally, a reservoir with 'friendly' formation characteristics such as large stress anisotropy, natural fractures set parallel to the maximum horizontal principal stress (SHmax), and large natural fracture spacing also promote the creation of large planar fractures while minimizing fracture complexity.« less

  20. Alternative Techniques for Treatment of Complex Below-the Knee Arterial Occlusions in Diabetic Patients With Critical Limb Ischemia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gandini, Roberto; Uccioli, Luigi; Spinelli, Alessio

    The purpose of this study was to describe alternative endovascular (EV) techniques and assess their feasibility and efficacy in minimizing failure rates in limb salvage for the treatment of complex below-the knee (BTK) occlusions that could not be crossed with a conventional antegrade access. Between December 2007 and November 2010, 1,035 patients (557 male) underwent EV treatment for critical limb ischemia in our institution. In 124 (12% [83 male], mean age 68.2 {+-} 0.5 years) patients, transfemoral antegrade revascularization attempt failed, and an alternative approach was used. Follow-up was performed at 1 and 6 months. Results were compared with 56more » patients treated between November 2002 and November 2007, in whom conventional technique was unsuccessful and unconventional techniques were not adopted. Technical success was achieved in 119 (96%) patients. The limb-salvage rates were 96.8% and 83% at 1- and 6-month follow-up, respectively. Sixteen (12.9%) and 33 (26.6%) patients underwent reintervention at 1- and 6-month follow-up, respectively. Transcutaneous oxygen tension increased at 1 month (44.7 {+-} 1.1 vs. 15.7 {+-} 0.8 mmHg; p < 0.001) and remained stable at follow-up. Twenty (16.1%) patients required major amputation. Thirteen (10.4%) patients died during follow-up. In our previous experience, percutaneous transluminal angioplasty failure, amputation, and death rates were 10.9, 39.2, and 23.2%, respectively. Alternative techniques allowed a significant decrease of major amputation and death rates (p = 0.0001 and p = 0.02, respectively). The use of alternative techniques seems feasible in case of a failed antegrade BTK revascularization attempt and could minimize failure rates in the treatment of complex occlusions while providing satisfying clinical success rates at 6 months.« less

  1. Untangling complex networks: risk minimization in financial markets through accessible spin glass ground states

    PubMed Central

    Lisewski, Andreas Martin; Lichtarge, Olivier

    2010-01-01

    Recurrent international financial crises inflict significant damage to societies and stress the need for mechanisms or strategies to control risk and tamper market uncertainties. Unfortunately, the complex network of market interactions often confounds rational approaches to optimize financial risks. Here we show that investors can overcome this complexity and globally minimize risk in portfolio models for any given expected return, provided the relative margin requirement remains below a critical, empirically measurable value. In practice, for markets with centrally regulated margin requirements, a rational stabilization strategy would be keeping margins small enough. This result follows from ground states of the random field spin glass Ising model that can be calculated exactly through convex optimization when relative spin coupling is limited by the norm of the network's Laplacian matrix. In that regime, this novel approach is robust to noise in empirical data and may be also broadly relevant to complex networks with frustrated interactions that are studied throughout scientific fields. PMID:20625477

  2. Untangling complex networks: Risk minimization in financial markets through accessible spin glass ground states

    NASA Astrophysics Data System (ADS)

    Lisewski, Andreas Martin; Lichtarge, Olivier

    2010-08-01

    Recurrent international financial crises inflict significant damage to societies and stress the need for mechanisms or strategies to control risk and tamper market uncertainties. Unfortunately, the complex network of market interactions often confounds rational approaches to optimize financial risks. Here we show that investors can overcome this complexity and globally minimize risk in portfolio models for any given expected return, provided the margin requirement remains below a critical, empirically measurable value. In practice, for markets with centrally regulated margin requirements, a rational stabilization strategy would be keeping margins small enough. This result follows from ground states of the random field spin glass Ising model that can be calculated exactly through convex optimization when relative spin coupling is limited by the norm of the network’s Laplacian matrix. In that regime, this novel approach is robust to noise in empirical data and may be also broadly relevant to complex networks with frustrated interactions that are studied throughout scientific fields.

  3. Untangling complex networks: risk minimization in financial markets through accessible spin glass ground states.

    PubMed

    Lisewski, Andreas Martin; Lichtarge, Olivier

    2010-08-15

    Recurrent international financial crises inflict significant damage to societies and stress the need for mechanisms or strategies to control risk and tamper market uncertainties. Unfortunately, the complex network of market interactions often confounds rational approaches to optimize financial risks. Here we show that investors can overcome this complexity and globally minimize risk in portfolio models for any given expected return, provided the relative margin requirement remains below a critical, empirically measurable value. In practice, for markets with centrally regulated margin requirements, a rational stabilization strategy would be keeping margins small enough. This result follows from ground states of the random field spin glass Ising model that can be calculated exactly through convex optimization when relative spin coupling is limited by the norm of the network's Laplacian matrix. In that regime, this novel approach is robust to noise in empirical data and may be also broadly relevant to complex networks with frustrated interactions that are studied throughout scientific fields.

  4. Optimized tuner selection for engine performance estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L. (Inventor); Garg, Sanjay (Inventor)

    2013-01-01

    A methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented. This technique specifically addresses the underdetermined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine which seeks to minimize the theoretical mean-squared estimation error. Theoretical Kalman filter estimation error bias and variance values are derived at steady-state operating conditions, and the tuner selection routine is applied to minimize these values. The new methodology yields an improvement in on-line engine performance estimation accuracy.

  5. Large Eddy Simulation of High Reynolds Number Complex Flows

    NASA Astrophysics Data System (ADS)

    Verma, Aman

    Marine configurations are subject to a variety of complex hydrodynamic phenomena affecting the overall performance of the vessel. The turbulent flow affects the hydrodynamic drag, propulsor performance and structural integrity, control-surface effectiveness, and acoustic signature of the marine vessel. Due to advances in massively parallel computers and numerical techniques, an unsteady numerical simulation methodology such as Large Eddy Simulation (LES) is well suited to study such complex turbulent flows whose Reynolds numbers (Re) are typically on the order of 10. 6. LES also promises increasedaccuracy over RANS based methods in predicting unsteady phenomena such as cavitation and noise production. This dissertation develops the capability to enable LES of high Re flows in complex geometries (e.g. a marine vessel) on unstructured grids and provide physical insight into the turbulent flow. LES is performed to investigate the geometry induced separated flow past a marine propeller attached to a hull, in an off-design condition called crashback. LES shows good quantitative agreement with experiments and provides a physical mechanism to explain the increase in side-force on the propeller blades below an advance ratio of J=-0.7. Fundamental developments in the dynamic subgrid-scale model for LES are pursued to improve the LES predictions, especially for complex flows on unstructured grids. A dynamic procedure is proposed to estimate a Lagrangian time scale based on a surrogate correlation without any adjustable parameter. The proposed model is applied to turbulent channel, cylinder and marine propeller flows and predicts improved results over other model variants due to a physically consistent Lagrangian time scale. A wall model is proposed for application to LES of high Reynolds number wall-bounded flows. The wall model is formulated as the minimization of a generalized constraint in the dynamic model for LES and applied to LES of turbulent channel flow at various Reynolds numbers up to Reτ=10000 and coarse grid resolutions to obtain significant improvement.

  6. The Case for Simulation-Based Mastery Learning Education Courses for Practicing Surgeons.

    PubMed

    Baumann, Lauren M; Barsness, Katherine A

    2018-03-12

    Pediatric surgeons rely on simulation courses to develop skills for safe minimally invasive repair of complex congenital anomalies. The majority of minimally invasive surgery (MIS) training courses occur during short "exposure courses" at annual conferences. Little data are available to support the benefit of these courses relative to the safe implementation of new skills. The purpose of this article is to determine the impact of an exposure course for advanced neonatal MIS on self-perceived comfort levels with independent performance of advanced MISs. Participants of a 4-hour hands-on course for neonatal MIS were surveyed regarding clinical practices and pre- and post-training perceived "comfort levels" of MIS skills for thoracoscopic esophageal atresia with tracheoesophageal fistula (tTEF) repair, thoracoscopic left upper lobe pulmonary lobectomy (tLobe), and laparoscopic duodenal atresia (lapDA) repair. Descriptive analyses were performed. Seventeen participants completed pre- and postcourse surveys. The majority of participants had no prior experience with tLobe (59%) or lapDA (53%), and 35% had no experience with tTEF repair. Similarly, the majority were "not comfortable" with these procedures. After the short course, the majority of surgeons reported that they were "likely to perform" these operations within 6 months, despite low levels of baseline experience and comfort levels. An exposure training course led to immediate perception of increased skills and confidence. However, these courses typically do not provide basic tenets of expert performance that demands deliberate practice. Future course design should transition to a mastery learning framework wherein regular skill assessments, milestones, and unlimited education time are prioritized before implementation of the new skills.

  7. Minimal acceptable care as a vital component to Missouri's trauma system.

    PubMed

    Helling, Thomas S

    2002-07-01

    Immediate attention to life-threatening injuries and expeditious transfer of major and complex wounds to tertiary care trauma centers are the cornerstones of any trauma system. Rapid assessment and "minimalization" of care should be the buzz-word of rural (Level III) and suburban (Level II) trauma centers in order to provide quickest treatment of injuries by timely referral of patients for definitive attention. This concept is called minimal acceptable care and may serve to improve patient outcome by reducing the interval to ultimate treatment and avoidance of duplication of services.

  8. Improving the Efficiency and Safety of Aspirin by Complexation with the Natural Polysaccharide Arabinogalactan.

    PubMed

    Khvostov, Mikhail V; Tolstikova, Tatjana G; Borisov, Sergey A; Zhukova, Natalja A; Dushkin, Alexander V; Chistyachenko, Yulia S; Polyakov, Nikolay E

    2016-01-01

    The main undesirable side effect of the aspirin is the damage to the gastrointestinal mucosa, leading to the formation of erosions, peptic ulcers, and as a result, bleeding. To overcome this problem "host-guest" complexation with natural polysaccharide arabinogalactan could be applied. The complex with a weight ratio of ASA:AG = 1:10 was prepared by solid phase method in a rotary mill. Complex was administered orally to mice or rats at doses of 250, 500 or 1000 mg/kg. The "acetic acid induced writhing" and "hot plate" tests were used as an in vivo pain models. The antiinflammatory activity was studied using "histamine swelling" test. Also, long-term (30 days) oral introduction of the complex to rats was performed and gastric mucosa damages were evaluated. In all experiments pure aspirin (ASA) was used as a control in appropriate doses. The minimal effective analgesic dose of the complex was 250 mg/kg, equivalent to 23 mg/kg of ASA, a dose in which aspirin itself was not active. The anti-inflammatory effect was found at relatively higher doses: 500 and 1000 mg/kg (46 and 92 mg/kg of ASA respectively) for the complex and only at 100 mg/kg for the ASA. Long-term introduction of the complex at doses of 250 and 500 mg/kg was safe for gastric mucosa, while ASA at the dose of 50 mg/kg showed a strong gastric mucosal damage. The effective analgesic and anti-inflammatory doses of 1:10 aspirin complex with arabinogalactan are twice less compared to pure aspirin and safer for the gastrointestinal mucosa.

  9. Recursive formulae and performance comparisons for first mode dynamics of periodic structures

    NASA Astrophysics Data System (ADS)

    Hobeck, Jared D.; Inman, Daniel J.

    2017-05-01

    Periodic structures are growing in popularity especially in the energy harvesting and metastructures communities. Common types of these unique structures are referred to in the literature as zigzag, orthogonal spiral, fan-folded, and longitudinal zigzag structures. Many of these studies on periodic structures have two competing goals in common: (a) minimizing natural frequency, and (b) minimizing mass or volume. These goals suggest that no single design is best for all applications; therefore, there is a need for design optimization and comparison tools which first require efficient easy-to-implement models. All available structural dynamics models for these types of structures do provide exact analytical solutions; however, they are complex requiring tedious implementation and providing more information than necessary for practical applications making them computationally inefficient. This paper presents experimentally validated recursive models that are able to very accurately and efficiently predict the dynamics of the four most common types of periodic structures. The proposed modeling technique employs a combination of static deflection formulae and Rayleigh’s Quotient to estimate the first mode shape and natural frequency of periodic structures having any number of beams. Also included in this paper are the results of an extensive experimental validation study which show excellent agreement between model prediction and measurement. Lastly, the proposed models are used to evaluate the performance of each type of structure. Results of this performance evaluation reveal key advantages and disadvantages associated with each type of structure.

  10. Electron Beam Freeform Fabrication: A Fabrication Process that Revolutionizes Aircraft Structural Designs and Spacecraft Supportability

    NASA Technical Reports Server (NTRS)

    Taminger, Karen M.

    2008-01-01

    The technological inception and challenges, as well as current applications of the electron beam freeform fabrication (EBF3) process are outlined. The process was motivated by the need for a new metals technology that would be cost-effective, enable the production of new alloys and that would could be used for efficient, lightweight structures. EBF3 is a rapid metal fabrication, layer-additive process that uses no molds or tools and which yields properties equivalent to wrought. The benefits of EBF3 include it near-net shape which minimizes scrap and reduces part count; efficiency in design which allows for lighter weight and enhanced performance; and, its "green" manufacturing process which yields minimal waste products. EBF3 also has a high tensile strength, while a structural test comparison found that EBF3 panels performed 5% lower than machined panels. Technical challenges in the EBF3 process include a need for process control monitoring and an improvement in localized heat response. Currently, the EBF3 process can be used to add details onto forgings and to construct and form complex shapes. However, it has potential uses in a variety of industries including aerospace, automotive, sporting goods and medical implant devices. The novel structural design capabilities of EBF3 have the ability to yield curved stiffeners which may be optimized for performance, low weight, low noise and damage tolerance applications. EBF3 has also demonstrated its usefulness in 0-gravity environments for supportability in space applications.

  11. Assessing the performance of MM/PBSA and MM/GBSA methods. 8. Predicting binding free energies and poses of protein-RNA complexes.

    PubMed

    Chen, Fu; Sun, Huiyong; Wang, Junmei; Zhu, Feng; Liu, Hui; Wang, Zhe; Lei, Tailong; Li, Youyong; Hou, Tingjun

    2018-06-21

    Molecular docking provides a computationally efficient way to predict the atomic structural details of protein-RNA interactions (PRI), but accurate prediction of the three-dimensional structures and binding affinities for PRI is still notoriously difficult, partly due to the unreliability of the existing scoring functions for PRI. MM/PBSA and MM/GBSA are more theoretically rigorous than most scoring functions for protein-RNA docking, but their prediction performance for protein-RNA systems remains unclear. Here, we systemically evaluated the capability of MM/PBSA and MM/GBSA to predict the binding affinities and recognize the near-native binding structures for protein-RNA systems with different solvent models and interior dielectric constants (ϵ in ). For predicting the binding affinities, the predictions given by MM/GBSA based on the minimized structures in explicit solvent and the GBGBn1 model with ϵ in = 2 yielded the highest correlation with the experimental data. Moreover, the MM/GBSA calculations based on the minimized structures in implicit solvent and the GBGBn1 model distinguished the near-native binding structures within the top 10 decoys for 118 out of the 149 protein-RNA systems (79.2%). This performance is better than all docking scoring functions studied here. Therefore, the MM/GBSA rescoring is an efficient way to improve the prediction capability of scoring functions for protein-RNA systems. Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  12. Governing factors affecting the impacts of silver nanoparticles on wastewater treatment.

    PubMed

    Zhang, Chiqian; Hu, Zhiqiang; Li, Ping; Gajaraj, Shashikanth

    2016-12-01

    Silver nanoparticles (nanosilver or AgNPs) enter municipal wastewater from various sources, raising concerns about their potential adverse effects on wastewater treatment processes. We argue that the biological effects of silver nanoparticles at environmentally realistic concentrations (μgL -1 or lower) on the performance of a full-scale municipal water resource recovery facility (WRRF) are minimal. Reactor configuration is a critical factor that reduces or even mutes the toxicity of silver nanoparticles towards wastewater microbes in a full-scale WRRF. Municipal sewage collection networks transform silver nanoparticles into silver(I)-complexes/precipitates with low ecotoxicity, and preliminary/primary treatment processes in front of biological treatment utilities partially remove silver nanoparticles to sludge. Microbial functional redundancy and microbial adaptability to silver nanoparticles also greatly alleviate the adverse effects of silver nanoparticles on the performance of a full-scale WRRF. Silver nanoparticles in a lab-scale bioreactor without a sewage collection system and/or a preliminary/primary treatment process, in contrast to being in a full scale system, may deteriorate the reactor performance at relatively high concentrations (e.g., mgL -1 levels or higher). However, in many cases, silver nanoparticles have minimal impacts on lab-scale bioreactors, such as sequencing batch bioreactors (SBRs), especially when at relatively low concentrations (e.g., less than 1mgL -1 ). The susceptibility of wastewater microbes to silver nanoparticles is species-specific. In general, silver nanoparticles have higher toxicity towards nitrifying bacteria than heterotrophic bacteria. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Improving the performance of minimizers and winnowing schemes

    PubMed Central

    Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl

    2017-01-01

    Abstract Motivation: The minimizers scheme is a method for selecting k-mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k-mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k-mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. Results: We provide an in-depth analysis of the effect of k-mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al.) on the expected density of minimizers in a random sequence. Availability and Implementation: The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git. Contact: gmarcais@cs.cmu.edu or carlk@cs.cmu.edu PMID:28881970

  14. A Scalable, Parallel Approach for Multi-Point, High-Fidelity Aerostructural Optimization of Aircraft Configurations

    NASA Astrophysics Data System (ADS)

    Kenway, Gaetan K. W.

    This thesis presents new tools and techniques developed to address the challenging problem of high-fidelity aerostructural optimization with respect to large numbers of design variables. A new mesh-movement scheme is developed that is both computationally efficient and sufficiently robust to accommodate large geometric design changes and aerostructural deformations. A fully coupled Newton-Krylov method is presented that accelerates the convergence of aerostructural systems and provides a 20% performance improvement over the traditional nonlinear block Gauss-Seidel approach and can handle more exible structures. A coupled adjoint method is used that efficiently computes derivatives for a gradient-based optimization algorithm. The implementation uses only machine accurate derivative techniques and is verified to yield fully consistent derivatives by comparing against the complex step method. The fully-coupled large-scale coupled adjoint solution method is shown to have 30% better performance than the segregated approach. The parallel scalability of the coupled adjoint technique is demonstrated on an Euler Computational Fluid Dynamics (CFD) model with more than 80 million state variables coupled to a detailed structural finite-element model of the wing with more than 1 million degrees of freedom. Multi-point high-fidelity aerostructural optimizations of a long-range wide-body, transonic transport aircraft configuration are performed using the developed techniques. The aerostructural analysis employs Euler CFD with a 2 million cell mesh and a structural finite element model with 300 000 DOF. Two design optimization problems are solved: one where takeoff gross weight is minimized, and another where fuel burn is minimized. Each optimization uses a multi-point formulation with 5 cruise conditions and 2 maneuver conditions. The optimization problems have 476 design variables are optimal results are obtained within 36 hours of wall time using 435 processors. The TOGW minimization results in a 4.2% reduction in TOGW with a 6.6% fuel burn reduction, while the fuel burn optimization resulted in a 11.2% fuel burn reduction with no change to the takeoff gross weight.

  15. Associating optical measurements and estimating orbits of geocentric objects with a Genetic Algorithm: performance limitations.

    NASA Astrophysics Data System (ADS)

    Zittersteijn, Michiel; Schildknecht, Thomas; Vananti, Alessandro; Dolado Perez, Juan Carlos; Martinot, Vincent

    2016-07-01

    Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention. This problem is also known as the Multiple Target Tracking (MTT) problem. The complexity of the MTT problem is defined by its dimension S. Current research tends to focus on the S = 2 MTT problem. The reason for this is that for S = 2 the problem has a P-complexity. However, with S = 2 the decision to associate a set of observations is based on the minimum amount of information, in ambiguous situations (e.g. satellite clusters) this will lead to incorrect associations. The S > 2 MTT problem is an NP-hard combinatorial optimization problem. In previous work an Elitist Genetic Algorithm (EGA) was proposed as a method to approximately solve this problem. It was shown that the EGA is able to find a good approximate solution with a polynomial time complexity. The EGA relies on solving the Lambert problem in order to perform the necessary orbit determinations. This means that the algorithm is restricted to orbits that are described by Keplerian motion. The work presented in this paper focuses on the impact that this restriction has on the algorithm performance.

  16. A multiplanar complex resection of a low-grade chondrosarcoma of the distal femur guided by K-wires previously inserted under CT-guide: a case report

    PubMed Central

    2014-01-01

    Background In muscular skeletal oncology aiming to achieve wide surgical margin is one of the main factors influencing patient prognosis. In cases where lesions are either meta or epiphyseal, surgery most often compromises joint integrity and stability because muscles, tendons and ligaments are involved in wide resection. When lesions are well circumscribed they can be completely resected by performing multi-planar osteotomies guided by computer-assisted navigation. We describe a case of low-grade chondrosarcoma of the distal femur where a simple but effective technique was useful to perform complex multiplanar osteotomies. No similar techniques are reported in the literature. Case presentation A 57 year-old Caucasian female was referred to our department for the presence of a distal femur chondrosarcoma. A resection with the presenting technique was scheduled. The first step consists of inserting several K-wires under CT-scan control to delimitate the tumor; the second step consists of tumor removal: in operative theatre, following surgical access, k-wires are used as guide positioning; scalpels are externally placed to k-wires to perform a safe osteotomy. Conclusions Computed assisted resections can be considered the most advantageous method to reach the best surgical outcome; unfortunately navigation systems are only available in specialized centres. The present technique allows for a multiplanar complex resection when navigation systems are not available. This technique can be applied in low-grade tumours where a minimal wide margin can be considered sufficient. PMID:25123066

  17. Design and manufacturing considerations for high-performance gimbals used for land, sea, air, and space

    NASA Astrophysics Data System (ADS)

    Sweeney, Mike; Redd, Lafe; Vettese, Tom; Myatt, Ray; Uchida, David; Sellers, Del

    2015-09-01

    High performance stabilized EO/IR surveillance and targeting systems are in demand for a wide variety of military, law enforcement, and commercial assets for land, sea, air, and space. Operating ranges, wavelengths, and angular resolution capabilities define the requirements for EO/IR optics and sensors, and line of sight stabilization. Many materials and design configurations are available for EO/IR pointing gimbals depending on trade-offs of size, weight, power (SWaP), performance, and cost. Space and high performance military aircraft applications are often driven toward expensive but exceptionally performing beryllium and aluminum beryllium components. Commercial applications often rely on aluminum and composite materials. Gimbal design considerations include achieving minimized mass and inertia simultaneous with demanding structural, thermal, optical, and scene stabilization requirements when operating in dynamic operational environments. Manufacturing considerations include precision lapping and honing of ball bearing interfaces, brazing, welding, and casting of complex aluminum and beryllium alloy structures, and molding of composite structures. Several notional and previously developed EO/IR gimbal platforms are profiled that exemplify applicable design and manufacturing technologies.

  18. Real-valued composite filters for correlation-based optical pattern recognition

    NASA Technical Reports Server (NTRS)

    Rajan, P. K.; Balendra, Anushia

    1992-01-01

    Advances in the technology of optical devices such as spatial light modulators (SLMs) have influenced the research and growth of optical pattern recognition. In the research leading to this report, the design of real-valued composite filters that can be implemented using currently available SLMs for optical pattern recognition and classification was investigated. The design of real-valued minimum average correlation energy (RMACE) filter was investigated. Proper selection of the phase of the output response was shown to reduce the correlation energy. The performance of the filter was evaluated using computer simulations and compared with the complex filters. It was found that the performance degraded only slightly. Continuing the above investigation, the design of a real filter that minimizes the output correlation energy and the output variance due to noise was developed. Simulation studies showed that this filter had better tolerance to distortion and noise compared to that of the RMACE filter. Finally, the space domain design of RMACE filter was developed and implemented on the computer. It was found that the sharpness of the correlation peak was slightly reduced but the filter design was more computationally efficient than the complex filter.

  19. Inhibiting diffusion of complex contagions in social networks: theoretical and experimental results

    PubMed Central

    Anil Kumar, V.S.; Marathe, Madhav V.; Ravi, S.S.; Rosenkrantz, Daniel J.

    2014-01-01

    We consider the problem of inhibiting undesirable contagions (e.g. rumors, spread of mob behavior) in social networks. Much of the work in this context has been carried out under the 1-threshold model, where diffusion occurs when a node has just one neighbor with the contagion. We study the problem of inhibiting more complex contagions in social networks where nodes may have thresholds larger than 1. The goal is to minimize the propagation of the contagion by removing a small number of nodes (called critical nodes) from the network. We study several versions of this problem and prove that, in general, they cannot even be efficiently approximated to within any factor ρ ≥ 1, unless P = NP. We develop efficient and practical heuristics for these problems and carry out an experimental study of their performance on three well known social networks, namely epinions, wikipedia and slashdot. Our results show that these heuristics perform significantly better than five other known methods. We also establish an efficiently computable upper bound on the number of nodes to which a contagion can spread and evaluate this bound on many real and synthetic networks. PMID:25750583

  20. Paediatric Palliative Care and Intellectual Disability--A Unique Context

    ERIC Educational Resources Information Center

    Duc, Jacqueline K.; Herbert, Anthony Robert; Heussler, Helen S.

    2017-01-01

    Background: Paediatric palliative care is a nuanced area of practice with additional complexities in the context of intellectual disability. There is currently minimal research to guide clinicians working in this challenging area of care. Method: This study describes the complex care of children with life-limiting conditions and intellectual…

  1. CONTINUOUS MICRO-SORTING OF COMPLEX WASTE PLASTICS PARTICLEMIXTURES VIA LIQUID-FLUIDIZED BED CLASSIFICATION (LFBC) FOR WASTE MINIMIZATIONAND RECYCLING

    EPA Science Inventory

    A fundamental investigation is proposed to provide a technical basis for the development of a novel, liquid-fluidized bed classification (LFBC) technology for the continuous separation of complex waste plastic mixtures for in-process recycling and waste minimization. Although ...

  2. Explosive Fracturing of an F-16 Canopy for Through-Canopy Crew Egress

    NASA Technical Reports Server (NTRS)

    Bement, Laurence J.

    2000-01-01

    Through-canopy crew egress, such as in the Harrier (AV-8B) aircraft, expands escape envelopes by reducing seat ejection delays in waiting for canopy jettison. Adverse aircraft attitude and reduced forward flight speed can further increase the times for canopy jettison. However, the advent of heavy, high-strength polycarbonate canopies for bird-strike resistance has not only increased jettison times, but has made seat penetration impossible. The goal of the effort described in this paper was to demonstrate a method of explosively fracturing the F-16 polycarbonate canopy to allow through-canopy crew ejection. The objectives of this effort were to: 1. Mount the explosive materials on the exterior of the canopy within the mold line, 2. Minimize visual obstructions, 3. Minimize internal debris on explosive activation, 4. Operate within less than 10 ms, 5. Maintain the shape of the canopy after functioning to prevent major pieces from entering the cockpit, and 6. Minimize the resistance of the canopy to seat penetration. All goals and objectives were met in a full-scale test demonstration. In addition to expanding crew escape envelopes, this canopy fracture approach offers the potential for reducing system complexity, weight and cost, while increasing overall reliability, compared to current canopy jettison approaches. To comply with International Traffic in Arms Regulations (ITAR) and permit public disclosure, this document addresses only the principles of explosive fracturing of the F-16 canopy materials and the end result. ITAR regulations restrict information on improving the performance of weapon systems. Therefore, details on the explosive loads and final assembly of this canopy fracture approach, necessary to assure functional performance, are not included.

  3. Leveraging advances in biology to design biomaterials

    NASA Astrophysics Data System (ADS)

    Darnell, Max; Mooney, David J.

    2017-12-01

    Biomaterials have dramatically increased in functionality and complexity, allowing unprecedented control over the cells that interact with them. From these engineering advances arises the prospect of improved biomaterial-based therapies, yet practical constraints favour simplicity. Tools from the biology community are enabling high-resolution and high-throughput bioassays that, if incorporated into a biomaterial design framework, could help achieve unprecedented functionality while minimizing the complexity of designs by identifying the most important material parameters and biological outputs. However, to avoid data explosions and to effectively match the information content of an assay with the goal of the experiment, material screens and bioassays must be arranged in specific ways. By borrowing methods to design experiments and workflows from the bioprocess engineering community, we outline a framework for the incorporation of next-generation bioassays into biomaterials design to effectively optimize function while minimizing complexity. This framework can inspire biomaterials designs that maximize functionality and translatability.

  4. Complex lasso: new entangled motifs in proteins

    NASA Astrophysics Data System (ADS)

    Niemyska, Wanda; Dabrowski-Tumanski, Pawel; Kadlof, Michal; Haglund, Ellinor; Sułkowski, Piotr; Sulkowska, Joanna I.

    2016-11-01

    We identify new entangled motifs in proteins that we call complex lassos. Lassos arise in proteins with disulfide bridges (or in proteins with amide linkages), when termini of a protein backbone pierce through an auxiliary surface of minimal area, spanned on a covalent loop. We find that as much as 18% of all proteins with disulfide bridges in a non-redundant subset of PDB form complex lassos, and classify them into six distinct geometric classes, one of which resembles supercoiling known from DNA. Based on biological classification of proteins we find that lassos are much more common in viruses, plants and fungi than in other kingdoms of life. We also discuss how changes in the oxidation/reduction potential may affect the function of proteins with lassos. Lassos and associated surfaces of minimal area provide new, interesting and possessing many potential applications geometric characteristics not only of proteins, but also of other biomolecules.

  5. On conjugate gradient type methods and polynomial preconditioners for a class of complex non-Hermitian matrices

    NASA Technical Reports Server (NTRS)

    Freund, Roland

    1988-01-01

    Conjugate gradient type methods are considered for the solution of large linear systems Ax = b with complex coefficient matrices of the type A = T + i(sigma)I where T is Hermitian and sigma, a real scalar. Three different conjugate gradient type approaches with iterates defined by a minimal residual property, a Galerkin type condition, and an Euclidian error minimization, respectively, are investigated. In particular, numerically stable implementations based on the ideas behind Paige and Saunder's SYMMLQ and MINRES for real symmetric matrices are proposed. Error bounds for all three methods are derived. It is shown how the special shift structure of A can be preserved by using polynomial preconditioning. Results on the optimal choice of the polynomial preconditioner are given. Also, some numerical experiments for matrices arising from finite difference approximations to the complex Helmholtz equation are reported.

  6. The simplicity principle in perception and cognition

    PubMed Central

    Feldman, Jacob

    2016-01-01

    The simplicity principle, traditionally referred to as Occam’s razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations— or, more precisely, that it balances a bias towards simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. PMID:27470193

  7. Method for Hot Real-Time Sampling of Gasification Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomeroy, Marc D

    The Thermochemical Process Development Unit (TCPDU) at the National Renewable Energy Laboratory (NREL) is a highly instrumented half-ton/day pilot scale plant capable of demonstrating industrially relevant thermochemical technologies from lignocellulosic biomass conversion, including gasification. Gasification creates primarily Syngas (a mixture of Hydrogen and Carbon Monoxide) that can be utilized with synthesis catalysts to form transportation fuels and other valuable chemicals. Biomass derived gasification products are a very complex mixture of chemical components that typically contain Sulfur and Nitrogen species that can act as catalysis poisons for tar reforming and synthesis catalysts. Real-time hot online sampling techniques, such as Molecular Beammore » Mass Spectrometry (MBMS), and Gas Chromatographs with Sulfur and Nitrogen specific detectors can provide real-time analysis providing operational indicators for performance. Sampling typically requires coated sampling lines to minimize trace sulfur interactions with steel surfaces. Other materials used inline have also shown conversion of sulfur species into new components and must be minimized. Sample line Residence time within the sampling lines must also be kept to a minimum to reduce further reaction chemistries. Solids from ash and char contribute to plugging and must be filtered at temperature. Experience at NREL has shown several key factors to consider when designing and installing an analytical sampling system for biomass gasification products. They include minimizing sampling distance, effective filtering as close to source as possible, proper line sizing, proper line materials or coatings, even heating of all components, minimizing pressure drops, and additional filtering or traps after pressure drops.« less

  8. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    ERIC Educational Resources Information Center

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  9. A negentropy minimization approach to adaptive equalization for digital communication systems.

    PubMed

    Choi, Sooyong; Lee, Te-Won

    2004-07-01

    In this paper, we introduce and investigate a new adaptive equalization method based on minimizing approximate negentropy of the estimation error for a finite-length equalizer. We consider an approximate negentropy using nonpolynomial expansions of the estimation error as a new performance criterion to improve performance of a linear equalizer based on minimizing minimum mean squared error (MMSE). Negentropy includes higher order statistical information and its minimization provides improved converge, performance and accuracy compared to traditional methods such as MMSE in terms of bit error rate (BER). The proposed negentropy minimization (NEGMIN) equalizer has two kinds of solutions, the MMSE solution and the other one, depending on the ratio of the normalization parameters. The NEGMIN equalizer has best BER performance when the ratio of the normalization parameters is properly adjusted to maximize the output power(variance) of the NEGMIN equalizer. Simulation experiments show that BER performance of the NEGMIN equalizer with the other solution than the MMSE one has similar characteristics to the adaptive minimum bit error rate (AMBER) equalizer. The main advantage of the proposed equalizer is that it needs significantly fewer training symbols than the AMBER equalizer. Furthermore, the proposed equalizer is more robust to nonlinear distortions than the MMSE equalizer.

  10. 77 FR 29749 - 74th Meeting: RTCA Special Committee 147, Minimal Operations Performance Standards for Traffic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-18

    ... 147, Minimal Operations Performance Standards for Traffic Alert and Collision Avoidance Systems... Traffic Alert and Collision Avoidance Systems Airborne Equipment. SUMMARY: The FAA is issuing this notice... Performance Standards for Traffic Alert and Collision Avoidance Systems Airborne Equipment. DATES: The meeting...

  11. Influence maximization in complex networks through optimal percolation

    NASA Astrophysics Data System (ADS)

    Morone, Flaviano; Makse, Hernan; CUNY Collaboration; CUNY Collaboration

    The whole frame of interconnections in complex networks hinges on a specific set of structural nodes, much smaller than the total size, which, if activated, would cause the spread of information to the whole network, or, if immunized, would prevent the diffusion of a large scale epidemic. Localizing this optimal, that is, minimal, set of structural nodes, called influencers, is one of the most important problems in network science. Here we map the problem onto optimal percolation in random networks to identify the minimal set of influencers, which arises by minimizing the energy of a many-body system, where the form of the interactions is fixed by the non-backtracking matrix of the network. Big data analyses reveal that the set of optimal influencers is much smaller than the one predicted by previous heuristic centralities. Remarkably, a large number of previously neglected weakly connected nodes emerges among the optimal influencers. Reference: F. Morone, H. A. Makse, Nature 524,65-68 (2015)

  12. A 3D virtual reality simulator for training of minimally invasive surgery.

    PubMed

    Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin

    2014-01-01

    For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.

  13. A defined, glucose-limited mineral medium for the cultivation of Listeria spp.

    PubMed

    Schneebeli, Rudolf; Egli, Thomas

    2013-04-01

    Members of the genus Listeria are fastidious bacteria with respect to their nutritional requirements, and several minimal media described in the literature fail to support growth of all Listeria spp. Furthermore, strict limitation by a single nutrient, e.g., the carbon source, has not been demonstrated for any of the published minimal media. This is an important prerequisite for defined studies of growth and physiology, including "omics." Based on a theoretical analysis of previously published mineral media for Listeria, an improved, well-balanced growth medium was designed. It supports the growth, not only of all tested Listeria monocytogenes strains, but of all other Listeria species, with the exception of L. ivanovii. The growth performance of L. monocytogenes strain Scott A was tested in the newly designed medium; glucose served as the only carbon and energy source for growth, whereas neither the supplied amino acids nor the buffering and complexing components (MOPS [morpholinepropanesulfonic acid] and EDTA) supported growth. Omission of amino acids, trace elements, or vitamins, alone or in combination, resulted in considerably reduced biomass yields. Furthermore, we monitored the specific growth rates of various Listeria strains cultivated in the designed mineral medium and compared them to growth in complex medium (brain heart infusion broth [BHI]). The novel mineral medium was optimized for the commonly used strain L. monocytogenes Scott A to achieve optimum cell yields and maximum specific growth rates. This mineral medium is the first published synthetic medium for Listeria that has been shown to be strictly carbon (glucose) limited.

  14. Transfusion-free cardiac reoperation in an 11-kg Jehovah's Witness child by use of a minimized cardiopulmonary bypass circuit.

    PubMed

    Huebler, Michael; Boettcher, Wolfgang; Koster, Andreas; Stiller, Brigitte; Kuppe, Hermann; Hetzer, Roland

    2007-01-01

    Herein, we describe the design of a perfusion system for a complex cardiovascular reoperation in an 11-kg Jehovah's Witness patient. The goal of safe, transfusion-free surgery was achieved chiefly by minimizing the priming volume of the cardiopulmonary bypass circuit to 200 mL while providing adequate flow and standard safety features.

  15. Effect of dual-dielectric hydrogen-diffusion barrier layers on the performance of low-temperature processed transparent InGaZnO thin-film transistors

    NASA Astrophysics Data System (ADS)

    Tari, Alireza; Wong, William S.

    2018-02-01

    Dual-dielectric SiOx/SiNx thin-film layers were used as back-channel and gate-dielectric barrier layers for bottom-gate InGaZnO (IGZO) thin-film transistors (TFTs). The concentration profiles of hydrogen, indium, gallium, and zinc oxide were analyzed using secondary-ion mass spectroscopy characterization. By implementing an effective H-diffusion barrier, the hydrogen concentration and the creation of H-induced oxygen deficiency (H-Vo complex) defects during the processing of passivated flexible IGZO TFTs were minimized. A bilayer back-channel passivation layer, consisting of electron-beam deposited SiOx on plasma-enhanced chemical vapor-deposition (PECVD) SiNx films, effectively protected the TFT active region from plasma damage and minimized changes in the chemical composition of the semiconductor layer. A dual-dielectric PECVD SiOx/PECVD SiNx gate-dielectric, using SiOx as a barrier layer, also effectively prevented out-diffusion of hydrogen atoms from the PECVD SiNx-gate dielectric to the IGZO channel layer during the device fabrication.

  16. Battling fire and ice: remote guidance ultrasound to diagnose injury on the International Space Station and the ice rink.

    PubMed

    Kwon, David; Bouffard, J Antonio; van Holsbeeck, Marnix; Sargsyan, Asot E; Hamilton, Douglas R; Melton, Shannon L; Dulchavsky, Scott A

    2007-03-01

    National Aeronautical and Space and Administration (NASA) researchers have optimized training methods that allow minimally trained, non-physician operators to obtain diagnostic ultrasound (US) images for medical diagnosis including musculoskeletal injury. We hypothesize that these techniques could be expanded to non-expert operators including National Hockey League (NHL) and Olympic athletic trainers to diagnose musculoskeletal injuries in athletes. NHL and Olympic athletic trainers received a brief course on musculoskeletal US. Remote guidance musculoskeletal examinations were conducted by athletic trainers, consisting of hockey groin hernia, knee, ankle, elbow, or shoulder evaluations. US images were transmitted to remote experts for interpretation. Groin, knee, ankle, elbow, or shoulder images were obtained on 32 athletes; all real-time US video stream and still capture images were considered adequate for diagnostic interpretation. This experience suggests that US can be expanded for use in locations without a high level of on-site expertise. A non-physician with minimal training can perform complex, diagnostic-quality examinations when directed by a remote-based expert.

  17. Markov random field model-based edge-directed image interpolation.

    PubMed

    Li, Min; Nguyen, Truong Q

    2008-07-01

    This paper presents an edge-directed image interpolation algorithm. In the proposed algorithm, the edge directions are implicitly estimated with a statistical-based approach. In opposite to explicit edge directions, the local edge directions are indicated by length-16 weighting vectors. Implicitly, the weighting vectors are used to formulate geometric regularity (GR) constraint (smoothness along edges and sharpness across edges) and the GR constraint is imposed on the interpolated image through the Markov random field (MRF) model. Furthermore, under the maximum a posteriori-MRF framework, the desired interpolated image corresponds to the minimal energy state of a 2-D random field given the low-resolution image. Simulated annealing methods are used to search for the minimal energy state from the state space. To lower the computational complexity of MRF, a single-pass implementation is designed, which performs nearly as well as the iterative optimization. Simulation results show that the proposed MRF model-based edge-directed interpolation method produces edges with strong geometric regularity. Compared to traditional methods and other edge-directed interpolation methods, the proposed method improves the subjective quality of the interpolated edges while maintaining a high PSNR level.

  18. Multicategory nets of single-layer perceptrons: complexity and sample-size issues.

    PubMed

    Raudys, Sarunas; Kybartas, Rimantas; Zavadskas, Edmundas Kazimieras

    2010-05-01

    The standard cost function of multicategory single-layer perceptrons (SLPs) does not minimize the classification error rate. In order to reduce classification error, it is necessary to: 1) refuse the traditional cost function, 2) obtain near to optimal pairwise linear classifiers by specially organized SLP training and optimal stopping, and 3) fuse their decisions properly. To obtain better classification in unbalanced training set situations, we introduce the unbalance correcting term. It was found that fusion based on the Kulback-Leibler (K-L) distance and the Wu-Lin-Weng (WLW) method result in approximately the same performance in situations where sample sizes are relatively small. The explanation for this observation is by theoretically known verity that an excessive minimization of inexact criteria becomes harmful at times. Comprehensive comparative investigations of six real-world pattern recognition (PR) problems demonstrated that employment of SLP-based pairwise classifiers is comparable and as often as not outperforming the linear support vector (SV) classifiers in moderate dimensional situations. The colored noise injection used to design pseudovalidation sets proves to be a powerful tool for facilitating finite sample problems in moderate-dimensional PR tasks.

  19. Demonstration of transoral robotic supraglottic laryngectomy and total laryngectomy in cadaveric specimens using the Medrobotics Flex System.

    PubMed

    Funk, Emily; Goldenberg, David; Goyal, Neerav

    2017-06-01

    Current management of laryngeal malignancies is associated with significant morbidity. Application of minimally invasive transoral techniques may reduce the morbidity associated with traditional procedures. The purpose of this study was to present our investigation of the utility of a novel flexible robotic system for transoral supraglottic laryngectomy and total laryngectomy. Transoral total laryngectomy and transoral supraglottic laryngectomy were performed in cadaveric specimens using the Flex Robotic System (Medrobotics, Raynham, MA). All procedures were completed successfully in the cadaveric models. The articulated endoscope allowed for access to the desired surgical site. Flexible instruments enabled an atraumatic approach and allowed for precise surgical technique. Access to deep anatomic structures remains problematic using current minimally invasive robotic approaches. Improvements in visualization and access to the laryngopharyngeal complex offered by this system may improve surgical applications to the larynx. This study demonstrates the technical feasibility using the Flex Robotic System for transoral robotic supraglottic laryngectomy and total laryngectomy. © 2017 Wiley Periodicals, Inc. Head Neck 39: 1218-1225, 2017. © 2017 Wiley Periodicals, Inc.

  20. A model for developing job rotation schedules that eliminate sequential high workloads and minimize between-worker variability in cumulative daily workloads: Application to automotive assembly lines.

    PubMed

    Yoon, Sang-Young; Ko, Jeonghan; Jung, Myung-Chul

    2016-07-01

    The aim of study is to suggest a job rotation schedule by developing a mathematical model in order to reduce cumulative workload from the successive use of the same body region. Workload assessment using rapid entire body assessment (REBA) was performed for the model in three automotive assembly lines of chassis, trim, and finishing to identify which body part exposed to relatively high workloads at workstations. The workloads were incorporated to the model to develop a job rotation schedule. The proposed schedules prevent the exposure to high workloads successively on the same body region and minimized between-worker variance in cumulative daily workload. Whereas some of workers were successively assigned to high workload workstation under no job rotation and serial job rotation. This model would help to reduce the potential for work-related musculoskeletal disorders (WMSDs) without additional cost for engineering work, although it may need more computational time and relative complex job rotation sequences. Copyright © 2016 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. Robotic liver surgery: technical aspects and review of the literature

    PubMed Central

    Bianco, Francesco Maria; Daskalaki, Despoina; Gonzalez-Ciccarelli, Luis Fernando; Kim, Jihun; Benedetti, Enrico

    2016-01-01

    Minimally invasive surgery for liver resections has a defined role and represents an accepted alternative to open techniques for selected cases. Robotic technology can overcome some of the disadvantages of the laparoscopic technique, mainly in the most complex cases. Precise dissection and microsuturing is possible, even in narrow operative fields, allowing for a better dissection of the hepatic hilum, fine lymphadenectomy, and biliary reconstruction even with small bile ducts and easier bleeding control. This technique has the potential to allow for a greater number of major resections and difficult segmentectomies to be performed in a minimally invasive fashion. The implementation of near-infrared fluorescence with indocyanine green (ICG) also allows for a more accurate recognition of vascular and biliary anatomy. The perspectives of this kind of virtually implemented imaging are very promising and may be reflected in better outcomes. The overall data present in current literature suggests that robotic liver resections are at least comparable to both open and laparoscopic surgery in terms of perioperative and postoperative outcomes. This article provides technical details of robotic liver resections and a review of the current literature. PMID:27500143

  2. Study of holmium (III) and yttrium(III) with DOTA complexes as candidates for radiopharmaceutical use

    NASA Astrophysics Data System (ADS)

    Ernestová, M.; Jedináková-Křížová, V.

    2003-01-01

    Reaction conditions for complexation of radionuclides with DOTA were studied using thinlayer chromatography (TLC), paper chromatography (PC) and potentiometry. It was found that all of the studied complexes can reach very high radiochemical yield about 95%. Optimal conditions for obtaining such high radiochemical yields are as follows: pH higher than 4 and the excess of chelating agent must be minimally 3∶1. Potentiometric study showed that the formation of complexes is characterised by very slow kinetics.

  3. National trends in minimally invasive and open operative experience of graduating general surgery residents: implications for surgical skills curricula development?

    PubMed

    Carson, Jeffrey S; Smith, Lynette; Are, Madhuri; Edney, James; Azarow, Kenneth; Mercer, David W; Thompson, Jon S; Are, Chandrakanth

    2011-12-01

    The aim of this study was to analyze national trends in minimally invasive and open cases of all graduating residents in general surgery. A retrospective analysis was performed on data obtained from Accreditation Council for Graduate Medical Education logs (1999-2008) of graduating residents from all US general surgery residency programs. Data were analyzed using Mantel-Haenszel χ(2) tests and the Bonferroni adjustment to detect trends in the number of minimally invasive and open cases. Minimally invasive procedures accounted for an increasing proportion of cases performed (3.7% to 11.1%, P < .0001), with a proportional decrease in open cases. An increase in minimally invasive procedures with a proportional decrease in open procedures was noted in subcategories such as alimentary tract, abdominal, vascular, thoracic, and pediatric surgery (P < .0001). The results of this study demonstrate that general surgery residents in the United States are performing a greater number of minimally invasive and fewer open procedures for common surgical conditions. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. KDM2B Recruitment of the Polycomb Group Complex, PRC1.1, Requires Cooperation between PCGF1 and BCORL1.

    PubMed

    Wong, Sarah J; Gearhart, Micah D; Taylor, Alexander B; Nanyes, David R; Ha, Daniel J; Robinson, Angela K; Artigas, Jason A; Lee, Oliver J; Demeler, Borries; Hart, P John; Bardwell, Vivian J; Kim, Chongwoo A

    2016-10-04

    KDM2B recruits H2A-ubiquitinating activity of a non-canonical Polycomb Repression Complex 1 (PRC1.1) to CpG islands, facilitating gene repression. We investigated the molecular basis of recruitment using in vitro assembly assays to identify minimal components, subcomplexes, and domains required for recruitment. A minimal four-component PRC1.1 complex can be assembled by combining two separately isolated subcomplexes: the DNA-binding KDM2B/SKP1 heterodimer and the heterodimer of BCORL1 and PCGF1, a core component of PRC1.1. The crystal structure of the KDM2B/SKP1/BCORL1/PCGF1 complex illustrates the crucial role played by the PCGF1/BCORL1 heterodimer. The BCORL1 PUFD domain positions residues preceding the RAWUL domain of PCGF1 to create an extended interface for interaction with KDM2B, which is unique to the PCGF1-containing PRC1.1 complex. The structure also suggests how KDM2B might simultaneously function in PRC1.1 and an SCF ubiquitin ligase complex and the possible molecular consequences of BCOR PUFD internal tandem duplications found in pediatric kidney and brain tumors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Improving the performance of minimizers and winnowing schemes.

    PubMed

    Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl

    2017-07-15

    The minimizers scheme is a method for selecting k -mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k -mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k -mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. We provide an in-depth analysis of the effect of k -mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al. ) on the expected density of minimizers in a random sequence. The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git . gmarcais@cs.cmu.edu or carlk@cs.cmu.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. Cooperation through Competition-Dynamics and Microeconomics of a Minimal Nutrient Trade System in Arbuscular Mycorrhizal Symbiosis.

    PubMed

    Schott, Stephan; Valdebenito, Braulio; Bustos, Daniel; Gomez-Porras, Judith L; Sharma, Tripti; Dreyer, Ingo

    2016-01-01

    In arbuscular mycorrhizal (AM) symbiosis, fungi and plants exchange nutrients (sugars and phosphate, for instance) for reciprocal benefit. Until now it is not clear how this nutrient exchange system works. Here, we used computational cell biology to simulate the dynamics of a network of proton pumps and proton-coupled transporters that are upregulated during AM formation. We show that this minimal network is sufficient to describe accurately and realistically the nutrient trade system. By applying basic principles of microeconomics, we link the biophysics of transmembrane nutrient transport with the ecology of organismic interactions and straightforwardly explain macroscopic scenarios of the relations between plant and AM fungus. This computational cell biology study allows drawing far reaching hypotheses about the mechanism and the regulation of nutrient exchange and proposes that the "cooperation" between plant and fungus can be in fact the result of a competition between both for the same resources in the tiny periarbuscular space. The minimal model presented here may serve as benchmark to evaluate in future the performance of more complex models of AM nutrient exchange. As a first step toward this goal, we included SWEET sugar transporters in the model and show that their co-occurrence with proton-coupled sugar transporters results in a futile carbon cycle at the plant plasma membrane proposing that two different pathways for the same substrate should not be active at the same time.

  7. Uterine fibroids: current perspectives

    PubMed Central

    Khan, Aamir T; Shehmar, Manjeet; Gupta, Janesh K

    2014-01-01

    Uterine fibroids are a major cause of morbidity in women of a reproductive age (and sometimes even after menopause). There are several factors that are attributed to underlie the development and incidence of these common tumors, but this further corroborates their relatively unknown etiology. The most likely presentation of fibroids is by their effect on the woman’s menstrual cycle or pelvic pressure symptoms. Leiomyosarcoma is a very rare entity that should be suspected in postmenopausal women with fibroid growth (and no concurrent hormone replacement therapy). The gold standard diagnostic modality for uterine fibroids appears to be gray-scale ultrasonography, with magnetic resonance imaging being a close second option in complex clinical circumstances. The management of uterine fibroids can be approached medically, surgically, and even by minimal access techniques. The recent introduction of selective progesterone receptor modulators (SPRMs) and aromatase inhibitors has added more armamentarium to the medical options of treatment. Uterine artery embolization (UAE) has now been well-recognized as a uterine-sparing (fertility-preserving) method of treating fibroids. More recently, the introduction of ultrasound waves (MRgFUS) or radiofrequency (VizAblate™ and Acessa™) for uterine fibroid ablation has added to the options of minimal access treatment. More definite surgery in the form of myomectomy or hysterectomy can be performed via the minimal access or open route methods. Our article seeks to review the already established information on uterine fibroids with added emphasis on contemporary knowledge. PMID:24511243

  8. Data acquisition with a positron emission tomograph

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freifelder, R.; Karp, J.S.

    1997-12-31

    Positron Emission Tomography (PET) is a clinical imaging modality used in Nuclear Medicine. PET measures functionality rather than anatomical features and is therefore invaluable in the treatment of diseases which are characterized by functional changes in organs rather than anatomical changes. Typical diseases for which PET is used are cancer, epilepsy, and heart disease. While the scanners are not very complex, the performance demands on the devices are high. Excellent spatial resolution, 4-5 mm, and high sensitivity are key to maintaining high image quality. Compensation or suppression of scattered radiation is also necessary for good image quality. The ability tomore » acquire data under high counting rates is also necessary in order to minimize the injected dose to the patient, minimize the patient`s time in the scanner, and finally to minimize blurring due to patient motion. We have adapted various techniques in our data acquisition system which will be reported on in this talk. These include pulse clipping using lumped delay lines, flash ADCs with short sampling time, the use of a local positioning algorithm to limit the number of data words being used in subsequent second level software triggers and calculations, and finally the use of high speed dedicated calculator boards for on-line rebinning and reduction of the data. Modifications to the system to allow for transmission scanning will also be discussed.« less

  9. Sensorimotor Model of Obstacle Avoidance in Echolocating Bats

    PubMed Central

    Vanderelst, Dieter; Holderied, Marc W.; Peremans, Herbert

    2015-01-01

    Bat echolocation is an ability consisting of many subtasks such as navigation, prey detection and object recognition. Understanding the echolocation capabilities of bats comes down to isolating the minimal set of acoustic cues needed to complete each task. For some tasks, the minimal cues have already been identified. However, while a number of possible cues have been suggested, little is known about the minimal cues supporting obstacle avoidance in echolocating bats. In this paper, we propose that the Interaural Intensity Difference (IID) and travel time of the first millisecond of the echo train are sufficient cues for obstacle avoidance. We describe a simple control algorithm based on the use of these cues in combination with alternating ear positions modeled after the constant frequency bat Rhinolophus rouxii. Using spatial simulations (2D and 3D), we show that simple phonotaxis can steer a bat clear from obstacles without performing a reconstruction of the 3D layout of the scene. As such, this paper presents the first computationally explicit explanation for obstacle avoidance validated in complex simulated environments. Based on additional simulations modelling the FM bat Phyllostomus discolor, we conjecture that the proposed cues can be exploited by constant frequency (CF) bats and frequency modulated (FM) bats alike. We hypothesize that using a low level yet robust cue for obstacle avoidance allows bats to comply with the hard real-time constraints of this basic behaviour. PMID:26502063

  10. Post-weaning and whole-of-life performance of pigs is determined by live weight at weaning and the complexity of the diet fed after weaning.

    PubMed

    Collins, Cherie L; Pluske, John R; Morrison, Rebecca S; McDonald, Trevor N; Smits, Robert J; Henman, David J; Stensland, Ingunn; Dunshea, Frank R

    2017-12-01

    The production performance and financial outcomes associated with weaner diet complexity for pigs of different weight classes at weaning were examined in this experiment. A total of 720 weaner pigs (360 entire males and 360 females) were selected at weaning (27 ± 3 d) and allocated to pens of 10 based on individual weaning weight (light weaning weight: pigs below 6.5 kg; medium weaning weight: 6.5 to 8 kg; heavy weaning weight: above 8.5 kg). Pens were then allocated in a 3 × 2 × 2 factorial arrangement of treatments with the respective factors being weaning weight (heavy, medium and light; H, M and L, respectively), weaner diet complexity (high complexity/cost, HC; low complexity/cost, LC), and gender (male and female). Common diets were fed to both treatment groups during the final 4 weeks of the weaner period (a period of 39 days). In the first 6 d after weaning, pigs offered the HC diets gained weight faster and used feed more efficiently than those offered the LC diets ( P  = 0.031). Pigs fed a HC diet after weaning tended to be heavier at the sale live weight of 123 d of age compared with pigs fed the LC diet ( P  = 0.056). There were no other main effects of the feeding program on growth performance through to slaughter. Weaning weight had a profound influence on lifetime growth performance and weight at 123 d of age, with H pigs at weaning increasing their weight advantage over the M and L pigs (101.3, 97.1, 89.6 kg respectively, P  < 0.001). Cost-benefit analyses suggested there was a minimal benefit in terms of cost per unit live weight gain over lifetime when pigs were offered a HC feeding program to L, with a lower feed cost/kg gain. The results from this investigation confirm the impact of weaning weight on lifetime growth performance, and suggest that a HC feeding program should be focused on L weaner pigs (i.e., weaning weight less than 6.5 kg at 27 d of age) in order to maximise financial returns.

  11. Trellises and Trellis-Based Decoding Algorithms for Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1998-01-01

    A code trellis is a graphical representation of a code, block or convolutional, in which every path represents a codeword (or a code sequence for a convolutional code). This representation makes it possible to implement Maximum Likelihood Decoding (MLD) of a code with reduced decoding complexity. The most well known trellis-based MLD algorithm is the Viterbi algorithm. The trellis representation was first introduced and used for convolutional codes [23]. This representation, together with the Viterbi decoding algorithm, has resulted in a wide range of applications of convolutional codes for error control in digital communications over the last two decades. There are two major reasons for this inactive period of research in this area. First, most coding theorists at that time believed that block codes did not have simple trellis structure like convolutional codes and maximum likelihood decoding of linear block codes using the Viterbi algorithm was practically impossible, except for very short block codes. Second, since almost all of the linear block codes are constructed algebraically or based on finite geometries, it was the belief of many coding theorists that algebraic decoding was the only way to decode these codes. These two reasons seriously hindered the development of efficient soft-decision decoding methods for linear block codes and their applications to error control in digital communications. This led to a general belief that block codes are inferior to convolutional codes and hence, that they were not useful. Chapter 2 gives a brief review of linear block codes. The goal is to provide the essential background material for the development of trellis structure and trellis-based decoding algorithms for linear block codes in the later chapters. Chapters 3 through 6 present the fundamental concepts, finite-state machine model, state space formulation, basic structural properties, state labeling, construction procedures, complexity, minimality, and sectionalization of trellises. Chapter 7 discusses trellis decomposition and subtrellises for low-weight codewords. Chapter 8 first presents well known methods for constructing long powerful codes from short component codes or component codes of smaller dimensions, and then provides methods for constructing their trellises which include Shannon and Cartesian product techniques. Chapter 9 deals with convolutional codes, puncturing, zero-tail termination and tail-biting.Chapters 10 through 13 present various trellis-based decoding algorithms, old and new. Chapter 10 first discusses the application of the well known Viterbi decoding algorithm to linear block codes, optimum sectionalization of a code trellis to minimize computation complexity, and design issues for IC (integrated circuit) implementation of a Viterbi decoder. Then it presents a new decoding algorithm for convolutional codes, named Differential Trellis Decoding (DTD) algorithm. Chapter 12 presents a suboptimum reliability-based iterative decoding algorithm with a low-weight trellis search for the most likely codeword. This decoding algorithm provides a good trade-off between error performance and decoding complexity. All the decoding algorithms presented in Chapters 10 through 12 are devised to minimize word error probability. Chapter 13 presents decoding algorithms that minimize bit error probability and provide the corresponding soft (reliability) information at the output of the decoder. Decoding algorithms presented are the MAP (maximum a posteriori probability) decoding algorithm and the Soft-Output Viterbi Algorithm (SOVA) algorithm. Finally, the minimization of bit error probability in trellis-based MLD is discussed.

  12. Sparse RNA folding revisited: space-efficient minimum free energy structure prediction.

    PubMed

    Will, Sebastian; Jabbari, Hosna

    2016-01-01

    RNA secondary structure prediction by energy minimization is the central computational tool for the analysis of structural non-coding RNAs and their interactions. Sparsification has been successfully applied to improve the time efficiency of various structure prediction algorithms while guaranteeing the same result; however, for many such folding problems, space efficiency is of even greater concern, particularly for long RNA sequences. So far, space-efficient sparsified RNA folding with fold reconstruction was solved only for simple base-pair-based pseudo-energy models. Here, we revisit the problem of space-efficient free energy minimization. Whereas the space-efficient minimization of the free energy has been sketched before, the reconstruction of the optimum structure has not even been discussed. We show that this reconstruction is not possible in trivial extension of the method for simple energy models. Then, we present the time- and space-efficient sparsified free energy minimization algorithm SparseMFEFold that guarantees MFE structure prediction. In particular, this novel algorithm provides efficient fold reconstruction based on dynamically garbage-collected trace arrows. The complexity of our algorithm depends on two parameters, the number of candidates Z and the number of trace arrows T; both are bounded by [Formula: see text], but are typically much smaller. The time complexity of RNA folding is reduced from [Formula: see text] to [Formula: see text]; the space complexity, from [Formula: see text] to [Formula: see text]. Our empirical results show more than 80 % space savings over RNAfold [Vienna RNA package] on the long RNAs from the RNA STRAND database (≥2500 bases). The presented technique is intentionally generalizable to complex prediction algorithms; due to their high space demands, algorithms like pseudoknot prediction and RNA-RNA-interaction prediction are expected to profit even stronger than "standard" MFE folding. SparseMFEFold is free software, available at http://www.bioinf.uni-leipzig.de/~will/Software/SparseMFEFold.

  13. Information search and decision making: effects of age and complexity on strategy use.

    PubMed

    Queen, Tara L; Hess, Thomas M; Ennis, Gilda E; Dowd, Keith; Grühn, Daniel

    2012-12-01

    The impact of task complexity on information search strategy and decision quality was examined in a sample of 135 young, middle-aged, and older adults. We were particularly interested in the competing roles of fluid cognitive ability and domain knowledge and experience, with the former being a negative influence and the latter being a positive influence on older adults' performance. Participants utilized 2 decision matrices, which varied in complexity, regarding a consumer purchase. Using process tracing software and an algorithm developed to assess decision strategy, we recorded search behavior, strategy selection, and final decision. Contrary to expectations, older adults were not more likely than the younger age groups to engage in information-minimizing search behaviors in response to increases in task complexity. Similarly, adults of all ages used comparable decision strategies and adapted their strategies to the demands of the task. We also examined decision outcomes in relation to participants' preferences. Overall, it seems that older adults utilize simpler sets of information primarily reflecting the most valued attributes in making their choice. The results of this study suggest that older adults are adaptive in their approach to decision making and that this ability may benefit from accrued knowledge and experience. 2013 APA, all rights reserved

  14. I-deas TMG to NX Space Systems Thermal Model Conversion and Computational Performance Comparison

    NASA Technical Reports Server (NTRS)

    Somawardhana, Ruwan

    2011-01-01

    CAD/CAE packages change on a continuous basis as the power of the tools increase to meet demands. End -users must adapt to new products as they come to market and replace legacy packages. CAE modeling has continued to evolve and is constantly becoming more detailed and complex. Though this comes at the cost of increased computing requirements Parallel processing coupled with appropriate hardware can minimize computation time. Users of Maya Thermal Model Generator (TMG) are faced with transitioning from NX I -deas to NX Space Systems Thermal (SST). It is important to understand what differences there are when changing software packages We are looking for consistency in results.

  15. Inflight alignment of payload inertial reference from Shuttle navigation system

    NASA Astrophysics Data System (ADS)

    Treder, A. J.; Norris, R. E.; Ruprecht, R.

    Two methods for payload attitude initialization from the STS Orbiter have been proposed: body axis maneuvers (BAM) and star line maneuvers (SLM). The first achieves alignment directly through the Shuttle star tracker, while the second, indirectly through the stellar-updated Shuttle inertial platform. The Inertial Upper Stage (IUS) with its strapdown navigation system is used to demonstrate in-flight alignment techniques. Significant accuracy can be obtained with minimal impact on Orbiter operations, with payload inertial reference potentially approaching the accuracy of the Shuttle star tracker. STS-6 flight performance parameters, including alignment stability, are discussed and compared with operational complexity. Results indicate overall alignment stability of .06 deg, 3 sigma per axis.

  16. A Minimum Delta V Orbit Maintenance Strategy for Low-Altitude Missions Using Burn Parameter Optimization

    NASA Technical Reports Server (NTRS)

    Brown, Aaron J.

    2011-01-01

    Orbit maintenance is the series of burns performed during a mission to ensure the orbit satisfies mission constraints. Low-altitude missions often require non-trivial orbit maintenance Delta V due to sizable orbital perturbations and minimum altitude thresholds. A strategy is presented for minimizing this Delta V using impulsive burn parameter optimization. An initial estimate for the burn parameters is generated by considering a feasible solution to the orbit maintenance problem. An low-lunar orbit example demonstrates the Delta V savings from the feasible solution to the optimal solution. The strategy s extensibility to more complex missions is discussed, as well as the limitations of its use.

  17. Number-related discrimination and summation by squirrel monkeys (Saimiri sciureus sciureus and S. boliviensus boliviensus) on the basis of the number of sides of polygons.

    PubMed

    Terrell, D F; Thomas, R K

    1990-09-01

    In Experiment 1, with the number of sides or angles of irregular polygons as cues, programmed training, and a 90% correct criterion (36 of 40), 2 squirrel monkeys' (Saimiri sciureus sciureus and S. boliviensus boliviensus) best performances were to discriminate heptagons from octagons, a 3rd's best was hexagons from heptagons, and a 4th's best was pentagons from heptagons. In Experiment 2, on most trials 2 polygons on one or both discriminanda had to be summed to determine which discrimination had the total fewer sides. Only 1 monkey met criterion (27 of 30) on the 2 tasks, 6 vs. 8 and 7 vs. 8 sides, but the other 3 performed better than chance on the 6 vs. 8 task. We conclude that previous studies of animals' discrimination of polygons in terms of complexity were minimally relevant to this work, and counting and subitizing were rejected in favor of a prototype-matching process to explain our monkeys' performances.

  18. Electrical cardioversion

    PubMed Central

    Sucu, Murat; Davutoglu, Vedat; Ozer, Orhan

    2009-01-01

    External electrical cardioversion was first performed in the 1950s. Urgent or elective cardioversions have specific advantages, such as termination of atrial and ventricular tachycardia and recovery of sinus rhythm. Electrical cardioversion is life-saving when applied in urgent circumstances. The succcess rate is increased by accurate tachycardia diagnosis, careful patient selection, adequate electrode (paddles) application, determination of the optimal energy and anesthesia levels, prevention of embolic events and arrythmia recurrence and airway conservation while minimizing possible complications. Potential complications include ventricular fibrillation due to general anesthesia or lack of synchronization between the direct current (DC) shock and the QRS complex, thromboembolus due to insufficient anticoagulant therapy, non-sustained VT, atrial arrhythmia, heart block, bradycardia, transient left bundle branch block, myocardial necrosis, myocardial dysfunction, transient hypotension, pulmonary edema and skin burn. Electrical cardioversion performed in patients with a pacemaker or an incompatible cardioverter defibrillator may lead to dysfunction, namely acute or chronic changes in the pacing or sensitivity threshold. Although this procedure appears fairly simple, serious consequences might occur if inappropriately performed. PMID:19448376

  19. Efficiency in nonequilibrium molecular dynamics Monte Carlo simulations

    DOE PAGES

    Radak, Brian K.; Roux, Benoît

    2016-10-07

    Hybrid algorithms combining nonequilibrium molecular dynamics and Monte Carlo (neMD/MC) offer a powerful avenue for improving the sampling efficiency of computer simulations of complex systems. These neMD/MC algorithms are also increasingly finding use in applications where conventional approaches are impractical, such as constant-pH simulations with explicit solvent. However, selecting an optimal nonequilibrium protocol for maximum efficiency often represents a non-trivial challenge. This work evaluates the efficiency of a broad class of neMD/MC algorithms and protocols within the theoretical framework of linear response theory. The approximations are validated against constant pH-MD simulations and shown to provide accurate predictions of neMD/MC performance.more » An assessment of a large set of protocols confirms (both theoretically and empirically) that a linear work protocol gives the best neMD/MC performance. Lastly, a well-defined criterion for optimizing the time parameters of the protocol is proposed and demonstrated with an adaptive algorithm that improves the performance on-the-fly with minimal cost.« less

  20. Power optimization of digital baseband WCDMA receiver components on algorithmic and architectural level

    NASA Astrophysics Data System (ADS)

    Schämann, M.; Bücker, M.; Hessel, S.; Langmann, U.

    2008-05-01

    High data rates combined with high mobility represent a challenge for the design of cellular devices. Advanced algorithms are required which result in higher complexity, more chip area and increased power consumption. However, this contrasts to the limited power supply of mobile devices. This presentation discusses the application of an HSDPA receiver which has been optimized regarding power consumption with the focus on the algorithmic and architectural level. On algorithmic level the Rake combiner, Prefilter-Rake equalizer and MMSE equalizer are compared regarding their BER performance. Both equalizer approaches provide a significant increase of performance for high data rates compared to the Rake combiner which is commonly used for lower data rates. For both equalizer approaches several adaptive algorithms are available which differ in complexity and convergence properties. To identify the algorithm which achieves the required performance with the lowest power consumption the algorithms have been investigated using SystemC models regarding their performance and arithmetic complexity. Additionally, for the Prefilter Rake equalizer the power estimations of a modified Griffith (LMS) and a Levinson (RLS) algorithm have been compared with the tool ORINOCO supplied by ChipVision. The accuracy of this tool has been verified with a scalable architecture of the UMTS channel estimation described both in SystemC and VHDL targeting a 130 nm CMOS standard cell library. An architecture combining all three approaches combined with an adaptive control unit is presented. The control unit monitors the current condition of the propagation channel and adjusts parameters for the receiver like filter size and oversampling ratio to minimize the power consumption while maintaining the required performance. The optimization strategies result in a reduction of the number of arithmetic operations up to 70% for single components which leads to an estimated power reduction of up to 40% while the BER performance is not affected. This work utilizes SystemC and ORINOCO for the first estimation of power consumption in an early step of the design flow. Thereby algorithms can be compared in different operating modes including the effects of control units. Here an algorithm having higher peak complexity and power consumption but providing more flexibility showed less consumption for normal operating modes compared to the algorithm which is optimized for peak performance.

  1. Spatially explicit risk assessment of an estuarine fish in Barataria Bay, Louisiana, following the Deepwater Horizon Oil spill: evaluating tradeoffs in model complexity and parsimony

    EPA Science Inventory

    As ecological risk assessments (ERA) move beyond organism-based determinations towards probabilistic population-level assessments, model complexity must be evaluated against the goals of the assessment, the information available to parameterize components with minimal dependence ...

  2. Comparison of fresh-frozen cadaver and high-fidelity virtual reality simulator as methods of laparoscopic training.

    PubMed

    Sharma, Mitesh; Horgan, Alan

    2012-08-01

    The aim of this study was to compare fresh-frozen cadavers (FFC) with a high-fidelity virtual reality simulator (VRS) as training tools in minimal access surgery for complex and relatively simple procedures. A prospective comparative face validity study between FFC and VRS (LAP Mentor(™)) was performed. Surgeons were recruited to perform tasks on both FFC and VRS appropriately paired to their experience level. Group A (senior) performed a laparoscopic sigmoid colectomy, Group B (intermediate) performed a laparoscopic incisional hernia repair, and Group C (junior) performed basic laparoscopic tasks (BLT) (camera manipulation, hand-eye coordination, tissue dissection and hand-transferring skills). Each subject completed a 5-point Likert-type questionnaire rating the training modalities in nine domains. Data were analysed using nonparametric tests. Forty-five surgeons were recruited to participate (15 per skill group). Median scores for subjects in Group A were significantly higher for evaluation of FFC in all nine domains compared to VRS (p < 0.01). Group B scored FFC significantly better (p < 0.05) in all domains except task replication (p = 0.06). Group C scored FFC significantly better (p < 0.01) in eight domains but not on performance feedback (p = 0.09). When compared across groups, juniors accepted VRS as a training model more than did intermediate and senior groups on most domains (p < 0.01) except team work. Fresh-frozen cadaver is perceived as a significantly overall better model for laparoscopic training than the high-fidelity VRS by all training grades, irrespective of the complexity of the operative procedure performed. VRS is still useful when training junior trainees in BLT.

  3. [Arthroscopically Assisted Minimally Invasive Fixation of a Type D2c Scapular Fracture].

    PubMed

    Kornherr, Patrick; Konerding, Christiane; Kovacevic, Mark; Wenda, Klaus

    2018-06-12

    Fractures of the scapula are rare and have an incidence of 1% of all fractures. Publications highlight glenoid rim fractures. Classification by Ideberg and Euler and Rüdi are accepted. Euler and Rüdi describe three extra-articular and two intra-articular fracture patterns. The indications for surgery are displaced glenoid fractures, scapula tilt of more than 40° and injuries to the superior shoulder suspensory complex. We describe a case of a 22 year old man, who while cycling collided with a moving car due to wet roads. After his admission to hospital as a polytraumatised patient, the trauma CT-Scan showed haemothorax with several associated rip fractures, displaced humeral shaft fracture and fractures of the acromion and glenoid, classified as type D2c according to Euler and Rüdi. Following damage control principles, drainage of the haemothorax was already performed in the ER and surgical treatment of the displaced humeral shaft fracture was performed on the day of admission. No peripheral neurological deficits were evident. After pulmonary stabilisation, surgery was performed 6 days later on the glenoid and acromion fracture, which in conjunction may be regarded as an injury to the superior shoulder suspensory complex. We performed an arthroscopically-assisted screw fixation of the glenoid fracture (type D2c according to Euler and Rüdi) and an ORIF procedure at the acromion. Postoperative rehabilitation was performed with passive abduction and elevation up to 90° for the first two weeks and active abduction an elevation up to 90° for weeks 3 to 6. Full ROM was allowed at week 7. Articular fractures of the glenoid are rare and mainly seen as rim fractures. The indications for surgery are displaced articular fractures and injury to the superior shoulder suspensory complex. As demonstrated by this article, type D2c fractures according to Euler and Rüdi can be treated effectively as an arthroscopically-assisted screw fixation procedure. Georg Thieme Verlag KG Stuttgart · New York.

  4. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  5. Current surgical management of mitral regurgitation.

    PubMed

    Calvinho, Paulo; Antunes, Manuel

    2008-04-01

    From Walton Lillehei, who performed the first successful open mitral valve surgery in 1956, until the advent of robotic surgery in the 21st Century, only 50 years have passed. The introduction of the first heart valve prosthesis, in 1960, was the next major step forward. However, correction of mitral disease by valvuloplasty results in better survival and ventricular performance than mitral valve replacement. However, the European Heart Survey demonstrated that only 40% of the valves are repaired. The standard procedures (Carpentier's techniques and Alfieri's edge-to-edge suture) are the surgical basis for the new technical approaches. Minimally invasive surgery led to the development of video-assisted and robotic surgery and interventional cardiology is already making the first steps on endovascular procedures, using the classical concepts in highly differentiated approaches. Correction of mitral regurgitation is a complex field that is still growing, whereas classic surgery is still under debate as the new era arises.

  6. “Additive Manufacturing: Building the Pathway Towards Process and Material Qualification”

    DOE PAGES

    Carpenter, John S.; Beese, Allison M.; Bourell, David L.; ...

    2016-06-14

    The potential benefits of metal additive manufacturing, as compared with more traditional, subtractive-only approaches, has created excitement within design circles seeking to take advantage of the ability to build and repair complex shapes, to integrate or consolidate multiple parts and minimize joining concerns, and to locally tailor material properties to increase functionality. Tempering the excitement of designers, however, has been concerns with the material deposited by the process. It is not enough for a part to ‘look’ right from a geometric perspective. Rather, the metallurgical aspects associated with the material being deposited must ‘look’ and ‘behave’ correctly along with themore » aforementioned geometric accuracy. Finally, without elucidation of the connections between processing, microstructure, properties, and performance from a materials science perspective, metal additive manufacturing will not realize its potential to change the manufacturing world for property and performance-critical engineering applications.« less

  7. Underwater wireless optical MIMO system with spatial modulation and adaptive power allocation

    NASA Astrophysics Data System (ADS)

    Huang, Aiping; Tao, Linwei; Niu, Yilong

    2018-04-01

    In this paper, we investigate the performance of underwater wireless optical multiple-input multiple-output communication system combining spatial modulation (SM-UOMIMO) with flag dual amplitude pulse position modulation (FDAPPM). Channel impulse response for coastal and harbor ocean water links are obtained by Monte Carlo (MC) simulation. Moreover, we obtain the closed-form and upper bound average bit error rate (BER) expressions for receiver diversity including optical combining, equal gain combining and selected combining. And a novel adaptive power allocation algorithm (PAA) is proposed to minimize the average BER of SM-UOMIMO system. Our numeric results indicate an excellent match between the analytical results and numerical simulations, which confirms the accuracy of our derived expressions. Furthermore, the results show that adaptive PAA outperforms conventional fixed factor PAA and equal PAA obviously. Multiple-input single-output system with adaptive PAA obtains even better BER performance than MIMO one, at the same time reducing receiver complexity effectively.

  8. A large flat panel multifunction display for military and space applications

    NASA Astrophysics Data System (ADS)

    Pruitt, James S.

    1992-09-01

    A flat panel multifunction display (MFD) that offers the size and reliability benefits of liquid crystal display technology while achieving near-CRT display quality is presented. Display generation algorithms that provide exceptional display quality are being implemented in custom VLSI components to minimize MFD size. A high-performance processor converts user-specified display lists to graphics commands used by these components, resulting in high-speed updates of two-dimensional and three-dimensional images. The MFD uses the MIL-STD-1553B data bus for compatibility with virtually all avionics systems. The MFD can generate displays directly from display lists received from the MIL-STD-1553B bus. Complex formats can be stored in the MFD and displayed using parameters from the data bus. The MFD also accepts direct video input and performs special processing on this input to enhance image quality.

  9. Robotic pancreaticoduodenectomy.

    PubMed

    Sola, Richard; Kirks, Russell C; Iannitti, David A; Vrochides, Dionisios; Martinie, John B

    2016-01-01

    Pancreaticoduodenectomy (PD) is considered one of the most complex and technically challenging abdominal surgeries performed by general surgeons. With increasing use of minimally invasive surgery, this operation continues to be performed most commonly in an open fashion. Open PD (OPD) is characterized by high morbidity and mortality rates in published series. Since the early 2000s, use of robotics for PD has slowly evolved. For appropriately selected patients, robotic PD (RPD) has been shown to have less intraoperative blood loss, decreased morbidity and mortality, shorter hospital length of stay, and similar oncological outcomes compared with OPD. At our high-volume center, we have found lower complication rates for RPD along with no difference in total cost when compared with OPD. With demonstrated non-inferior oncologic outcomes for RPD, the potential exists that RPD may be the future standard in surgical management for pancreatic disease. We present a case of a patient with a pancreatic head mass and describe our institution's surgical technique for RPD.

  10. Nano-scale hydrogen-bond network improves the durability of greener cements

    PubMed Central

    Jacobsen, Johan; Rodrigues, Michelle Santos; Telling, Mark T. F.; Beraldo, Antonio Ludovico; Santos, Sérgio Francisco; Aldridge, Laurence P.; Bordallo, Heloisa N.

    2013-01-01

    More than ever before, the world's increasing need for new infrastructure demands the construction of efficient, sustainable and durable buildings, requiring minimal climate-changing gas-generation in their production. Maintenance-free “greener” building materials made from blended cements have advantages over ordinary Portland cements, as they are cheaper, generate less carbon dioxide and are more durable. The key for the improved performance of blends (which substitute fine amorphous silicates for cement) is related to their resistance to water penetration. The mechanism of this water resistance is of great environmental and economical impact but is not yet understood due to the complexity of the cement's hydration reactions. Using neutron spectroscopy, we studied a blend where cement was replaced by ash from sugar cane residuals originating from agricultural waste. Our findings demonstrate that the development of a distinctive hydrogen bond network at the nano-scale is the key to the performance of these greener materials. PMID:24036676

  11. An Overview of Randomization and Minimization Programs for Randomized Clinical Trials

    PubMed Central

    Saghaei, Mahmoud

    2011-01-01

    Randomization is an essential component of sound clinical trials, which prevents selection biases and helps in blinding the allocations. Randomization is a process by which subsequent subjects are enrolled into trial groups only by chance, which essentially eliminates selection biases. A serious consequence of randomization is severe imbalance among the treatment groups with respect to some prognostic factors, which invalidate the trial results or necessitate complex and usually unreliable secondary analysis to eradicate the source of imbalances. Minimization on the other hand tends to allocate in such a way as to minimize the differences among groups, with respect to prognostic factors. Pure minimization is therefore completely deterministic, that is, one can predict the allocation of the next subject by knowing the factor levels of a previously enrolled subject and having the properties of the next subject. To eliminate the predictability of randomization, it is necessary to include some elements of randomness in the minimization algorithms. In this article brief descriptions of randomization and minimization are presented followed by introducing selected randomization and minimization programs. PMID:22606659

  12. Propeller Flaps and Its Outcomes - A Prospective Study of 15 Cases Over Two-years.

    PubMed

    K T, Ramesha; J, Vijay; M, Shankarappa

    2014-01-01

    Cover flaps are needed in management of any bodily defect involving bone, tendon, nerve & vessels. The major objective of a plastic surgeon, facing a complex soft-tissue defect, is to replace "like with like" tissues at minimal donor site "cost" and with maximal accuracy & efficacy. To study the "Propeller Flaps" utility in reconstructive surgeries, evaluate its planning and complications involving donor site morbidity. The prospective study was conducted on 15 cases (11 males/4 females) of propeller flaps during the period of two years (2010-12) in Department of Plastic Surgery and Burns, Bangalore Medical College and Research Institute (BMCRI), Karnataka, India. The propeller flaps were performed in cases with defects due to any cause. Cases with Peripheral Vascular Disease (PVD). Flaps were performed and details recorded. Overall results revealed problem resolution in 87% cases (13 cases). Comprehensive description of each flap type and its related cases are given in the table. It has been categorically found that there were 2 flap partial losses. Partial necrosis has been reported in heavy-smoker patients. This current study clearly justifies that careful application, optimal designing & judicious scientific application of local perforator flaps for lower-limb wounds including rest of the body is successful in many aspects providing high-quality reconstruction ensuring minimal morbidity. It is cost-effective as well as time-saving.

  13. Identification and Characterization of Outer Membrane Vesicle-Associated Proteins in Salmonella enterica Serovar Typhimurium

    PubMed Central

    Bai, Jaewoo; Kim, Seul I; Ryu, Sangryeol

    2014-01-01

    Salmonella enterica serovar Typhimurium is a primary cause of enteric diseases and has acquired a variety of virulence factors during its evolution into a pathogen. Secreted virulence factors interact with commensal flora and host cells and enable Salmonella to survive and thrive in hostile environments. Outer membrane vesicles (OMVs) released from many Gram-negative bacteria function as a mechanism for the secretion of complex mixtures, including virulence factors. We performed a proteomic analysis of OMVs that were isolated under standard laboratory and acidic minimal medium conditions and identified 14 OMV-associated proteins that were observed in the OMV fraction isolated only under the acidic minimal medium conditions, which reproduced the nutrient-deficient intracellular milieu. The inferred roles of these 14 proteins were diverse, including transporter, enzyme, and transcriptional regulator. The absence of these proteins influenced Salmonella survival inside murine macrophages. Eleven of these proteins were predicted to possess secretion signal sequences at their N termini, and three (HupA, GlnH, and PhoN) of the proteins were found to be translocated into the cytoplasm of host cells. The comparative proteomic profiling of OMVs performed in this study revealed different protein compositions in the OMVs isolated under the two different conditions, which indicates that the OMV cargo depends on the growth conditions and provides a deeper insight into how Salmonella utilizes OMVs to adapt to environmental changes. PMID:24935973

  14. Non-negative infrared patch-image model: Robust target-background separation via partial sum minimization of singular values

    NASA Astrophysics Data System (ADS)

    Dai, Yimian; Wu, Yiquan; Song, Yu; Guo, Jun

    2017-03-01

    To further enhance the small targets and suppress the heavy clutters simultaneously, a robust non-negative infrared patch-image model via partial sum minimization of singular values is proposed. First, the intrinsic reason behind the undesirable performance of the state-of-the-art infrared patch-image (IPI) model when facing extremely complex backgrounds is analyzed. We point out that it lies in the mismatching of IPI model's implicit assumption of a large number of observations with the reality of deficient observations of strong edges. To fix this problem, instead of the nuclear norm, we adopt the partial sum of singular values to constrain the low-rank background patch-image, which could provide a more accurate background estimation and almost eliminate all the salient residuals in the decomposed target image. In addition, considering the fact that the infrared small target is always brighter than its adjacent background, we propose an additional non-negative constraint to the sparse target patch-image, which could not only wipe off more undesirable components ulteriorly but also accelerate the convergence rate. Finally, an algorithm based on inexact augmented Lagrange multiplier method is developed to solve the proposed model. A large number of experiments are conducted demonstrating that the proposed model has a significant improvement over the other nine competitive methods in terms of both clutter suppressing performance and convergence rate.

  15. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    PubMed

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  16. In-flight Evaluation of Aerodynamic Predictions of an Air-launched Space Booster

    NASA Technical Reports Server (NTRS)

    Curry, Robert E.; Mendenhall, Michael R.; Moulton, Bryan

    1992-01-01

    Several analytical aerodynamic design tools that were applied to the Pegasus (registered trademark) air-launched space booster were evaluated using flight measurements. The study was limited to existing codes and was conducted with limited computational resources. The flight instrumentation was constrained to have minimal impact on the primary Pegasus missions. Where appropriate, the flight measurements were compared with computational data. Aerodynamic performance and trim data from the first two flights were correlated with predictions. Local measurements in the wing and wing-body interference region were correlated with analytical data. This complex flow region includes the effect of aerothermal heating magnification caused by the presence of a corner vortex and interaction of the wing leading edge shock and fuselage boundary layer. The operation of the first two missions indicates that the aerodynamic design approach for Pegasus was adequate, and data show that acceptable margins were available. Additionally, the correlations provide insight into the capabilities of these analytical tools for more complex vehicles in which the design margins may be more stringent.

  17. In-flight evaluation of aerodynamic predictions of an air-launched space booster

    NASA Technical Reports Server (NTRS)

    Curry, Robert E.; Mendenhall, Michael R.; Moulton, Bryan

    1993-01-01

    Several analytical aerodynamic design tools that were applied to the Pegasus air-launched space booster were evaluated using flight measurements. The study was limited to existing codes and was conducted with limited computational resources. The flight instrumentation was constrained to have minimal impact on the primary Pegasus missions. Where appropriate, the flight measurements were compared with computational data. Aerodynamic performance and trim data from the first two flights were correlated with predictions. Local measurements in the wing and wing-body interference region were correlated with analytical data. This complex flow region includes the effect of aerothermal heating magnification caused by the presence of a corner vortex and interaction of the wing leading edge shock and fuselage boundary layer. The operation of the first two missions indicates that the aerodynamic design approach for Pegasus was adequate, and data show that acceptable margins were available. Additionally, the correlations provide insight into the capabilities of these analytical tools for more complex vehicles in which design margins may be more stringent.

  18. Processing and Structural Advantages of the Sylramic-iBN SiC Fiber for SiC/SiC Components

    NASA Technical Reports Server (NTRS)

    Yun, H. M.; Dicarlo, J. A.; Bhatt, R. T.; Hurst, J. B.

    2008-01-01

    The successful high-temperature application of complex-shaped SiC/SiC components will depend on achieving as high a fraction of the as-produced fiber strength as possible during component fabrication and service. Key issues center on a variety of component architecture, processing, and service-related factors that can reduce fiber strength, such as fiber-fiber abrasion during architecture shaping, surface chemical attack during interphase deposition and service, and intrinsic flaw growth during high-temperature matrix formation and composite creep. The objective of this paper is to show that the NASA-developed Sylramic-iBN SiC fiber minimizes many of these issues for state-of-the-art melt-infiltrated (MI) SiC/BN/SiC composites. To accomplish this, data from various mechanical tests are presented that compare how different high performance SiC fiber types retain strength during formation of complex architectures, during processing of BN interphases and MI matrices, and during simulated composite service at high temperatures.

  19. Hypophosphataemia: an easy strategy for diagnosis and treatment in HIV patients.

    PubMed

    Bagnis, Corinne Isnard; Karie, Svetlana; Deray, Gilbert; Essig, Marie

    2009-01-01

    Because HIV infection has become a chronic disease, it is crucial that metabolic complications secondary to HIV infection or prolonged therapy be diagnosed and managed appropriately over time. Therefore the optimal follow-up becomes complex and time consuming. Our review aimed to provide physicians in charge of HIV-infected patients with key data helping them to diagnose and understand hypophosphataemia in HIV patients. Hypophosphataemia is frequent and sometimes secondary to renal phosphate wasting. It is very rarely a component of a complex proximal tubular disorder, such as Fanconi syndrome. When isolated, hypophosphataemia is easy to rule out and treat. In rare cases, prolonged hypophosphataemia, when related to renal phosphate wasting and tubular dysfunction, might have potential consequences on bone outcome, however, more studies are needed. HIV infection by itself might be a risk factor for bone metabolism abnormalities; antiretroviral drugs might also be involved. Therefore, it seems valuable for patients that the minimal screening should be performed routinely, in order to prevent long-term disabilities.

  20. Cardiorespiratory interactions: the relationship between mechanical ventilation and hemodynamics.

    PubMed

    Cheifetz, Ira M

    2014-12-01

    The overall goal of the cardiorespiratory system is to provide the organs and tissues of the body with an adequate supply of oxygen in relation to oxygen consumption. An understanding of the complex physiologic interactions between the respiratory and cardiac systems is essential to optimal patient management. Alterations in intrathoracic pressure are transmitted to the heart and lungs and can dramatically alter cardiovascular performance, with significant differences existing between the physiologic response of the right and left ventricles to changes in intrathoracic pressure. In terms of cardiorespiratory interactions, the clinician should titrate the mean airway pressure to optimize the balance between mean lung volume (ie, arterial oxygenation) and ventricular function (ie, global cardiac output), minimize pulmonary vascular resistance, and routinely monitor cardiorespiratory parameters closely. Oxygen delivery to all organs and tissues of the body should be optimized, but not necessarily maximized. The heart and lungs are, obviously, connected anatomically but also physiologically in a complex relationship. Copyright © 2014 by Daedalus Enterprises.

  1. Lattice thermal conductivity of multi-component alloys

    DOE PAGES

    Caro, Magdalena; Béland, Laurent K.; Samolyuk, German D.; ...

    2015-06-12

    High entropy alloys (HEA) have unique properties including the potential to be radiation tolerant. These materials with extreme disorder could resist damage because disorder, stabilized by entropy, is the equilibrium thermodynamic state. Disorder also reduces electron and phonon conductivity keeping the damage energy longer at the deposition locations, eventually favoring defect recombination. In the short time-scales related to thermal spikes induced by collision cascades, phonons become the relevant energy carrier. In this paper, we perform a systematic study of phonon thermal conductivity in multiple component solid solutions represented by Lennard-Jones (LJ) potentials. We explore the conditions that minimize phonon meanmore » free path via extreme alloy complexity, by varying the composition and the elements (differing in mass, atomic radii, and cohesive energy). We show that alloy complexity can be tailored to modify the scattering mechanisms that control energy transport in the phonon subsystem. Finally, our analysis provides a qualitative guidance for the selection criteria used in the design of HEA alloys with low phonon thermal conductivity.« less

  2. Long-life, space-maintainable nuclear stage regulators and shutoff valves

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The six most promising valve, regulator, and remote coupling concepts, representing the more radical designs from twenty concepts generated, were investigated. Of the three valves, one has no moving parts because shutoff sealing is accomplished by an electromagnetic field which ionized the flowing fluid. Another valve uses liquid metal to obtain sealing. In the third valve, high sealing forces are generated by heating and expanding trapped hydrogen. The pressure regulator is an electronically controlled, electromechanically operated, single state valve. Its complexity is in electronic circuitry, and the design results in less weight, increased reliability and performance flexibility, and multipurpose application. The two remote couplings feature the minimization of weight and mechanical complexity. One concept uses a low melting temperature metal alloy which is injected into the joint cavity; upon solidification, the alloy provides a seal and a structural joint. The second concept is based on the differential thermal expansion of the coupling mating parts. At thermal equilibrium there is a predetermined interference between the parts, and sealing is achieved by interference loading.

  3. Replica Exchange Improves Sampling in Low-Resolution Docking Stage of RosettaDock

    PubMed Central

    Zhang, Zhe; Lange, Oliver F.

    2013-01-01

    Many protein-protein docking protocols are based on a shotgun approach, in which thousands of independent random-start trajectories minimize the rigid-body degrees of freedom. Another strategy is enumerative sampling as used in ZDOCK. Here, we introduce an alternative strategy, ReplicaDock, using a small number of long trajectories of temperature replica exchange. We compare replica exchange sampling as low-resolution stage of RosettaDock with RosettaDock's original shotgun sampling as well as with ZDOCK. A benchmark of 30 complexes starting from structures of the unbound binding partners shows improved performance for ReplicaDock and ZDOCK when compared to shotgun sampling at equal or less computational expense. ReplicaDock and ZDOCK consistently reach lower energies and generate significantly more near-native conformations than shotgun sampling. Accordingly, they both improve typical metrics of prediction quality of complex structures after refinement. Additionally, the refined ReplicaDock ensembles reach significantly lower interface energies and many previously hidden features of the docking energy landscape become visible when ReplicaDock is applied. PMID:24009670

  4. Development of a 3-step straight-through purification strategy combining membrane adsorbers and resins.

    PubMed

    Hughson, Michael D; Cruz, Thayana A; Carvalho, Rimenys J; Castilho, Leda R

    2017-07-01

    The pressures to efficiently produce complex biopharmaceuticals at reduced costs are driving the development of novel techniques, such as in downstream processing with straight-through processing (STP). This method involves directly and sequentially purifying a particular target with minimal holding steps. This work developed and compared six different 3-step STP strategies, combining membrane adsorbers, monoliths, and resins, to purify a large, complex, and labile glycoprotein from Chinese hamster ovary cell culture supernatant. The best performing pathway was cation exchange chromatography to hydrophobic interaction chromatography to affinity chromatography with an overall product recovery of up to 88% across the process and significant clearance of DNA and protein impurities. This work establishes a platform and considerations for the development of STP of biopharmaceutical products and highlights its suitability for integration with single-use technologies and continuous production methods. © 2017 American Institute of Chemical Engineers Biotechnol. Prog., 33:931-940, 2017. © 2017 American Institute of Chemical Engineers.

  5. Data-collection strategy for challenging native SAD phasing.

    PubMed

    Olieric, Vincent; Weinert, Tobias; Finke, Aaron D; Anders, Carolin; Li, Dianfan; Olieric, Natacha; Borca, Camelia N; Steinmetz, Michel O; Caffrey, Martin; Jinek, Martin; Wang, Meitian

    2016-03-01

    Recent improvements in data-collection strategies have pushed the limits of native SAD (single-wavelength anomalous diffraction) phasing, a method that uses the weak anomalous signal of light elements naturally present in macromolecules. These involve the merging of multiple data sets from either multiple crystals or from a single crystal collected in multiple orientations at a low X-ray dose. Both approaches yield data of high multiplicity while minimizing radiation damage and systematic error, thus ensuring accurate measurements of the anomalous differences. Here, the combined use of these two strategies is described to solve cases of native SAD phasing that were particular challenges: the integral membrane diacylglycerol kinase (DgkA) with a low Bijvoet ratio of 1% and the large 200 kDa complex of the CRISPR-associated endonuclease (Cas9) bound to guide RNA and target DNA crystallized in the low-symmetry space group C2. The optimal native SAD data-collection strategy based on systematic measurements performed on the 266 kDa multiprotein/multiligand tubulin complex is discussed.

  6. 3-D video techniques in endoscopic surgery.

    PubMed

    Becker, H; Melzer, A; Schurr, M O; Buess, G

    1993-02-01

    Three-dimensional visualisation of the operative field is an important requisite for precise and fast handling of open surgical operations. Up to now it has only been possible to display a two-dimensional image on the monitor during endoscopic procedures. The increasing complexity of minimal invasive interventions requires endoscopic suturing and ligatures of larger vessels which are difficult to perform without the impression of space. Three-dimensional vision therefore may decrease the operative risk, accelerate interventions and widen the operative spectrum. In April 1992 a 3-D video system developed at the Nuclear Research Center Karlsruhe, Germany (IAI Institute) was applied in various animal experimental procedures and clinically in laparoscopic cholecystectomy. The system works with a single monitor and active high-speed shutter glasses. Our first trials with this new 3-D imaging system clearly showed a facilitation of complex surgical manoeuvres like mobilisation of organs, preparation in the deep space and suture techniques. The 3-D-system introduced in this article will enter the market in 1993 (Opticon Co., Karlsruhe, Germany.

  7. GaAs VLSI technology and circuit elements for DSP

    NASA Astrophysics Data System (ADS)

    Mikkelson, James M.

    1990-10-01

    Recent progress in digital GaAs circuit performance and complexity is presented to demonstrate the current capabilities of GaAs components. High density GaAs process technology and circuit design techniques are described and critical issues for achieving favorable complexity speed power and cost tradeoffs are reviewed. Some DSP building blocks are described to provide examples of what types of DSP systems could be implemented with present GaAs technology. DIGITAL GaAs CIRCUIT CAPABILITIES In the past few years the capabilities of digital GaAs circuits have dramatically increased to the VLSI level. Major gains in circuit complexity and power-delay products have been achieved by the use of silicon-like process technologies and simple circuit topologies. The very high speed and low power consumption of digital GaAs VLSI circuits have made GaAs a desirable alternative to high performance silicon in hardware intensive high speed system applications. An example of the performance and integration complexity available with GaAs VLSI circuits is the 64x64 crosspoint switch shown in figure 1. This switch which is the most complex GaAs circuit currently available is designed on a 30 gate GaAs gate array. It operates at 200 MHz and dissipates only 8 watts of power. The reasons for increasing the level of integration of GaAs circuits are similar to the reasons for the continued increase of silicon circuit complexity. The market factors driving GaAs VLSI are system design methodology system cost power and reliability. System designers are hesitant or unwilling to go backwards to previous design techniques and lower levels of integration. A more highly integrated system in a lower performance technology can often approach the performance of a system in a higher performance technology at a lower level of integration. Higher levels of integration also lower the system component count which reduces the system cost size and power consumption while improving the system reliability. For large gate count circuits the power per gate must be minimized to prevent reliability and cooling problems. The technical factors which favor increasing GaAs circuit complexity are primarily related to reducing the speed and power penalties incurred when crossing chip boundaries. Because the internal GaAs chip logic levels are not compatible with standard silicon I/O levels input receivers and output drivers are needed to convert levels. These I/O circuits add significant delay to logic paths consume large amounts of power and use an appreciable portion of the die area. The effects of these I/O penalties can be reduced by increasing the ratio of core logic to I/O on a chip. DSP operations which have a large number of logic stages between the input and the output are ideal candidates to take advantage of the performance of GaAs digital circuits. Figure 2 is a schematic representation of the I/O penalties encountered when converting from ECL levels to GaAs

  8. Software Aids for radiologists: Part 1, Useful Photoshop skills.

    PubMed

    Gross, Joel A; Thapa, Mahesh M

    2012-12-01

    The purpose of this review is to describe the use of several essential techniques and tools in Adobe Photoshop image-editing software. The techniques shown expand on those previously described in the radiologic literature. Radiologists, especially those with minimal experience with image-editing software, can quickly apply a few essential Photoshop tools to minimize the frustration that can result from attempting to navigate a complex user interface.

  9. Da Vinci Robotic Surgery in a Pediatric Hospital.

    PubMed

    Mattioli, Girolamo; Pini Prato, Alessio; Razore, Barbara; Leonelli, Lorenzo; Pio, Luca; Avanzini, Stefano; Boscarelli, Alessandro; Barabino, Paola; Disma, Nicola Massimo; Zanaboni, Clelia; Garzi, Alfredo; Martigli, Sofia Paola; Buffi, Nicolò Maria; Rosati, Ubaldo; Petralia, Paolo

    2017-05-01

    Since the use of robotic surgery (RS) revolutionized some adult surgery procedures such as radical prostatectomy, it has been progressively and increasingly introduced in pediatric surgery. The aim of this study is to evaluate how the Da Vinci ® Si HD technology impacts a pediatric public hospital and to define the use of a robotic system in pediatric surgery. We prospectively included patients older than 6 months of age undergoing RS or conventional minimal access surgery (MAS): Study period ranges between February 2015 and April 2016. Surgical indications were defined after a detailed disease-specific diagnostic work-up. We analyzed surgical outcomes and the most relevant economic aspects. The 30-day postoperative complications were evaluated and retrospectively collected in an electronic database. From February 2015 to April 2016, we performed 77 procedures with RS and 84 with conventional MAS in patients with a median age of 77 and 98 months at surgery and a median weight of 20 and 23 kg, respectively. Median operative times were 130 and 109 minutes, respectively. We observed 9.1% of complications in the RS group and 6% in the MAS group and the difference was not statistically significant. Of note, 8 out of 77 RS procedures would have been performed with open classic surgery in case of conversion or failure of RS. This initial experience confirms that RS is as safe and effective as conventional MAS. A number of selected procedures performed with RS would only benefit from this approach, as it is not suitable for conventional MAS. Although economically demanding, in particular for a pediatric hospital, we firmly believe that centralization of care would allow pediatric surgeons adopting RS to perform complex reconstructive surgical procedures with great advantages for the patients and a minimal increase in overall costs for the health system.

  10. Hydrodynamically induced oscillations and traffic dynamics in 1D microfludic networks

    NASA Astrophysics Data System (ADS)

    Bartolo, Denis; Jeanneret, Raphael

    2011-03-01

    We report on the traffic dynamics of particles driven through a minimal microfluidic network. Even in the minimal network consisting in a single loop, the traffic dynamics has proven to yield complex temporal patterns, including periodic, multi-periodic or chaotic sequences. This complex dynamics arises from the strongly nonlinear hydrodynamic interactions between the particles, that takes place at a junction. To better understand the consequences of this nontrivial coupling, we combined theoretical, numerical and experimental efforts and solved the 3-body problem in a 1D loop network. This apparently simple dynamical system revealed a rich and unexpected dynamics, including coherent spontaneous oscillations along closed orbits. Striking similarities between Hamiltonian systems and this driven dissipative system will be explained.

  11. Temperature Dependence of Uranium and Vanadium Adsorption on Amidoxime-Based Adsorbents in Natural Seawater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuo, Li-Jung; Gill, Gary A.; Tsouris, Costas

    The apparent enthalpy and entropy of the complexation of uranium (VI) and vanadium (V) with amidoxime ligands grafted onto polyethylene fiber was determined using time series measurements of adsorption capacities in natural seawater at three different temperatures. The complexation of uranium was highly endothermic, while the complexation of vanadium showed minimal temperature sensitivity. Amidoxime-based polymeric adsorbents exhibit significantly increased uranium adsorption capacities and selectivity in warmer waters.

  12. Trajectory-Oriented Approach to Managing Traffic Complexity: Operational Concept and Preliminary Metrics Definition

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Vivona, Robert; Garcia-Chico, Jose L.

    2008-01-01

    This document describes preliminary research on a distributed, trajectory-oriented approach for traffic complexity management. The approach is to manage traffic complexity in a distributed control environment, based on preserving trajectory flexibility and minimizing constraints. In particular, the document presents an analytical framework to study trajectory flexibility and the impact of trajectory constraints on it. The document proposes preliminary flexibility metrics that can be interpreted and measured within the framework.

  13. Minimal Intervention Dentistry – A New Frontier in Clinical Dentistry

    PubMed Central

    NK., Bajwa; A, Pathak

    2014-01-01

    Minimally invasive procedures are the new paradigm in health care. Everything from heart bypasses to gall bladder, surgeries are being performed with these dynamic new techniques. Dentistry is joining this exciting revolution as well. Minimally invasive dentistry adopts a philosophy that integrates prevention, remineralisation and minimal intervention for the placement and replacement of restorations. Minimally invasive dentistry reaches the treatment objective using the least invasive surgical approach, with the removal of the minimal amount of healthy tissues. This paper reviews in brief the concept of minimal intervention in dentistry. PMID:25177659

  14. Minimal intervention dentistry - a new frontier in clinical dentistry.

    PubMed

    Mm, Jingarwar; Nk, Bajwa; A, Pathak

    2014-07-01

    Minimally invasive procedures are the new paradigm in health care. Everything from heart bypasses to gall bladder, surgeries are being performed with these dynamic new techniques. Dentistry is joining this exciting revolution as well. Minimally invasive dentistry adopts a philosophy that integrates prevention, remineralisation and minimal intervention for the placement and replacement of restorations. Minimally invasive dentistry reaches the treatment objective using the least invasive surgical approach, with the removal of the minimal amount of healthy tissues. This paper reviews in brief the concept of minimal intervention in dentistry.

  15. Improving polymer/nanocrystal hybrid solar cell performance via tuning ligand orientation at CdSe quantum dot surface.

    PubMed

    Fu, Weifei; Wang, Ling; Zhang, Yanfang; Ma, Ruisong; Zuo, Lijian; Mai, Jiangquan; Lau, Tsz-Ki; Du, Shixuan; Lu, Xinhui; Shi, Minmin; Li, Hanying; Chen, Hongzheng

    2014-11-12

    Achieving superior solar cell performance based on the colloidal nanocrystals remains challenging due to their complex surface composition. Much attention has been devoted to the development of effective surface modification strategies to enhance electronic coupling between the nanocrystals to promote charge carrier transport. Herein, we aim to attach benzenedithiol ligands onto the surface of CdSe nanocrystals in the "face-on" geometry to minimize the nanocrystal-nanocrystal or polymer-nanocrystal distance. Furthermore, the "electroactive" π-orbitals of the benzenedithiol are expected to further enhance the electronic coupling, which facilitates charge carrier dissociation and transport. The electron mobility of CdSe QD films was improved 20 times by tuning the ligand orientation, and high performance poly[2,6-(4,4-bis(2-ethylhexyl)-4H-cyclopenta[2,1-b;3,4-b']-dithiophene)-alt-4,7-(2,1,3-benzothiadiazole)] (PCPDTBT):CdSe nanocrystal hybrid solar cells were also achieved, showing a highest power conversion efficiency of 4.18%. This research could open up a new pathway to improve further the performance of colloidal nanocrystal based solar cells.

  16. Robotic Assisted Transanal Polypectomies: Is There Any Indication?

    PubMed

    Gómez Ruiz, Marcos; Cagigas Fernández, Carmen; Alonso Martín, Joaquín; Cristobal Poch, Lidia; Manuel Palazuelos, Carlos; Barredo Cañibano, Francisco Javier; Gómez Fleitas, Manuel; Castillo Diego, Julio

    2017-12-01

    Robotic assisted transanal polipectomy may have advantages compared with the conventional transanal minimally invasive surgery technique. We evaluate the safety, feasibility and advantages of this technique. Between February 2014 and October 2015, 9patients underwent robotic transanal polypectomy. We performed a retrospective study in which we analyse prospectively collected data regarding patient and tumor characteristics, perioperative outcomes, pathological report, morbidity and mortality. A total of 5 male and 4 female patients underwent robotic TAMIS. Lesions were 6,22cm from the anal verge. Mean size was 15,8cm 2 . All procedures were performed in the lithotomy position. Closure of the defect was performed in all cases. Mean blood loss was 39,8ml. Mean operative time was 71,9min. No severe postoperative complications or readmissions occured. Median hospital stay was 2,5 days. Robotic TAMIS is useful to treat complex rectal lesions. Our transanal platform allowed a wider range of movements of the robotic arms and to perform all procedures in the lithotomy position. Copyright © 2017 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Lunar Reconnaissance Orbiter (LRO) Rapid Thermal Design Development

    NASA Technical Reports Server (NTRS)

    Baker, Charles; Cottingham, Christine; Garrison, Matthew; Melak, Tony; Peabody, Sharon; Powers, Dan

    2009-01-01

    The Lunar Reconnaissance Orbiter (LRO) project had a rapid development schedule starting with project conception in spring of 2004, instrument and launch vehicle selection late in 2005 and then launch in early 2009. The lunar thermal environment is one of the harshest in our solar system with the heavy infrared loading of the moon due to low albedo, lack of lunar atmosphere, and low effective regolith conduction. This set of constraints required a thermal design which maximized performance (minimized radiator area and cold control heater power) and minimized thermal hardware build at the orbiter level (blanketing, and heater service). The orbiter design located most of the avionics on an isothermalized heat pipe panel called the IsoThermal Panel (ITP). The ITP was coupled by dual bore heat pipes to an Optical Solar Reflector (OSR) covered heat pipe radiator. By coupling all of the avionics to one system, the hardware was simplified. The seven instruments were mainly heritage instruments which resulted in their desired radiators being located by their heritage design. This minimized instrument redesigns and therefore allowed them to be delivered earlier, though it resulted in a more complex orbiter level blanket and heater service design. Three of the instruments were mounted on a tight pointing M55J optical bench that needed to be covered in heaters to maintain pointing. Two were mounted to spacecraft controlled radiators. One was mounted to the ITP Dual Bores. The last was mounted directly to the bus structure on the moon facing panel. The propulsion system utilized four-20 pound insertion thrusters and eight-5 pound attitude control thrusters (ACS) in addition to 1000 kg of fuel in two large tanks. The propulsion system had a heater cylinder and a heated mounting deck for the insertion thrusters which coupled most of the propulsion design together simplifying the heater design. The High Gain Antenna System (HGAS) and Solar Array System (SAS) used dual axis actuator gimbal systems. HGAS required additional boom heaters to cool the approximately 10 W of RF losses thru the rotary joints and wave guides from the 40 W Ka system. By design this module needed a fair amount of heater, blanketing, and radiator complexity. The SAS system required a separate cable wrap radiator to help cool the Solar Array harness which dissipated 30 W thru the actuators and cable wraps. This module also was complex.

  18. Energy aware path planning in complex four dimensional environments

    NASA Astrophysics Data System (ADS)

    Chakrabarty, Anjan

    This dissertation addresses the problem of energy-aware path planning for small autonomous vehicles. While small autonomous vehicles can perform missions that are too risky (or infeasible) for larger vehicles, the missions are limited by the amount of energy that can be carried on board the vehicle. Path planning techniques that either minimize energy consumption or exploit energy available in the environment can thus increase range and endurance. Path planning is complicated by significant spatial (and potentially temporal) variations in the environment. While the main focus is on autonomous aircraft, this research also addresses autonomous ground vehicles. Range and endurance of small unmanned aerial vehicles (UAVs) can be greatly improved by utilizing energy from the atmosphere. Wind can be exploited to minimize energy consumption of a small UAV. But wind, like any other atmospheric component , is a space and time varying phenomenon. To effectively use wind for long range missions, both exploration and exploitation of wind is critical. This research presents a kinematics based tree algorithm which efficiently handles the four dimensional (three spatial and time) path planning problem. The Kinematic Tree algorithm provides a sequence of waypoints, airspeeds, heading and bank angle commands for each segment of the path. The planner is shown to be resolution complete and computationally efficient. Global optimality of the cost function cannot be claimed, as energy is gained from the atmosphere, making the cost function inadmissible. However the Kinematic Tree is shown to be optimal up to resolution if the cost function is admissible. Simulation results show the efficacy of this planning method for a glider in complex real wind data. Simulation results verify that the planner is able to extract energy from the atmosphere enabling long range missions. The Kinematic Tree planning framework, developed to minimize energy consumption of UAVs, is applied for path planning in ground robots. In traditional path planning problem the focus is on obstacle avoidance and navigation. The optimal Kinematic Tree algorithm named Kinematic Tree* is shown to find optimal paths to reach the destination while avoiding obstacles. A more challenging path planning scenario arises for planning in complex terrain. This research shows how the Kinematic Tree* algorithm can be extended to find minimum energy paths for a ground vehicle in difficult mountainous terrain.

  19. An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity

    PubMed Central

    Whittington, James C. R.; Bogacz, Rafal

    2017-01-01

    To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons. Several models have been proposed that approximate the backpropagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the backpropagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output. PMID:28333583

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chacón, L., E-mail: chacon@lanl.gov; Chen, G.; Knoll, D.A.

    We review the state of the art in the formulation, implementation, and performance of so-called high-order/low-order (HOLO) algorithms for challenging multiscale problems. HOLO algorithms attempt to couple one or several high-complexity physical models (the high-order model, HO) with low-complexity ones (the low-order model, LO). The primary goal of HOLO algorithms is to achieve nonlinear convergence between HO and LO components while minimizing memory footprint and managing the computational complexity in a practical manner. Key to the HOLO approach is the use of the LO representations to address temporal stiffness, effectively accelerating the convergence of the HO/LO coupled system. The HOLOmore » approach is broadly underpinned by the concept of nonlinear elimination, which enables segregation of the HO and LO components in ways that can effectively use heterogeneous architectures. The accuracy and efficiency benefits of HOLO algorithms are demonstrated with specific applications to radiation transport, gas dynamics, plasmas (both Eulerian and Lagrangian formulations), and ocean modeling. Across this broad application spectrum, HOLO algorithms achieve significant accuracy improvements at a fraction of the cost compared to conventional approaches. It follows that HOLO algorithms hold significant potential for high-fidelity system scale multiscale simulations leveraging exascale computing.« less

  1. Multiscale high-order/low-order (HOLO) algorithms and applications

    NASA Astrophysics Data System (ADS)

    Chacón, L.; Chen, G.; Knoll, D. A.; Newman, C.; Park, H.; Taitano, W.; Willert, J. A.; Womeldorff, G.

    2017-02-01

    We review the state of the art in the formulation, implementation, and performance of so-called high-order/low-order (HOLO) algorithms for challenging multiscale problems. HOLO algorithms attempt to couple one or several high-complexity physical models (the high-order model, HO) with low-complexity ones (the low-order model, LO). The primary goal of HOLO algorithms is to achieve nonlinear convergence between HO and LO components while minimizing memory footprint and managing the computational complexity in a practical manner. Key to the HOLO approach is the use of the LO representations to address temporal stiffness, effectively accelerating the convergence of the HO/LO coupled system. The HOLO approach is broadly underpinned by the concept of nonlinear elimination, which enables segregation of the HO and LO components in ways that can effectively use heterogeneous architectures. The accuracy and efficiency benefits of HOLO algorithms are demonstrated with specific applications to radiation transport, gas dynamics, plasmas (both Eulerian and Lagrangian formulations), and ocean modeling. Across this broad application spectrum, HOLO algorithms achieve significant accuracy improvements at a fraction of the cost compared to conventional approaches. It follows that HOLO algorithms hold significant potential for high-fidelity system scale multiscale simulations leveraging exascale computing.

  2. An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity.

    PubMed

    Whittington, James C R; Bogacz, Rafal

    2017-05-01

    To efficiently learn from feedback, cortical networks need to update synaptic weights on multiple levels of cortical hierarchy. An effective and well-known algorithm for computing such changes in synaptic weights is the error backpropagation algorithm. However, in this algorithm, the change in synaptic weights is a complex function of weights and activities of neurons not directly connected with the synapse being modified, whereas the changes in biological synapses are determined only by the activity of presynaptic and postsynaptic neurons. Several models have been proposed that approximate the backpropagation algorithm with local synaptic plasticity, but these models require complex external control over the network or relatively complex plasticity rules. Here we show that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity. Furthermore, for certain parameters, the weight change in the predictive coding model converges to that of the backpropagation algorithm. This suggests that it is possible for cortical networks with simple Hebbian synaptic plasticity to implement efficient learning algorithms in which synapses in areas on multiple levels of hierarchy are modified to minimize the error on the output.

  3. Complex angular and torsional deformities (distal femoral malunions). Preoperative planning using stereolithography and surgical correction with locking plate fixation in four dogs.

    PubMed

    DeTora, Michael D; Boudrieau, Randy J

    2016-09-20

    To describe the surgical technique of complex distal femoral deformity correction with the aid of stereolithography apparatus (SLA) biomodels, stabilized with locking plate fixation. Full-size replica epoxy bone biomodels of the affected femurs (4 dogs/ 5 limbs) were used as templates for surgical planning. A rehearsal procedure was performed on the biomodels aided by a guide wire technique and stabilized with locking plate fixation. Surgery performed in all dogs was guided by the rehearsal procedure. All pre-contoured implants were subsequently used in the definitive surgical procedure with minimal modification. All dogs had markedly improved, with near normal functional outcomes; all but one had a mild persistent lameness at the final in-hospital follow-up examination (mean: 54.4 weeks; range: 24-113 weeks after surgery). All femurs healed without complications (mean: 34 weeks, median: 12 weeks; range: 8-12 weeks for closing osteotomies, and 26-113 weeks for opening wedge osteotomies). Long-term follow-up examination (mean: 28.6 months; range: 5-42 months) revealed all but one owner to be highly satisfied with the outcome. Complications were observed in two dogs: prolonged tibiotarsal joint decreased flexion that resolved with physical therapy. In one of these dogs, iatrogenic transection of the long digital extensor tendon was repaired, and the other had a peroneal nerve neurapraxia. Stereolithography apparatus biomodels and rehearsal surgery simplified the definitive surgical corrections of complex femoral malunions and resulted in good functional outcomes.

  4. Safety of robotic general surgery in elderly patients.

    PubMed

    Buchs, Nicolas C; Addeo, Pietro; Bianco, Francesco M; Ayloo, Subhashini; Elli, Enrique F; Giulianotti, Pier C

    2010-08-01

    As the life expectancy of people in Western countries continues to rise, so too does the number of elderly patients. In parallel, robotic surgery continues to gain increasing acceptance, allowing for more complex operations to be performed by minimally invasive approach and extending indications for surgery to this population. The aim of this study is to assess the safety of robotic general surgery in patients 70 years and older. From April 2007 to December 2009, patients 70 years and older, who underwent various robotic procedures at our institution, were stratified into three categories of surgical complexity (low, intermediate, and high). There were 73 patients, including 39 women (53.4%) and 34 men (46.6%). The median age was 75 years (range 70-88 years). There were 7, 24, and 42 patients included, respectively, in the low, intermediate, and high surgical complexity categories. Approximately 50% of patients underwent hepatic and pancreatic resections. There was no statistically significant difference between the three groups in terms of morbidity, mortality, readmission or transfusion. Mean overall operative time was 254 ± 133 min (range 15-560 min). Perioperative mortality and morbidity was 1.4% and 15.1%, respectively. Transfusion rate was 9.6%, and median length of stay was 6 days (range 0-30 days). Robotic surgery can be performed safely in the elderly population with low mortality, acceptable morbidity, and short hospital stay. Age should not be considered as a contraindication to robotic surgery even for advanced procedures.

  5. Sparse Representation with Spatio-Temporal Online Dictionary Learning for Efficient Video Coding.

    PubMed

    Dai, Wenrui; Shen, Yangmei; Tang, Xin; Zou, Junni; Xiong, Hongkai; Chen, Chang Wen

    2016-07-27

    Classical dictionary learning methods for video coding suer from high computational complexity and interfered coding eciency by disregarding its underlying distribution. This paper proposes a spatio-temporal online dictionary learning (STOL) algorithm to speed up the convergence rate of dictionary learning with a guarantee of approximation error. The proposed algorithm incorporates stochastic gradient descents to form a dictionary of pairs of 3-D low-frequency and highfrequency spatio-temporal volumes. In each iteration of the learning process, it randomly selects one sample volume and updates the atoms of dictionary by minimizing the expected cost, rather than optimizes empirical cost over the complete training data like batch learning methods, e.g. K-SVD. Since the selected volumes are supposed to be i.i.d. samples from the underlying distribution, decomposition coecients attained from the trained dictionary are desirable for sparse representation. Theoretically, it is proved that the proposed STOL could achieve better approximation for sparse representation than K-SVD and maintain both structured sparsity and hierarchical sparsity. It is shown to outperform batch gradient descent methods (K-SVD) in the sense of convergence speed and computational complexity, and its upper bound for prediction error is asymptotically equal to the training error. With lower computational complexity, extensive experiments validate that the STOL based coding scheme achieves performance improvements than H.264/AVC or HEVC as well as existing super-resolution based methods in ratedistortion performance and visual quality.

  6. Good Trellises for IC Implementation of Viterbi Decoders for Linear Block Codes

    NASA Technical Reports Server (NTRS)

    Moorthy, Hari T.; Lin, Shu; Uehara, Gregory T.

    1997-01-01

    This paper investigates trellis structures of linear block codes for the integrated circuit (IC) implementation of Viterbi decoders capable of achieving high decoding speed while satisfying a constraint on the structural complexity of the trellis in terms of the maximum number of states at any particular depth. Only uniform sectionalizations of the code trellis diagram are considered. An upper-bound on the number of parallel and structurally identical (or isomorphic) subtrellises in a proper trellis for a code without exceeding the maximum state complexity of the minimal trellis of the code is first derived. Parallel structures of trellises with various section lengths for binary BCH and Reed-Muller (RM) codes of lengths 32 and 64 are analyzed. Next, the complexity of IC implementation of a Viterbi decoder based on an L-section trellis diagram for a code is investigated. A structural property of a Viterbi decoder called add-compare-select (ACS)-connectivity which is related to state connectivity is introduced. This parameter affects the complexity of wire-routing (interconnections within the IC). The effect of five parameters namely: (1) effective computational complexity; (2) complexity of the ACS-circuit; (3) traceback complexity; (4) ACS-connectivity; and (5) branch complexity of a trellis diagram on the very large scale integration (VISI) complexity of a Viterbi decoder is investigated. It is shown that an IC implementation of a Viterbi decoder based on a nonminimal trellis requires less area and is capable of operation at higher speed than one based on the minimal trellis when the commonly used ACS-array architecture is considered.

  7. The reliability, accuracy and minimal detectable difference of a multi-segment kinematic model of the foot-shoe complex.

    PubMed

    Bishop, Chris; Paul, Gunther; Thewlis, Dominic

    2013-04-01

    Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot-shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot-shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC=0.75-0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC=0.68-0.99) than the inexperienced rater (ICC=0.38-0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint--MDD90=2.17-9.36°, tarsometatarsal joint--MDD90=1.03-9.29° and the metatarsophalangeal joint--MDD90=1.75-9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Genetic Circuit Performance under Conditions Relevant for Industrial Bioreactors

    PubMed Central

    Moser, Felix; Broers, Nicolette J.; Hartmans, Sybe; Tamsir, Alvin; Kerkman, Richard; Roubos, Johannes A.; Bovenberg, Roel; Voigt, Christopher A.

    2014-01-01

    Synthetic genetic programs promise to enable novel applications in industrial processes. For such applications, the genetic circuits that compose programs will require fidelity in varying and complex environments. In this work, we report the performance of two synthetic circuits in Escherichia coli under industrially relevant conditions, including the selection of media, strain, and growth rate. We test and compare two transcriptional circuits: an AND and a NOR gate. In E. coli DH10B, the AND gate is inactive in minimal media; activity can be rescued by supplementing the media and transferring the gate into the industrial strain E. coli DS68637 where normal function is observed in minimal media. In contrast, the NOR gate is robust to media composition and functions similarly in both strains. The AND gate is evaluated at three stages of early scale-up: 100 ml shake-flask experiments, a 1 ml MTP microreactor, and a 10 L bioreactor. A reference plasmid that constitutively produces a GFP reporter is used to make comparisons of circuit performance across conditions. The AND gate function is quantitatively different at each scale. The output deteriorates late in fermentation after the shift from exponential to constant feed rates, which induces rapid resource depletion and changes in growth rate. In addition, one of the output states of the AND gate failed in the bioreactor, effectively making it only responsive to a single input. Finally, cells carrying the AND gate show considerably less accumulation of biomass. Overall, these results highlight challenges and suggest modified strategies for developing and characterizing genetic circuits that function reliably during fermentation. PMID:23656232

  9. Standards and interdisciplinary treatment of boxing injuries of the head in professional boxing on the basis of an IBF World Championship Fight.

    PubMed

    Dragu, Adrian; Unglaub, Frank; Radomirovic, Sinisa; Schnürer, Stefan; Wagner, Walter; Horch, Raymund E; Hell, Berthold

    2010-12-01

    Boxing injuries are well known in hobby boxing as well as in professional boxing. Especially in professional boxing it is of great importance to implement and follow prevention-, diagnosis- and therapy-standards in order to prevent or at least to minimize injuries of the athlete. The utmost aim would be to establish international prevention-, diagnosis- and therapy-standards for boxing injuries in professional boxing. However, this aim is on a short run unrealistic, as there are too many different professional boxing organisations with different regulations. A realistic short term aim would be to develop a national standard in order to unify the management and medical treatment of boxing injuries in professional boxing. We present the management and interdisciplinary treatment of a professional boxer with a bilateral open fracture of the mandible during a middle weight IBF World Championship Fight. On the basis of this case we want to present and discuss the possibilities of an interdisciplinary and successful medical treatment. In order to prevent or minimize boxing injuries of professional boxers, annual MRI-Scans of the head and neck have to be performed as prevention standard. Furthermore, neurocognitive tests must be performed on a regular basis. Boxing injuries in professional boxing need an interdisciplinary, unbiased and complex analysis directly at the boxing ring. The treatment of the injuries should be only performed in medical centres and thus under constant parameters. The needed qualifications must be learned in mandatory national licence courses of boxing physicians, referees and promoters.

  10. Exposing the Complex III Qo semiquinone radical

    PubMed Central

    Zhang, Haibo; Osyczka, Artur; Dutton, P. L.; Moser, Christopher C.

    2012-01-01

    Complex III Qo site semiquinone has been assigned pivotal roles in productive energy-conversion and destructive superoxide generation. After a 30 year search, a genetic heme bH knockout arrests this transient semiquinone EPR radical, revealing the natural engineering balance pitting energy-conserving, short-circuit minimizing, split electron transfer and catalytic speed against damaging oxygen reduction. PMID:17560537

  11. Accelerated construction

    DOT National Transportation Integrated Search

    2004-01-01

    Accelerated Construction Technology Transfer (ACTT) is a strategic process that uses various innovative techniques, strategies, and technologies to minimize actual construction time, while enhancing quality and safety on today's large, complex multip...

  12. Application of Ultrasound-Guided Core Biopsy to Minimal-Invasively Diagnose Supraclavicular Fossa Tumors and Minimize the Requirement of Invasive Diagnostic Surgery

    PubMed Central

    Chen, Chun-Nan; Lin, Che-Yi; Chi, Fan-Hsiang; Chou, Chen-Han; Hsu, Ya-Ching; Kuo, Yen-Lin; Lin, Chih-Feng; Chen, Tseng-Cheng; Wang, Cheng-Ping; Lou, Pei-Jen; Ko, Jenq-Yuh; Hsiao, Tzu-Yu; Yang, Tsung-Lin

    2016-01-01

    Abstract Tumors of the supraclavicular fossa (SC) is clinically challenging because of anatomical complexity and tumor pathological diversity. Because of varied diseases entities and treatment choices of SC tumors, making the accurate decision among numerous differential diagnoses is imperative. Sampling by open biopsy (OB) remains the standard procedure for pathological confirmation. However, complicated anatomical structures of SC always render surgical intervention difficult to perform. Ultrasound-guided core biopsy (USCB) is a minimally invasive and office-based procedure for tissue sampling widely applied in many diseases of head and neck. This study aims to evaluate the clinical efficacy and utility of using USCB as the sampling method of SC tumors. From 2009 to 2014, consecutive patients who presented clinical symptoms and signs of supraclavicular tumors and were scheduled to receive sampling procedures for diagnostic confirmation were recruited. The patients received USCB or OB respectively in the initial tissue sampling. The accurate diagnostic rate based on pathological results was 90.2% for USCB, and 93.6% for OB. No significant difference was noted between USCB and OB groups in terms of diagnostic accuracy and the percentage of inadequate specimens. All cases in the USCB group had the sampling procedure completed within 10 minutes, but not in the OB group. No scars larger than 1 cm were found in USCB. Only patients in the OB groups had the need to receive general anesthesia and hospitalization and had scars postoperatively. Accordingly, USCB can serve as the first-line sampling tool for SC tumors with high diagnostic accuracy, minimal invasiveness, and low medical cost. PMID:26825877

  13. Stability, Consistency and Performance of Distribution Entropy in Analysing Short Length Heart Rate Variability (HRV) Signal.

    PubMed

    Karmakar, Chandan; Udhayakumar, Radhagayathri K; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu

    2017-01-01

    Distribution entropy ( DistEn ) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters-the embedding dimension m , and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy ( ApEn ) and sample entropy ( SampEn ) measures. The performance of DistEn can also be affected by the data length N . In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter ( m or M ) or combination of two parameters ( N and M ). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn . The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series.

  14. Performance of the new automated Abbott RealTime MTB assay for rapid detection of Mycobacterium tuberculosis complex in respiratory specimens.

    PubMed

    Chen, J H K; She, K K K; Kwong, T-C; Wong, O-Y; Siu, G K H; Leung, C-C; Chang, K-C; Tam, C-M; Ho, P-L; Cheng, V C C; Yuen, K-Y; Yam, W-C

    2015-09-01

    The automated high-throughput Abbott RealTime MTB real-time PCR assay has been recently launched for Mycobacterium tuberculosis complex (MTBC) clinical diagnosis. This study would like to evaluate its performance. We first compared its diagnostic performance with the Roche Cobas TaqMan MTB assay on 214 clinical respiratory specimens. Prospective analysis of a total 520 specimens was then performed to further evaluate the Abbott assay. The Abbott assay showed a lower limit of detection at 22.5 AFB/ml, which was more sensitive than the Cobas assay (167.5 AFB/ml). The two assays demonstrated a significant difference in diagnostic performance (McNemar's test; P = 0.0034), in which the Abbott assay presented significantly higher area under curve (AUC) than the Cobas assay (1.000 vs 0.880; P = 0.0002). The Abbott assay demonstrated extremely low PCR inhibition on clinical respiratory specimens. The automated Abbott assay required only very short manual handling time (0.5 h), which could help to improve the laboratory management. In the prospective analysis, the overall estimates for sensitivity and specificity of the Abbott assay were both 100 % among smear-positive specimens, whereas the smear-negative specimens were 96.7 and 96.1 %, respectively. No cross-reactivity with non-tuberculosis mycobacterial species was observed. The superiority in sensitivity of the Abbott assay for detecting MTBC in smear-negative specimens could further minimize the risk in MTBC false-negative detection. The new Abbott RealTime MTB assay has good diagnostic performance which can be a useful diagnostic tool for rapid MTBC detection in clinical laboratories.

  15. Stability, Consistency and Performance of Distribution Entropy in Analysing Short Length Heart Rate Variability (HRV) Signal

    PubMed Central

    Karmakar, Chandan; Udhayakumar, Radhagayathri K.; Li, Peng; Venkatesh, Svetha; Palaniswami, Marimuthu

    2017-01-01

    Distribution entropy (DistEn) is a recently developed measure of complexity that is used to analyse heart rate variability (HRV) data. Its calculation requires two input parameters—the embedding dimension m, and the number of bins M which replaces the tolerance parameter r that is used by the existing approximation entropy (ApEn) and sample entropy (SampEn) measures. The performance of DistEn can also be affected by the data length N. In our previous studies, we have analyzed stability and performance of DistEn with respect to one parameter (m or M) or combination of two parameters (N and M). However, impact of varying all the three input parameters on DistEn is not yet studied. Since DistEn is predominantly aimed at analysing short length heart rate variability (HRV) signal, it is important to comprehensively study the stability, consistency and performance of the measure using multiple case studies. In this study, we examined the impact of changing input parameters on DistEn for synthetic and physiological signals. We also compared the variations of DistEn and performance in distinguishing physiological (Elderly from Young) and pathological (Healthy from Arrhythmia) conditions with ApEn and SampEn. The results showed that DistEn values are minimally affected by the variations of input parameters compared to ApEn and SampEn. DistEn also showed the most consistent and the best performance in differentiating physiological and pathological conditions with various of input parameters among reported complexity measures. In conclusion, DistEn is found to be the best measure for analysing short length HRV time series. PMID:28979215

  16. Performance of chip seals using local and minimally processed aggregates for preservation of low traffic volume roadways.

    DOT National Transportation Integrated Search

    2013-07-01

    This report documents the performance of two low traffic volume experimental chip seals constructed using : locally available, minimally processed sand and gravel aggregates after four winters of service. The projects : were constructed by CDOT maint...

  17. Competitive two-agent scheduling problems to minimize the weighted combination of makespans in a two-machine open shop

    NASA Astrophysics Data System (ADS)

    Jiang, Fuhong; Zhang, Xingong; Bai, Danyu; Wu, Chin-Chia

    2018-04-01

    In this article, a competitive two-agent scheduling problem in a two-machine open shop is studied. The objective is to minimize the weighted sum of the makespans of two competitive agents. A complexity proof is presented for minimizing the weighted combination of the makespan of each agent if the weight α belonging to agent B is arbitrary. Furthermore, two pseudo-polynomial-time algorithms using the largest alternate processing time (LAPT) rule are presented. Finally, two approximation algorithms are presented if the weight is equal to one. Additionally, another approximation algorithm is presented if the weight is larger than one.

  18. Low-Complexity Polynomial Channel Estimation in Large-Scale MIMO With Arbitrary Statistics

    NASA Astrophysics Data System (ADS)

    Shariati, Nafiseh; Bjornson, Emil; Bengtsson, Mats; Debbah, Merouane

    2014-10-01

    This paper considers pilot-based channel estimation in large-scale multiple-input multiple-output (MIMO) communication systems, also known as massive MIMO, where there are hundreds of antennas at one side of the link. Motivated by the fact that computational complexity is one of the main challenges in such systems, a set of low-complexity Bayesian channel estimators, coined Polynomial ExpAnsion CHannel (PEACH) estimators, are introduced for arbitrary channel and interference statistics. While the conventional minimum mean square error (MMSE) estimator has cubic complexity in the dimension of the covariance matrices, due to an inversion operation, our proposed estimators significantly reduce this to square complexity by approximating the inverse by a L-degree matrix polynomial. The coefficients of the polynomial are optimized to minimize the mean square error (MSE) of the estimate. We show numerically that near-optimal MSEs are achieved with low polynomial degrees. We also derive the exact computational complexity of the proposed estimators, in terms of the floating-point operations (FLOPs), by which we prove that the proposed estimators outperform the conventional estimators in large-scale MIMO systems of practical dimensions while providing a reasonable MSEs. Moreover, we show that L needs not scale with the system dimensions to maintain a certain normalized MSE. By analyzing different interference scenarios, we observe that the relative MSE loss of using the low-complexity PEACH estimators is smaller in realistic scenarios with pilot contamination. On the other hand, PEACH estimators are not well suited for noise-limited scenarios with high pilot power; therefore, we also introduce the low-complexity diagonalized estimator that performs well in this regime. Finally, we ...

  19. Self-inflicted long complex urethro-vesical foreign body: is open surgery always needed?

    PubMed Central

    Garg, Manish; Kumar, Manoj; Sankhwar, Satyanarayan; Singh, Vishwajeet

    2013-01-01

    In this case report, we describe our experience of a self-inflicted long complex urethrovesical foreign body managed suprapubically through the minimally invasive technique. A 21-year-old man with antipsychotic treatment for the past 10 years presented with a long electric cable wire in his bladder with the distal end in the penile urethra. He presented with symptoms of voiding difficulty and gross haematuria. An attempt of gentle retrieval of wire through the cystoscopic forceps was not successful due to a very complex knot of cable in the bladder. To avoid open surgery such as suprapubic cystotomy, the percutaneous minimally invasive approach was planned. Access to the bladder was achieved by the suprapubic puncture of the bladder, placement of a guide-wire and serial dilation of supra-pubic tract. With the help of nephroscope, through suprapubic tract, the cable wire was retrieved antegradely without causing undue trauma to the bladder or urethra. PMID:23749820

  20. System for decision analysis support on complex waste management issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shropshire, D.E.

    1997-10-01

    A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less

  1. Protective effects of a topical antioxidant complex containing vitamins C and E and ferulic acid against ultraviolet irradiation-induced photodamage in Chinese women.

    PubMed

    Wu, Yan; Zheng, Xin; Xu, Xue-Gang; Li, Yuan-Hong; Wang, Bin; Gao, Xing-Hua; Chen, Hong-Duo; Yatskayer, Margarita; Oresajo, Christian

    2013-04-01

    The objective of the study was to investigate whether a topical antioxidant complex containing vitamins C and E and ferulic acid can protect solar-simulated ultraviolet irradiation (ssUVR)-induced acute photodamage in human skin. Twelve healthy female Chinese subjects were enrolled in this study. Four unexposed sites on dorsal skin were marked for the experiment. The products containing antioxidant complex and vehicle were applied onto 2 sites, respectively, for 4 consecutive days. On day 4, the antioxidant complex-treated site, the vehicle-treated site, and the untreated site (positive control) received ssUVR (5 times the minimal erythema dose). The fourth site (negative control) received neither ssUVR nor treatment. Digital photographs were taken, and skin color was measured pre- and postirradiation. Skin biopsies were obtained 24 hours after exposure to ssUVR, for hematoxylin and eosin and immunohistochemical staining. A single, 5 times the minimal erythema dose of ssUVR substantially induced large amounts of sunburn cell formation, thymine dimer formation, overexpression of p53 protein, and depletion of CD1a+ Langerhans cells. The antioxidant complex containing vitamins C and E and ferulic acid conferred significant protection against biological events compared with other irradiated sites. A topical antioxidant complex containing vitamins C and E and ferulic acid has potential photoprotective effects against ssUVR-induced acute photodamage in human skin.

  2. Controlled loading of cryoprotectants (CPAs) to oocyte with linear and complex CPA profiles on a microfluidic platform.

    PubMed

    Heo, Yun Seok; Lee, Ho-Joon; Hassell, Bryan A; Irimia, Daniel; Toth, Thomas L; Elmoazzen, Heidi; Toner, Mehmet

    2011-10-21

    Oocyte cryopreservation has become an essential tool in the treatment of infertility by preserving oocytes for women undergoing chemotherapy. However, despite recent advances, pregnancy rates from all cryopreserved oocytes remain low. The inevitable use of the cryoprotectants (CPAs) during preservation affects the viability of the preserved oocytes and pregnancy rates either through CPA toxicity or osmotic injury. Current protocols attempt to reduce CPA toxicity by minimizing CPA concentrations, or by minimizing the volume changes via the step-wise addition of CPAs to the cells. Although the step-wise addition decreases osmotic shock to oocytes, it unfortunately increases toxic injuries due to the long exposure times to CPAs. To address limitations of current protocols and to rationally design protocols that minimize the exposure to CPAs, we developed a microfluidic device for the quantitative measurements of oocyte volume during various CPA loading protocols. We spatially secured a single oocyte on the microfluidic device, created precisely controlled continuous CPA profiles (step-wise, linear and complex) for the addition of CPAs to the oocyte and measured the oocyte volumetric response to each profile. With both linear and complex profiles, we were able to load 1.5 M propanediol to oocytes in less than 15 min and with a volumetric change of less than 10%. Thus, we believe this single oocyte analysis technology will eventually help future advances in assisted reproductive technologies and fertility preservation.

  3. Knee point search using cascading top-k sorting with minimized time complexity.

    PubMed

    Wang, Zheng; Tseng, Shian-Shyong

    2013-01-01

    Anomaly detection systems and many other applications are frequently confronted with the problem of finding the largest knee point in the sorted curve for a set of unsorted points. This paper proposes an efficient knee point search algorithm with minimized time complexity using the cascading top-k sorting when a priori probability distribution of the knee point is known. First, a top-k sort algorithm is proposed based on a quicksort variation. We divide the knee point search problem into multiple steps. And in each step an optimization problem of the selection number k is solved, where the objective function is defined as the expected time cost. Because the expected time cost in one step is dependent on that of the afterwards steps, we simplify the optimization problem by minimizing the maximum expected time cost. The posterior probability of the largest knee point distribution and the other parameters are updated before solving the optimization problem in each step. An example of source detection of DNS DoS flooding attacks is provided to illustrate the applications of the proposed algorithm.

  4. Enabling Controlling Complex Networks with Local Topological Information.

    PubMed

    Li, Guoqi; Deng, Lei; Xiao, Gaoxi; Tang, Pei; Wen, Changyun; Hu, Wuhua; Pei, Jing; Shi, Luping; Stanley, H Eugene

    2018-03-15

    Complex networks characterize the nature of internal/external interactions in real-world systems including social, economic, biological, ecological, and technological networks. Two issues keep as obstacles to fulfilling control of large-scale networks: structural controllability which describes the ability to guide a dynamical system from any initial state to any desired final state in finite time, with a suitable choice of inputs; and optimal control, which is a typical control approach to minimize the cost for driving the network to a predefined state with a given number of control inputs. For large complex networks without global information of network topology, both problems remain essentially open. Here we combine graph theory and control theory for tackling the two problems in one go, using only local network topology information. For the structural controllability problem, a distributed local-game matching method is proposed, where every node plays a simple Bayesian game with local information and local interactions with adjacent nodes, ensuring a suboptimal solution at a linear complexity. Starring from any structural controllability solution, a minimizing longest control path method can efficiently reach a good solution for the optimal control in large networks. Our results provide solutions for distributed complex network control and demonstrate a way to link the structural controllability and optimal control together.

  5. His-Tag-Mediated Dimerization of Chemoreceptors Leads to Assembly of Functional Nanoarrays.

    PubMed

    Haglin, Elizabeth R; Yang, Wen; Briegel, Ariane; Thompson, Lynmarie K

    2017-11-07

    Transmembrane chemotaxis receptors are found in bacteria in extended hexagonal arrays stabilized by the membrane and by cytosolic binding partners, the kinase CheA and coupling protein CheW. Models of array architecture and assembly propose receptors cluster into trimers of dimers that associate with one CheA dimer and two CheW monomers to form the minimal "core unit" necessary for signal transduction. Reconstructing in vitro chemoreceptor ternary complexes that are homogeneous and functional and exhibit native architecture remains a challenge. Here we report that His-tag-mediated receptor dimerization with divalent metals is sufficient to drive assembly of nativelike functional arrays of a receptor cytoplasmic fragment. Our results indicate receptor dimerization initiates assembly and precedes formation of ternary complexes with partial kinase activity. Restoration of maximal kinase activity coincides with a shift to larger complexes, suggesting that kinase activity depends on interactions beyond the core unit. We hypothesize that achieving maximal activity requires building core units into hexagons and/or coalescing hexagons into the extended lattice. Overall, the minimally perturbing His-tag-mediated dimerization leads to assembly of chemoreceptor arrays with native architecture and thus serves as a powerful tool for studying the assembly and mechanism of this complex and other multiprotein complexes.

  6. Composite Development and Applications for RLV Tankage

    NASA Technical Reports Server (NTRS)

    Wright, Richard J.; Achary, David C.; McBain, Michael C.

    2003-01-01

    The development of polymer composite cryogenic tanks is a critical step in creating the next generation of launch vehicles. Future launch vehicles need to minimize the gross liftoff weight (GLOW), which is possible due to the 28%-41% reduction in weight that composite materials can provide over current aluminum technology. The development of composite cryogenic tanks, feedlines, and unpressurized structures are key enabling technologies for performance and cost enhancements for Reusable Launch Vehicles (RLVs). The technology development of composite tanks has provided direct and applicable data for feedlines, unpressurized structures, material compatibility, and cryogenic fluid containment for highly loaded complex structures and interfaces. All three types of structure have similar material systems, processing parameters, scaling issues, analysis methodologies, NDE development, damage tolerance, and repair scenarios. Composite cryogenic tankage is the most complex of the 3 areas and provides the largest breakthrough in technology. A building block approach has been employed to bring this family of difficult technologies to maturity. This approach has built up composite materials, processes, design, analysis and test methods technology through a series of composite test programs beginning with the NASP program to meet aggressive performance goals for reusable launch vehicles. In this paper, the development and application of advanced composites for RLV use is described.

  7. Comparative RNA-sequencing of the acarbose producer Actinoplanes sp. SE50/110 cultivated in different growth media.

    PubMed

    Schwientek, Patrick; Wendler, Sergej; Neshat, Armin; Eirich, Christina; Rückert, Christian; Klein, Andreas; Wehmeier, Udo F; Kalinowski, Jörn; Stoye, Jens; Pühler, Alfred

    2013-08-20

    Actinoplanes sp. SE50/110 is known as the producer of the alpha-glucosidase inhibitor acarbose, a potent drug in the treatment of type-2 diabetes mellitus. We conducted the first whole transcriptome analysis of Actinoplanes sp. SE50/110, using RNA-sequencing technology for comparative gene expression studies between cells grown in maltose minimal medium, maltose minimal medium with trace elements, and glucose complex medium. We first studied the behavior of Actinoplanes sp. SE50/110 cultivations in these three media and found that the different media had significant impact on growth rate and in particular on acarbose production. It was demonstrated that Actinoplanes sp. SE50/110 grew well in all three media, but acarbose biosynthesis was only observed in cultures grown in maltose minimal medium with and without trace elements. When comparing the expression profiles between the maltose minimal media with and without trace elements, only few significantly differentially expressed genes were found, which mainly code for uptake systems of metal ions provided in the trace element solution. In contrast, the comparison of expression profiles from maltose minimal medium and glucose complex medium revealed a large number of differentially expressed genes, of which the most conspicuous genes account for iron storage and uptake. Furthermore, the acarbose gene cluster was found to be highly expressed in maltose-containing media and almost silent in the glucose-containing medium. In addition, a putative antibiotic biosynthesis gene cluster was found to be similarly expressed as the acarbose cluster. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Time Management in the Operating Room: An Analysis of the Dedicated Minimally Invasive Surgery Suite

    PubMed Central

    Hsiao, Kenneth C.; Machaidze, Zurab

    2004-01-01

    Background: Dedicated minimally invasive surgery suites are available that contain specialized equipment to facilitate endoscopic surgery. Laparoscopy performed in a general operating room is hampered by the multitude of additional equipment that must be transported into the room. The objective of this study was to compare the preparation times between procedures performed in traditional operating rooms versus dedicated minimally invasive surgery suites to see whether operating room efficiency is improved in the specialized room. Methods: The records of 50 patients who underwent laparoscopic procedures between September 2000 and April 2002 were retrospectively reviewed. Twenty-three patients underwent surgery in a general operating room and 18 patients in an minimally invasive surgery suite. Nine patients were excluded because of cystoscopic procedures undergone prior to laparoscopy. Various time points were recorded from which various time intervals were derived, such as preanesthesia time, anesthesia induction time, and total preparation time. A 2-tailed, unpaired Student t test was used for statistical analysis. Results: The mean preanesthesia time was significantly faster in the minimally invasive surgery suite (12.2 minutes) compared with that in the traditional operating room (17.8 minutes) (P=0.013). Mean anesthesia induction time in the minimally invasive surgery suite (47.5 minutes) was similar to time in the traditional operating room (45.7 minutes) (P=0.734). The average total preparation time for the minimally invasive surgery suite (59.6 minutes) was not significantly faster than that in the general operating room (63.5 minutes) (P=0.481). Conclusion: The amount of time that elapses between the patient entering the room and anesthesia induction is statically shorter in a dedicated minimally invasive surgery suite. Laparoscopic surgery is performed more efficiently in a dedicated minimally invasive surgery suite versus a traditional operating room. PMID:15554269

  9. A Defined, Glucose-Limited Mineral Medium for the Cultivation of Listeria spp.

    PubMed Central

    Schneebeli, Rudolf

    2013-01-01

    Members of the genus Listeria are fastidious bacteria with respect to their nutritional requirements, and several minimal media described in the literature fail to support growth of all Listeria spp. Furthermore, strict limitation by a single nutrient, e.g., the carbon source, has not been demonstrated for any of the published minimal media. This is an important prerequisite for defined studies of growth and physiology, including “omics.” Based on a theoretical analysis of previously published mineral media for Listeria, an improved, well-balanced growth medium was designed. It supports the growth, not only of all tested Listeria monocytogenes strains, but of all other Listeria species, with the exception of L. ivanovii. The growth performance of L. monocytogenes strain Scott A was tested in the newly designed medium; glucose served as the only carbon and energy source for growth, whereas neither the supplied amino acids nor the buffering and complexing components (MOPS [morpholinepropanesulfonic acid] and EDTA) supported growth. Omission of amino acids, trace elements, or vitamins, alone or in combination, resulted in considerably reduced biomass yields. Furthermore, we monitored the specific growth rates of various Listeria strains cultivated in the designed mineral medium and compared them to growth in complex medium (brain heart infusion broth [BHI]). The novel mineral medium was optimized for the commonly used strain L. monocytogenes Scott A to achieve optimum cell yields and maximum specific growth rates. This mineral medium is the first published synthetic medium for Listeria that has been shown to be strictly carbon (glucose) limited. PMID:23377938

  10. ECIRS (Endoscopic Combined Intrarenal Surgery) in the Galdakao-modified supine Valdivia position: a new life for percutaneous surgery?

    PubMed

    Cracco, Cecilia Maria; Scoffone, Cesare Marco

    2011-12-01

    Percutaneous nephrolithotomy (PNL) is still the gold-standard treatment for large and/or complex renal stones. Evolution in the endoscopic instrumentation and innovation in the surgical skills improved its success rate and reduced perioperative morbidity. ECIRS (Endoscopic Combined IntraRenal Surgery) is a new way of affording PNL in a modified supine position, approaching antero-retrogradely to the renal cavities, and exploiting the full array of endourologic equipment. ECIRS summarizes the main issues recently debated about PNL. The recent literature regarding supine PNL and ECIRS has been reviewed, namely about patient positioning, synergy between operators, procedures, instrumentation, accessories and diagnostic tools, step-by-step standardization along with versatility of the surgical sequence, minimization of radiation exposure, broadening to particular and/or complex patients, limitation of post-operative renal damage. Supine PNL and ECIRS are not superior to prone PNL in terms of urological results, but guarantee undeniable anesthesiological and management advantages for both patient and operators. In particular, ECIRS requires from the surgeon a permanent mental attitude to synergy, standardized surgical steps, versatility and adherence to the ongoing clinical requirements. ECIRS can be performed also in particular cases, irrespective to age or body habitus. The use of flexible endoscopes during ECIRS contributes to minimizing radiation exposure, hemorrhagic risk and post-PNL renal damage. ECIRS may be considered an evolution of the PNL procedure. Its proposal has the merit of having triggered the critical analysis of the various PNL steps and of patient positioning, and of having transformed the old static PNL into an updated approach.

  11. Automated distribution system management for multichannel space power systems

    NASA Technical Reports Server (NTRS)

    Fleck, G. W.; Decker, D. K.; Graves, J.

    1983-01-01

    A NASA sponsored study of space power distribution system technology is in progress to develop an autonomously managed power system (AMPS) for large space power platforms. The multichannel, multikilowatt, utility-type power subsystem proposed presents new survivability requirements and increased subsystem complexity. The computer controls under development for the power management system must optimize the power subsystem performance and minimize the life cycle cost of the platform. A distribution system management philosophy has been formulated which incorporates these constraints. Its implementation using a TI9900 microprocessor and FORTH as the programming language is presented. The approach offers a novel solution to the perplexing problem of determining the optimal combination of loads which should be connected to each power channel for a versatile electrical distribution concept.

  12. One way Doppler Extractor. Volume 2: Digital VCO technique

    NASA Technical Reports Server (NTRS)

    Nossen, E. J.; Starner, E. R.

    1974-01-01

    A feasibility analysis and trade-offs for a one-way Doppler extractor using digital VCO techniques is presented. The method of Doppler measurement involves the use of a digital phase lock loop; once this loop is locked to the incoming signal, the precise frequency and hence the Doppler component can be determined directly from the contents of the digital control register. The only serious error source is due to internally generated noise. Techniques are presented for minimizing this error source and achieving an accuracy of 0.01 Hz in a one second averaging period. A number of digitally controlled oscillators were analyzed from a performance and complexity point of view. The most promising technique uses an arithmetic synthesizer as a digital waveform generator.

  13. Network immunization under limited budget using graph spectra

    NASA Astrophysics Data System (ADS)

    Zahedi, R.; Khansari, M.

    2016-03-01

    In this paper, we propose a new algorithm that minimizes the worst expected growth of an epidemic by reducing the size of the largest connected component (LCC) of the underlying contact network. The proposed algorithm is applicable to any level of available resources and, despite the greedy approaches of most immunization strategies, selects nodes simultaneously. In each iteration, the proposed method partitions the LCC into two groups. These are the best candidates for communities in that component, and the available resources are sufficient to separate them. Using Laplacian spectral partitioning, the proposed method performs community detection inference with a time complexity that rivals that of the best previous methods. Experiments show that our method outperforms targeted immunization approaches in both real and synthetic networks.

  14. Transoral robotic surgery for neurogenic tumors of the prestyloid parapharyngeal space.

    PubMed

    Lee, Hyoung Shin; Kim, Jinna; Lee, Hyun Jin; Koh, Yoon Woo; Choi, Eun Chang

    2012-08-01

    The parapharyngeal space is a difficult area for a surgical approach due to anatomical complexity. We performed a minimally invasive and precise surgical technique to remove neurogenic tumors of the prestyloid parapharyngeal space using transoral robotic instrumentation. The mass was successfully removed in the two cases with three-dimensional visualization providing an excellent view of the resection margin and the dissection plane preserving the vital structures. An adequate resection margin was acquired, and no violation of the tumor capsule occurred. No significant complications were noted. Transoral robotic surgery was feasible for neurogenic tumors of the prestyloid parapharyngeal space, providing a sufficient resection margin and delicate dissection through excellent surgical views and instrumentation. Copyright © 2012. Published by Elsevier Ireland Ltd.

  15. Protein-ligand docking using FFT based sampling: D3R case study.

    PubMed

    Padhorny, Dzmitry; Hall, David R; Mirzaei, Hanieh; Mamonov, Artem B; Moghadasi, Mohammad; Alekseenko, Andrey; Beglov, Dmitri; Kozakov, Dima

    2018-01-01

    Fast Fourier transform (FFT) based approaches have been successful in application to modeling of relatively rigid protein-protein complexes. Recently, we have been able to adapt the FFT methodology to treatment of flexible protein-peptide interactions. Here, we report our latest attempt to expand the capabilities of the FFT approach to treatment of flexible protein-ligand interactions in application to the D3R PL-2016-1 challenge. Based on the D3R assessment, our FFT approach in conjunction with Monte Carlo minimization off-grid refinement was among the top performing methods in the challenge. The potential advantage of our method is its ability to globally sample the protein-ligand interaction landscape, which will be explored in further applications.

  16. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L.-N.

    1977-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively modest coding complexity, it is proposed to concatenate a byte-oriented unit-memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real-time minimal-byte-error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  17. Concatenated coding systems employing a unit-memory convolutional code and a byte-oriented decoding algorithm

    NASA Technical Reports Server (NTRS)

    Lee, L. N.

    1976-01-01

    Concatenated coding systems utilizing a convolutional code as the inner code and a Reed-Solomon code as the outer code are considered. In order to obtain very reliable communications over a very noisy channel with relatively small coding complexity, it is proposed to concatenate a byte oriented unit memory convolutional code with an RS outer code whose symbol size is one byte. It is further proposed to utilize a real time minimal byte error probability decoding algorithm, together with feedback from the outer decoder, in the decoder for the inner convolutional code. The performance of the proposed concatenated coding system is studied, and the improvement over conventional concatenated systems due to each additional feature is isolated.

  18. Graph cuts for curvature based image denoising.

    PubMed

    Bae, Egil; Shi, Juan; Tai, Xue-Cheng

    2011-05-01

    Minimization of total variation (TV) is a well-known method for image denoising. Recently, the relationship between TV minimization problems and binary MRF models has been much explored. This has resulted in some very efficient combinatorial optimization algorithms for the TV minimization problem in the discrete setting via graph cuts. To overcome limitations, such as staircasing effects, of the relatively simple TV model, variational models based upon higher order derivatives have been proposed. The Euler's elastica model is one such higher order model of central importance, which minimizes the curvature of all level lines in the image. Traditional numerical methods for minimizing the energy in such higher order models are complicated and computationally complex. In this paper, we will present an efficient minimization algorithm based upon graph cuts for minimizing the energy in the Euler's elastica model, by simplifying the problem to that of solving a sequence of easy graph representable problems. This sequence has connections to the gradient flow of the energy function, and converges to a minimum point. The numerical experiments show that our new approach is more effective in maintaining smooth visual results while preserving sharp features better than TV models.

  19. Construction of a minimal genome as a chassis for synthetic biology.

    PubMed

    Sung, Bong Hyun; Choe, Donghui; Kim, Sun Chang; Cho, Byung-Kwan

    2016-11-30

    Microbial diversity and complexity pose challenges in understanding the voluminous genetic information produced from whole-genome sequences, bioinformatics and high-throughput '-omics' research. These challenges can be overcome by a core blueprint of a genome drawn with a minimal gene set, which is essential for life. Systems biology and large-scale gene inactivation studies have estimated the number of essential genes to be ∼300-500 in many microbial genomes. On the basis of the essential gene set information, minimal-genome strains have been generated using sophisticated genome engineering techniques, such as genome reduction and chemical genome synthesis. Current size-reduced genomes are not perfect minimal genomes, but chemically synthesized genomes have just been constructed. Some minimal genomes provide various desirable functions for bioindustry, such as improved genome stability, increased transformation efficacy and improved production of biomaterials. The minimal genome as a chassis genome for synthetic biology can be used to construct custom-designed genomes for various practical and industrial applications. © 2016 The Author(s). published by Portland Press Limited on behalf of the Biochemical Society.

  20. The Intersection of Task-Based Interaction, Task Complexity, and Working Memory: L2 Question Development through Recasts in a Laboratory Setting

    ERIC Educational Resources Information Center

    Kim, YouJin; Payant, Caroline; Pearson, Pamela

    2015-01-01

    The extent to which individual differences in cognitive abilities affect the relationship among task complexity, attention to form, and second language development has been addressed only minimally in the cognition hypothesis literature. The present study explores how reasoning demands in tasks and working memory (WM) capacity predict learners'…

  1. Anti-Angiogenic Gene Therapy for Prostate Cancer

    DTIC Science & Technology

    2004-04-01

    disease have been prostectomy followed by chemotherapy and radiation therapy. Although these forms of palliative therapies have been successful in early...animals was done following the guidelines of to mouse immunoglobulin and a horseradish peroxidase-streptavidin complex. the Institutional Animal Care and...Institutional Animal Care and Use Committee ization of antigen-antibody complex. Slides were minimally counterstained and the Occupational Health and

  2. KRISSY: user's guide to modeling three-dimensional wind flow in complex terrain

    Treesearch

    Michael A. Fosberg; Michael L. Sestak

    1986-01-01

    KRISSY is a computer model for generating three-dimensional wind flows in complex terrain from data that were not or perhaps cannot be collected. The model is written in FORTRAN IV This guide describes data requirements, modeling, and output from an applications viewpoint rather than that of programming or theoretical modeling. KRISSY is designed to minimize...

  3. Support for Self-Regulation in Learning Complex Topics from Multimedia Explanations: Do Learners Need Extensive or Minimal Support?

    ERIC Educational Resources Information Center

    Rodicio, Hector Garcia; Sanchez, Emilio; Acuna, Santiago R.

    2013-01-01

    Acquiring complex conceptual knowledge requires learners to self-regulate their learning by planning, monitoring, and adjusting the process but they find it difficult to do so. In one experiment, we examined whether learners need broad systems of support for self-regulation or whether they are also able to learn with more economical support…

  4. A generic minimization random allocation and blinding system on web.

    PubMed

    Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping

    2006-12-01

    Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.

  5. Use of simulation to compare the performance of minimization with stratified blocked randomization.

    PubMed

    Toorawa, Robert; Adena, Michael; Donovan, Mark; Jones, Steve; Conlon, John

    2009-01-01

    Minimization is an alternative method to stratified permuted block randomization, which may be more effective at balancing treatments when there are many strata. However, its use in the regulatory setting for industry trials remains controversial, primarily due to the difficulty in interpreting conventional asymptotic statistical tests under restricted methods of treatment allocation. We argue that the use of minimization should be critically evaluated when designing the study for which it is proposed. We demonstrate by example how simulation can be used to investigate whether minimization improves treatment balance compared with stratified randomization, and how much randomness can be incorporated into the minimization before any balance advantage is no longer retained. We also illustrate by example how the performance of the traditional model-based analysis can be assessed, by comparing the nominal test size with the observed test size over a large number of simulations. We recommend that the assignment probability for the minimization be selected using such simulations. Copyright (c) 2008 John Wiley & Sons, Ltd.

  6. Trajectory-Oriented Approach to Managing Traffic Complexity: Trajectory Flexibility Metrics and Algorithms and Preliminary Complexity Impact Assessment

    NASA Technical Reports Server (NTRS)

    Idris, Husni; Vivona, Robert A.; Al-Wakil, Tarek

    2009-01-01

    This document describes exploratory research on a distributed, trajectory oriented approach for traffic complexity management. The approach is to manage traffic complexity based on preserving trajectory flexibility and minimizing constraints. In particular, the document presents metrics for trajectory flexibility; a method for estimating these metrics based on discrete time and degree of freedom assumptions; a planning algorithm using these metrics to preserve flexibility; and preliminary experiments testing the impact of preserving trajectory flexibility on traffic complexity. The document also describes an early demonstration capability of the trajectory flexibility preservation function in the NASA Autonomous Operations Planner (AOP) platform.

  7. Economic and environmental optimization of a multi-site utility network for an industrial complex.

    PubMed

    Kim, Sang Hun; Yoon, Sung-Geun; Chae, Song Hwa; Park, Sunwon

    2010-01-01

    Most chemical companies consume a lot of steam, water and electrical resources in the production process. Given recent record fuel costs, utility networks must be optimized to reduce the overall cost of production. Environmental concerns must also be considered when preparing modifications to satisfy the requirements for industrial utilities, since wastes discharged from the utility networks are restricted by environmental regulations. Construction of Eco-Industrial Parks (EIPs) has drawn attention as a promising approach for retrofitting existing industrial parks to improve energy efficiency. The optimization of the utility network within an industrial complex is one of the most important undertakings to minimize energy consumption and waste loads in the EIP. In this work, a systematic approach to optimize the utility network of an industrial complex is presented. An important issue in the optimization of a utility network is the desire of the companies to achieve high profits while complying with the environmental regulations. Therefore, the proposed optimization was performed with consideration of both economic and environmental factors. The proposed approach consists of unit modeling using thermodynamic principles, mass and energy balances, development of a multi-period Mixed Integer Linear Programming (MILP) model for the integration of utility systems in an industrial complex, and an economic/environmental analysis of the results. This approach is applied to the Yeosu Industrial Complex, considering seasonal utility demands. The results show that both the total utility cost and waste load are reduced by optimizing the utility network of an industrial complex. 2009 Elsevier Ltd. All rights reserved.

  8. Can We Advance Macroscopic Quantum Systems Outside the Framework of Complex Decoherence Theory?

    PubMed Central

    Brezinski, Mark E; Rupnick, Maria

    2016-01-01

    Macroscopic quantum systems (MQS) are macroscopic systems driven by quantum rather than classical mechanics, a long studied area with minimal success till recently. Harnessing the benefits of quantum mechanics on a macroscopic level would revolutionize fields ranging from telecommunication to biology, the latter focused on here for reasons discussed. Contrary to misconceptions, there are no known physical laws that prevent the development of MQS. Instead, they are generally believed universally lost in complex systems from environmental entanglements (decoherence). But we argue success is achievable MQS with decoherence compensation developed, naturally or artificially, from top-down rather current reductionist approaches. This paper advances the MQS field by a complex systems approach to decoherence. First, why complex system decoherence approaches (top-down) are needed is discussed. Specifically, complex adaptive systems (CAS) are not amenable to reductionist models (and their master equations) because of emergent behaviour, approximation failures, not accounting for quantum compensatory mechanisms, ignoring path integrals, and the subentity problem. In addition, since MQS must exist within the context of the classical world, where rapid decoherence and prolonged coherence are both needed. Nature has already demonstrated this for quantum subsystems such as photosynthesis and magnetoreception. Second, we perform a preliminary study that illustrates a top-down approach to potential MQS. In summary, reductionist arguments against MQS are not justifiable. It is more likely they are not easily detectable in large intact classical systems or have been destroyed by reductionist experimental set-ups. This complex systems decoherence approach, using top down investigations, is critical to paradigm shifts in MQS research both in biological and non-biological systems. PMID:29200743

  9. Can We Advance Macroscopic Quantum Systems Outside the Framework of Complex Decoherence Theory?

    PubMed

    Brezinski, Mark E; Rupnick, Maria

    2014-07-01

    Macroscopic quantum systems (MQS) are macroscopic systems driven by quantum rather than classical mechanics, a long studied area with minimal success till recently. Harnessing the benefits of quantum mechanics on a macroscopic level would revolutionize fields ranging from telecommunication to biology, the latter focused on here for reasons discussed. Contrary to misconceptions, there are no known physical laws that prevent the development of MQS. Instead, they are generally believed universally lost in complex systems from environmental entanglements (decoherence). But we argue success is achievable MQS with decoherence compensation developed, naturally or artificially, from top-down rather current reductionist approaches. This paper advances the MQS field by a complex systems approach to decoherence. First, why complex system decoherence approaches (top-down) are needed is discussed. Specifically, complex adaptive systems (CAS) are not amenable to reductionist models (and their master equations) because of emergent behaviour, approximation failures, not accounting for quantum compensatory mechanisms, ignoring path integrals, and the subentity problem. In addition, since MQS must exist within the context of the classical world, where rapid decoherence and prolonged coherence are both needed. Nature has already demonstrated this for quantum subsystems such as photosynthesis and magnetoreception. Second, we perform a preliminary study that illustrates a top-down approach to potential MQS. In summary, reductionist arguments against MQS are not justifiable. It is more likely they are not easily detectable in large intact classical systems or have been destroyed by reductionist experimental set-ups. This complex systems decoherence approach, using top down investigations, is critical to paradigm shifts in MQS research both in biological and non-biological systems.

  10. Computing Pathways for Urban Decarbonization.

    NASA Astrophysics Data System (ADS)

    Cremades, R.; Sommer, P.

    2016-12-01

    Urban areas emit roughly three quarters of global carbon emissions. Cities are crucial elements for a decarbonized society. Urban expansion and related transportation needs lead to increased energy use, and to carbon-intensive lock-ins that create barriers for climate change mitigation globally. The authors present the Integrated Urban Complexity (IUC) model, based on self-organizing Cellular Automata (CA), and use it to produce a new kind of spatially explicit Transformation Pathways for Urban Decarbonization (TPUD). IUC is based on statistical evidence relating the energy needed for transportation with the spatial distribution of population, specifically IUC incorporates variables from complexity science related to urban form, like the slope of the rank-size rule or spatial entropy, which brings IUC a step beyond existing models. The CA starts its evolution with real-world urban land use and population distribution data from the Global Human Settlement Layer. Thus, the IUC model runs over existing urban settlements, transforming the spatial distribution of population so the energy consumption for transportation is minimized. The statistical evidence that governs the evolution of the CA departs from the database of the International Association of Public Transport. A selected case is presented using Stuttgart (Germany) as an example. The results show how IUC varies urban density in those places where it improves the performance of crucial parameters related to urban form, producing a TPUD that shows where the spatial distribution of population should be modified with a degree of detail of 250 meters of cell size. The TPUD shows how the urban complex system evolves over time to minimize energy consumption for transportation. The resulting dynamics or urban decarbonization show decreased energy per capita, although total energy increases for increasing population. The results provide innovative insights: by checking current urban planning against a TPUD, urban planners could understand where existing plans contradict the Agenda 2030, primarily the Sustainable Development Goals (SDGs) Climate Action (SDG 13), and Sustainable Cities and Communities (SDG 11). For the first time, evidence-based transformation pathways are produced to decarbonize cities.

  11. Increase in the efficiency of a high-speed ramjet on hydrocarbon fuel at the flying vehicle acceleration up to M = 6+

    NASA Astrophysics Data System (ADS)

    Abashev, V. M.; Korabelnikov, A. V.; Kuranov, A. L.; Tretyakov, P. K.

    2017-10-01

    At the analysis of the work process in a ramjet, a complex consideration of the ensemble of problems the solution of which determines the engine efficiency appears reasonable. The main problems are ensuring a high completeness of fuel combustion and minimal hydraulic losses, the reliability of cooling of high-heat areas with the use of the fuel cooling resource, and ensuring the strength of the engine duct elements under non-uniform heat loads due to fuel combustion in complex gas-dynamic flow structures. The fundamental techniques and approaches to the solution of above-noted problems are considered in the present report, their novelty and advantages in comparison with conventional techniques are substantiated. In particular, a technique of the arrangement of an intense (pre-detonation) combustion regime for ensuring a high completeness of fuel combustion and minimal hydraulic losses at a smooth deceleration of a supersonic flow down to the sound velocity using the pulsed-periodic gas-dynamic flow control has been proposed. A technique has been proposed for cooling the high-heat areas, which employs the cooling resource of the hydrocarbon fuel, including the process of the kerosene chemical transformation (conversion) using the nano-catalysts. An analysis has shown that the highly heated structure will operate in the elastic-plastic domain of the behavior of constructional materials, which is directly connected to the engine operation resource. There arise the problems of reducing the ramjet shells depending on deformations. The deformations also lead to a significant influence on the work process in the combustor and, naturally, on the heat transfer process and the performance of catalysts (the action of plastic and elastic deformations of restrained shells). The work presents some results illustrating the presence of identified problems. A conclusion is drawn about the necessity of formulating a complex investigation both with the realization in model experiments and execution of computational and theoretical investigations.

  12. Cognition and aging in a complex work environment: relationships with performance among air traffic control specialists.

    PubMed

    Becker, J T; Milke, R M

    1998-10-01

    Chronological age affects the performance of demanding cognitive tasks within the aviation environment. Within the domain of air traffic control (ATC), the ability to handle simultaneous visual and auditory input, or to return to a task after a break to complete another task, is critical to success and is the sort of cognitive function most affected by age. The limited available data suggest a strong relationship between age and job performance among ATC specialists, whether measured at the time of entry into the system or during the working lifetime of a full-performance-level controller. An analysis of the distribution of the ages of controllers currently in the system, and a projection for the years 2001 and 2006, leads to the conclusion that a high proportion of the ATC work force will be at risk for displaying age-related changes in job performance efficiency over the next 10 yr. It seems important, therefore, to determine the nature and extent of the age-related cognitive changes that can occur during the lifespan of a controller (i.e., 25-55 yr of age) and how these changes may affect job performance. The results of such an analysis should aid in the design and implementation of new control systems to minimize any deleterious effects of aging on performance.

  13. Cooperation through Competition—Dynamics and Microeconomics of a Minimal Nutrient Trade System in Arbuscular Mycorrhizal Symbiosis

    PubMed Central

    Schott, Stephan; Valdebenito, Braulio; Bustos, Daniel; Gomez-Porras, Judith L.; Sharma, Tripti; Dreyer, Ingo

    2016-01-01

    In arbuscular mycorrhizal (AM) symbiosis, fungi and plants exchange nutrients (sugars and phosphate, for instance) for reciprocal benefit. Until now it is not clear how this nutrient exchange system works. Here, we used computational cell biology to simulate the dynamics of a network of proton pumps and proton-coupled transporters that are upregulated during AM formation. We show that this minimal network is sufficient to describe accurately and realistically the nutrient trade system. By applying basic principles of microeconomics, we link the biophysics of transmembrane nutrient transport with the ecology of organismic interactions and straightforwardly explain macroscopic scenarios of the relations between plant and AM fungus. This computational cell biology study allows drawing far reaching hypotheses about the mechanism and the regulation of nutrient exchange and proposes that the “cooperation” between plant and fungus can be in fact the result of a competition between both for the same resources in the tiny periarbuscular space. The minimal model presented here may serve as benchmark to evaluate in future the performance of more complex models of AM nutrient exchange. As a first step toward this goal, we included SWEET sugar transporters in the model and show that their co-occurrence with proton-coupled sugar transporters results in a futile carbon cycle at the plant plasma membrane proposing that two different pathways for the same substrate should not be active at the same time. PMID:27446142

  14. Reagentless chemiluminescence-based fiber optic sensors for regenerative life support in space

    NASA Astrophysics Data System (ADS)

    Atwater, James E.; Akse, James R.; DeHart, Jeffrey; Wheeler, Richard R., Jr.

    1995-04-01

    The initial feasibility demonstration of a reagentless chemiluminescence based fiber optic sensor technology for use in advanced regenerative life support applications in space and planetary outposts is described. The primary constraints for extraterrestrial deployment of any technology are compatibility with microgravity and hypogravity environments; minimal size, weight, and power consumption; and minimal use of expendables due to the great expense and difficulty inherent to resupply logistics. In the current research, we report the integration of solid state flow through modules for the production of aqueous phase reagents into an integrated system for the detection of important analytes by chemiluminescence, with fiber optic light transmission. By minimizing the need for resupply expendables, the use of solid phase modules makes complex chemical detection schemes practical. For the proof of concept, hydrogen peroxide and glucose were chosen as analytes. The reaction is catalyzed by glucose oxidase, an immobilized enzyme. The aqueous phase chemistry required for sensor operation is implemented using solid phase modules which adjust the pH of the influent stream, catalyze the oxidation of analyte, and provide the controlled addition of the luminophore to the flowing aqueous stream. Precise control of the pH has proven essential for the long-term sustained release of the luminophore. Electrocatalysis is achieved using a controlled potential across gold mesh and gold foil electrodes which undergo periodic polarity reversals. The development and initial characterization of performance of the reagentless fiber optic chemiluminescence sensors are presented in this paper.

  15. Probabilistic sparse matching for robust 3D/3D fusion in minimally invasive surgery.

    PubMed

    Neumann, Dominik; Grbic, Sasa; John, Matthias; Navab, Nassir; Hornegger, Joachim; Ionasec, Razvan

    2015-01-01

    Classical surgery is being overtaken by minimally invasive and transcatheter procedures. As there is no direct view or access to the affected anatomy, advanced imaging techniques such as 3D C-arm computed tomography (CT) and C-arm fluoroscopy are routinely used in clinical practice for intraoperative guidance. However, due to constraints regarding acquisition time and device configuration, intraoperative modalities have limited soft tissue image quality and reliable assessment of the cardiac anatomy typically requires contrast agent, which is harmful to the patient and requires complex acquisition protocols. We propose a probabilistic sparse matching approach to fuse high-quality preoperative CT images and nongated, noncontrast intraoperative C-arm CT images by utilizing robust machine learning and numerical optimization techniques. Thus, high-quality patient-specific models can be extracted from the preoperative CT and mapped to the intraoperative imaging environment to guide minimally invasive procedures. Extensive quantitative experiments on 95 clinical datasets demonstrate that our model-based fusion approach has an average execution time of 1.56 s, while the accuracy of 5.48 mm between the anchor anatomy in both images lies within expert user confidence intervals. In direct comparison with image-to-image registration based on an open-source state-of-the-art medical imaging library and a recently proposed quasi-global, knowledge-driven multi-modal fusion approach for thoracic-abdominal images, our model-based method exhibits superior performance in terms of registration accuracy and robustness with respect to both target anatomy and anchor anatomy alignment errors.

  16. Apparent complex partial seizures in a bipolar patient after withdrawal of carbamazepine.

    PubMed

    Garbutt, J C; Gillette, G M

    1988-10-01

    A 64-year-old woman with long-standing bipolar illness was treated with carbamazepine and clonazepam with minimal success. Discontinuation of carbamazepine and clonazepam was followed by episodic amnesia, purposeless behavior, déjà vu, and confusion. Although her EEG was normal, the episodes were compatible with complex partial seizures and ceased after carbamazepine and clonazepam were reinstituted. This case raises the question of whether discontinuing carbamazepine and clonazepam can induce complex partial seizures in bipolar patients.

  17. Using an Extended Dynamic Drag-and-Drop Assistive Program to Assist People with Multiple Disabilities and Minimal Motor Control to Improve Computer Drag-and-Drop Ability through a Mouse Wheel

    ERIC Educational Resources Information Center

    Shih, Ching-Hsiang

    2012-01-01

    Software technology is adopted by the current research to improve the Drag-and-Drop abilities of two people with multiple disabilities and minimal motor control. This goal was realized through a Dynamic Drag-and-Drop Assistive Program (DDnDAP) in which the complex dragging process is replaced by simply poking the mouse wheel and clicking. However,…

  18. What Are the Principles That Guide Behaviors in the Operating Room?: Creating a Framework to Define and Measure Performance.

    PubMed

    Madani, Amin; Vassiliou, Melina C; Watanabe, Yusuke; Al-Halabi, Becher; Al-Rowais, Mohammed S; Deckelbaum, Dan L; Fried, Gerald M; Feldman, Liane S

    2017-02-01

    To identify the core principles that guide expert intraoperative behaviors and to use these principles to develop a universal framework that defines intraoperative performance. Surgical outcomes are associated with intraoperative cognitive skills. Yet, our understanding of factors that control intraoperative judgment and decision-making are limited. As a result, current methods for training and measuring performance are somewhat subjective-more task rather than procedure-oriented-and usually not standardized. They thus provide minimal insight into complex cognitive processes that are fundamental to patient safety. Cognitive task analyses for 6 diverse surgical procedures were performed using semistructured interviews and field observations to describe the thoughts, behaviors, and actions that characterize and guide expert performance. Verbal data were transcribed, supplemented with content from published literature, coded, thematically analyzed using grounded-theory by 4 independent reviewers, and synthesized into a list of items. A conceptual framework was developed based on 42 semistructured interviews lasting 45 to 120 minutes, 5 expert panels and 51 field observations involving 35 experts, and 135 sources from the literature. Five domains of intraoperative performance were identified: psychomotor skills, declarative knowledge, advanced cognitive skills, interpersonal skills, and personal resourcefulness. Within the advanced cognitive skills domain, 21 themes were perceived to guide the behaviors of surgeons: 18 for surgical planning and error prevention, and 3 for error/injury recognition, rescue, and recovery. The application of these thought patterns was highly case-specific and variable amongst subspecialties, environments, and individuals. This study provides a comprehensive definition of intraoperative expertise, with greater insight into the complex cognitive processes that seem to underlie optimal performance. This framework provides trainees and other nonexperts with the necessary information to use in deliberate practice and the creation of effective thought habits that characterize expert performance. It may help to identify gaps in performance, and to isolate root causes of surgical errors with the ultimate goal of improving patient safety.

  19. iATTRACT: simultaneous global and local interface optimization for protein-protein docking refinement.

    PubMed

    Schindler, Christina E M; de Vries, Sjoerd J; Zacharias, Martin

    2015-02-01

    Protein-protein interactions are abundant in the cell but to date structural data for a large number of complexes is lacking. Computational docking methods can complement experiments by providing structural models of complexes based on structures of the individual partners. A major caveat for docking success is accounting for protein flexibility. Especially, interface residues undergo significant conformational changes upon binding. This limits the performance of docking methods that keep partner structures rigid or allow limited flexibility. A new docking refinement approach, iATTRACT, has been developed which combines simultaneous full interface flexibility and rigid body optimizations during docking energy minimization. It employs an atomistic molecular mechanics force field for intermolecular interface interactions and a structure-based force field for intramolecular contributions. The approach was systematically evaluated on a large protein-protein docking benchmark, starting from an enriched decoy set of rigidly docked protein-protein complexes deviating by up to 15 Å from the native structure at the interface. Large improvements in sampling and slight but significant improvements in scoring/discrimination of near native docking solutions were observed. Complexes with initial deviations at the interface of up to 5.5 Å were refined to significantly better agreement with the native structure. Improvements in the fraction of native contacts were especially favorable, yielding increases of up to 70%. © 2014 Wiley Periodicals, Inc.

  20. Collaborative virtual reality based advanced cardiac life support training simulator using virtual reality principles.

    PubMed

    Khanal, Prabal; Vankipuram, Akshay; Ashby, Aaron; Vankipuram, Mithra; Gupta, Ashish; Drumm-Gurnee, Denise; Josey, Karen; Tinker, Linda; Smith, Marshall

    2014-10-01

    Advanced Cardiac Life Support (ACLS) is a series of team-based, sequential and time constrained interventions, requiring effective communication and coordination of activities that are performed by the care provider team on a patient undergoing cardiac arrest or respiratory failure. The state-of-the-art ACLS training is conducted in a face-to-face environment under expert supervision and suffers from several drawbacks including conflicting care provider schedules and high cost of training equipment. The major objective of the study is to describe, including the design, implementation, and evaluation of a novel approach of delivering ACLS training to care providers using the proposed virtual reality simulator that can overcome the challenges and drawbacks imposed by the traditional face-to-face training method. We compare the efficacy and performance outcomes associated with traditional ACLS training with the proposed novel approach of using a virtual reality (VR) based ACLS training simulator. One hundred and forty-eight (148) ACLS certified clinicians, translating into 26 care provider teams, were enrolled for this study. Each team was randomly assigned to one of the three treatment groups: control (traditional ACLS training), persuasive (VR ACLS training with comprehensive feedback components), or minimally persuasive (VR ACLS training with limited feedback components). The teams were tested across two different ACLS procedures that vary in the degree of task complexity: ventricular fibrillation or tachycardia (VFib/VTach) and pulseless electric activity (PEA). The difference in performance between control and persuasive groups was not statistically significant (P=.37 for PEA and P=.1 for VFib/VTach). However, the difference in performance between control and minimally persuasive groups was significant (P=.05 for PEA and P=.02 for VFib/VTach). The pre-post comparison of performances of the groups showed that control (P=.017 for PEA, P=.01 for VFib/VTach) and persuasive (P=.02 for PEA, P=.048 for VFib/VTach) groups improved their performances significantly, whereas minimally persuasive group did not (P=.45 for PEA, P=.46 for VFib/VTach). Results also suggest that the benefit of persuasiveness is constrained by the potentially interruptive nature of these features. Our results indicate that the VR-based ACLS training with proper feedback components can provide a learning experience similar to face-to-face training, and therefore could serve as a more easily accessed supplementary training tool to the traditional ACLS training. Our findings also suggest that the degree of persuasive features in VR environments have to be designed considering the interruptive nature of the feedback elements. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Trends in laparoscopic colorectal surgery over time from 2005-2014 using the NSQIP database.

    PubMed

    Davis, Catherine H; Shirkey, Beverly A; Moore, Linda W; Gaglani, Tanmay; Du, Xianglin L; Bailey, H Randolph; Cusick, Marianne V

    2018-03-01

    Laparoscopy, originally pioneered by gynecologists, was first adopted by general surgeons in the late 1980s. Since then, laparoscopy has been adopted in the surgical specialties and colorectal surgery for treatment of benign and malignant disease. Formal laparoscopic training became a required component of surgery residency programs as validated by the Fundamentals of Laparoscopic Surgery curriculum; however, some surgeons may be more apprehensive of widespread adoption of minimally invasive techniques. Although an overall increase in the use of laparoscopy in colorectal surgery is anticipated over a 10-year period, it is unknown if a similar increase will be seen in higher risk or more acutely ill patients. Using the American College of Surgeons (ACS) National Surgical Quality Improvement Program (NSQIP) database from 2005-2014, colorectal procedures were identified by Current Procedural Terminology codes and categorized to open or laparoscopic surgery. The proportion of colorectal surgeries performed laparoscopically was calculated for each year. Separate descriptive statistics was performed and categorized by age and body mass index (BMI). American Society of Anesthesiology (ASA) classification and emergency case status variables were added to the project to help assess complexity of cases. During the 10-year study period, the number of colorectal cases increased from 3114 in 2005 to 51,611 in 2014 as more hospitals joined NSQIP. A total of 277,376 colorectal cases were identified; of which, 114,359 (41.2%) were performed laparoscopically. The use of laparoscopy gradually increased each year, from 22.7% in 2005 to 49.8% in 2014. Laparoscopic procedures were most commonly performed in the youngest age group (18-49 years), overweight and obese patients (BMI 25-34.9), and in ASA class 1-2 patients. Over the 10-year period, there was a noted increase in the use of laparoscopy in every age, BMI, and ASA category, except ASA 5. The percent of emergency cases receiving laparoscopic surgery also doubled from 5.5% in 2005 to 11.5% in 2014. Over a 10-year period, there was a gradual increase in the use of laparoscopy in colorectal surgery. Further, there was a consistent increase of laparoscopic surgery in all age groups, including the elderly, in all BMI classes, including the obese and morbidly obese, and in most ASA classes, including ASA 3-4, as well as in emergency surgeries. These trends suggest that minimally invasive colorectal surgery appears to be widely adopted and performed on more complex or higher risk patients. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Cell-free protein synthesis in micro compartments: building a minimal cell from biobricks.

    PubMed

    Jia, Haiyang; Heymann, Michael; Bernhard, Frank; Schwille, Petra; Kai, Lei

    2017-10-25

    The construction of a minimal cell that exhibits the essential characteristics of life is a great challenge in the field of synthetic biology. Assembling a minimal cell requires multidisciplinary expertise from physics, chemistry and biology. Scientists from different backgrounds tend to define the essence of 'life' differently and have thus proposed different artificial cell models possessing one or several essential features of living cells. Using the tools and methods of molecular biology, the bottom-up engineering of a minimal cell appears in reach. However, several challenges still remain. In particular, the integration of individual sub-systems that is required to achieve a self-reproducing cell model presents a complex optimization challenge. For example, multiple self-organisation and self-assembly processes have to be carefully tuned. We review advances and developments of new methods and techniques, for cell-free protein synthesis as well as micro-fabrication, for their potential to resolve challenges and to accelerate the development of minimal cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Discriminating quantum-optical beam-splitter channels with number-diagonal signal states: Applications to quantum reading and target detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nair, Ranjith

    2011-09-15

    We consider the problem of distinguishing, with minimum probability of error, two optical beam-splitter channels with unequal complex-valued reflectivities using general quantum probe states entangled over M signal and M' idler mode pairs of which the signal modes are bounced off the beam splitter while the idler modes are retained losslessly. We obtain a lower bound on the output state fidelity valid for any pure input state. We define number-diagonal signal (NDS) states to be input states whose density operator in the signal modes is diagonal in the multimode number basis. For such input states, we derive series formulas formore » the optimal error probability, the output state fidelity, and the Chernoff-type upper bounds on the error probability. For the special cases of quantum reading of a classical digital memory and target detection (for which the reflectivities are real valued), we show that for a given input signal photon probability distribution, the fidelity is minimized by the NDS states with that distribution and that for a given average total signal energy N{sub s}, the fidelity is minimized by any multimode Fock state with N{sub s} total signal photons. For reading of an ideal memory, it is shown that Fock state inputs minimize the Chernoff bound. For target detection under high-loss conditions, a no-go result showing the lack of appreciable quantum advantage over coherent state transmitters is derived. A comparison of the error probability performance for quantum reading of number state and two-mode squeezed vacuum state (or EPR state) transmitters relative to coherent state transmitters is presented for various values of the reflectances. While the nonclassical states in general perform better than the coherent state, the quantitative performance gains differ depending on the values of the reflectances. The experimental outlook for realizing nonclassical gains from number state transmitters with current technology at moderate to high values of the reflectances is argued to be good.« less

  4. Ergonomic design of crane cabins: a case study from a steel plant in India.

    PubMed

    Ray, Pradip Kumar; Tewari, V K

    2012-01-01

    The study, carried out at the Batch Annealing Furnace (BAF) shop of Cold Rolling Mill (CRM) at an integrated steel plant of India, concerns ergonomic evaluation and redesign of a manually-operated Electrical Overhead Travelling (EOT) crane cabin. The crane cabin is a complex worksystem consisting of the crane operator and twelve specific machine components embedded in a closed workspace. A crane operator has to perform various activities, such as loading and unloading of coils, setting and removal of convector plates, and routine maintenance work. Initially, an operator had to work in standing posture with bent back most of the time. Ergonomically poor design of the chair and the controls, awkward work postures, and insufficient vision angle resulting in musculoskeletal disorders (MSDs) are some of the critical problems observed.. The study, conceived as an industry-academia joint initiative, was undertaken by a design team, the members of which were drawn from both the company concerned and the institute. With the project executed successfully, a number of lessons, such as how to minimize the anthropometric mismatch, how to improve the layout of the components and controls within enclosed workspace, and how to improve work posture minimizing risk of MSDs have been learned.

  5. Final Report - High-Order Spectral Volume Method for the Navier-Stokes Equations On Unstructured Tetrahedral Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z J

    2012-12-06

    The overriding objective for this project is to develop an efficient and accurate method for capturing strong discontinuities and fine smooth flow structures of disparate length scales with unstructured grids, and demonstrate its potentials for problems relevant to DOE. More specifically, we plan to achieve the following objectives: 1. Extend the SV method to three dimensions, and develop a fourth-order accurate SV scheme for tetrahedral grids. Optimize the SV partition by minimizing a form of the Lebesgue constant. Verify the order of accuracy using the scalar conservation laws with an analytical solution; 2. Extend the SV method to Navier-Stokes equationsmore » for the simulation of viscous flow problems. Two promising approaches to compute the viscous fluxes will be tested and analyzed; 3. Parallelize the 3D viscous SV flow solver using domain decomposition and message passing. Optimize the cache performance of the flow solver by designing data structures minimizing data access times; 4. Demonstrate the SV method with a wide range of flow problems including both discontinuities and complex smooth structures. The objectives remain the same as those outlines in the original proposal. We anticipate no technical obstacles in meeting these objectives.« less

  6. Minimizing Barriers in Learning for On-Call Radiology Residents-End-to-End Web-Based Resident Feedback System.

    PubMed

    Choi, Hailey H; Clark, Jennifer; Jay, Ann K; Filice, Ross W

    2018-02-01

    Feedback is an essential part of medical training, where trainees are provided with information regarding their performance and further directions for improvement. In diagnostic radiology, feedback entails a detailed review of the differences between the residents' preliminary interpretation and the attendings' final interpretation of imaging studies. While the on-call experience of independently interpreting complex cases is important to resident education, the more traditional synchronous "read-out" or joint review is impossible due to multiple constraints. Without an efficient method to compare reports, grade discrepancies, convey salient teaching points, and view images, valuable lessons in image interpretation and report construction are lost. We developed a streamlined web-based system, including report comparison and image viewing, to minimize barriers in asynchronous communication between attending radiologists and on-call residents. Our system provides real-time, end-to-end delivery of case-specific and user-specific feedback in a streamlined, easy-to-view format. We assessed quality improvement subjectively through surveys and objectively through participation metrics. Our web-based feedback system improved user satisfaction for both attending and resident radiologists, and increased attending participation, particularly with regards to cases where substantive discrepancies were identified.

  7. A common minimal motif for the ligands of HLA-B*27 class I molecules.

    PubMed

    Barriga, Alejandro; Lorente, Elena; Johnstone, Carolina; Mir, Carmen; del Val, Margarita; López, Daniel

    2014-01-01

    CD8(+) T cells identify and kill infected cells through the specific recognition of short viral antigens bound to human major histocompatibility complex (HLA) class I molecules. The colossal number of polymorphisms in HLA molecules makes it essential to characterize the antigen-presenting properties common to large HLA families or supertypes. In this context, the HLA-B*27 family comprising at least 100 different alleles, some of them widely distributed in the human population, is involved in the cellular immune response against pathogens and also associated to autoimmune spondyloarthritis being thus a relevant target of study. To this end, HLA binding assays performed using nine HLA-B*2705-restricted ligands endogenously processed and presented in virus-infected cells revealed a common minimal peptide motif for efficient binding to the HLA-B*27 family. The motif was independently confirmed using four unrelated peptides. This experimental approach, which could be easily transferred to other HLA class I families and supertypes, has implications for the validation of new bioinformatics tools in the functional clustering of HLA molecules, for the identification of antiviral cytotoxic T lymphocyte responses, and for future vaccine development.

  8. Selecting a Control Strategy for Plug and Process Loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobato, C.; Sheppy, M.; Brackney, L.

    2012-09-01

    Plug and Process Loads (PPLs) are building loads that are not related to general lighting, heating, ventilation, cooling, and water heating, and typically do not provide comfort to the building occupants. PPLs in commercial buildings account for almost 5% of U.S. primary energy consumption. On an individual building level, they account for approximately 25% of the total electrical load in a minimally code-compliant commercial building, and can exceed 50% in an ultra-high efficiency building such as the National Renewable Energy Laboratory's (NREL) Research Support Facility (RSF) (Lobato et al. 2010). Minimizing these loads is a primary challenge in the designmore » and operation of an energy-efficient building. A complex array of technologies that measure and manage PPLs has emerged in the marketplace. Some fall short of manufacturer performance claims, however. NREL has been actively engaged in developing an evaluation and selection process for PPLs control, and is using this process to evaluate a range of technologies for active PPLs management that will cap RSF plug loads. Using a control strategy to match plug load use to users' required job functions is a huge untapped potential for energy savings.« less

  9. Optimizing Cluster Heads for Energy Efficiency in Large-Scale Heterogeneous Wireless Sensor Networks

    DOE PAGES

    Gu, Yi; Wu, Qishi; Rao, Nageswara S. V.

    2010-01-01

    Many complex sensor network applications require deploying a large number of inexpensive and small sensors in a vast geographical region to achieve quality through quantity. Hierarchical clustering is generally considered as an efficient and scalable way to facilitate the management and operation of such large-scale networks and minimize the total energy consumption for prolonged lifetime. Judicious selection of cluster heads for data integration and communication is critical to the success of applications based on hierarchical sensor networks organized as layered clusters. We investigate the problem of selecting sensor nodes in a predeployed sensor network to be the cluster heads tomore » minimize the total energy needed for data gathering. We rigorously derive an analytical formula to optimize the number of cluster heads in sensor networks under uniform node distribution, and propose a Distance-based Crowdedness Clustering algorithm to determine the cluster heads in sensor networks under general node distribution. The results from an extensive set of experiments on a large number of simulated sensor networks illustrate the performance superiority of the proposed solution over the clustering schemes based on k -means algorithm.« less

  10. Is Hyperbaric Oxygen Therapy Effective for Traumatic Brain Injury? A Rapid Evidence Assessment of the Literature and Recommendations for the Field.

    PubMed

    Crawford, Cindy; Teo, Lynn; Yang, EunMee; Isbister, Caitlin; Berry, Kevin

    This systematic review examines the efficacy of hyperbaric oxygen (HBO2) for traumatic brain injury (TBI) to make evidence-based recommendations for its application and future research. A comprehensive search was conducted to identify studies through 2014. Methodological quality was assessed and synthesis and interpretation of relevant data was performed. Twelve randomized trials were included. All mild TBI studies demonstrated minimal bias and no statistically significant differences between HBO2 and sham arms. Statistically significant improvement occurred over time within both groups. Moderate-to-severe TBI studies were of mixed quality, with majority of results favoring HBO2 compared with "standard care." The placebo analysis conducted was limited by lack of details. For mild TBI, results indicate HBO2 is no better than sham treatment. Improvements within both HBO2 and sham groups cannot be ignored. For acute treatment of moderate-to-severe TBI, although methodology appears flawed across some studies, because of the complexity of brain injury, HBO2 may be beneficial as a relatively safe adjunctive therapy if feasible. Further research should be considered to resolve the controversy surrounding this field, but only if methodological flaws are avoided and bias minimized.

  11. Emergence of complex behavior in pili-based motility in early stages of P. aeruginosa surface adaptation

    NASA Astrophysics Data System (ADS)

    Brill-Karniely, Yifat; Jin, Fan; Wong, Gerard C. L.; Frenkel, Daan; Dobnikar, Jure

    2017-04-01

    Pseudomonas aeruginosa move across surfaces by using multiple Type IV Pili (TFP), motorized appendages capable of force generation via linear extension/retraction cycles, to generate surface motions collectively known as twitching motility. Pseudomonas cells arrive at a surface with low levels of piliation and TFP activity, which both progressively increase as the cells sense the presence of a surface. At present, it is not clear how twitching motility emerges from these initial minimal conditions. Here, we build a simple model for TFP-driven surface motility without complications from viscous and solid friction on surfaces. We discover the unanticipated structural requirement that TFP motors need to have a minimal amount of effective angular rigidity in order for cells to perform the various classes of experimentally-observed motions. Moreover, a surprisingly small number of TFP are needed to recapitulate movement signatures associated with twitching: Two TFP can already produce movements reminiscent of recently observed slingshot type motion. Interestingly, jerky slingshot motions characteristic of twitching motility comprise the transition region between different types of observed crawling behavior in the dynamical phase diagram, such as self-trapped localized motion, 2-D diffusive exploration, and super-diffusive persistent motion.

  12. Parallel particle filters for online identification of mechanistic mathematical models of physiology from monitoring data: performance and real-time scalability in simulation scenarios.

    PubMed

    Zenker, Sven

    2010-08-01

    Combining mechanistic mathematical models of physiology with quantitative observations using probabilistic inference may offer advantages over established approaches to computerized decision support in acute care medicine. Particle filters (PF) can perform such inference successively as data becomes available. The potential of PF for real-time state estimation (SE) for a model of cardiovascular physiology is explored using parallel computers and the ability to achieve joint state and parameter estimation (JSPE) given minimal prior knowledge tested. A parallelized sequential importance sampling/resampling algorithm was implemented and its scalability for the pure SE problem for a non-linear five-dimensional ODE model of the cardiovascular system evaluated on a Cray XT3 using up to 1,024 cores. JSPE was implemented using a state augmentation approach with artificial stochastic evolution of the parameters. Its performance when simultaneously estimating the 5 states and 18 unknown parameters when given observations only of arterial pressure, central venous pressure, heart rate, and, optionally, cardiac output, was evaluated in a simulated bleeding/resuscitation scenario. SE was successful and scaled up to 1,024 cores with appropriate algorithm parametrization, with real-time equivalent performance for up to 10 million particles. JSPE in the described underdetermined scenario achieved excellent reproduction of observables and qualitative tracking of enddiastolic ventricular volumes and sympathetic nervous activity. However, only a subset of the posterior distributions of parameters concentrated around the true values for parts of the estimated trajectories. Parallelized PF's performance makes their application to complex mathematical models of physiology for the purpose of clinical data interpretation, prediction, and therapy optimization appear promising. JSPE in the described extremely underdetermined scenario nevertheless extracted information of potential clinical relevance from the data in this simulation setting. However, fully satisfactory resolution of this problem when minimal prior knowledge about parameter values is available will require further methodological improvements, which are discussed.

  13. Hierarchical coordinate systems for understanding complexity and its evolution, with applications to genetic regulatory networks.

    PubMed

    Egri-Nagy, Attila; Nehaniv, Chrystopher L

    2008-01-01

    Beyond complexity measures, sometimes it is worthwhile in addition to investigate how complexity changes structurally, especially in artificial systems where we have complete knowledge about the evolutionary process. Hierarchical decomposition is a useful way of assessing structural complexity changes of organisms modeled as automata, and we show how recently developed computational tools can be used for this purpose, by computing holonomy decompositions and holonomy complexity. To gain insight into the evolution of complexity, we investigate the smoothness of the landscape structure of complexity under minimal transitions. As a proof of concept, we illustrate how the hierarchical complexity analysis reveals symmetries and irreversible structure in biological networks by applying the methods to the lac operon mechanism in the genetic regulatory network of Escherichia coli.

  14. Tailored minimally invasive management of complex calculi in horseshoe kidney.

    PubMed

    Ding, Jie; Zhang, Yuanyuan; Cao, Qifeng; Huang, Tao; Xu, Wei; Huang, Kai; Fang, Jing; Bai, Qiang; Qi, Jun; Huang, Yunteng

    2015-01-01

    Complex calculi in horseshoe kidney (HK) present a significant management challenge. Here, we report the clinical efficacy of extracorporeal shock wave lithotripsy (ESWL), minimally invasive percutaneous nephrolithotomy (MPCNL) and flexible ureteroscopy (FURS), combined with holmium laser lithotripsy, in the treatment of calculi in HK. From January 2005 to May 2014, 62 HK patients with renal calculi were reviewed in terms of medical history, treatment modality and therapeutic outcome in a single tertiary care hospital. Among the patients, 11 with a solitary stone ≤ 1.5 cm in diameter received ESWL, leading to overall stone-free rate of 72.7%; 18 with stone diameter ≤ 2-3 cm received retrograde flexible ureteroscopy, with a recorded mean digitized surface area (DSA) of 339.6 ± 103.9 mm2, mean operation time of 93.1 ± 11.5 minutes and overall stone-free rate of 88.9%; and 33 with staghorn or complex calculi (d ≥ 2 cm) had MPCNL or MPCNL-FURS, with a recorded mean DSA of 691.0 ± 329.9 vs. 802.9 ± 333.3 mm2, mean operation time of 106.4 ± 16.6 vs. 124.4 ± 15.1 min and overall stone-free rate of 89.5% vs. 92.9%. For complex calculi (d ≥ 2 cm), MPCNL combined with antegrade FURS was superior in terms of reducing number of tracts, controlling mean hemoglobin drop, but required longer operation time, comparing with MPCNL alone. As minimally invasive treatments, a combination of MPCNL and antegrade FURS provides a safe and effective modality in the management of staghorn or complex calculi (d ≥ 2 cm) in HK with significantly reduced blood loss comparing to MPCNL alone, and retrograde FURS alone is favorable for stones with a diameter ≤ 2-3 cm. ESWL is effective for viable small solitary stones (d ≤ 1.5 cm). Treatment modality should be tailored based on individual condition.

  15. Entropy and equilibrium via games of complexity

    NASA Astrophysics Data System (ADS)

    Topsøe, Flemming

    2004-09-01

    It is suggested that thermodynamical equilibrium equals game theoretical equilibrium. Aspects of this thesis are discussed. The philosophy is consistent with maximum entropy thinking of Jaynes, but goes one step deeper by deriving the maximum entropy principle from an underlying game theoretical principle. The games introduced are based on measures of complexity. Entropy is viewed as minimal complexity. It is demonstrated that Tsallis entropy ( q-entropy) and Kaniadakis entropy ( κ-entropy) can be obtained in this way, based on suitable complexity measures. A certain unifying effect is obtained by embedding these measures in a two-parameter family of entropy functions.

  16. Visualizing feasible operating ranges within tissue engineering systems using a "windows of operation" approach: a perfusion-scaffold bioreactor case study.

    PubMed

    McCoy, Ryan J; O'Brien, Fergal J

    2012-12-01

    Tissue engineering approaches to developing functional substitutes are often highly complex, multivariate systems where many aspects of the biomaterials, bio-regulatory factors or cell sources may be controlled in an effort to enhance tissue formation. Furthermore, success is based on multiple performance criteria reflecting both the quantity and quality of the tissue produced. Managing the trade-offs between different performance criteria is a challenge. A "windows of operation" tool that graphically represents feasible operating spaces to achieve user-defined levels of performance has previously been described by researchers in the bio-processing industry. This paper demonstrates the value of "windows of operation" to the tissue engineering field using a perfusion-scaffold bioreactor system as a case study. In our laboratory, perfusion bioreactor systems are utilized in the context of bone tissue engineering to enhance the osteogenic differentiation of cell-seeded scaffolds. A key challenge of such perfusion bioreactor systems is to maximize the induction of osteogenesis but minimize cell detachment from the scaffold. Two key operating variables that influence these performance criteria are the mean scaffold pore size and flow-rate. Using cyclooxygenase-2 and osteopontin gene expression levels as surrogate indicators of osteogenesis, we employed the "windows of operation" methodology to rapidly identify feasible operating ranges for the mean scaffold pore size and flow-rate that achieved user-defined levels of performance for cell detachment and differentiation. Incorporation of such tools into the tissue engineer's armory will hopefully yield a greater understanding of the highly complex systems used and help aid decision making in future translation of products from the bench top to the market place. Copyright © 2012 Wiley Periodicals, Inc.

  17. Effects of Camera Arrangement on Perceptual-Motor Performance in Minimally Invasive Surgery

    ERIC Educational Resources Information Center

    Delucia, Patricia R.; Griswold, John A.

    2011-01-01

    Minimally invasive surgery (MIS) is performed for a growing number of treatments. Whereas open surgery requires large incisions, MIS relies on small incisions through which instruments are inserted and tissues are visualized with a camera. MIS results in benefits for patients compared with open surgery, but degrades the surgeon's perceptual-motor…

  18. Optimal Tuner Selection for Kalman Filter-Based Aircraft Engine Performance Estimation

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Garg, Sanjay

    2010-01-01

    A linear point design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented. This technique specifically addresses the underdetermined estimation problem, where there are more unknown parameters than available sensor measurements. A systematic approach is applied to produce a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. Tuning parameter selection is performed using a multi-variable iterative search routine which seeks to minimize the theoretical mean-squared estimation error. This paper derives theoretical Kalman filter estimation error bias and variance values at steady-state operating conditions, and presents the tuner selection routine applied to minimize these values. Results from the application of the technique to an aircraft engine simulation are presented and compared to the conventional approach of tuner selection. Experimental simulation results are found to be in agreement with theoretical predictions. The new methodology is shown to yield a significant improvement in on-line engine performance estimation accuracy

  19. Design of the low area monotonic trim DAC in 40 nm CMOS technology for pixel readout chips

    NASA Astrophysics Data System (ADS)

    Drozd, A.; Szczygiel, R.; Maj, P.; Satlawa, T.; Grybos, P.

    2014-12-01

    The recent research in hybrid pixel detectors working in single photon counting mode focuses on nanometer or 3D technologies which allow making pixels smaller and implementing more complex solutions in each of the pixels. Usually single pixel in readout electronics for X-ray detection comprises of charge amplifier, shaper and discriminator that allow classification of events occurring at the detector as true or false hits by comparing amplitude of the signal obtained with threshold voltage, which minimizes the influence of noise effects. However, making the pixel size smaller often causes problems with pixel to pixel uniformity and additional effects like charge sharing become more visible. To improve channel-to-channel uniformity or implement an algorithm for charge sharing effect minimization, small area trimming DACs working in each pixel independently are necessary. However, meeting the requirement of small area often results in poor linearity and even non-monotonicity. In this paper we present a novel low-area thermometer coded 6-bit DAC implemented in 40 nm CMOS technology. Monte Carlo simulations were performed on the described design proving that under all conditions designed DAC is inherently monotonic. Presented DAC was implemented in the prototype readout chip with 432 pixels working in single photon counting mode, with two trimming DACs in each pixel. Each DAC occupies the area of 8 μm × 18.5 μm. Measurements and chips' tests were performed to obtain reliable statistical results.

  20. Qualification of the Tropical Rainfall Measuring Mission Solar Array Deployment System

    NASA Technical Reports Server (NTRS)

    Lawrence, Jon

    1998-01-01

    The Tropical Rainfall Measuring Mission (TRMM) solar arrays are placed into orbital configuration by a complex deployment system. Its two wings each comprise twin seven square solar panels located by a twelve foot articulated boom. The four spring-driven hinge lines per wing are rate-limited by viscous dampers. The wings are stowed against the spacecraft kinematically, and released by five pyrotechnically-actuated mechanisms. Since deployment failure would be catastrophic, a total of 17 deployment tests were completed to qualify the system for the worst cast launch environment. This successful testing culminated in the flawless deployment of the solar arrays on orbit, 15 minutes after launch in November 1997. The custom gravity negation system used to perform deployment testing is modular to allow its setup in several locations, including the launch site in Japan. Both platform and height can be varied, to meet the requirements of the test configuration and the test facility. Its air pad floatation system meets tight packaging requirements, allowing installation while stowed against the spacecraft without breaking any flight interfaces, and avoiding interference during motion. This system was designed concurrently with the deployment system, to facilitate its installation, to aid in the integration of the flight system to the spacecraft, while demonstrating deployment capabilities. Critical parameters for successful testing were alignment of deployment axes and tables to gravity, alignment of table seams to minimize discontinuities, and minimizing pressure drops in the air supply system. Orbital performance was similar to that predicted by ground testing.

Top