Science.gov

Sample records for minimal cut-set methodology

  1. Minimal cut-set methodology for artificial intelligence applications

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements.

  2. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same

  3. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  4. Energy minimization in medical image analysis: Methodologies and applications.

    PubMed

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well.

  5. Pollution balance. A new methodology for minimizing waste production in manufacturing processes

    SciTech Connect

    Hilaly, A.K.; Sikdar, S.K.

    1994-11-01

    A new methodology based on a generic pollution balance equation has been developed for minimizing waste production in manufacturing processes. A `pollution index,` defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative measure of waste generation in a process. A waste reduction algorithm also has been developed from the pollution balance equation. This paper explains this methodology and demonstrates the applicability of the method by a case study. 8 refs., 7 figs.

  6. Anthropogenic microfibres pollution in marine biota. A new and simple methodology to minimize airborne contamination.

    PubMed

    Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi

    2016-12-15

    Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work.

  7. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  8. Using benchmarking to minimize common DOE waste streams. Volume 1, Methodology and liquid photographic waste

    SciTech Connect

    Levin, V.

    1994-04-01

    Finding innovative ways to reduce waste streams generated at Department of Energy (DOE) sites by 50% by the year 2000 is a challenge for DOE`s waste minimization efforts. This report examines the usefulness of benchmarking as a waste minimization tool, specifically regarding common waste streams at DOE sites. A team of process experts from a variety of sites, a project leader, and benchmarking consultants completed the project with management support provided by the Waste Minimization Division EM-352. Using a 12-step benchmarking process, the team examined current waste minimization processes for liquid photographic waste used at their sites and used telephone and written questionnaires to find ``best-in-class`` industrv partners willing to share information about their best waste minimization techniques and technologies through a site visit. Eastman Kodak Co., and Johnson Space Center/National Aeronautics and Space Administration (NASA) agreed to be partners. The site visits yielded strategies for source reduction, recycle/recovery of components, regeneration/reuse of solutions, and treatment of residuals, as well as best management practices. An additional benefit of the work was the opportunity for DOE process experts to network and exchange ideas with their peers at similar sites.

  9. POLLUTION BALANCE: A NEW METHODOLOGY FOR MINIMIZING WASTE PRODUCTION IN MANUFACTURING PROCESSES.

    EPA Science Inventory

    A new methodolgy based on a generic pollution balance equation, has been developed for minimizing waste production in manufacturing processes. A "pollution index," defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative meas...

  10. Methodology for Minimizing Losses for the Harman Technique at High Temperatures

    NASA Astrophysics Data System (ADS)

    McCarty, R.; Thompson, J.; Sharp, J.; Thompson, A.; Bierschenk, J.

    2012-06-01

    A high-temperature Harman technique for measuring material ZT, or thermo- electric figure of merit, with increased measurement accuracy is presented. Traditional Harman tests are sensitive to radiation heat losses at elevated temperature, and measurement errors are minimized by applying current in positive and negative polarities while thermally sinking the sample base to a constant temperature for both polarities (referred to here as bottom temperature match, BTM). Since the sample top temperature differs between polarities in BTM, the heat losses are not equivalent and still add error to the ZT measurement. A modification is presented in which the sample base temperature is adjusted until the sample top temperature is the same in both polarities (referred to as top temperature match, TTM). This ensures that heat losses from the top of the sample are nearly identical and cancel out of the ZT calculation. A temperature-controlled radiation guard maintained at the sample top temperature is employed to minimize radiation loss and increase ZT calculation accuracy. Finite-element analysis (FEA) models suggest that ZT errors less than 5% for Bi2Te3 alloys tested at 250°C are achievable, a 30% improvement over the conventional BTM approach. Experimental results support these trends.

  11. A minimally invasive methodology based on morphometric parameters for day 2 embryo quality assessment.

    PubMed

    Molina, Inmaculada; Lázaro-Ibáñez, Elisa; Pertusa, Jose; Debón, Ana; Martínez-Sanchís, Juan Vicente; Pellicer, Antonio

    2014-10-01

    The risk of multiple pregnancy to maternal-fetal health can be minimized by reducing the number of embryos transferred. New tools for selecting embryos with the highest implantation potential should be developed. The aim of this study was to evaluate the ability of morphological and morphometric variables to predict implantation by analysing images of embryos. This was a retrospective study of 135 embryo photographs from 112 IVF-ICSI cycles carried out between January and March 2011. The embryos were photographed immediately before transfer using Cronus 3 software. Their images were analysed using the public program ImageJ. Significant effects (P < 0.05), and higher discriminant power to predict implantation were observed for the morphometric embryo variables compared with morphological ones. The features for successfully implanted embryos were as follows: four cells on day 2 of development; all blastomeres with circular shape (roundness factor greater than 0.9), an average zona pellucida thickness of 13 µm and an average of 17695.1 µm² for the embryo area. Embryo size, which is described by its area and the average roundness factor for each cell, provides two objective variables to consider when predicting implantation. This approach should be further investigated for its potential ability to improve embryo scoring.

  12. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations.

    PubMed

    Andrews, Nick A; Latrémolière, Alban; Basbaum, Allan I; Mogil, Jeffrey S; Porreca, Frank; Rice, Andrew S C; Woolf, Clifford J; Currie, Gillian L; Dworkin, Robert H; Eisenach, James C; Evans, Scott; Gewandter, Jennifer S; Gover, Tony D; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P; Kennedy, Jeffrey D; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P; McMahon, Stephen B; Price, Theodore J; Ross, Sarah E; Scherrer, Grégory; Seal, Rebecca P; Sena, Emily S; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I; Turk, Dennis C; Whiteside, Garth

    2016-04-01

    There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public-private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues.

  13. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations

    PubMed Central

    Andrews, Nick A.; Latrémolière, Alban; Basbaum, Allan I.; Mogil, Jeffrey S.; Porreca, Frank; Rice, Andrew S.C.; Woolf, Clifford J.; Currie, Gillian L.; Dworkin, Robert H.; Eisenach, James C.; Evans, Scott; Gewandter, Jennifer S.; Gover, Tony D.; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P.; Kennedy, Jeffrey D.; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P.; McMahon, Stephen B.; Price, Theodore J.; Ross, Sarah E.; Scherrer, Grégory; Seal, Rebecca P.; Sena, Emily S.; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I.; Turk, Dennis C.; Whiteside, Garth

    2015-01-01

    Abstract There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  14. Towards uniform accelerometry analysis: a standardization methodology to minimize measurement bias due to systematic accelerometer wear-time variation.

    PubMed

    Katapally, Tarun R; Muhajarine, Nazeem

    2014-05-01

    Accelerometers are predominantly used to objectively measure the entire range of activity intensities - sedentary behaviour (SED), light physical activity (LPA) and moderate to vigorous physical activity (MVPA). However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants), jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within 'valid' data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com) were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA). Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time's influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and MVPA as outcome

  15. Minimal Pairs: Minimal Importance?

    ERIC Educational Resources Information Center

    Brown, Adam

    1995-01-01

    This article argues that minimal pairs do not merit as much attention as they receive in pronunciation instruction. There are other aspects of pronunciation that are of greater importance, and there are other ways of teaching vowel and consonant pronunciation. (13 references) (VWL)

  16. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows.

  17. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  18. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  19. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  20. Minimal covering problem and PLA minimization

    SciTech Connect

    Young, M.H.; Muroga, S.

    1985-12-01

    Solving the minimal covering problem by an implicit enumeration method is discussed. The implicit enumeration method in this paper is a modification of the Quine-McCluskey method tailored to computer processing and also its extension, utilizing some new properties of the minimal covering problem for speedup. A heuristic algorithm is also presented to solve large-scale problems. Its application to the minimization of programmable logic arrays (i.e., PLAs) is shown as an example. Computational experiences are presented to confirm the improvements by the implicit enumeration method discussed.

  1. Methodological Gravitism

    ERIC Educational Resources Information Center

    Zaman, Muhammad

    2011-01-01

    In this paper the author presents the case of the exchange marriage system to delineate a model of methodological gravitism. Such a model is not a deviation from or alteration to the existing qualitative research approaches. I have adopted culturally specific methodology to investigate spouse selection in line with the Grounded Theory Method. This…

  2. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  3. Better Hyper-minimization

    NASA Astrophysics Data System (ADS)

    Maletti, Andreas

    Hyper-minimization aims to compute a minimal deterministic finite automaton (dfa) that recognizes the same language as a given dfa up to a finite number of errors. Algorithms for hyper-minimization that run in time O(n logn), where n is the number of states of the given dfa, have been reported recently in [Gawrychowski and Jeż: Hyper-minimisation made efficient. Proc. Mfcs, Lncs 5734, 2009] and [Holzer and Maletti: An n logn algorithm for hyper-minimizing a (minimized) deterministic automaton. Theor. Comput. Sci. 411, 2010]. These algorithms are improved to return a hyper-minimal dfa that commits the least number of errors. This closes another open problem of [Badr, Geffert, and Shipman: Hyper-minimizing minimized deterministic finite state automata. Rairo Theor. Inf. Appl. 43, 2009]. Unfortunately, the time complexity for the obtained algorithm increases to O(n 2).

  4. Increasingly minimal bias routing

    DOEpatents

    Bataineh, Abdulla; Court, Thomas; Roweth, Duncan

    2017-02-21

    A system and algorithm configured to generate diversity at the traffic source so that packets are uniformly distributed over all of the available paths, but to increase the likelihood of taking a minimal path with each hop the packet takes. This is achieved by configuring routing biases so as to prefer non-minimal paths at the injection point, but increasingly prefer minimal paths as the packet proceeds, referred to herein as Increasing Minimal Bias (IMB).

  5. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  6. Probabilistic inspection strategies for minimizing service failures

    NASA Astrophysics Data System (ADS)

    Brot, Abraham

    1994-09-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  7. Minimizing Classroom Interruptions.

    ERIC Educational Resources Information Center

    Partin, Ronald L.

    1987-01-01

    Offers suggestions for minimizing classroom interruptions, such as suggesting to the principal that announcements not be read over the intercom during class time and arranging desks and chairs so as to minimize visual distractions. Contains a school interruption survey form. (JC)

  8. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  9. Minimally invasive stomas.

    PubMed

    Hellinger, Michael D; Al Haddad, Abdullah

    2008-02-01

    Traditionally, stoma creation and end stoma reversal have been performed via a laparotomy incision. However, in many situations, stoma construction may be safely performed in a minimally invasive nature. This may include a trephine, laparoscopic, or combined approach. Furthermore, Hartmann's colostomy reversal, a procedure traditionally associated with substantial morbidity, may also be performed laparoscopically. The authors briefly review patient selection, preparation, and indications, and focus primarily on surgical techniques and results of minimally invasive stoma creation and Hartmann's reversal.

  10. Minimally invasive lumbar foraminotomy.

    PubMed

    Deutsch, Harel

    2013-07-01

    Lumbar radiculopathy is a common problem. Nerve root compression can occur at different places along a nerve root's course including in the foramina. Minimal invasive approaches allow easier exposure of the lateral foramina and decompression of the nerve root in the foramina. This video demonstrates a minimally invasive approach to decompress the lumbar nerve root in the foramina with a lateral to medial decompression. The video can be found here: http://youtu.be/jqa61HSpzIA.

  11. Methodology for assessing systems materials requirements

    SciTech Connect

    Culver, D.H.; Teeter, R.R.; Jamieson, W.M.

    1980-01-01

    A potential stumbling block to new system planning and design is imprecise, confusing, or contradictory data regarding materials - their availability and costs. A methodology is now available that removes this barrier by minimizing uncertainties regarding materials availability. Using this methodology, a planner can assess materials requirements more quickly, at lower cost, and with much greater confidence in the results. Developed specifically for energy systems, its potential application is much broader. This methodology and examples of its use are discussed.

  12. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  13. Approximate fault-tree analysis without cut sets

    NASA Astrophysics Data System (ADS)

    Schneeweiss, Winfrid G.

    It is shown that a rather efficient approximate fault tree analysis is possible on the basis of the Shannon decomposition. The main advantages are: (1) no preprocessing is necessary to determine all the mincuts; (2) the maximum error can be prespecified; and (3) noncoherent systems and systems with dependent component states can be treated. The main disadvantage is the fact that the cutting off of certain subtrees of the decomposition tree (for upper bound results) may need some trial and error test calculations.

  14. Minimally invasive valve surgery.

    PubMed

    Woo, Y Joseph

    2009-08-01

    Traditional cardiac valve replacement surgery is being rapidly supplanted by innovative, minimally invasive approaches toward the repair of these valves. Patients are experiencing benefits ranging from less bleeding and pain to faster recovery and greater satisfaction. These operations are proving to be safe, highly effective, and durable, and their use will likely continue to increase and become even more widely applicable.

  15. CONMIN- CONSTRAINED FUNCTION MINIMIZATION

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1994-01-01

    In many mathematical problems, it is necessary to determine the minimum and maximum of a function of several variables, limited by various linear and nonlinear inequality constraints. It is seldom possible, in practical applications, to solve these problems directly. In most cases, an iterative method must be used to numerically obtain a solution. The CONMIN program was developed to numerically perform the minimization of a multi-variable function subject to a set of inequality constraints. The function need not be a simple analytical equation; it may be any function which can be numerically evaluated. The basic analytic technique used by CONMIN is to minimize the function until one or more of the constraints become active. The minimization process then continues by following the constraint boundaries in a direction such that the value of the function continues to decrease. When a point is reached where no further decrease in the function can be obtained, the process is terminated. Function maximization may be achieved by minimizing the negative of the function. This program is written in FORTRAN IV for batch execution and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 43K (octal) of 60 bit words. The CONMIN program was originally developed in 1973 and last updated in 1978.

  16. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  17. Testing methodologies

    SciTech Connect

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  18. Discrete Minimal Surface Algebras

    NASA Astrophysics Data System (ADS)

    Arnlind, Joakim; Hoppe, Jens

    2010-05-01

    We consider discrete minimal surface algebras (DMSA) as generalized noncommutative analogues of minimal surfaces in higher dimensional spheres. These algebras appear naturally in membrane theory, where sequences of their representations are used as a regularization. After showing that the defining relations of the algebra are consistent, and that one can compute a basis of the enveloping algebra, we give several explicit examples of DMSAs in terms of subsets of sln (any semi-simple Lie algebra providing a trivial example by itself). A special class of DMSAs are Yang-Mills algebras. The representation graph is introduced to study representations of DMSAs of dimension d ≤ 4, and properties of representations are related to properties of graphs. The representation graph of a tensor product is (generically) the Cartesian product of the corresponding graphs. We provide explicit examples of irreducible representations and, for coinciding eigenvalues, classify all the unitary representations of the corresponding algebras.

  19. Minimal hepatic encephalopathy.

    PubMed

    Zamora Nava, Luis Eduardo; Torre Delgadillo, Aldo

    2011-06-01

    The term minimal hepatic encephalopathy (MHE) refers to the subtle changes in cognitive function, electrophysiological parameters, cerebral neurochemical/neurotransmitter homeostasis, cerebral blood flow, metabolism, and fluid homeostasis that can be observed in patients with cirrhosis who have no clinical evidence of hepatic encephalopathy; the prevalence is as high as 84% in patients with hepatic cirrhosis. Physician does generally not perceive cirrhosis complications, and neuropsychological tests and another especial measurement like evoked potentials and image studies like positron emission tomography can only make diagnosis. Diagnosis of minimal hepatic encephalopathy may have prognostic and therapeutic implications in cirrhotic patients. The present review pretends to explore the clinic, therapeutic, diagnosis and prognostic aspects of this complication.

  20. [Minimally invasive thymus surgery].

    PubMed

    Rückert, J C; Ismail, M; Swierzy, M; Braumann, C; Badakhshi, H; Rogalla, P; Meisel, A; Rückert, R I; Müller, J M

    2008-01-01

    There are absolute and relative indications for complete removal of the thymus gland. In the complex therapy of autoimmune-related myasthenia gravis, thymectomy plays a central role and is performed with relative indication. In case of thymoma with or without myasthenia, thymectomy is absolutely indicated. Thymus resection is further necessary for cases of hyperparathyroidism with ectopic intrathymic parathyroids or with certain forms of multiple endocrine neoplasia. The transcervical operation technique traditionally reflected the well-founded desire for minimal invasiveness for thymectomy. Due to the requirement of radicality however, most of these operations were performed using sternotomy. With the evolution of therapeutic thoracoscopy in thoracic surgery, several pure or extended minimally invasive operation techniques for thymectomy have been developed. At present uni- or bilateral, subxiphoid, and modified transcervical single or combination thoracoscopic techniques are in use. Recently a very precise new level of thoracoscopic operation technique was developed using robotic-assisted surgery. There are special advantages of this technique for thymectomy. An overview of the development and experiences with minimally invasive thymectomy is presented, including data from the largest series published so far.

  1. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary's direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  2. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  3. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  4. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  5. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  6. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  7. Logarithmic superconformal minimal models

    NASA Astrophysics Data System (ADS)

    Pearce, Paul A.; Rasmussen, Jørgen; Tartaglia, Elena

    2014-05-01

    The higher fusion level logarithmic minimal models {\\cal LM}(P,P';n) have recently been constructed as the diagonal GKO cosets {(A_1^{(1)})_k\\oplus (A_1^ {(1)})_n}/ {(A_1^{(1)})_{k+n}} where n ≥ 1 is an integer fusion level and k = nP/(P‧- P) - 2 is a fractional level. For n = 1, these are the well-studied logarithmic minimal models {\\cal LM}(P,P')\\equiv {\\cal LM}(P,P';1). For n ≥ 2, we argue that these critical theories are realized on the lattice by n × n fusion of the n = 1 models. We study the critical fused lattice models {\\cal LM}(p,p')_{n\\times n} within a lattice approach and focus our study on the n = 2 models. We call these logarithmic superconformal minimal models {\\cal LSM}(p,p')\\equiv {\\cal LM}(P,P';2) where P = |2p - p‧|, P‧ = p‧ and p, p‧ are coprime. These models share the central charges c=c^{P,P';2}=\\frac {3}{2}\\big (1-{2(P'-P)^2}/{P P'}\\big ) of the rational superconformal minimal models {\\cal SM}(P,P'). Lattice realizations of these theories are constructed by fusing 2 × 2 blocks of the elementary face operators of the n = 1 logarithmic minimal models {\\cal LM}(p,p'). Algebraically, this entails the fused planar Temperley-Lieb algebra which is a spin-1 Birman-Murakami-Wenzl tangle algebra with loop fugacity β2 = [x]3 = x2 + 1 + x-2 and twist ω = x4 where x = eiλ and λ = (p‧- p)π/p‧. The first two members of this n = 2 series are superconformal dense polymers {\\cal LSM}(2,3) with c=-\\frac {5}{2}, β2 = 0 and superconformal percolation {\\cal LSM}(3,4) with c = 0, β2 = 1. We calculate the bulk and boundary free energies analytically. By numerically studying finite-size conformal spectra on the strip with appropriate boundary conditions, we argue that, in the continuum scaling limit, these lattice models are associated with the logarithmic superconformal models {\\cal LM}(P,P';2). For system size N, we propose finitized Kac character formulae of the form q^{-{c^{P,P';2}}/{24}+\\Delta ^{P,P';2} _{r

  8. Minimally invasive valve surgery.

    PubMed

    Woo, Y Joseph; Seeburger, Joerg; Mohr, Friedrich W

    2007-01-01

    As alternatives to standard sternotomy, surgeons have developed innovative, minimally invasive approaches to conducting valve surgery. Through very small skin incisions and partial upper sternal division for aortic valve surgery and right minithoracotomy for mitral surgery, surgeons have become adept at performing complex valve procedures. Beyond cosmetic appeal, apparent benefits range from decreased pain and bleeding to improved respiratory function and recovery time. The large retrospective studies and few small prospective randomized studies are herein briefly summarized. The focus is then directed toward describing specific intraoperative technical details in current clinical use, covering anesthetic preparation, incision, mediastinal access, cardiovascular cannulation, valve exposure, and valve reconstruction. Finally, unique situations such as pulmonic valve surgery, reoperations, beating heart surgery, and robotics are discussed.

  9. Transanal Minimally Invasive Surgery

    PubMed Central

    deBeche-Adams, Teresa; Nassif, George

    2015-01-01

    Transanal minimally invasive surgery (TAMIS) was first described in 2010 as a crossover between single-incision laparoscopic surgery and transanal endoscopic microsurgery (TEM) to allow access to the proximal and mid-rectum for resection of benign and early-stage malignant rectal lesions. The TAMIS technique can also be used for noncurative intent surgery of more advanced lesions in patients who are not candidates for radical surgery. Proper workup and staging should be done before surgical decision-making. In addition to the TAMIS port, instrumentation and set up include readily available equipment found in most operating suites. TAMIS has proven its usefulness in a wide range of applications outside of local excision, including repair of rectourethral fistula, removal of rectal foreign body, control of rectal hemorrhage, and as an adjunct in total mesorectal excision for rectal cancer. TAMIS is an easily accessible, technically feasible, and cost-effective alternative to TEM. PMID:26491410

  10. Minimally invasive esophagectomy

    PubMed Central

    Herbella, Fernando A; Patti, Marco G

    2010-01-01

    Esophageal resection is associated with a high morbidity and mortality rate. Minimally invasive esophagectomy (MIE) might theoretically decrease this rate. We reviewed the current literature on MIE, with a focus on the available techniques, outcomes and comparison with open surgery. This review shows that the available literature on MIE is still crowded with heterogeneous studies with different techniques. There are no controlled and randomized trials, and the few retrospective comparative cohort studies are limited by small numbers of patients and biased by historical controls of open surgery. Based on the available literature, there is no evidence that MIE brings clear benefits compared to conventional esophagectomy. Increasing experience and the report of larger series might change this scenario. PMID:20698044

  11. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  12. A perturbation technique for shield weight minimization

    SciTech Connect

    Watkins, E.F.; Greenspan, E. )

    1993-01-01

    The radiation shield optimization code SWAN (Ref. 1) was originally developed for minimizing the thickness of a shield that will meet a given dose (or another) constraint or for extremizing a performance parameter of interest (e.g., maximizing energy multiplication or minimizing dose) while maintaining the shield volume constraint. The SWAN optimization process proved to be highly effective (e.g., see Refs. 2, 3, and 4). The purpose of this work is to investigate the applicability of the SWAN methodology to problems in which the weight rather than the volume is the relevant shield characteristic. Such problems are encountered in shield design for space nuclear power systems. The investigation is carried out using SWAN with the coupled neutron-photon cross-section library FLUNG (Ref. 5).

  13. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  14. Discrete minimal flavor violation

    SciTech Connect

    Zwicky, Roman; Fischbacher, Thomas

    2009-10-01

    We investigate the consequences of replacing the global flavor symmetry of minimal flavor violation (MFV) SU(3){sub Q}xSU(3){sub U}xSU(3){sub D}x{center_dot}{center_dot}{center_dot} by a discrete D{sub Q}xD{sub U}xD{sub D}x{center_dot}{center_dot}{center_dot} symmetry. Goldstone bosons resulting from the breaking of the flavor symmetry generically lead to bounds on new flavor structure many orders of magnitude above the TeV scale. The absence of Goldstone bosons for discrete symmetries constitute the primary motivation of our work. Less symmetry implies further invariants and renders the mass-flavor basis transformation observable in principle and calls for a hierarchy in the Yukawa matrix expansion. We show, through the dimension of the representations, that the (discrete) symmetry in principle does allow for additional {delta}F=2 operators. If though the {delta}F=2 transitions are generated by two subsequent {delta}F=1 processes, as, for example, in the standard model, then the four crystal-like groups {sigma}(168){approx_equal}PSL(2,F{sub 7}), {sigma}(72{phi}), {sigma}(216{phi}) and especially {sigma}(360{phi}) do provide enough protection for a TeV-scale discrete MFV scenario. Models where this is not the case have to be investigated case by case. Interestingly {sigma}(216{phi}) has a (nonfaithful) representation corresponding to an A{sub 4} symmetry. Moreover we argue that the, apparently often omitted, (D) groups are subgroups of an appropriate {delta}(6g{sup 2}). We would like to stress that we do not provide an actual model that realizes the MFV scenario nor any other theory of flavor.

  15. USGS Methodology for Assessing Continuous Petroleum Resources

    USGS Publications Warehouse

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  16. Minimizing Launch Mass for ISRU Processes

    NASA Technical Reports Server (NTRS)

    England, C.; Hallinan, K. P.

    2004-01-01

    The University of Dayton and the Jet Propulsion Laboratory are developing a methodology for estimating the Earth launch mass (ELM) of processes for In-Situ Resource Utilization (ISRU) with a focus on lunar resource recovery. ISRU may be enabling for both an extended presence on the Moon, and for large sample return missions and for a human presence on Mars. To accomplish these exploration goals, the resources recovered by ISRU must offset the ELM for the recovery process. An appropriate figure of merit is the cost of the exploration mission, which is closely related to ELM. For a given production rate and resource concentration, the lowest ELM - and the best ISRU process - is achieved by minimizing capital equipment for both the ISRU process and energy production. ISRU processes incur Carnot limitations and second law losses (irreversibilities) that ultimately determine production rate, material utilization and energy efficiencies. Heat transfer, chemical reaction, and mechanical operations affect the ELM in ways that are best understood by examining the process's detailed energetics. Schemes for chemical and thermal processing that do not incorporate an understanding of second law losses will be incompletely understood. Our team is developing a methodology that will aid design and selection of ISRU processes by identifying the impact of thermodynamic losses on ELM. The methodology includes mechanical, thermal and chemical operations, and, when completed, will provide a procedure and rationale for optimizing their design and minimizing their cost. The technique for optimizing ISRU with respect to ELM draws from work of England and Funk that relates the cost of endothermic processes to their second law efficiencies. Our team joins their approach for recovering resources by chemical processing with analysis of thermal and mechanical operations in space. Commercial firms provide cost inputs for ELM and planetary landing. Additional information is included in the

  17. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  18. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  19. Influenza SIRS with Minimal Pneumonitis.

    PubMed

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  20. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  1. Microbiological Methodology in Astrobiology

    NASA Technical Reports Server (NTRS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  2. The development and characterization of synthetic minimal yeast promoters

    PubMed Central

    Redden, Heidi; Alper, Hal S.

    2015-01-01

    Synthetic promoters, especially minimally sized, are critical for advancing fungal synthetic biology. Fungal promoters often span hundreds of base pairs, nearly ten times the amount of bacterial counterparts. This size limits large-scale synthetic biology efforts in yeasts. Here we address this shortcoming by establishing a methodical workflow necessary to identify robust minimal core elements that can be linked with minimal upstream activating sequences to develop short, yet strong yeast promoters. Through a series of library-based synthesis, analysis and robustness tests, we create a set of non-homologous, purely synthetic, minimal promoters for yeast. These promoters are comprised of short core elements that are generic and interoperable and 10 bp UAS elements that impart strong, constitutive function. Through this methodology, we are able to generate the shortest fungal promoters to date, which can achieve high levels of both inducible and constitutive expression with up to an 80% reduction in size. PMID:26183606

  3. Response Surface Methodology

    DTIC Science & Technology

    2004-10-01

    methods . All three of these topics are usually combined into Response Surface Methodology (RSM). Also the experimenter may encounter situations where...TITLE AND SUBTITLE Response Surface Methodology 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...18 Keywords: Response Surface Methodology (RSM), regression analysis, linear

  4. Aortic valve surgery - minimally invasive

    MedlinePlus

    ... of the heart is reduced. This is called aortic stenosis. The aortic valve can be replaced using: Minimally ... RN, Wang A. Percutaneous heart valve replacement for aortic stenosis: state of the evidence. Ann Intern Med . 2010; ...

  5. Shapes of embedded minimal surfaces.

    PubMed

    Colding, Tobias H; Minicozzi, William P

    2006-07-25

    Surfaces that locally minimize area have been extensively used to model physical phenomena, including soap films, black holes, compound polymers, protein folding, etc. The mathematical field dates to the 1740s but has recently become an area of intense mathematical and scientific study, specifically in the areas of molecular engineering, materials science, and nanotechnology because of their many anticipated applications. In this work, we show that all minimal surfaces are built out of pieces of the surfaces in Figs. 1 and 2.

  6. Specialized minimal PDFs for optimized LHC calculations.

    PubMed

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.

  7. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  8. Minimally invasive surgery for atrial fibrillation

    PubMed Central

    Suwalski, Piotr

    2013-01-01

    Atrial fibrillation (AF) remains the most common cardiac arrhythmia, affecting nearly 2% of the general population worldwide. Minimally invasive surgical ablation remains one of the most dynamically evolving fields of modern cardiac surgery. While there are more than a dozen issues driving this development, two seem to play the most important role: first, there is lack of evidence supporting percutaneous catheter based approach to treat patients with persistent and long-standing persistent AF. Paucity of this data offers surgical community unparalleled opportunity to challenge guidelines and change indications for surgical intervention. Large, multicenter prospective clinical studies are therefore of utmost importance, as well as honest, clear data reporting. Second, a collaborative methodology started a long-awaited debate on a Heart Team approach to AF, similar to the debate on coronary artery disease and transcatheter valves. Appropriate patient selection and tailored treatment options will most certainly result in better outcomes and patient satisfaction, coupled with appropriate use of always-limited institutional resources. The aim of this review, unlike other reviews of minimally invasive surgical ablation, is to present medical professionals with two distinctly different, approaches. The first one is purely surgical, Standalone surgical isolation of the pulmonary veins using bipolar energy source with concomitant amputation of the left atrial appendage—a method of choice in one of the most important clinical trials on AF—The Atrial Fibrillation Catheter Ablation Versus Surgical Ablation Treatment (FAST) Trial. The second one represents the most complex approach to this problem: a multidisciplinary, combined effort of a cardiac surgeon and electrophysiologist. The Convergent Procedure, which includes both endocardial and epicardial unipolar ablation bonds together minimally invasive endoscopic surgery with electroanatomical mapping, to deliver best of

  9. Minimal but non-minimal inflation and electroweak symmetry breaking

    SciTech Connect

    Marzola, Luca; Racioppi, Antonio

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  10. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  11. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  12. Does Minimally Invasive Spine Surgery Minimize Surgical Site Infections?

    PubMed Central

    Patel, Ravish Shammi; Dutta, Shumayou

    2016-01-01

    Study Design Retrospective review of prospectively collected data. Purpose To evaluate the incidence of surgical site infections (SSIs) in minimally invasive spine surgery (MISS) in a cohort of patients and compare with available historical data on SSI in open spinal surgery cohorts, and to evaluate additional direct costs incurred due to SSI. Overview of Literature SSI can lead to prolonged antibiotic therapy, extended hospitalization, repeated operations, and implant removal. Small incisions and minimal dissection intrinsic to MISS may minimize the risk of postoperative infections. However, there is a dearth of literature on infections after MISS and their additional direct financial implications. Methods All patients from January 2007 to January 2015 undergoing posterior spinal surgery with tubular retractor system and microscope in our institution were included. The procedures performed included tubular discectomies, tubular decompressions for spinal stenosis and minimal invasive transforaminal lumbar interbody fusion (TLIF). The incidence of postoperative SSI was calculated and compared to the range of cited SSI rates from published studies. Direct costs were calculated from medical billing for index cases and for patients with SSI. Results A total of 1,043 patients underwent 763 noninstrumented surgeries (discectomies, decompressions) and 280 instrumented (TLIF) procedures. The mean age was 52.2 years with male:female ratio of 1.08:1. Three infections were encountered with fusion surgeries (mean detection time, 7 days). All three required wound wash and debridement with one patient requiring unilateral implant removal. Additional direct cost due to infection was $2,678 per 100 MISS-TLIF. SSI increased hospital expenditure per patient 1.5-fold after instrumented MISS. Conclusions Overall infection rate after MISS was 0.29%, with SSI rate of 0% in non-instrumented MISS and 1.07% with instrumented MISS. MISS can markedly reduce the SSI rate and can be an

  13. Shapes of embedded minimal surfaces

    PubMed Central

    Colding, Tobias H.; Minicozzi, William P.

    2006-01-01

    Surfaces that locally minimize area have been extensively used to model physical phenomena, including soap films, black holes, compound polymers, protein folding, etc. The mathematical field dates to the 1740s but has recently become an area of intense mathematical and scientific study, specifically in the areas of molecular engineering, materials science, and nanotechnology because of their many anticipated applications. In this work, we show that all minimal surfaces are built out of pieces of the surfaces in Figs. 1 and 2. PMID:16847265

  14. Rovers minimize human disturbance in research on wild animals.

    PubMed

    Le Maho, Yvon; Whittington, Jason D; Hanuise, Nicolas; Pereira, Louise; Boureau, Matthieu; Brucker, Mathieu; Chatelain, Nicolas; Courtecuisse, Julien; Crenner, Francis; Friess, Benjamin; Grosbellet, Edith; Kernaléguen, Laëtitia; Olivier, Frédérique; Saraux, Claire; Vetter, Nathanaël; Viblanc, Vincent A; Thierry, Bernard; Tremblay, Pascale; Groscolas, René; Le Bohec, Céline

    2014-12-01

    Investigating wild animals while minimizing human disturbance remains an important methodological challenge. When approached by a remote-operated vehicle (rover) which can be equipped to make radio-frequency identifications, wild penguins had significantly lower and shorter stress responses (determined by heart rate and behavior) than when approached by humans. Upon immobilization, the rover-unlike humans-did not disorganize colony structure, and stress rapidly ceased. Thus, rovers can reduce human disturbance of wild animals and the resulting scientific bias.

  15. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  16. Wilson loops in minimal surfaces

    SciTech Connect

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-04-27

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS{sub 5} x S{sup 5}. The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS{sub 5} x S{sup 5} gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface.

  17. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  18. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  19. What is minimally invasive dentistry?

    PubMed

    Ericson, Dan

    2004-01-01

    Minimally Invasive Dentistry is the application of "a systematic respect for the original tissue." This implies that the dental profession recognizes that an artifact is of less biological value than the original healthy tissue. Minimally invasive dentistry is a concept that can embrace all aspects of the profession. The common delineator is tissue preservation, preferably by preventing disease from occurring and intercepting its progress, but also removing and replacing with as little tissue loss as possible. It does not suggest that we make small fillings to restore incipient lesions or surgically remove impacted third molars without symptoms as routine procedures. The introduction of predictable adhesive technologies has led to a giant leap in interest in minimally invasive dentistry. The concept bridges the traditional gap between prevention and surgical procedures, which is just what dentistry needs today. The evidence-base for survival of restorations clearly indicates that restoring teeth is a temporary palliative measure that is doomed to fail if the disease that caused the condition is not addressed properly. Today, the means, motives and opportunities for minimally invasive dentistry are at hand, but incentives are definitely lacking. Patients and third parties seem to be convinced that the only things that count are replacements. Namely, they are prepared to pay for a filling but not for a procedure that can help avoid having one.

  20. Menopause and Methodological Doubt

    ERIC Educational Resources Information Center

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  1. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  2. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  3. Propeller aeroacoustic methodologies

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Gregorek, G. M.

    1980-01-01

    The aspects related to propeller performance by means of a review of propeller methodologies are addressed. Preliminary wind tunnel propeller performance data are presented and the predominent limitations of existing propeller performance methodologies are discussed. Airfoil developments appropriate for propeller applications are also reviewed.

  4. Light modular rig for minimal environment impact

    SciTech Connect

    Mehra, S.; Abedrabbo, A.

    1996-12-31

    The fast plenary meeting of United Nations on human Environment in 1972 considered the need for a common outlook and for common principles to inspire and guide the people and industries of the world in the preservation and enhancement of human environment. Since then many countries have, or am now enacting, environmental legislation`s covering the wide spectrum of environmental protection issues. Petroleum industry has not been immune to inch scrutiny, however, little has changed in land based drilling operations, especially in remote areas. A major aspect of the ongoing program in the design of a light modular land rig has been minimization of the environmental impact. Today, concerns for protection of the environment have spread in many drilling areas: the use of some traditional drilling techniques such as waste pits is now banned. When rethinking about rig hardware and design today, environment protection needs to be considered at an early stage. There are many incentives for implementation of environmental protection programs, in design and in operation, aside from the regulatory/compliance issue. Waste disposal costs have risen dramatically over the last few years and the trend is expected to continue. Improvements in environment conditions improves morale and image. Growing public awareness and realization of the man made harm in my regions of the earth : dangerous levels of pollution in water, air, earth and living beings; major and undesirable disturbances to the ecological balance of the biosphere; destruction and depletion of irreplaceable resources; and gross deficiencies harmful to the physical, mental and social health of man in the living and working environment. This paper discusses the steps taken, early on in the design stage and operations methodology, to minimize the environmental impact.

  5. Minimally invasive surgical approach to pancreatic malignancies

    PubMed Central

    Bencini, Lapo; Annecchiarico, Mario; Farsi, Marco; Bartolini, Ilenia; Mirasolo, Vita; Guerra, Francesco; Coratti, Andrea

    2015-01-01

    Pancreatic surgery for malignancy is recognized as challenging for the surgeons and risky for the patients due to consistent perioperative morbidity and mortality. Furthermore, the oncological long-term results are largely disappointing, even for those patients who experience an uneventfully hospital stay. Nevertheless, surgery still remains the cornerstone of a multidisciplinary treatment for pancreatic cancer. In order to maximize the benefits of surgery, the advent of both laparoscopy and robotics has led many surgeons to treat pancreatic cancers with these new methodologies. The reduction of postoperative complications, length of hospital stay and pain, together with a shorter interval between surgery and the beginning of adjuvant chemotherapy, represent the potential advantages over conventional surgery. Lastly, a better cosmetic result, although not crucial in any cancerous patient, could also play a role by improving overall well-being and patient self-perception. The laparoscopic approach to pancreatic surgery is, however, difficult in inexperienced hands and requires a dedicated training in both advanced laparoscopy and pancreatic surgery. The recent large diffusion of the da Vinci® robotic platform seems to facilitate many of the technical maneuvers, such as anastomotic biliary and pancreatic reconstructions, accurate lymphadenectomy, and vascular sutures. The two main pancreatic operations, distal pancreatectomy and pancreaticoduodenectomy, are approachable by a minimally invasive path, but more limited interventions such as enucleation are also feasible. Nevertheless, a word of caution should be taken into account when considering the increasing costs of these newest technologies because the main concerns regarding these are the maintenance of all oncological standards and the lack of long-term follow-up. The purpose of this review is to examine the evidence for the use of minimally invasive surgery in pancreatic cancer (and less aggressive tumors

  6. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  7. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  8. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  9. Methodologies for clinical ethics.

    PubMed

    Drane, J F

    1990-01-01

    Truly professional medical ethics requires a methodology that generates both moral discernment and consistently right judgments. In this article the author briefly reviews difficulties involved in ethical decision-making, the historical development of casuistry, and four ethical methodologies employed in clinical medicine today. These latter, which are outlined and compared, are as follows: the methodology developed by David Thomasma in the 1960s and 1970s; one created by Jonsen, Siegler, and Winslade; another developed by the author; and the Bochum Protocol authored by Hans-Martin Sass et al. of the Bochum Center for Medical Ethics in the Federal Republic of Germany.

  10. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  11. Outcomes After Minimally Invasive Esophagectomy

    PubMed Central

    Luketich, James D.; Pennathur, Arjun; Awais, Omar; Levy, Ryan M.; Keeley, Samuel; Shende, Manisha; Christie, Neil A.; Weksler, Benny; Landreneau, Rodney J.; Abbas, Ghulam; Schuchert, Matthew J.; Nason, Katie S.

    2014-01-01

    Background Esophagectomy is a complex operation and is associated with significant morbidity and mortality. In an attempt to lower morbidity, we have adopted a minimally invasive approach to esophagectomy. Objectives Our primary objective was to evaluate the outcomes of minimally invasive esophagectomy (MIE) in a large group of patients. Our secondary objective was to compare the modified McKeown minimally invasive approach (videothoracoscopic surgery, laparoscopy, neck anastomosis [MIE-neck]) with our current approach, a modified Ivor Lewis approach (laparoscopy, videothoracoscopic surgery, chest anastomosis [MIE-chest]). Methods We reviewed 1033 consecutive patients undergoing MIE. Elective operation was performed on 1011 patients; 22 patients with nonelective operations were excluded. Patients were stratified by surgical approach and perioperative outcomes analyzed. The primary endpoint studied was 30-day mortality. Results The MIE-neck was performed in 481 (48%) and MIE-Ivor Lewis in 530 (52%). Patients undergoing MIE-Ivor Lewis were operated in the current era. The median number of lymph nodes resected was 21. The operative mortality was 1.68%. Median length of stay (8 days) and ICU stay (2 days) were similar between the 2 approaches. Mortality rate was 0.9%, and recurrent nerve injury was less frequent in the Ivor Lewis MIE group (P < 0.001). Conclusions MIE in our center resulted in acceptable lymph node resection, postoperative outcomes, and low mortality using either an MIE-neck or an MIE-chest approach. The MIE Ivor Lewis approach was associated with reduced recurrent laryngeal nerve injury and mortality of 0.9% and is now our preferred approach. Minimally invasive esophagectomy can be performed safely, with good results in an experienced center. PMID:22668811

  12. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  13. Minimal massive 3D gravity

    NASA Astrophysics Data System (ADS)

    Bergshoeff, Eric; Hohm, Olaf; Merbis, Wout; Routh, Alasdair J.; Townsend, Paul K.

    2014-07-01

    We present an alternative to topologically massive gravity (TMG) with the same ‘minimal’ bulk properties; i.e. a single local degree of freedom that is realized as a massive graviton in linearization about an anti-de Sitter (AdS) vacuum. However, in contrast to TMG, the new ‘minimal massive gravity’ has both a positive energy graviton and positive central charges for the asymptotic AdS-boundary conformal algebra.

  14. Technology transfer methodology

    NASA Technical Reports Server (NTRS)

    Labotz, Rich

    1991-01-01

    Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.

  15. Methodology for Stochastic Modeling.

    DTIC Science & Technology

    1985-01-01

    AD-AISS 851 METHODOLOGY FOR STOCHASTIC MODELING(U) ARMY MATERIEL 11 SYSTEMS ANALYSIS ACTIYITY ABERDEEN PROVING GROUND MD H E COHEN JAN 95 RNSAA-TR-41...FORM T REPORT NUMBER 2. GOVT ACCESSION NO. 3. RECIPIENT’$ CATALOG NUMBER 4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED Methodology for...autoregression models, moving average models, ARMA, adaptive modeling, covariance methods , singular value decom- position, order determination rational

  16. Minimal Absent Words in Four Human Genome Assemblies

    PubMed Central

    Garcia, Sara P.; Pinho, Armando J.

    2011-01-01

    Minimal absent words have been computed in genomes of organisms from all domains of life. Here, we aim to contribute to the catalogue of human genomic variation by investigating the variation in number and content of minimal absent words within a species, using four human genome assemblies. We compare the reference human genome GRCh37 assembly, the HuRef assembly of the genome of Craig Venter, the NA12878 assembly from cell line GM12878, and the YH assembly of the genome of a Han Chinese individual. We find the variation in number and content of minimal absent words between assemblies more significant for large and very large minimal absent words, where the biases of sequencing and assembly methodologies become more pronounced. Moreover, we find generally greater similarity between the human genome assemblies sequenced with capillary-based technologies (GRCh37 and HuRef) than between the human genome assemblies sequenced with massively parallel technologies (NA12878 and YH). Finally, as expected, we find the overall variation in number and content of minimal absent words within a species to be generally smaller than the variation between species. PMID:22220210

  17. Multiple myeloma, immunotherapy and minimal residual disease.

    PubMed

    Kusenda, J; Kovarikova, A

    2016-01-01

    Multiple myeloma (MM) is an incurable heterogeneous hematological malignancy in which relapse is characterized by re-growth of residual tumor and immune suppression with a complex biology that affects many aspects of the disease and its response to treatment. The bone marrow microenvironment, including immune cells, plays a central role in MM pathogenesis, survival, and drug resistance. The advances in basic and translational research, introduction of novel agents, particularly combination therapies, improved indicators of quality of life and survival. Minimal residual disease (MRD) detection by multiparameter flow cytometry (MFC) has revolutionized monitoring of treatment response in MM. The importance of MFC methodology will be further strengthened by the ongoing international standardization efforts. Results of MRD testing provide unique and clinically important information and demonstrated the prognostic significance of MRD in patients, leading to regulate treatment intensity in many contemporary protocols. In this review, we will summarize the principal approaches in MM immunotherapy, focusing how new agents have potential in the treatment of MM and application of MRD detection by MFC as a surrogate endpoint would allow quicker evaluation of treatment outcomes and rapid identification of effective new therapies.

  18. Temporal structure of consciousness and minimal self in schizophrenia.

    PubMed

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one's first-person experiential perspective as, for example, an "altered presence" during which the sense of the experienced self ("mineness") is subtly affected, or "altered sense of demarcation," i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an "I." In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed.

  19. Temporal structure of consciousness and minimal self in schizophrenia

    PubMed Central

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one’s first-person experiential perspective as, for example, an “altered presence” during which the sense of the experienced self (“mineness”) is subtly affected, or “altered sense of demarcation,” i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an “I.” In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  20. Minimally invasive PCNL-MIP.

    PubMed

    Zanetti, Stefano Paolo; Boeri, Luca; Gallioli, Andrea; Talso, Michele; Montanari, Emanuele

    2017-01-01

    Miniaturized percutaneous nephrolithotomy (mini-PCNL) has increased in popularity in recent years and is now widely used to overcome the therapeutic gap between conventional PCNL and less-invasive procedures such as shock wave lithotripsy (SWL) or flexible ureterorenoscopy (URS) for the treatment of renal stones. However, despite its minimally invasive nature, the superiority in terms of safety, as well as the similar efficacy of mini-PCNL compared to conventional procedures, is still under debate. The aim of this chapter is to present one of the most recent advancements in terms of mini-PCNL: the Karl Storz "minimally invasive PCNL" (MIP). A literature search for original and review articles either published or e-published up to December 2016 was performed using Google and the PubMed database. Keywords included: minimally invasive PCNL; MIP. The retrieved articles were gathered and examined. The complete MIP set is composed of different sized rigid metallic fiber-optic nephroscopes and different sized metallic operating sheaths, according to which the MIP is categorized into extra-small (XS), small (S), medium (M) and large (L). Dilation can be performed either in one-step or with a progressive technique, as needed. The reusable devices of the MIP and vacuum cleaner efect make PCNL with this set a cheap procedure. The possibility to shift from a small to a larger instrument within the same set (Matrioska technique) makes MIP a very versatile technique suitable for the treatment of almost any stone. Studies in the literature have shown that MIP is equally effective, with comparable rates of post-operative complications, as conventional PCNL, independently from stone size. MIP does not represent a new technique, but rather a combination of the last ten years of PCNL improvements in a single system that can transversally cover all available techniques in the panorama of percutaneous stone treatment.

  1. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  2. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  3. Minimally invasive therapy in Denmark.

    PubMed

    Schou, I

    1993-01-01

    Minimally invasive therapy (MIT) is beginning to have impacts on health care in Denmark, although diffusion has been delayed compared to diffusion in other European countries. Now policy makers are beginning to appreciate the potential advantages in terms of closing hospitals and shifting treatment to the out-patient setting, and diffusion will probably go faster in the future. Denmark does not have a system for technology assessment, neither central nor regional, and there is no early warning mechanism to survey international developments. This implies lack of possibilities for the planning of diffusion, training, and criteria for treatment.

  4. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  5. Evidence-Based Integrated Environmental Solutions For Secondary Lead Smelters: Pollution Prevention And Waste Minimization Technologies And Practices

    EPA Science Inventory

    An evidence-based methodology was adopted in this research to establish strategies to increase lead recovery and recycling via a systematic review and critical appraisal of the published literature. In particular, the research examines pollution prevention and waste minimization...

  6. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  7. The minimal time detection algorithm

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan

    1995-01-01

    An aerospace vehicle may operate throughout a wide range of flight environmental conditions that affect its dynamic characteristics. Even when the control design incorporates a degree of robustness, system parameters may drift enough to cause its performance to degrade below an acceptable level. The object of this paper is to develop a change detection algorithm so that we can build a highly adaptive control system applicable to aircraft systems. The idea is to detect system changes with minimal time delay. The algorithm developed is called Minimal Time-Change Detection Algorithm (MT-CDA) which detects the instant of change as quickly as possible with false-alarm probability below a certain specified level. Simulation results for the aircraft lateral motion with a known or unknown change in control gain matrices, in the presence of doublet input, indicate that the algorithm works fairly well as theory indicates though there is a difficulty in deciding the exact amount of change in some situations. One of MT-CDA distinguishing properties is that detection delay of MT-CDA is superior to that of Whiteness Test.

  8. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  9. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  10. Methodology for research I

    PubMed Central

    Garg, Rakesh

    2016-01-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles. PMID:27729690

  11. Methodology for research I.

    PubMed

    Garg, Rakesh

    2016-09-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles.

  12. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  13. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  14. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  15. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR.

  16. Minimally Invasive Spigelian Hernia Repair

    PubMed Central

    Baucom, Catherine; Nguyen, Quan D.; Hidalgo, Marco

    2009-01-01

    Introduction: Spigelian hernia is an uncommon ventral hernia characterized by a defect in the linea semilunaris. Repair of spigelian hernia has traditionally been accomplished via an open transverse incision and primary repair. The purpose of this article is to present 2 case reports of incarcerated spigelian hernia that were successfully repaired laparoscopically using Gortex mesh and to present a review of the literature regarding laparoscopic repair of spigelian hernias. Methods: Retrospective chart review and Medline literature search. Results: Two patients underwent laparoscopic mesh repair of incarcerated spigelian hernias. Both were started on a regular diet on postoperative day 1 and discharged on postoperative days 2 and 3. One patient developed a seroma that resolved without intervention. There was complete resolution of preoperative symptoms at the 12-month follow-up. Conclusion: Minimally invasive repair of spigelian hernias is an alternative to the traditional open surgical technique. Further studies are needed to directly compare the open and the laparoscopic repair. PMID:19660230

  17. Strategies to minimize antibiotic resistance.

    PubMed

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-09-12

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics.

  18. Empowering Research Methodologies.

    ERIC Educational Resources Information Center

    Lather, Patti

    Neo-marxist theory provides a better tool for educational researchers than other research methodologies because of its focus on empowering the dispossessed and its interest in the relationships between human activity and material circumstances. Traditional educational research is rooted in the positivist tradition and claims to be value neutral…

  19. Courseware Engineering Methodology.

    ERIC Educational Resources Information Center

    Uden, Lorna

    2002-01-01

    Describes development of the Courseware Engineering Methodology (CEM), created to guide novices in designing effective courseware. Discusses CEM's four models: pedagogical (concerned with the courseware's pedagogical aspects), conceptual (dealing with software engineering), interface (relating to human-computer interaction), and hypermedia…

  20. Video: Modalities and Methodologies

    ERIC Educational Resources Information Center

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data…

  1. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  2. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  3. Improved methodology for generating controlled test atmospheres.

    PubMed

    Miller, R R; Letts, R L; Potts, W J; McKenna, M J

    1980-11-01

    Improved methodology has been developed for generating controlled test atmospheres. Vaporization of volatile liquids is accomplished in a 28 mm (O.D.) glass J-tube in conjunction with a compressed air flameless heat torch, a pressure-sensitive switch, and a positive displacement piston pump. The vaporization system has been very reliable with a variety of test materials in studies ranging from a few days to several months. The J-tube vaporization assembly minimizes the possibility of thermal decomposition of the test material and affords a better margin of safety when vaporizing potentially explosive materials.

  4. A methodology for distributed fault diagnosis

    NASA Astrophysics Data System (ADS)

    Gupta, V.; Puig, V.; Blesa, J.

    2017-01-01

    In this paper, a methodology for distributed fault diagnosis is proposed. The algorithm places the sensors in a system in such a manner that the partition of a system into various subsystems becomes easier facilitating the implementation of a distributed fault diagnosis system. This algorithm also reduces or minimized the number of sensors to be used or install thus reducing overall cost. Binary integer linear programming is used for optimization in this algorithm. Real case study of Barcelona water network has been used to demonstrate and validate the proposed algorithm.

  5. Minimizing Variation in Outdoor CPV Power Ratings: Preprint

    SciTech Connect

    Muller, M.; Marion, B.; Rodriguez, J.; Kurtz, S.

    2011-07-01

    The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  6. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  7. Perturbation resilience and superiorization methodology of averaged mappings

    NASA Astrophysics Data System (ADS)

    He, Hongjin; Xu, Hong-Kun

    2017-04-01

    We first prove the bounded perturbation resilience for the successive fixed point algorithm of averaged mappings, which extends the string-averaging projection and block-iterative projection methods. We then apply the superiorization methodology to a constrained convex minimization problem where the constraint set is the intersection of fixed point sets of a finite family of averaged mappings.

  8. Minimally Invasive Mitral Valve Surgery II

    PubMed Central

    Wolfe, J. Alan; Malaisrie, S. Chris; Farivar, R. Saeid; Khan, Junaid H.; Hargrove, W. Clark; Moront, Michael G.; Ryan, William H.; Ailawadi, Gorav; Agnihotri, Arvind K.; Hummel, Brian W.; Fayers, Trevor M.; Grossi, Eugene A.; Guy, T. Sloane; Lehr, Eric J.; Mehall, John R.; Murphy, Douglas A.; Rodriguez, Evelio; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Lewis, Clifton T. P.; Barnhart, Glenn R.; Goldman, Scott M.

    2016-01-01

    Abstract Techniques for minimally invasive mitral valve repair and replacement continue to evolve. This expert opinion, the second of a 3-part series, outlines current best practices for nonrobotic, minimally invasive mitral valve procedures, and for postoperative care after minimally invasive mitral valve surgery. PMID:27654406

  9. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  10. Differentially Private Empirical Risk Minimization.

    PubMed

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D

    2011-03-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance.

  11. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  12. Minimal hepatic encephalopathy: A review.

    PubMed

    Nardone, Raffaele; Taylor, Alexandra C; Höller, Yvonne; Brigo, Francesco; Lochner, Piergiorgio; Trinka, Eugen

    2016-10-01

    Minimal hepatic encephalopathy (MHE) is the earliest form of hepatic encephalopathy and can affect up to 80% of patients with liver cirrhosis. By definition, MHE is characterized by cognitive function impairment in the domains of attention, vigilance and integrative function, but obvious clinical manifestation are lacking. MHE has been shown to affect daily functioning, quality of life, driving and overall mortality. The diagnosis can be achieved through neuropsychological testing, recently developed computerized psychometric tests, such as the critical flicker frequency and the inhibitory control tests, as well as neurophysiological procedures. Event related potentials can reveal subtle changes in patients with normal neuropsychological performances. Spectral analysis of electroencephalography (EEG) and quantitative analysis of sleep EEG provide early markers of cerebral dysfunction in cirrhotic patients with MHE. Neuroimaging, in particular MRI, also increasingly reveals diffuse abnormalities in intrinsic brain activity and altered organization of functional connectivity networks. Medical treatment for MHE to date has been focused on reducing serum ammonia levels and includes non-absorbable disaccharides, probiotics or rifaximin. Liver transplantation may not reverse the cognitive deficits associated with MHE. We performed here an updated review on epidemiology, burden and quality of life, neuropsychological testing, neuroimaging, neurophysiology and therapy in subjects with MHE.

  13. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  14. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  15. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  16. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  17. Tobacco documents research methodology

    PubMed Central

    McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-01-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings. PMID:21504933

  18. Autonomous spacecraft design methodology

    SciTech Connect

    Divita, E.L.; Turner, P.R.

    1984-08-01

    A methodology for autonomous spacecraft design blends autonomy requirements with traditional mission requirements and assesses the impact of autonomy upon the total system resources available to support faulttolerance and automation. A baseline functional design can be examined for autonomy implementation impacts, and the costs, risk, and benefits of various options can be assessed. The result of the process is a baseline design that includes autonomous control functions.

  19. Methodology for research II

    PubMed Central

    Bhaskar, S Bala; Manjuladevi, M

    2016-01-01

    Research is a systematic process, which uses scientific methods to generate new knowledge that can be used to solve a query or improve on the existing system. Any research on human subjects is associated with varying degree of risk to the participating individual and it is important to safeguard the welfare and rights of the participants. This review focuses on various steps involved in methodology (in continuation with the previous section) before the data are submitted for publication. PMID:27729691

  20. NAVOSH Priority Methodology.

    DTIC Science & Technology

    1982-03-01

    studies were available. However, the extent to which the results of previous prioritization investigations might benefit this research was not known. By...In 1978, SRI developed a method for the U.S. Enviromental Protection Agency (EPA) to use in rapid ranking of environmental pollutants. The method is...representative of the state of development of relevant prioritization methodology techniques: IN a. Cost- Benefit Fault Tree Analysis b. Cost- Benefit Type Methods c

  1. Expert Systems Development Methodology

    DTIC Science & Technology

    1989-07-28

    two volumes. Volume 1 is the Development Metodology and Volume 2 is an Evaluation Methodology containing methods for evaluation, validation and...system are written in an English -like language which almost anyone can understand. Thus programming in rule based systems can become "programming for...computers and others have little understanding about how computers work. The knowledge engineer must therefore be willing and able to teach the expert

  2. Darwin's Methodological Evolution.

    PubMed

    Lennox, James G

    2005-01-01

    A necessary condition for having a revolution named after you is that you are an innovator in your field. I argue that if Charles Darwin meets this condition, it is as a philosopher and methodologist. In 1991, I made the case for Darwin's innovative use of "thought experiment" in the Origin. Here I place this innovative practice in the context of Darwin's methodological commitments, trace its origins back into Darwin's notebooks, and pursue Darwin's suggestion that it owes its inspiration to Charles Lyell.

  3. Solubility curves and nucleation rates from molecular dynamics for polymorph prediction - moving beyond lattice energy minimization.

    PubMed

    Parks, Conor; Koswara, Andy; DeVilbiss, Frank; Tung, Hsien-Hsin; Nere, Nandkishor K; Bordawekar, Shailendra; Nagy, Zoltan K; Ramkrishna, Doraiswami

    2017-02-15

    Current polymorph prediction methods, known as lattice energy minimization, seek to determine the crystal lattice with the lowest potential energy, rendering it unable to predict solvent dependent metastable form crystallization. Facilitated by embarrassingly parallel, multiple replica, large-scale molecular dynamics simulations, we report on a new method concerned with predicting crystal structures using the kinetics and solubility of the low energy polymorphs predicted by lattice energy minimization. The proposed molecular dynamics simulation methodology provides several new predictions to the field of crystallization. (1) The methodology is shown to correctly predict the kinetic preference for β-glycine nucleation in water relative to α- and γ-glycine. (2) Analysis of nanocrystal melting temperatures show γ- nanocrystals have melting temperatures up to 20 K lower than either α- or β-glycine. This provides a striking explanation of how an energetically unstable classical nucleation theory (CNT) transition state complex leads to kinetic inaccessibility of γ-glycine in water, despite being the thermodynamically preferred polymorph predicted by lattice energy minimization. (3) The methodology also predicts polymorph-specific solubility curves, where the α-glycine solubility curve is reproduced to within 19% error, over a 45 K temperature range, using nothing but atomistic-level information provided from nucleation simulations. (4) Finally, the methodology produces the correct solubility ranking of β- > α-glycine. In this work, we demonstrate how the methodology supplements lattice energy minimization with molecular dynamics nucleation simulations to give the correct polymorph prediction, at different length scales, when lattice energy minimization alone would incorrectly predict the formation of γ-glycine in water from the ranking of lattice energies. Thus, lattice energy minimization optimization algorithms are supplemented with the necessary solvent

  4. Minimal breast cancer: a clinical appraisal.

    PubMed Central

    Peters, T G; Donegan, W L; Burg, E A

    1977-01-01

    Eighty-five patients with a diagnosis of minimal breast cancer were evaluated. The predominant lesion was intraductal carcinoma, and axillary metastases occurred in association with minimal breast cancer in seven of 96 cases. One death occurred due to minimal breast cancer. Bilateral mammary carcinoma was evident in 24% and bilateral minimal breast cancer in 13% of the patients. The component lesions of minimal breast cancer have varied biologic activity, but prognosis is good with a variety of operations. The multifocal nature of minimal breast cancer and the potential for metastases should be recognized. Therapy should include removal of the entire mammary parenchyma and low axillary nodes. The high incidence of bilateral malignancy supports elective contralateral biopsy at the time of therapy for minimal breast cancer. Images Fig. 1. Fig. 2. Fig. 3. Fig. 5. PMID:203233

  5. Minimal Models of Multidimensional Computations

    PubMed Central

    Fitzgerald, Jeffrey D.; Sincich, Lawrence C.; Sharpee, Tatyana O.

    2011-01-01

    The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs. PMID:21455284

  6. Missile Misdistance Reduction: An Instructive Methodology for Developing Terminal Guidance Control Systems to Minimize Missile Misdistance.

    DTIC Science & Technology

    1982-10-01

    constant bearing 4 course given by: V a T 1- cosm T ,for n #1 [qZm= n-i[t~-1 n(iV.C-10) V a T 1- cos B in 1- for n = 1 I 67 Equation (IV.C-9) is plotted in...VI.B-2) ’P +STU or SP = -(i - SqTa - 4) (VI.B-2a) Tcy Equation (VI.B-2a) can be expressed in state form as follows: Let Y = X5 , = k5 = 6 ’= X8 =i and

  7. WARRP Decon-13: Subject Matter Expert (SME) Meeting Waste Screening and Waste Minimization Methodologies Project

    DTIC Science & Technology

    2012-08-01

    parallel cesium-contaminated debris. He explained that Denver had previously had to remove radium tailings from approximately five miles of Denver...Grandview, Idaho; Clive, Utah; and Deer Trail, Colorado for disposal. The required cleanup level was less than 5 picocuries per gram (pCi/g) Radium -226

  8. Regional Expansion of Minimally Invasive Surgery for Hysterectomy: Implementation and Methodology in a Large Multispecialty Group

    PubMed Central

    Andryjowicz, Esteban; Wray, Teresa

    2011-01-01

    Introduction: Approximately 600,000 hysterectomies are performed in the US each year, making hysterectomy the second most common major operation performed in women. Several methods can be used to perform this procedure. In 2009, a Cochrane Review concluded “that vaginal hysterectomy should be performed in preference to abdominal hysterectomy, where possible. Where vaginal hysterectomy is not possible, a laparoscopic approach may avoid the need for an abdominal hysterectomy. Risks and benefits of different approaches may however be influenced by the surgeon's experience. More research is needed, particularly to examine the long-term effects of the different types of surgery.” This article reviews the steps that a large multispecialty group used to teach non-open hysterectomy methods to improve the quality of care for their patients and to decrease the number of inpatient procedures and therefore costs. The percentages of each type of hysterectomy performed yearly between 2005 and 2010 were calculated, as well as the length of stay (LOS) for each method. Methods: A structured educational intervention with both didactic and hands-on exercises was created and rolled out to 12 medical centers. All patients undergoing hysterectomy for benign conditions through the Southern California Permanente Medical Group (a large multispecialty group that provides medical care to Kaiser Permanente patients in Southern California) between 2005 and 2010 were included. This amounted to 26,055 hysterectomies for benign conditions being performed by more than 350 obstetrician/gynecologists (Ob/Gyns). Results: More than 300 Ob/Gyns took the course across 12 medical centers. On the basis of hospital discharge data, the total number of hysterectomies, types of hysterectomies, and LOS for each type were identified for each year. Between 2005 and 2010, the rate of non-open hysterectomies has increased 120% (from 38% to 78%) and the average LOS has decreased 31%. PMID:22319415

  9. A simple efficient methodology for Dirac equation in minimal length quantum mechanics

    NASA Astrophysics Data System (ADS)

    Hassanabadi, H.; Zarrinkamar, S.; Rajabi, A. A.

    2013-01-01

    We solve the modified Dirac equation by adding a harmonic oscillator potential and implementing the Nikiforov-Uvarov technique. The closed forms of solutions are reported in a quite simple and systematic manner.

  10. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  11. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  12. Supply chain assessment methodology.

    PubMed

    Topor, E

    2000-08-01

    This article describes an assessment methodology based on the supply chain proficiency model that can be used to set realistic supply chain objectives. The assessment centers on a business model that identifies the logical stages of supply chain proficiency as measured against a comprehensive set of business characteristics. For each characteristic, an enterprise evolves from one stage to the next. The magnitude of change inherent in moving forward usually prohibits skipping stages. Although it is possible to be at different stages for each characteristic, it is usually desirable to maintain balance.

  13. Neuropathography: origins and methodology.

    PubMed

    Bradford, David T

    2006-10-01

    Neuropathography is a genre of case study which balances the clinical neuroscientific perspective with the descriptive acuity and existential interests of phenomenological psychopathology. Its subjects are persons of exceptional talent whose contributions are widely recognized, and also those whose seemingly ordinary lives include personally profound experiences of discernible cultural significance. In all instances, the chief focus is on the shaping influence of brain dysfunction in the subject's life and work. Six methodological guidelines are outlined, their topics ranging from the subjects, source material, aesthetic standards, and multidisciplinary character of neuropathography to normative standards and concepts of neuropsychological causation.

  14. Minimal Cells-Real and Imagined.

    PubMed

    Glass, John I; Merryman, Chuck; Wise, Kim S; Hutchison, Clyde A; Smith, Hamilton O

    2017-03-27

    A minimal cell is one whose genome only encodes the minimal set of genes necessary for the cell to survive. Scientific reductionism postulates the best way to learn the first principles of cellular biology would be to use a minimal cell in which the functions of all genes and components are understood. The genes in a minimal cell are, by definition, essential. In 2016, synthesis of a genome comprised of only the set of essential and quasi-essential genes encoded by the bacterium Mycoplasma mycoides created a near-minimal bacterial cell. This organism performs the cellular functions common to all organisms. It replicates DNA, transcribes RNA, translates proteins, undergoes cell division, and little else. In this review, we examine this organism and contrast it with other bacteria that have been used as surrogates for a minimal cell.

  15. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  16. Methodology for Teachers. Volunteer's Manual.

    ERIC Educational Resources Information Center

    Holt, Daniel D.; And Others

    The Volunteer's Manual of "Methodology for Teachers" was written to (1) provide Peace Corps/Korea TESOL volunteers with a simple, complete guide to methodology for teaching English in Korea; and (2) provide these volunteers with a simple, complete guide for teaching this methodology to Korean English teachers in inservice training programs. For…

  17. Relative Hazard Calculation Methodology

    SciTech Connect

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-09-07

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation).

  18. The methodology of neuroproteomics.

    PubMed

    Ottens, Andrew K

    2009-01-01

    The human central nervous system (CNS) is the most complex organ in nature, composed of ten trillion cells forming complex neural networks using a quadrillion synaptic connections. Proteins, their modifications, and their interactions are integral to CNS function. The emerging field of neuroproteomics provides us with a wide-scope view of posttranslation protein dynamics within the CNS to better our understanding of its function, and more often, its dysfunction consequent to neurodegenerative disorders. This chapter reviews methodology employed in the neurosciences to study the neuroproteome in health and disease. The chapter layout parallels this volume's four parts. Part I focuses on modeling human neuropathology in animals as surrogate, accessible, and controllable platforms in our research. Part II discusses methodology used to focus analysis onto a subneuroproteome. Part III reviews analytical and bioinformatic technologies applied in neuroproteomics. Part IV discusses clinical neuroproteomics, from processing of human biofluids to translation in biomarkers research. Neuroproteomics continues to mature as a discipline, confronting the extreme complexity of the CNS proteome and its dynamics, and providing insight into the molecular mechanisms underlying how our nervous system works and how it is compromised by injury and disease.

  19. Waste Minimization Study on Pyrochemical Reprocessing Processes

    SciTech Connect

    Boussier, H.; Conocar, O.; Lacquement, J.

    2006-07-01

    Ideally a new pyro-process should not generate more waste, and should be at least as safe and cost effective as the hydrometallurgical processes currently implemented at industrial scale. This paper describes the thought process, the methodology and some results obtained by process integration studies to devise potential pyro-processes and to assess their capability of achieving this challenging objective. As example the assessment of a process based on salt/metal reductive extraction, designed for the reprocessing of Generation IV carbide spent fuels, is developed. Salt/metal reductive extraction uses the capability of some metals, aluminum in this case, to selectively reduce actinide fluorides previously dissolved in a fluoride salt bath. The reduced actinides enter the metal phase from which they are subsequently recovered; the fission products remain in the salt phase. In fact, the process is not so simple, as it requires upstream and downstream subsidiary steps. All these process steps generate secondary waste flows representing sources of actinide leakage and/or FP discharge. In aqueous processes the main solvent (nitric acid solution) has a low boiling point and evaporate easily or can be removed by distillation, thereby leaving limited flow containing the dissolved substance behind to be incorporated in a confinement matrix. From the point of view of waste generation, one main handicap of molten salt processes, is that the saline phase (fluoride in our case) used as solvent is of same nature than the solutes (radionuclides fluorides) and has a quite high boiling point. So it is not so easy, than it is with aqueous solutions, to separate solvent and solutes in order to confine only radioactive material and limit the final waste flows. Starting from the initial block diagram devised two years ago, the paper shows how process integration studies were able to propose process fittings which lead to a reduction of the waste variety and flows leading at an 'ideal

  20. Contemporary review of minimally invasive pancreaticoduodenectomy

    PubMed Central

    Dai, Rui; Turley, Ryan S; Blazer, Dan G

    2016-01-01

    AIM To assess the current literature describing various minimally invasive techniques for and to review short-term outcomes after minimally invasive pancreaticoduodenectomy (PD). METHODS PD remains the only potentially curative treatment for periampullary malignancies, including, most commonly, pancreatic adenocarcinoma. Minimally invasive approaches to this complex operation have begun to be increasingly reported in the literature and are purported by some to reduce the historically high morbidity of PD associated with the open technique. In this systematic review, we have searched the literature for high-quality publications describing minimally invasive techniques for PD-including laparoscopic, robotic, and laparoscopic-assisted robotic approaches (hybrid approach). We have identified publications with the largest operative experiences from well-known centers of excellence for this complex procedure. We report primarily short term operative and perioperative results and some short term oncologic endpoints. RESULTS Minimally invasive techniques include laparoscopic, robotic and hybrid approaches and each of these techniques has strong advocates. Consistently, across all minimally invasive modalities, these techniques are associated less intraoperative blood loss than traditional open PD (OPD), but in exchange for longer operating times. These techniques are relatively equivalent in terms of perioperative morbidity and short term oncologic outcomes. Importantly, pancreatic fistula rate appears to be comparable in most minimally invasive series compared to open technique. Impact of minimally invasive technique on length of stay is mixed compared to some traditional open series. A few series have suggested that initiation of and time to adjuvant therapy may be improved with minimally invasive techniques, however this assertion remains controversial. In terms of short-terms costs, minimally invasive PD is significantly higher than that of OPD. CONCLUSION Minimally

  1. Situating methodology within qualitative research.

    PubMed

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  2. Minimally Invasive Mitral Valve Surgery I

    PubMed Central

    Ailawadi, Gorav; Agnihotri, Arvind K.; Mehall, John R.; Wolfe, J. Alan; Hummel, Brian W.; Fayers, Trevor M.; Farivar, R. Saeid; Grossi, Eugene A.; Guy, T. Sloane; Hargrove, W. Clark; Khan, Junaid H.; Lehr, Eric J.; Malaisrie, S. Chris; Murphy, Douglas A.; Rodriguez, Evelio; Ryan, William H.; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Goldman, Scott M.; Lewis, Clifton T. P.; Barnhart, Glenn R.

    2016-01-01

    Abstract Widespread adoption of minimally invasive mitral valve repair and replacement may be fostered by practice consensus and standardization. This expert opinion, first of a 3-part series, outlines current best practices in patient evaluation and selection for minimally invasive mitral valve procedures, and discusses preoperative planning for cannulation and myocardial protection. PMID:27654407

  3. Minimizing electrode contamination in an electrochemical cell

    DOEpatents

    Kim, Yu Seung; Zelenay, Piotr; Johnston, Christina

    2014-12-09

    An electrochemical cell assembly that is expected to prevent or at least minimize electrode contamination includes one or more getters that trap a component or components leached from a first electrode and prevents or at least minimizes them from contaminating a second electrode.

  4. Locus minimization in breed prediction using artificial neural network approach.

    PubMed

    Iquebal, M A; Ansari, M S; Sarika; Dixit, S P; Verma, N K; Aggarwal, R A K; Jayakumar, S; Rai, A; Kumar, D

    2014-12-01

    Molecular markers, viz. microsatellites and single nucleotide polymorphisms, have revolutionized breed identification through the use of small samples of biological tissue or germplasm, such as blood, carcass samples, embryos, ova and semen, that show no evident phenotype. Classical tools of molecular data analysis for breed identification have limitations, such as the unavailability of referral breed data, causing increased cost of collection each time, compromised computational accuracy and complexity of the methodology used. We report here the successful use of an artificial neural network (ANN) in background to decrease the cost of genotyping by locus minimization. The webserver is freely accessible (http://nabg.iasri.res.in/bisgoat) to the research community. We demonstrate that the machine learning (ANN) approach for breed identification is capable of multifold advantages such as locus minimization, leading to a drastic reduction in cost, and web availability of reference breed data, alleviating the need for repeated genotyping each time one investigates the identity of an unknown breed. To develop this model web implementation based on ANN, we used 51,850 samples of allelic data of microsatellite-marker-based DNA fingerprinting on 25 loci covering 22 registered goat breeds of India for training. Minimizing loci to up to nine loci through the use of a multilayer perceptron model, we achieved 96.63% training accuracy. This server can be an indispensable tool for identification of existing breeds and new synthetic commercial breeds, leading to protection of intellectual property in case of sovereignty and bio-piracy disputes. This server can be widely used as a model for cost reduction by locus minimization for various other flora and fauna in terms of variety, breed and/or line identification, especially in conservation and improvement programs.

  5. Minimizing Expected Maximum Risk from Cyber-Attacks with Probabilistic Attack Success

    SciTech Connect

    Bhuiyan, Tanveer H.; Nandi, Apurba; Medal, Hugh; Halappanavar, Mahantesh

    2016-07-16

    The goal of our work is to enhance network security by generating partial cut-sets, which are a subset of edges that remove paths from initially vulnerable nodes (initial security conditions) to goal nodes (critical assets), on an attack graph given costs for cutting an edge and a limited overall budget.

  6. Scientific methodology applied.

    PubMed

    Lussier, A

    1975-04-01

    The subject of this symposium is naproxen, a new drug that resulted from an investigation to find a superior anti-inflammatory agent. It was synthesized by Harrison et al. in 1970 at the Syntex Institute of Organic Chemistry and Biological Sciences. How can we chart the evolution of this or any other drug? Three steps are necessary: first, chemical studies (synthesis, analysis); second, animal pharmacology; third, human pharmacology. The last step can additionally be divided into four phases: metabolism and toxicology of the drug in normal volunteers; dose titration and initial clinical trials with sick subjects (pharmacometry); confirmatory clinical trials when the drug is accepted on the market and revaluation (familiarization trials). To discover the truth about naproxen, we must all participate actively with a critical mind, following the principles of scientific methodology. We shall find that the papers to be presented today all deal with the third step in the evaluation process--clinical pharmacology. It is quite evident that the final and most decisive test must be aimed at the most valuable target: the human being. The end product of this day's work for each of us should be the formation of an opinion based on solid scientific proofs. And let us hope that we will all enjoy fulfilling the symposium in its entire etymological meaning this evening. In vino veritas.

  7. Engineering radioecology: Methodological considerations

    SciTech Connect

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-12-31

    The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.

  8. Cancer cytogenetics: methodology revisited.

    PubMed

    Wan, Thomas S K

    2014-11-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed.

  9. Minimal representations, geometric quantization, and unitarity.

    PubMed Central

    Brylinski, R; Kostant, B

    1994-01-01

    In the framework of geometric quantization we explicitly construct, in a uniform fashion, a unitary minimal representation pio of every simply-connected real Lie group Go such that the maximal compact subgroup of Go has finite center and Go admits some minimal representation. We obtain algebraic and analytic results about pio. We give several results on the algebraic and symplectic geometry of the minimal nilpotent orbits and then "quantize" these results to obtain the corresponding representations. We assume (Lie Go)C is simple. PMID:11607478

  10. Minimal covariant observables identifying all pure states

    NASA Astrophysics Data System (ADS)

    Carmeli, Claudio; Heinosaari, Teiko; Toigo, Alessandro

    2013-09-01

    It has been recently shown by Heinosaari, Mazzarella and Wolf (2013) [1] that an observable that identifies all pure states of a d-dimensional quantum system has minimally 4d-4 outcomes or slightly less (the exact number depending on d). However, no simple construction of this type of minimal observable is known. We investigate covariant observables that identify all pure states and have minimal number of outcomes. It is shown that the existence of this kind of observables depends on the dimension of the Hilbert space.

  11. Technology applications for radioactive waste minimization

    SciTech Connect

    Devgun, J.S.

    1994-07-01

    The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry.

  12. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  13. Status of sonic boom methodology and understanding

    NASA Technical Reports Server (NTRS)

    Darden, Christine M.; Powell, Clemans A.; Hayes, Wallace D.; George, Albert R.; Pierce, Allan D.

    1989-01-01

    In January 1988, approximately 60 representatives of industry, academia, government, and the military gathered at NASA-Langley for a 2 day workshop on the state-of-the-art of sonic boom physics, methodology, and understanding. The purpose of the workshop was to assess the sonic boom area, to determine areas where additional sonic boom research is needed, and to establish some strategies and priorities in this sonic boom research. Attendees included many internationally recognized sonic boom experts who had been very active in the Supersonic Transport (SST) and Supersonic Cruise Aircraft Research Programs of the 60's and 70's. Summaries of the assessed state-of-the-art and the research needs in theory, minimization, atmospheric effects during propagation, and human response are given.

  14. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  15. Nursing research methodology: transcending Cartesianism.

    PubMed

    Walters, A J

    1996-06-01

    Nurses involved in research are concerned with methodological issues. This paper explores the Cartesian debate that has polarized the discourse on nursing research methodology. It is argued that methodologies exclusively based on objectivism, one pole of the Cartesian debate, or subjectivism, the other, do not provide nurses with adequate research foundations to understand the complexity of the lifeworld of nursing practice. This paper provides nurse researchers with an alternative methodological perspective, Gadamerian hermeneutics, which is in harmony with the clinical world of nursing practice.

  16. Dosimetric methodology of the ICRP

    SciTech Connect

    Eckerman, K.F.

    1994-12-31

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples.

  17. Minimization of power consumption during charging of superconducting accelerating cavities

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Anirban Krishna; Ziemann, Volker; Ruber, Roger; Goryashko, Vitaliy

    2015-11-01

    The radio frequency cavities, used to accelerate charged particle beams, need to be charged to their nominal voltage after which the beam can be injected into them. The standard procedure for such cavity filling is to use a step charging profile. However, during initial stages of such a filling process a substantial amount of the total energy is wasted in reflection for superconducting cavities because of their extremely narrow bandwidth. The paper presents a novel strategy to charge cavities, which reduces total energy reflection. We use variational calculus to obtain analytical expression for the optimal charging profile. Energies, reflected and required, and generator peak power are also compared between the charging schemes and practical aspects (saturation, efficiency and gain characteristics) of power sources (tetrodes, IOTs and solid state power amplifiers) are also considered and analysed. The paper presents a methodology to successfully identify the optimal charging scheme for different power sources to minimize total energy requirement.

  18. Minimizing Variation in Outdoor CPV Power Ratings (Presentation)

    SciTech Connect

    Muller, M.

    2011-04-01

    Presented at the 7th International Conference on Concentrating Photovoltaic Systems (CPV-7), 4-6 April 2011, Las Vegas, Nevada. The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  19. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    ERIC Educational Resources Information Center

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  20. Degreasing of titanium to minimize stress corrosion

    NASA Technical Reports Server (NTRS)

    Carpenter, S. R.

    1967-01-01

    Stress corrosion of titanium and its alloys at elevated temperatures is minimized by replacing trichloroethylene with methanol or methyl ethyl ketone as a degreasing agent. Wearing cotton gloves reduces stress corrosion from perspiration before the metal components are processed.

  1. Academic Achievement and Minimal Brain Dysfunction

    ERIC Educational Resources Information Center

    Edwards, R. Philip; And Others

    1971-01-01

    The investigation provided no evidence that a diagnosis of minimal brain dysfunction based on a pediatric neurological evaluation and/or visual-motor impairment as measured by the Bender-Gestalt, is a useful predictor of academic achievement. (Author)

  2. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  3. Genetic algorithms for minimal source reconstructions

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.

    1993-12-01

    Under-determined linear inverse problems arise in applications in which signals must be estimated from insufficient data. In these problems the number of potentially active sources is greater than the number of observations. In many situations, it is desirable to find a minimal source solution. This can be accomplished by minimizing a cost function that accounts from both the compatibility of the solution with the observations and for its ``sparseness``. Minimizing functions of this form can be a difficult optimization problem. Genetic algorithms are a relatively new and robust approach to the solution of difficult optimization problems, providing a global framework that is not dependent on local continuity or on explicit starting values. In this paper, the authors describe the use of genetic algorithms to find minimal source solutions, using as an example a simulation inspired by the reconstruction of neural currents in the human brain from magnetoencephalographic (MEG) measurements.

  4. Heart bypass surgery - minimally invasive - discharge

    MedlinePlus

    ... thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest . 2012;141(2 ... bypass surgery - minimally invasive Heart failure - overview High blood cholesterol ...

  5. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    Not Available

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  6. Mixed waste minimization in a research environment

    SciTech Connect

    Kirner, N.

    1994-12-31

    This presentation describes minimization efforts and processes for mixed waste generated by research facilities. Waste stream assessment and treatment, and database management for various research-related waste streams is detailed.

  7. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  8. Current research in sonic-boom minimization

    NASA Technical Reports Server (NTRS)

    Darden, C. M.; Mack, R. J.

    1976-01-01

    A review is given of several questions as yet unanswered in the area of sonic-boom research. Efforts, both here at Langley and elsewhere, in the area of minimization, human response, design techniques and in developing higher order propagation methods are discussed. In addition, a wind-tunnel test program being conducted to assess the validity of minimization methods based on a forward spike in the F-function is described.

  9. Mesonic spectroscopy of minimal walking technicolor

    SciTech Connect

    Del Debbio, Luigi; Lucini, Biagio; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2010-07-01

    We investigate the structure and the novel emerging features of the mesonic nonsinglet spectrum of the minimal walking technicolor theory. Precision measurements in the nonsinglet pseudoscalar and vector channels are compared to the expectations for an IR-conformal field theory and a QCD-like theory. Our results favor a scenario in which minimal walking technicolor is (almost) conformal in the infrared, while spontaneous chiral symmetry breaking seems less plausible.

  10. Minimally invasive treatment of infected pancreatic necrosis

    PubMed Central

    Cebulski, Włodzimierz; Słodkowski, Maciej; Krasnodębski, Ireneusz W.

    2014-01-01

    Infected pancreatic necrosis is a challenging complication that worsens prognosis in acute pancreatitis. For years, open necrosectomy has been the mainstay treatment option in infected pancreatic necrosis, although surgical debridement still results in high morbidity and mortality rates. Recently, many reports on minimally invasive treatment in infected pancreatic necrosis have been published. This paper presents a review of minimally invasive techniques and attempts to define their role in the management of infected pancreatic necrosis. PMID:25653725

  11. A modified secant method for unconstrained minimization

    NASA Technical Reports Server (NTRS)

    Polak, E.

    1972-01-01

    A gradient-secant algorithm for unconstrained optimization problems is presented. The algorithm uses Armijo gradient method iterations until it reaches a region where the Newton method is more efficient, and then switches over to a secant form of operation. It is concluded that an efficient method for unconstrained minimization has been developed, and that any convergent minimization method can be substituted for the Armijo gradient method.

  12. Minimally Invasive Osteotomies of the Calcaneus.

    PubMed

    Guyton, Gregory P

    2016-09-01

    Osteotomies of the calcaneus are powerful surgical tools, representing a critical component of the surgical reconstruction of pes planus and pes cavus deformity. Modern minimally invasive calcaneal osteotomies can be performed safely with a burr through a lateral incision. Although greater kerf is generated with the burr, the effect is modest, can be minimized, and is compatible with many fixation techniques. A hinged jig renders the procedure more reproducible and accessible.

  13. Minimally Invasive Forefoot Surgery in France.

    PubMed

    Meusnier, Tristan; Mukish, Prikesht

    2016-06-01

    Study groups have been formed in France to advance the use of minimally invasive surgery. These techniques are becoming more frequently used and the technique nuances are continuing to evolve. The objective of this article was to advance the awareness of the current trends in minimally invasive surgery for common diseases of the forefoot. The percutaneous surgery at the forefoot is less developed at this time, but also will be discussed.

  14. Gravitino problem in minimal supergravity inflation

    NASA Astrophysics Data System (ADS)

    Hasegawa, Fuminori; Mukaida, Kyohei; Nakayama, Kazunori; Terada, Takahiro; Yamada, Yusuke

    2017-04-01

    We study non-thermal gravitino production in the minimal supergravity inflation. In this minimal model utilizing orthogonal nilpotent superfields, the particle spectrum includes only graviton, gravitino, inflaton, and goldstino. We find that a substantial fraction of the cosmic energy density can be transferred to the longitudinal gravitino due to non-trivial change of its sound speed. This implies either a breakdown of the effective theory after inflation or a serious gravitino problem.

  15. Minimally invasive osteosynthesis technique for articular fractures.

    PubMed

    Beale, Brian S; Cole, Grayson

    2012-09-01

    Articular fractures require accurate reduction and rigid stabilization to decrease the chance of osteoarthritis and joint dysfunction. Articular fractures have been traditionally repaired by arthrotomy and internal fixation. Recently, minimally invasive techniques have been introduced to treat articular fractures, reducing patient morbidity and improving the accuracy of reduction. A variety of techniques, including distraction, radiographic imaging, and arthroscopy, are used with the minimally invasive osteosynthesis technique of articular fractures to achieve a successful repair and outcome.

  16. Alternating minimization and Boltzmann machine learning.

    PubMed

    Byrne, W

    1992-01-01

    Training a Boltzmann machine with hidden units is appropriately treated in information geometry using the information divergence and the technique of alternating minimization. The resulting algorithm is shown to be closely related to gradient descent Boltzmann machine learning rules, and the close relationship of both to the EM algorithm is described. An iterative proportional fitting procedure for training machines without hidden units is described and incorporated into the alternating minimization algorithm.

  17. Minimalism in Art, Medical Science and Neurosurgery.

    PubMed

    Ökten, Ali İhsan

    2016-12-21

    The word ''minimalism'' is a word derived from French the word ''minimum''. Whereas the lexical meaning of minimum is ''the least or the smallest quantity necessary for something'', its expression in mathematics can be described as ''the lowest step a variable number can descend, least, minimal''. Minimalism, which advocates an extreme simplicity of the artistic form, is a current in modern art and music whose origins go to 1960s and which features simplicity and objectivity. Although art, science and philosophy are different disciplines, they support each other from time to time, sometimes they intertwine and sometimes they copy each other. A periodic schools or teaching in one of them can take the others into itself, so, they proceed on their ways empowering each other. It is also true for the minimalism in art and the minimal invasive surgical approaches in science. Concepts like doing with less, avoiding unnecessary materials and reducing the number of the elements in order to increase the effect in the expression which are the main elements of the minimalism in art found their equivalents in medicine and neurosurgery. Their equivalents in medicine or neurosurgery have been to protect the physical integrity of the patient with less iatrogenic injury, minimum damage and the same therapeutic effect in the most effective way and to enable the patient to regain his health in the shortest span of time.

  18. Economic impact of minimally invasive lumbar surgery

    PubMed Central

    Hofstetter, Christoph P; Hofer, Anna S; Wang, Michael Y

    2015-01-01

    Cost effectiveness has been demonstrated for traditional lumbar discectomy, lumbar laminectomy as well as for instrumented and noninstrumented arthrodesis. While emerging evidence suggests that minimally invasive spine surgery reduces morbidity, duration of hospitalization, and accelerates return to activites of daily living, data regarding cost effectiveness of these novel techniques is limited. The current study analyzes all available data on minimally invasive techniques for lumbar discectomy, decompression, short-segment fusion and deformity surgery. In general, minimally invasive spine procedures appear to hold promise in quicker patient recovery times and earlier return to work. Thus, minimally invasive lumbar spine surgery appears to have the potential to be a cost-effective intervention. Moreover, novel less invasive procedures are less destabilizing and may therefore be utilized in certain indications that traditionally required arthrodesis procedures. However, there is a lack of studies analyzing the economic impact of minimally invasive spine surgery. Future studies are necessary to confirm the durability and further define indications for minimally invasive lumbar spine procedures. PMID:25793159

  19. Minimizing liability by properly planning UST system upgrades

    SciTech Connect

    Kroon, D.H.; Baach, M.K.

    1995-12-31

    Existing underground storage tank (UST) systems containing regulated substances, including petroleum products, are defined by the Environmental Protection Agency (EPA) as those installed prior to December 22, 1988. Under the federal regulations (40 CFR Parts 280 and 281), these systems must be upgraded to new standards by December 22, 1998 in the areas of spill and overfill prevention, corrosion protection, and leak detection. Properly planned UST system upgrades provide safety and environmental protection plus: compliance with federal regulations; minimum public liability; and reduced insurance premiums. Some modification to this program will be required where state and local regulations are more strict than the federal requirements. Minimizing liability at reduced costs is the key element of the upgrade program. Although the regulatory requirements must be satisfied, the paramount issue is to minimize exposure to public liability. The methodology presented has been demonstrated to economically achieve that very important goal. In a recent case history, a major operator of UST systems adopted this program and was rewarded by his insurance company with over a 50% reduction in premiums for pollution liability insurance. The upgrade program for existing UST systems consists of: general planning; site investigation; specific plan development; implementation; and monitoring and records.

  20. Cluster Stability Estimation Based on a Minimal Spanning Trees Approach

    NASA Astrophysics Data System (ADS)

    Volkovich, Zeev (Vladimir); Barzily, Zeev; Weber, Gerhard-Wilhelm; Toledano-Kitai, Dvora

    2009-08-01

    Among the areas of data and text mining which are employed today in science, economy and technology, clustering theory serves as a preprocessing step in the data analyzing. However, there are many open questions still waiting for a theoretical and practical treatment, e.g., the problem of determining the true number of clusters has not been satisfactorily solved. In the current paper, this problem is addressed by the cluster stability approach. For several possible numbers of clusters we estimate the stability of partitions obtained from clustering of samples. Partitions are considered consistent if their clusters are stable. Clusters validity is measured as the total number of edges, in the clusters' minimal spanning trees, connecting points from different samples. Actually, we use the Friedman and Rafsky two sample test statistic. The homogeneity hypothesis, of well mingled samples within the clusters, leads to asymptotic normal distribution of the considered statistic. Resting upon this fact, the standard score of the mentioned edges quantity is set, and the partition quality is represented by the worst cluster corresponding to the minimal standard score value. It is natural to expect that the true number of clusters can be characterized by the empirical distribution having the shortest left tail. The proposed methodology sequentially creates the described value distribution and estimates its left-asymmetry. Numerical experiments, presented in the paper, demonstrate the ability of the approach to detect the true number of clusters.

  1. Choosing a Methodology: Philosophical Underpinning

    ERIC Educational Resources Information Center

    Jackson, Elizabeth

    2013-01-01

    As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of…

  2. Methodological Pluralism and Narrative Inquiry

    ERIC Educational Resources Information Center

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  3. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    1992-02-01

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  4. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  5. Sequential unconstrained minimization algorithms for constrained optimization

    NASA Astrophysics Data System (ADS)

    Byrne, Charles

    2008-02-01

    The problem of minimizing a function f(x):RJ → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G_k(x)=f(x)+g_k(x), to obtain xk. The auxiliary functions gk(x):D ⊆ RJ → R+ are nonnegative on the set D, each xk is assumed to lie within D, and the objective is to minimize the continuous function f:RJ → R over x in the set C=\\overline D , the closure of D. We assume that such minimizers exist, and denote one such by \\hat x . We assume that the functions gk(x) satisfy the inequalities 0\\leq g_k(x)\\leq G_{k-1}(x)-G_{k-1}(x^{k-1}), for k = 2, 3, .... Using this assumption, we show that the sequence {f(xk)} is decreasing and converges to f({\\hat x}) . If the restriction of f(x) to D has bounded level sets, which happens if \\hat x is unique and f(x) is closed, proper and convex, then the sequence {xk} is bounded, and f(x^*)=f({\\hat x}) , for any cluster point x*. Therefore, if \\hat x is unique, x^*={\\hat x} and \\{x^k\\}\\rightarrow {\\hat x} . When \\hat x is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton-Raphson method. The proof techniques used for SUMMA can be extended to obtain related results for the induced proximal

  6. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    PubMed Central

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-01-01

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171

  7. Intrasulcal electrocorticography in macaque monkeys with minimally invasive neurosurgical protocols.

    PubMed

    Matsuo, Takeshi; Kawasaki, Keisuke; Osada, Takahiro; Sawahata, Hirohito; Suzuki, Takafumi; Shibata, Masahiro; Miyakawa, Naohisa; Nakahara, Kiyoshi; Iijima, Atsuhiko; Sato, Noboru; Kawai, Kensuke; Saito, Nobuhito; Hasegawa, Isao

    2011-01-01

    Electrocorticography (ECoG), multichannel brain-surface recording and stimulation with probe electrode arrays, has become a potent methodology not only for clinical neurosurgery but also for basic neuroscience using animal models. The highly evolved primate's brain has deep cerebral sulci, and both gyral and intrasulcal cortical regions have been implicated in important functional processes. However, direct experimental access is typically limited to gyral regions, since placing probes into sulci is difficult without damaging the surrounding tissues. Here we describe a novel methodology for intrasulcal ECoG in macaque monkeys. We designed and fabricated ultra-thin flexible probes for macaques with micro-electro-mechanical systems technology. We developed minimally invasive operative protocols to implant the probes by introducing cutting-edge devices for human neurosurgery. To evaluate the feasibility of intrasulcal ECoG, we conducted electrophysiological recording and stimulation experiments. First, we inserted parts of the Parylene-C-based probe into the superior temporal sulcus to compare visually evoked ECoG responses from the ventral bank of the sulcus with those from the surface of the inferior temporal cortex. Analyses of power spectral density and signal-to-noise ratio revealed that the quality of the ECoG signal was comparable inside and outside of the sulcus. Histological examination revealed no obvious physical damage in the implanted areas. Second, we placed a modified silicone ECoG probe into the central sulcus and also on the surface of the precentral gyrus for stimulation. Thresholds for muscle twitching were significantly lower during intrasulcal stimulation compared to gyral stimulation. These results demonstrate the feasibility of intrasulcal ECoG in macaques. The novel methodology proposed here opens up a new frontier in neuroscience research, enabling the direct measurement and manipulation of electrical activity in the whole brain.

  8. Blackfolds, plane waves and minimal surfaces

    NASA Astrophysics Data System (ADS)

    Armas, Jay; Blau, Matthias

    2015-07-01

    Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.

  9. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  10. One-dimensional Gromov minimal filling problem

    NASA Astrophysics Data System (ADS)

    Ivanov, Alexandr O.; Tuzhilin, Alexey A.

    2012-05-01

    The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.

  11. Waste Minimization Measurement and Progress Reporting

    SciTech Connect

    Stone, K.A.

    1995-02-13

    Westinghouse Savannah River Company is implementing productivity improvement concepts into the Waste Minimization Program by focusing on the positive initiatives taken to reduce waste generation at the Savannah River Site. Previous performance measures, based only on waste generation rates, proved to be an ineffective metric for measuring performance and promoting continuous improvements within the Program. Impacts of mission changes and non-routine operations impeded development of baseline waste generation rates and often negated waste generation trending reports. A system was developed to quantify, document and track innovative activities that impact waste volume and radioactivity/toxicity reductions. This system coupled with Management-driven waste disposal avoidance goals is proving to be a powerful tool to promote waste minimization awareness and the implementation of waste reduction initiatives. Measurement of waste not generated, in addition to waste generated, increases the credibility of the Waste Minimization Program, improves sharing of success stories, and supports development of regulatory and management reports

  12. Minimal Length Scale Scenarios for Quantum Gravity.

    PubMed

    Hossenfelder, Sabine

    2013-01-01

    We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.

  13. Genetic Research on Biospecimens Poses Minimal Risk

    PubMed Central

    Wendler, David S.; Rid, Annette

    2014-01-01

    Genetic research on human biospecimens is increasingly common. Yet, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to address this concern using the widely-endorsed ‘risks of daily life’ standard. The three extant versions of this standard all suggest that, with proper measures in place to protect donor confidentiality, most genetic research on human biospecimens poses minimal risk to donors. PMID:25530152

  14. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  15. PRIME: Phase Retrieval via Majorization-Minimization

    NASA Astrophysics Data System (ADS)

    Qiu, Tianyu; Babu, Prabhu; Palomar, Daniel P.

    2016-10-01

    This paper considers the phase retrieval problem in which measurements consist of only the magnitude of several linear measurements of the unknown, e.g., spectral components of a time sequence. We develop low-complexity algorithms with superior performance based on the majorization-minimization (MM) framework. The proposed algorithms are referred to as PRIME: Phase Retrieval vIa the Majorization-minimization techniquE. They are preferred to existing benchmark methods since at each iteration a simple surrogate problem is solved with a closed-form solution that monotonically decreases the original objective function. In total, four algorithms are proposed using different majorization-minimization techniques. Experimental results validate that our algorithms outperform existing methods in terms of successful recovery and mean square error under various settings.

  16. Minimally invasive neurosurgery for cerebrospinal fluid disorders.

    PubMed

    Guillaume, Daniel J

    2010-10-01

    This article focuses on minimally invasive approaches used to address disorders of cerebrospinal fluid (CSF) circulation. The author covers the primary CSF disorders that are amenable to minimally invasive treatment, including aqueductal stenosis, fourth ventricular outlet obstruction (including Chiari malformation), isolated lateral ventricle, isolated fourth ventricle, multiloculated hydrocephalus, arachnoid cysts, and tumors that block CSF flow. General approaches to evaluating disorders of CSF circulation, including detailed imaging studies, are discussed. Approaches to minimally invasive management of such disorders are described in general, and for each specific entity. For each procedure, indications, surgical technique, and known outcomes are detailed. Specific complications as well as strategies for their avoidance and management are addressed. Lastly, future directions and the need for structured outcome studies are discussed.

  17. The GO-FLOW methodology

    SciTech Connect

    Matsuoka, T.; Kobayashi, M.; Takemura, K.

    1989-03-01

    A reliability analysis using the GO-FLOW methodology is given for the emergency core cooling system (ECCS) of a marine reactor experiencing either a collision or a grounding accident. The analysis is an example of a phased mission problem, and the system is a relatively large system with 90 components. An overview of the GO-FLOW methodology, a description of the ECCS, and the analysis procedure are given. Time-dependent mission unreliabilities under three accident conditions are obtained by one GO-FLOW chart with one computer run. The GO-FLOW methodology has proved to be a useful tool for probabilistic safety assessments of actual systems.

  18. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  19. Minimally invasive surgical techniques in periodontal regeneration.

    PubMed

    Cortellini, Pierpaolo

    2012-09-01

    A review of the current scientific literature was undertaken to evaluate the efficacy of minimally invasive periodontal regenerative surgery in the treatment of periodontal defects. The impact on clinical outcomes, surgical chair-time, side effects and patient morbidity were evaluated. An electronic search of PUBMED database from January 1987 to December 2011 was undertaken on dental journals using the key-word "minimally invasive surgery". Cohort studies, retrospective studies and randomized controlled clinical trials referring to treatment of periodontal defects with at least 6 months of follow-up were selected. Quality assessment of the selected studies was done through the Strength of Recommendation Taxonomy Grading (SORT) System. Ten studies (1 retrospective, 5 cohorts and 4 RCTs) were included. All the studies consistently support the efficacy of minimally invasive surgery in the treatment of periodontal defects in terms of clinical attachment level gain, probing pocket depth reduction and minimal gingival recession. Six studies reporting on side effects and patient morbidity consistently indicate very low levels of pain and discomfort during and after surgery resulting in a reduced intake of pain-killers and very limited interference with daily activities in the post-operative period. Minimally invasive surgery might be considered a true reality in the field of periodontal regeneration. The observed clinical improvements are consistently associated with very limited morbidity to the patient during the surgical procedure as well as in the post-operative period. Minimally invasive surgery, however, cannot be applied at all cases. A stepwise decisional algorithm should support clinicians in choosing the treatment approach.

  20. Minimally Invasive Treatment of Spine Trauma.

    PubMed

    McGowan, Jason E; Ricks, Christian B; Kanter, Adam S

    2017-01-01

    The role for minimally invasive surgery (MIS) continues to expand in the management of spinal pathology. In the setting of trauma, operative techniques that can minimize morbidity without compromising clinical efficacy have significant value. MIS techniques are associated with decreased intraoperative blood loss, operative time, and morbidity, while providing patients with comparable outcomes when compared with conventional open procedures. MIS interventions further enable earlier mobilization, decreased hospital stay, decreased pain, and an earlier return to baseline function when compared with traditional techniques. This article reviews patient selection and select MIS techniques for those who have suffered traumatic spinal injury.

  1. The Parisi Formula has a Unique Minimizer

    NASA Astrophysics Data System (ADS)

    Auffinger, Antonio; Chen, Wei-Kuo

    2015-05-01

    In 1979, Parisi (Phys Rev Lett 43:1754-1756, 1979) predicted a variational formula for the thermodynamic limit of the free energy in the Sherrington-Kirkpatrick model, and described the role played by its minimizer. This formula was verified in the seminal work of Talagrand (Ann Math 163(1):221-263, 2006) and later generalized to the mixed p-spin models by Panchenko (Ann Probab 42(3):946-958, 2014). In this paper, we prove that the minimizer in Parisi's formula is unique at any temperature and external field by establishing the strict convexity of the Parisi functional.

  2. Minimally invasive treatments for venous compression syndromes

    PubMed Central

    Hulsberg, Paul C.; McLoney, Eric; Partovi, Sasan; Davidson, Jon C.

    2016-01-01

    The management of venous compression syndromes has historically been reliant on surgical treatment when conservative measures fail. There are, however, several settings in which endovascular therapy can play a significant role as an adjunct or even a replacement to more invasive surgical methods. We explore the role of minimally invasive treatment options for three of the most well-studied venous compression syndromes. The clinical aspects and pathophysiology of Paget-Schroetter syndrome (PSS), nutcracker syndrome, and May-Thurner syndrome are discussed in detail, with particular emphasis on the role that interventionalists can play in minimally invasive treatment. PMID:28123978

  3. Instabilities and Solitons in Minimal Strips.

    PubMed

    Machon, Thomas; Alexander, Gareth P; Goldstein, Raymond E; Pesci, Adriana I

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ^{4} theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  4. Reversible Rings with Involutions and Some Minimalities

    PubMed Central

    Fakieh, W. M.; Nauman, S. K.

    2013-01-01

    In continuation of the recent developments on extended reversibilities on rings, we initiate here a study on reversible rings with involutions, or, in short, ∗-reversible rings. These rings are symmetric, reversible, reflexive, and semicommutative. In this note we will study some properties and examples of ∗-reversible rings. It is proved here that the polynomial rings of ∗-reversible rings may not be ∗-reversible. A criterion for rings which cannot adhere to any involution is developed and it is observed that a minimal noninvolutary ring is of order 4 and that a minimal noncommutative ∗-reversible ring is of order 16. PMID:24489510

  5. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  6. Minimally invasive treatments for perforator vein insufficiency

    PubMed Central

    Salazar, Gloria Maria; Prabhakar, Anand M.; Ganguli, Suvranu

    2016-01-01

    Incompetent superficial veins are the most common cause of lower extremity superficial venous reflux and varicose veins; however, incompetent or insufficient perforator veins are the most common cause of recurrent varicose veins after treatment, often unrecognized. Perforator vein insufficiency can result in pain, skin changes, and skin ulcers, and often merit intervention. Minimally invasive treatments have replaced traditional surgical treatments for incompetent perforator veins. Current minimally invasive treatment options include ultrasound guided sclerotherapy (USGS) and endovascular thermal ablation (EVTA) with either laser or radiofrequency energy sources. Advantages and disadvantages of each modality and knowledge on these treatments are required to adequately address perforator venous disease. PMID:28123979

  7. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D.W.; Bennett, B.L.; Cockroft, N.J.

    1998-09-08

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal. 5 figs.

  8. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D. Wayne; Bennett, Bryan L.; Cockroft, Nigel J.

    1998-01-01

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal.

  9. Instabilities and Solitons in Minimal Strips

    NASA Astrophysics Data System (ADS)

    Machon, Thomas; Alexander, Gareth P.; Goldstein, Raymond E.; Pesci, Adriana I.

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ4 theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  10. Linearized non-minimal higher curvature supergravity

    NASA Astrophysics Data System (ADS)

    Farakos, Fotis; Kehagias, Alex; Koutrolikos, Konstantinos

    2015-05-01

    In the framework of linearized non-minimal supergravity (20/20), we present the embedding of the R +R2 model and we analyze its field spectrum. As usual, the auxiliary fields of the Einstein theory now become propagating, giving rise to additional degrees of freedom, which organize themselves into on-shell irreducible supermultiplets. By performing the analysis both in component and superspace formulations we identify the new supermultiplets. On top of the two massive chiral superfields reminiscent of the old-minimal supergravity embedding, the spectrum contains also a consistent physical, massive, vector supermultiplet and a tachyonic ghost, massive, vector supermultiplet.

  11. Minimally invasive plate osteosynthesis: tibia and fibula.

    PubMed

    Beale, Brian S; McCally, Ryan

    2012-09-01

    Fractures of the tibia and fibula are common in dogs and cats and occur most commonly as a result of substantial trauma. Tibial fractures are often amenable to repair using the minimally invasive plate osteosynthesis (MIPO) technique because of the minimal soft tissue covering of the tibia and relative ease of indirect reduction and application of the implant system on the tibia. Treatment of tibial fractures by MIPO has been found to reduce surgical time, reduce the time for fracture healing, and decrease patient morbidity, while at the same time reducing complications compared with traditional open reduction and internal fixation.

  12. DSN data systems software methodology

    NASA Technical Reports Server (NTRS)

    Hung, C. K.

    1982-01-01

    A software methodology for JPL deep space network (DSN) data systems software implementations through transfer and delivery is presented. The DSN Data Systems Software Methodology is compatible with and depends on DSN software methodology and also incorporates the characteristics of real-time program development in a DSN environment. The DSN Data Systems software implementation consists of a series of six distinct phases. An Independent Group is responsible for verification and validation of the DSN Data Systems software during developing phases. The DSN data systems software methodology is applied to all development software provided for or by the DSN data systems section in Mark IV where there is a desire for reliability, maintainability, and usability within budget and schedule constraints.

  13. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  14. Methodological Problems of Soviet Pedagogy

    ERIC Educational Resources Information Center

    Noah, Harold J., Ed.; Beach, Beatrice S., Ed.

    1974-01-01

    Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)

  15. Mach, methodology, hysteresis and economics

    NASA Astrophysics Data System (ADS)

    Cross, R.

    2008-11-01

    This methodological note examines the epistemological foundations of hysteresis with particular reference to applications to economic systems. The economy principles of Ernst Mach are advocated and used in this assessment.

  16. [Guidelines for nursing methodology implantation].

    PubMed

    Alberdi Castell, Rosamaría; Artigas Lelong, Berta; Cuxart Ainaud, Núria; Agüera Ponce, Ana

    2003-09-01

    The authors introduce three guidelines as part of the process to implant the nursing methodology based on the Virginia Henderson Conceptual Model; they propose to help nurses adopt the aforementioned method in their daily practice. These three guidelines shall be published in successive articles: Guidelines to identify attitudes and aptitudes related to the nursing profession; Guidelines to implant the nursing methodology based on the Virginia Henderson Conceptual Model; and Guidelines to plan areas for improvement.

  17. Banach spaces that realize minimal fillings

    NASA Astrophysics Data System (ADS)

    Bednov, B. B.; Borodin, P. A.

    2014-04-01

    It is proved that a real Banach space realizes minimal fillings for all its finite subsets (a shortest network spanning a fixed finite subset always exists and has the minimum possible length) if and only if it is a predual of L_1. The spaces L_1 are characterized in terms of Steiner points (medians). Bibliography: 25 titles.

  18. The Biochemical Basis of Minimal Brain Dysfunction

    ERIC Educational Resources Information Center

    Shaywitz, Sally E.; And Others

    1978-01-01

    Available from: C. V. Mosby Company 11830 Westline Industrial Drive St. Louis, Missouri 63141 The research review examines evidence suggesting a biochemical basis for minimal brain dysfunction (MBD), which includes both a relationship between MBD and metabolic abnormalities and a significant genetic influence on the disorder in children. (IM)

  19. Botulinum toxin to minimize facial scarring.

    PubMed

    Jablonka, Eric M; Sherris, David A; Gassner, Holger G

    2012-10-01

    Chemoimmobilization with botulinum toxin A is an ideal biochemical agent that allows near-total elimination of muscle pull on the healing facial wound. The goal of chemoimmobilization of facial cutaneous wounds is to eliminate dynamic tension on the healing tissues to improve wound healing and minimize scarring for optimal aesthetic results.

  20. Banach spaces that realize minimal fillings

    SciTech Connect

    Bednov, B. B.; Borodin, P. A. E-mail: pborodin@inbox.ru

    2014-04-30

    It is proved that a real Banach space realizes minimal fillings for all its finite subsets (a shortest network spanning a fixed finite subset always exists and has the minimum possible length) if and only if it is a predual of L{sub 1}. The spaces L{sub 1} are characterized in terms of Steiner points (medians). Bibliography: 25 titles. (paper)

  1. Minimization of Salmonella Contamination on Raw Poultry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many reviews have discussed Salmonella in poultry and suggested best practices to minimize this organism on raw poultry meat. Despite years of research and conscientious control efforts by industry and regulatory agencies, human salmonellosis rates have declined only modestly and Salmonella is stil...

  2. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. The intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative s...

  3. Minimal Guidelines for Authors of Web Pages.

    ERIC Educational Resources Information Center

    ADE Bulletin, 2002

    2002-01-01

    Presents guidelines that recommend the minimal reference information that should be provided on Web pages intended for use by students, teachers, and scholars in the modern languages. Suggests the inclusion of information about responsible parties, copyright declaration, privacy statements, and site information. Makes a note on Web page style. (SG)

  4. MULTIOBJECTIVE PARALLEL GENETIC ALGORITHM FOR WASTE MINIMIZATION

    EPA Science Inventory

    In this research we have developed an efficient multiobjective parallel genetic algorithm (MOPGA) for waste minimization problems. This MOPGA integrates PGAPack (Levine, 1996) and NSGA-II (Deb, 2000) with novel modifications. PGAPack is a master-slave parallel implementation of a...

  5. Challenging the minimal supersymmetric SU(5) model

    SciTech Connect

    Bajc, Borut; Lavignac, Stéphane; Mede, Timon

    2014-06-24

    We review the main constraints on the parameter space of the minimal renormalizable supersymmetric SU(5) grand unified theory. They consist of the Higgs mass, proton decay, electroweak symmetry breaking and fermion masses. Superpartner masses are constrained both from below and from above, giving hope for confirming or definitely ruling out the theory in the future. This contribution is based on Ref. [1].

  6. Mixed waste minimization/mixed waste avoidance

    SciTech Connect

    Todisco, L.R.

    1994-12-31

    This presentation describes methods for the minimization and volume reduction of low-level radioactive and mixed wastes. Many methods are presented including: source reduction, better waste monitoring activities, waste segregation, recycling, administrative controls, and optimization of waste-generating processes.

  7. Minimally Invasive Mitral Valve Surgery III

    PubMed Central

    Lehr, Eric J.; Guy, T. Sloane; Smith, Robert L.; Grossi, Eugene A.; Shemin, Richard J.; Rodriguez, Evelio; Ailawadi, Gorav; Agnihotri, Arvind K.; Fayers, Trevor M.; Hargrove, W. Clark; Hummel, Brian W.; Khan, Junaid H.; Malaisrie, S. Chris; Mehall, John R.; Murphy, Douglas A.; Ryan, William H.; Salemi, Arash; Segurola, Romualdo J.; Smith, J. Michael; Wolfe, J. Alan; Weldner, Paul W.; Barnhart, Glenn R.; Goldman, Scott M.; Lewis, Clifton T. P.

    2016-01-01

    Abstract Minimally invasive mitral valve operations are increasingly common in the United States, but robotic-assisted approaches have not been widely adopted for a variety of reasons. This expert opinion reviews the state of the art and defines best practices, training, and techniques for developing a successful robotics program. PMID:27662478

  8. Minimizing risk in anonymous egg donation.

    PubMed

    Ahuja, K K; Simons, E G; Nair, S; Rimington, M R; Armar, N A

    2003-11-01

    Assisted conception carries with it known and putative medical and surgical risks. Exposing healthy women to these risks in order to harvest eggs for donation when a safer alternative exists is morally and ethically unacceptable. Egg sharing minimizes risk and provides a source of eggs for donation. Anonymity protects all parties involved and should not be removed.

  9. Minimally Invasive Surgery for Inflammatory Bowel Disease

    PubMed Central

    Holder-Murray, Jennifer; Marsicovetere, Priscilla

    2015-01-01

    Abstract: Surgical management of inflammatory bowel disease is a challenging endeavor given infectious and inflammatory complications, such as fistula, and abscess, complex often postoperative anatomy, including adhesive disease from previous open operations. Patients with Crohn's disease and ulcerative colitis also bring to the table the burden of their chronic illness with anemia, malnutrition, and immunosuppression, all common and contributing independently as risk factors for increased surgical morbidity in this high-risk population. However, to reduce the physical trauma of surgery, technologic advances and worldwide experience with minimally invasive surgery have allowed laparoscopic management of patients to become standard of care, with significant short- and long-term patient benefits compared with the open approach. In this review, we will describe the current state-of the-art for minimally invasive surgery for inflammatory bowel disease and the caveats inherent with this practice in this complex patient population. Also, we will review the applicability of current and future trends in minimally invasive surgical technique, such as laparoscopic “incisionless,” single-incision laparoscopic surgery (SILS), robotic-assisted, and other techniques for the patient with inflammatory bowel disease. There can be no doubt that minimally invasive surgery has been proven to decrease the short- and long-term burden of surgery of these chronic illnesses and represents high-value care for both patient and society. PMID:25989341

  10. Minimal Interventions in the Teaching of Mathematics

    ERIC Educational Resources Information Center

    Foster, Colin

    2014-01-01

    This paper addresses ways in which mathematics pedagogy can benefit from insights gleaned from counselling. Person-centred counselling stresses the value of genuineness, warm empathetic listening and minimal intervention to support people in solving their own problems and developing increased autonomy. Such an approach contrasts starkly with the…

  11. New Diagnostic Terminology for Minimal Brain Dysfunction.

    ERIC Educational Resources Information Center

    Shaywitz, Bennett A.; And Others

    1979-01-01

    Minimal brain dysfunction has been redefined by the American Psychological Association as attention deficit disorder (ADD) and subdivided into categories with and without hyperactivity. The revised 'Diagnostic and Statistical Manual' (DSM III) is now undergoing field trials. Journal Availability: C. V. Mosby Company, 11830 Westline Industrial…

  12. Minimal Mimicry: Mere Effector Matching Induces Preference

    ERIC Educational Resources Information Center

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-01-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually…

  13. Waste minimization in environmental sampling and analysis

    SciTech Connect

    Brice, D.A.; Nixon, J. . Fernald Environmental Management Project); Lewis, E.T. )

    1992-01-01

    Environmental investigations of the extent and effect of contamination, and projects to remediate such contamination, are designed to mitigate perceived threats to human health and the environment. During the course of these investigations, excavations, borings, and monitoring wells are constructed: monitoring wells are developed and purged prior to sampling; samples are collected; equipment is decontaminated; constituents extracted and analyzed; and personal protective equipment is used to keep workers safe. All of these activities generate waste. A large portion of this waste may be classified as hazardous based on characteristics or constituent components. Waste minimization is defined as reducing the volume and/or toxicity of waste generated by a process. Waste minimization has proven to be an effective means of cost reduction and improving worker health, safety, and environmental awareness in the industrial workplace through pollution prevention. Building waste minimization goals into a project during the planning phase is both cost effective and consistent with total quality management principles. Application of waste minimization principles should be an integral part of the planning and conduct of environmental investigations. Current regulatory guidance on planning environmental investigations focuses on data quality and risk assessment objectives. Waste minimization should also be a scoping priority, along with meeting worker protection requirements, protection of human health and the environment, and achieving data quality objectives. Waste volume or toxicity can be reduced through the use of smaller sample sizes, less toxic extraction solvents, less hazardous decontamination materials, smaller excavations and borings, smaller diameter monitoring wells, dedicated sampling equipment, well-fitting personal protective equipment, judicious use of screening technologies, and analyzing only for parameters of concern.

  14. Waste minimization in environmental sampling and analysis

    SciTech Connect

    Brice, D.A.; Nixon, J.; Lewis, E.T.

    1992-03-01

    Environmental investigations of the extent and effect of contamination, and projects to remediate such contamination, are designed to mitigate perceived threats to human health and the environment. During the course of these investigations, excavations, borings, and monitoring wells are constructed: monitoring wells are developed and purged prior to sampling; samples are collected; equipment is decontaminated; constituents extracted and analyzed; and personal protective equipment is used to keep workers safe. All of these activities generate waste. A large portion of this waste may be classified as hazardous based on characteristics or constituent components. Waste minimization is defined as reducing the volume and/or toxicity of waste generated by a process. Waste minimization has proven to be an effective means of cost reduction and improving worker health, safety, and environmental awareness in the industrial workplace through pollution prevention. Building waste minimization goals into a project during the planning phase is both cost effective and consistent with total quality management principles. Application of waste minimization principles should be an integral part of the planning and conduct of environmental investigations. Current regulatory guidance on planning environmental investigations focuses on data quality and risk assessment objectives. Waste minimization should also be a scoping priority, along with meeting worker protection requirements, protection of human health and the environment, and achieving data quality objectives. Waste volume or toxicity can be reduced through the use of smaller sample sizes, less toxic extraction solvents, less hazardous decontamination materials, smaller excavations and borings, smaller diameter monitoring wells, dedicated sampling equipment, well-fitting personal protective equipment, judicious use of screening technologies, and analyzing only for parameters of concern.

  15. Spatially explicit methodology for coordinated manure management in shared watersheds.

    PubMed

    Sharara, Mahmoud; Sampat, Apoorva; Good, Laura W; Smith, Amanda S; Porter, Pamela; Zavala, Victor M; Larson, Rebecca; Runge, Troy

    2017-05-01

    Increased clustering and consolidation of livestock production systems has been linked to adverse impacts on water quality. This study presents a methodology to optimize manure management within a hydrologic region to minimize agricultural phosphorus (P) loss associated with winter manure application. Spatial and non-spatial data representing livestock, crop, soil, terrain and hydrography were compiled to determine manure P production rates, crop P uptake, existing manure storage capabilities, and transportation distances. Field slope, hydrologic soil group (HSG), and proximity to waterbodies were used to classify crop fields according to their runoff risk for winter-applied manure. We use these data to construct a comprehensive optimization model that identifies optimal location, size, and transportation strategy to achieve environmental and economic goals. The environmental goal was the minimization of daily hauling of manure to environmentally sensitive crop fields, i.e., those classified as high P-loss fields, whereas the economic goal was the minimization of the transportation costs across the entire study area. A case study encompassing two contiguous 10-digit hydrologic unit subwatersheds (HUC-10) in South Central Wisconsin, USA was developed to demonstrate the proposed methodology. Additionally, scenarios representing different management decisions (storage facility maximum volume, and project capital) and production conditions (increased milk production and 20-year future projection) were analyzed to determine their impact on optimal decisions.

  16. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.

  17. Minimizing proteome redundancy in the UniProt Knowledgebase

    PubMed Central

    Bursteinas, Borisas; Britto, Ramona; Bely, Benoit; Auchincloss, Andrea; Rivoire, Catherine; Redaschi, Nicole; O'Donovan, Claire; Martin, Maria Jesus

    2016-01-01

    Advances in high-throughput sequencing have led to an unprecedented growth in genome sequences being submitted to biological databases. In particular, the sequencing of large numbers of nearly identical bacterial genomes during infection outbreaks and for other large-scale studies has resulted in a high level of redundancy in nucleotide databases and consequently in the UniProt Knowledgebase (UniProtKB). Redundancy negatively impacts on database searches by causing slower searches, an increase in statistical bias and cumbersome result analysis. The redundancy combined with the large data volume increases the computational costs for most reuses of UniProtKB data. All of this poses challenges for effective discovery in this wealth of data. With the continuing development of sequencing technologies, it is clear that finding ways to minimize redundancy is crucial to maintaining UniProt's essential contribution to data interpretation by our users. We have developed a methodology to identify and remove highly redundant proteomes from UniProtKB. The procedure identifies redundant proteomes by performing pairwise alignments of sets of sequences for pairs of proteomes and subsequently, applies graph theory to find dominating sets that provide a set of non-redundant proteomes with a minimal loss of information. This method was implemented for bacteria in mid-2015, resulting in a removal of 50 million proteins in UniProtKB. With every new release, this procedure is used to filter new incoming proteomes, resulting in a more scalable and scientifically valuable growth of UniProtKB. Database URL: http://www.uniprot.org/proteomes/ PMID:28025334

  18. The problems of the minimal surface and minimal lineal measure in three dimensions

    SciTech Connect

    Christensen, R.M.

    1994-02-01

    A solution is given to the classical problem of the minimal surface in three dimensions formed from a repeating cell microstructure under isotropic conditions. The solution is found through a global/local minimization procedure and the resulting basic cell is composed of 14 faces. At the junctions where the intersections between faces meet at a point, half of the junctions involve 4 intersections and half involve 3 intersections. The same general solution also applies tot he related minimal lineal measure problem where the measure is that of the length of the intersections connecting the junctions. Some implications and applications for materials science are given.

  19. Minimal Left-Right Symmetric Dark Matter.

    PubMed

    Heeck, Julian; Patra, Sudhanwa

    2015-09-18

    We show that left-right symmetric models can easily accommodate stable TeV-scale dark matter particles without the need for an ad hoc stabilizing symmetry. The stability of a newly introduced multiplet either arises accidentally as in the minimal dark matter framework or comes courtesy of the remaining unbroken Z_{2} subgroup of B-L. Only one new parameter is introduced: the mass of the new multiplet. As minimal examples, we study left-right fermion triplets and quintuplets and show that they can form viable two-component dark matter. This approach is, in particular, valid for SU(2)×SU(2)×U(1) models that explain the recent diboson excess at ATLAS in terms of a new charged gauge boson of mass 2 TeV.

  20. Waste Minimization Program. Air Force Plant 6.

    DTIC Science & Technology

    1986-02-01

    636 WASTE RNIZTXOM P76RW AIR F T L INC / DOYNTON SEACH FL It Of ISD/o fl" FOM3-4--142-SCS1 LSIFIEEEFIGh24/ . son hhshmhhhEEI II 1202 L6 0 2O s 4 111...2-9 3.0 Waste Minimization Program, AFP 6 3-1 3.1 Machine Coolant Waste 3-1 3.2 Engine Oil and Hydraulic Fluid Waste 3-12 3.3 Paint Sludge 3-14 3.4...Beach, Florida, for the purpose of aiding in minimizing waste generation from Air Force industrial facilities. It is not an endorsement of any product

  1. [Invasive and minimally invasive hemodynamic monitoring].

    PubMed

    Hansen, Matthias

    2016-10-01

    Advanced hemodynamic monitoring is necessary for adequate management of high-risk patients or patients with derangement of circulation. Studies demonstrate a benefit of early goal directed therapy in unstable cardiopulmonary situations. In these days we have different possibilities of minimally invasive or invasive hemodynamic monitoring. Minimally invasive measurements like pulse conture analysis or pulse wave analysis being less accurate under some circumstances, however only an artery catheter is needed for cardiac output monitoring. Pulmonary artery, transpulmonary thermodilution and lithium dilution technology have acceptable accuracy in cardiac output measurement. For therapy of unstable circulation there are additionally parameters to obtain. The pulmonary artery catheter is the device with the largest rate of complications, used by a trained crew and with a correct indication, his use is unchained justified.

  2. Advances in minimally invasive neonatal colorectal surgery

    PubMed Central

    Bandi, Ashwath S; Bradshaw, Catherine J; Giuliani, Stefano

    2016-01-01

    Over the last two decades, advances in laparoscopic surgery and minimally invasive techniques have transformed the operative management of neonatal colorectal surgery for conditions such as anorectal malformations (ARMs) and Hirschsprung’s disease. Evolution of surgical care has mainly occurred due to the use of laparoscopy, as opposed to a laparotomy, for intra-abdominal procedures and the development of trans-anal techniques. This review describes these advances and outlines the main minimally invasive techniques currently used for management of ARMs and Hirschsprung’s disease. There does still remain significant variation in the procedures used and this review aims to report the current literature comparing techniques with an emphasis on the short- and long-term clinical outcomes. PMID:27830038

  3. Minimal walking technicolor: Setup for collider physics

    SciTech Connect

    Foadi, Roshan; Frandsen, Mads T.; Ryttov, Thomas A.; Sannino, Francesco

    2007-09-01

    Different theoretical and phenomenological aspects of the minimal and nonminimal walking technicolor theories have recently been studied. The goal here is to make the models ready for collider phenomenology. We do this by constructing the low energy effective theory containing scalars, pseudoscalars, vector mesons, and other fields predicted by the minimal walking theory. We construct their self-interactions and interactions with standard model fields. Using the Weinberg sum rules, opportunely modified to take into account the walking behavior of the underlying gauge theory, we find interesting relations for the spin-one spectrum. We derive the electroweak parameters using the newly constructed effective theory and compare the results with the underlying gauge theory. Our analysis is sufficiently general such that the resulting model can be used to represent a generic walking technicolor theory not at odds with precision data.

  4. The Minimal Supersymmetric Fat Higgs Model

    SciTech Connect

    Harnik, Roni; Kribs, Graham D.; Larson, Daniel T.; Murayama, Hitoshi

    2003-11-26

    We present a calculable supersymmetric theory of a composite"fat'" Higgs boson. Electroweak symmetry is broken dynamically through a new gauge interaction that becomes strong at an intermediate scale. The Higgs mass can easily be 200-450 GeV along with the superpartner masses, solving the supersymmetric little hierarchy problem. We explicitly verify that the model is consistent with precision electroweak data without fine-tuning. Gauge coupling unification can be maintained despite the inherently strong dynamics involved in electroweak symmetry breaking. Supersymmetrizing the Standard Model therefore does not imply a light Higgs mass, contrary to the lore in the literature. The Higgs sector of the minimal Fat Higgs model has a mass spectrum that is distinctly different from the Minimal Supersymmetric Standard Model.

  5. A Minimal Periods Algorithm with Applications

    NASA Astrophysics Data System (ADS)

    Xu, Zhi

    Kosaraju in "Computation of squares in a string" briefly described a linear-time algorithm for computing the minimal squares starting at each position in a word. Using the same construction of suffix trees, we generalize his result and describe in detail how to compute the minimal α power, with a period of length longer than s, starting at each position in a word w for arbitrary exponent α> 1 and integer s ≥ 0. The algorithm runs in O(α|w|)-time for s = 0 and in O(|w|2)-time otherwise. We provide a complete proof of the correctness and computational complexity of the algorithm. The algorithm can be used to detect certain types of pseudo-patterns in words, which was our original goal in studying this generalization.

  6. Towards synthesis of a minimal cell.

    PubMed

    Forster, Anthony C; Church, George M

    2006-01-01

    Construction of a chemical system capable of replication and evolution, fed only by small molecule nutrients, is now conceivable. This could be achieved by stepwise integration of decades of work on the reconstitution of DNA, RNA and protein syntheses from pure components. Such a minimal cell project would initially define the components sufficient for each subsystem, allow detailed kinetic analyses and lead to improved in vitro methods for synthesis of biopolymers, therapeutics and biosensors. Completion would yield a functionally and structurally understood self-replicating biosystem. Safety concerns for synthetic life will be alleviated by extreme dependence on elaborate laboratory reagents and conditions for viability. Our proposed minimal genome is 113 kbp long and contains 151 genes. We detail building blocks already in place and major hurdles to overcome for completion.

  7. Topological minimally entangled states via geometric measure

    NASA Astrophysics Data System (ADS)

    Buerschaper, Oliver; García-Saez, Artur; Orús, Román; Wei, Tzu-Chieh

    2014-11-01

    Here we show how the Minimally Entangled States (MES) of a 2d system with topological order can be identified using the geometric measure of entanglement. We show this by minimizing this measure for the doubled semion, doubled Fibonacci and toric code models on a torus with non-trivial topological partitions. Our calculations are done either quasi-exactly for small system sizes, or using the tensor network approach in Orús et al (arXiv:1406.0585) for large sizes. As a byproduct of our methods, we see that the minimisation of the geometric entanglement can also determine the number of Abelian quasiparticle excitations in a given model. The results in this paper provide a very efficient and accurate way of extracting the full topological information of a 2d quantum lattice model from the multipartite entanglement structure of its ground states.

  8. Minimizing broadband excitation under dissipative conditions

    NASA Astrophysics Data System (ADS)

    Gelman, David; Kosloff, Ronnie

    2005-12-01

    Optimal control theory is employed for the task of minimizing the excited-state population of a dye molecule in solution. The spectrum of the excitation pulse is contained completely in the absorption band of the molecule. Only phase control is studied which is equivalent to optimizing the transmission of the pulse through the medium. The molecular model explicitly includes two electronic states and a single vibrational mode. The other degrees of freedom are classified as bath modes. The surrogate Hamiltonian method is employed to incorporate these bath degrees of freedom. Their influence can be classified as electronic dephasing and vibrational relaxation. In accordance with experimental results, minimal excitation is associated with a negatively chirped pulses. Optimal pulses with more complex transient structure are found to be superior to linearly chirped pulses. The difference is enhanced when the fluence is increased. The improvement degrades when dissipative effects become more dominant.

  9. [Minimally invasive operations in vascular surgery].

    PubMed

    Stádler, Petr; Sedivý, Petr; Dvorácek, Libor; Slais, Marek; Vitásek, Petr; El Samman, Khaled; Matous, Pavel

    2011-01-01

    Minimally invasive surgery provides an attractive alternative compared with conventional surgical approaches and is popular with patients, particularly because of its favourable cosmetic results. Vascular surgery has taken its inspiration from general surgery and, over the past few years, has also been reducing the invasiveness of its operating methods. In addition to traditional laparoscopic techniques, we most frequently encounter the endovascular treatment of aneurysms of the thoracic and abdominal aorta and, most recently, robot-assisted surgery in the area of the abdominal aorta and pelvic arteries. Minimally invasive surgical interventions also have other advantages, including less operative trauma, a reduction in post-operative pain, shorter periods spent in the intensive care unit and overall hospitalization times, an earlier return to normal life and, finally, a reduction in total treatment costs.

  10. Minimize reference sideband generation in microwave PLLs

    NASA Astrophysics Data System (ADS)

    Goldman, Stan

    1991-02-01

    The processes responsible for producing reference sidebands are outlined, and the sources of coupling to the microwave voltage-controlled oscillator (VCO) tune line including power-supply-generated signals, TTL-controlled interface signals, intermediate programmable-divider signals, and radiated TTL signals are discussed. It is noted that filtering alone is inadequate for reference-sideband suppression, while minimizing the tuning slope and maximizing the reference frequency will result in a reduced reference-sideband level. Minimizing offset currents by using a differential amplifier connection may reduce the reference-sideband level aggravated by an opamp. The selection of a TTL, ECL, or GaAs phase/frequency detector can determine the level of reference sidebands, as well as PCB isolation techniques.

  11. The Sense of Commitment: A Minimal Approach

    PubMed Central

    Michael, John; Sebanz, Natalie; Knoblich, Günther

    2016-01-01

    This paper provides a starting point for psychological research on the sense of commitment within the context of joint action. We begin by formulating three desiderata: to illuminate the motivational factors that lead agents to feel and act committed, to pick out the cognitive processes and situational factors that lead agents to sense that implicit commitments are in place, and to illuminate the development of an understanding of commitment in ontogeny. In order to satisfy these three desiderata, we propose a minimal framework, the core of which is an analysis of the minimal structure of situations which can elicit a sense of commitment. We then propose a way of conceptualizing and operationalizing the sense of commitment, and discuss cognitive and motivational processes which may underpin the sense of commitment. PMID:26779080

  12. Tall sections from non-minimal transformations

    NASA Astrophysics Data System (ADS)

    Morrison, David R.; Park, Daniel S.

    2016-10-01

    In previous work, we have shown that elliptic fibrations with two sections, or Mordell-Weil rank one, can always be mapped birationally to a Weierstrass model of a certain form, namely, the Jacobian of a P^{112} model. Most constructions of elliptically fibered Calabi-Yau manifolds with two sections have been carried out assuming that the image of this birational map was a "minimal" Weierstrass model. In this paper, we show that for some elliptically fibered Calabi-Yau manifolds with Mordell-Weil rank-one, the Jacobian of the P^{112} model is not minimal. Said another way, starting from a Calabi-Yau Weierstrass model, the total space must be blown up (thereby destroying the "Calabi-Yau" property) in order to embed the model into P^{112} . In particular, we show that the elliptic fibrations studied recently by Klevers and Taylor fall into this class of models.

  13. Minimally processed vegetable salads: microbial quality evaluation.

    PubMed

    Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa

    2007-05-01

    The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had <10(2) CFU/g concentrations of fecal coliforms. L. monocytogenes was detected in only 1 (0.6%) of the 181 samples examined. This positive sample was simultaneously detected by both methods. The other Listeria species identified by plating were L. welshimeri (one sample of curly lettuce) and L. innocua (2 samples of watercress). The results indicate that minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.

  14. [Minimally invasive spine surgery: past and present].

    PubMed

    Corniola, M V; Stienen, M N; Tessitore, E; Schaller, K; Gautschi, O P

    2015-11-18

    In the early twentieth century, the understanding of spine biomechanics and the advent of surgical techniques of the lumbar spine, led to the currently emerging concept of minimal invasive spine surgery, By reducing surgical access, blood loss, infection rate and general morbidity, functional prognosis of patients is improved. This is a real challenge for the spine surgeon, who has to maintain a good operative result by significantly reducing surgical collateral damages due to the relatively traumatic conventional access.

  15. Asymptotic safety, emergence and minimal length

    NASA Astrophysics Data System (ADS)

    Percacci, Roberto; Vacca, Gian Paolo

    2010-12-01

    There seems to be a common prejudice that asymptotic safety is either incompatible with, or at best unrelated to, the other topics in the title. This is not the case. In fact, we show that (1) the existence of a fixed point with suitable properties is a promising way of deriving emergent properties of gravity, and (2) there is a sense in which asymptotic safety implies a minimal length. In doing so we also discuss possible signatures of asymptotic safety in scattering experiments.

  16. Heroin-associated anthrax with minimal morbidity.

    PubMed

    Black, Heather; Chapman, Ann; Inverarity, Donald; Sinha, Satyajit

    2017-03-08

    In 2010, during an outbreak of anthrax affecting people who inject drugs, a heroin user aged 37 years presented with soft tissue infection. He subsequently was found to have anthrax. We describe his management and the difficulty in distinguishing anthrax from non-anthrax lesions. His full recovery, despite an overall mortality of 30% for injectional anthrax, demonstrates that some heroin-related anthrax cases can be managed predominately with oral antibiotics and minimal surgical intervention.

  17. Minimally invasive aesthetic procedures in young adults

    PubMed Central

    Wollina, Uwe; Goldman, Alberto

    2011-01-01

    Age is a significant factor in modifying specific needs when it comes to medical aesthetic procedures. In this review we will focus on young adults in their third decade of life and review minimally invasive aesthetic procedures other than cosmetics and cosmeceuticals. Correction of asymmetries, correction after body modifying procedures, and facial sculpturing are important issues for young adults. The implication of aesthetic medicine as part of preventive medicine is a major ethical challenge that differentiates aesthetic medicine from fashion. PMID:21673871

  18. Minimal Basis for Gauge Theory Amplitudes

    SciTech Connect

    Bjerrum-Bohr, N. E. J.; Damgaard, Poul H.; Vanhove, Pierre

    2009-10-16

    Identities based on monodromy for integrations in string theory are used to derive relations between different color-ordered tree-level amplitudes in both bosonic and supersymmetric string theory. These relations imply that the color-ordered tree-level n-point gauge theory amplitudes can be expanded in a minimal basis of (n-3)exclamation amplitudes. This result holds for any choice of polarizations of the external states and in any number of dimensions.

  19. Cigarette price minimization strategies used by adults.

    PubMed

    Pesko, Michael F; Kruger, Judy; Hyland, Andrew

    2012-09-01

    We used multivariate logistic regressions to analyze data from the 2006 to 2007 Tobacco Use Supplement of the Current Population Survey, a nationally representative sample of adults. We explored use of cigarette price minimization strategies, such as purchasing cartons of cigarettes, purchasing in states with lower after-tax cigarette prices, and purchasing on the Internet. Racial/ethnic minorities and persons with low socioeconomic status used these strategies less frequently at last purchase than did White and high-socioeconomic-status respondents.

  20. Minimally Invasive Diagnosis of Secondary Intracranial Lymphoma

    PubMed Central

    Healy, G. M.; Redmond, C. E.; Stocker, E.; Connaghan, G.; Skehan, S. J.; Killeen, R. P.

    2016-01-01

    Diffuse large B cell lymphomas (DLBCL) are an aggressive group of non-Hodgkin lymphoid malignancies which have diverse presentation and can have high mortality. Central nervous system relapse is rare but has poor survival. We present the diagnosis of primary mandibular DLBCL and a unique minimally invasive diagnosis of secondary intracranial recurrence. This case highlights the manifold radiological contributions to the diagnosis and management of lymphoma. PMID:28018686

  1. On 3D minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Alishahiha, Mohsen; Qaemmaqami, Mohammad M.; Naseh, Ali; Shirzad, Ahmad

    2014-12-01

    We study linearized equations of motion of the newly proposed three dimensional gravity, known as minimal massive gravity, using its metric formulation. By making use of a redefinition of the parameters of the model, we observe that the resulting linearized equations are exactly the same as that of TMG. In particular the model admits logarithmic modes at critical points. We also study several vacuum solutions of the model, specially at a certain limit where the contribution of Chern-Simons term vanishes.

  2. Smooth GERBS, orthogonal systems and energy minimization

    NASA Astrophysics Data System (ADS)

    Dechevsky, Lubomir T.; Zanaty, Peter

    2013-12-01

    New results are obtained in three mutually related directions of the rapidly developing theory of generalized expo-rational B-splines (GERBS) [7, 6]: closed-form computability of C∞-smooth GERBS in terms of elementary and special functions, Hermite interpolation and least-squares best approximation via smooth GERBS, energy minimizing properties of smooth GERBS similar to those of the classical cubic polynomial B-splines.

  3. Smooth GERBS, orthogonal systems and energy minimization

    SciTech Connect

    Dechevsky, Lubomir T. E-mail: pza@hin.no; Zanaty, Peter E-mail: pza@hin.no

    2013-12-18

    New results are obtained in three mutually related directions of the rapidly developing theory of generalized expo-rational B-splines (GERBS) [7, 6]: closed-form computability of C{sup ∞}-smooth GERBS in terms of elementary and special functions, Hermite interpolation and least-squares best approximation via smooth GERBS, energy minimizing properties of smooth GERBS similar to those of the classical cubic polynomial B-splines.

  4. Nonlinear transient analysis via energy minimization

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.

  5. The NLC Software Requirements Methodology

    SciTech Connect

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  6. [Minimally Invasive Treatment of Esophageal Benign Diseases].

    PubMed

    Inoue, Haruhiro

    2016-07-01

    As a minimally invasive treatment of esophageal achalasia per-oral endoscopic myotomy( POEM) was developed in 2008. More than 1,100 cases of achalasia-related diseases received POEM. Success rate of the procedure was more than 95%(Eckerdt score improvement 3 points and more). No serious( Clavian-Dindo classification III b and more) complication was experienced. These results suggest that POEM becomes a standard minimally invasive treatment for achalasia-related diseases. As an off-shoot of POEM submucosal tumor removal through submucosal tunnel (per-oral endoscopic tumor resection:POET) was developed and safely performed. Best indication of POET is less than 5 cm esophageal leiomyoma. A novel endoscopic treatment of gastroesophageal reflux disease (GERD) was developed. Anti-reflux mucosectomy( ARMS) is nearly circumferential mucosal reduction of gastric cardia mucosa. ARMS is performed in 56 consecutive cases of refractory GERD. No major complications were encountered and excellent clinical results. Best indication of ARMS is a refractory GERD without long sliding hernia. Longest follow-up case is more than 10 years. Minimally invasive treatments for esophageal benign diseases are currently performed by therapeutic endoscopy.

  7. Minimally invasive thyroidectomy: a ten years experience

    PubMed Central

    Viani, Lorenzo; Montana, Chiara Montana; Cozzani, Federico; Sianesi, Mario

    2016-01-01

    Background The conventional thyroidectomy is the most frequent surgical procedure for thyroidal surgical disease. From several years were introduced minimally invasive approaches to thyroid surgery. These new procedures improved the incidence of postoperative pain, cosmetic results, patient’s quality of life, postoperative morbidity. The mini invasive video-assisted thyroidectomy (MIVAT) is a minimally invasive procedure that uses a minicervicotomy to treat thyroidal diseases. Methods We present our experience on 497 consecutively treated patients with MIVAT technique. We analyzed the mean age, sex, mean operative time, rate of bleeding, hypocalcemia, transitory and definitive nerve palsy (6 months after the procedure), postoperative pain scale from 0 to 10 at 1 hour and 24 hours after surgery, mean hospital stay. Results The indications to treat were related to preoperative diagnosis: 182 THYR 6, 184 THYR 3–4, 27 plummer, 24 basedow, 28 toxic goiter, 52 goiter. On 497 cases we have reported 1 case of bleeding (0,2%), 12 (2,4%) cases of transitory nerve palsy and 4 (0,8%) definitive nerve palsy. The rate of serologic hypocalcemia was 24.9% (124 cases) and clinical in 7.2% (36 cases); 1 case of hypoparathyroidism (0.2%). Conclusions The MIVAT is a safe approach to surgical thyroid disease, the cost are similar to CT as the adverse events. The minicervicotomy is really a minimally invasive tissue dissection. PMID:27294036

  8. Esophageal surgery in minimally invasive era

    PubMed Central

    Bencini, Lapo; Moraldi, Luca; Bartolini, Ilenia; Coratti, Andrea

    2016-01-01

    The widespread popularity of new surgical technologies such as laparoscopy, thoracoscopy and robotics has led many surgeons to treat esophageal diseases with these methods. The expected benefits of minimally invasive surgery (MIS) mainly include reductions of postoperative complications, length of hospital stay, and pain and better cosmetic results. All of these benefits could potentially be of great interest when dealing with the esophagus due to the potentially severe complications that can occur after conventional surgery. Moreover, robotic platforms are expected to reduce many of the difficulties encountered during advanced laparoscopic and thoracoscopic procedures such as anastomotic reconstructions, accurate lymphadenectomies, and vascular sutures. Almost all esophageal diseases are approachable in a minimally invasive way, including diverticula, gastro-esophageal reflux disease, achalasia, perforations and cancer. Nevertheless, while the limits of MIS for benign esophageal diseases are mainly technical issues and costs, oncologic outcomes remain the cornerstone of any procedure to cure malignancies, for which the long-term results are critical. Furthermore, many of the minimally invasive esophageal operations should be compared to pharmacologic interventions and advanced pure endoscopic procedures; such a comparison requires a difficult literature analysis and leads to some confounding results of clinical trials. This review aims to examine the evidence for the use of MIS in both malignancies and more common benign disease of the esophagus, with a particular emphasis on future developments and ongoing areas of research. PMID:26843913

  9. The non-minimal ekpyrotic trispectrum

    SciTech Connect

    Fertig, Angelika; Lehners, Jean-Luc E-mail: jlehners@aei.mpg.de

    2016-01-01

    Employing the covariant formalism, we derive the evolution equations for two scalar fields with non-canonical field space metric up to third order in perturbation theory. These equations can be used to derive predictions for local bi- and trispectra of multi-field cosmological models. Our main application is to ekpyrotic models in which the primordial curvature perturbations are generated via the non-minimal entropic mechanism. In these models, nearly scale-invariant entropy perturbations are generated first due to a non-minimal kinetic coupling between two scalar fields, and subsequently these perturbations are converted into curvature perturbations. Remarkably, the entropy perturbations have vanishing bi- and trispectra during the ekpyrotic phase. However, as we show, the conversion process to curvature perturbations induces local non-Gaussianity parameters f{sub NL} and g{sub NL} at levels that should be detectable by near-future observations. In fact, in order to obtain a large enough amplitude and small enough bispectrum of the curvature perturbations, as seen in current measurements, the conversion process must be very efficient. Interestingly, for such efficient conversions the trispectrum parameter g{sub NL} remains negative and typically of a magnitude O(10{sup 2})–O(10{sup 3}), resulting in a distinguishing feature of non-minimally coupled ekpyrotic models.

  10. Waste minimization in an autobody repair shop

    SciTech Connect

    Baria, D.N.; Dorland, D.; Bergeron, J.T.

    1994-12-31

    This work was done to document the waste minimization incorporated in a new autobody repair facility in Hermantown, Minnesota. Humes Collision Center incorporated new waste reduction techniques when it expanded its old facilities in 1992 and it was able to achieve the benefits of cost reduction and waste reduction. Humes Collision Center repairs an average of 500 cars annually and is a very small quantity generator (VSQG) of hazardous waste, as defined by the Minnesota Pollution Control Agency (MPCA). The hazardous waste consists of antifreeze, batteries, paint sludge, refrigerants, and used oil, while the nonhazardous waste consists of cardboard, glass, paint filters, plastic, sanding dust, scrap metal, and wastewater. The hazardous and nonhazardous waste output were decreased by 72%. In addition, there was a 63% reduction in the operating costs. The waste minimization includes antifreeze recovery and recycling, reduction in unused waste paint, reduction, recovery and recycle of waste lacquer thinner for cleaning spray guns and paint cups, elimination of used plastic car bags, recovery and recycle of refrigerant, reduction in waste sandpaper and elimination of sanding dust, and elimination of waste paint filters. The rate of return on the investment in waste minimization equipment is estimated from 37% per year for the distillation unit, 80% for vacuum sanding, 146% for computerized paint mixing, 211% for the refrigerant recycler, to 588% per year for the gun washer. The corresponding payback time varies from 3 years to 2 months.

  11. The non-minimal ekpyrotic trispectrum

    NASA Astrophysics Data System (ADS)

    Fertig, Angelika; Lehners, Jean-Luc

    2016-01-01

    Employing the covariant formalism, we derive the evolution equations for two scalar fields with non-canonical field space metric up to third order in perturbation theory. These equations can be used to derive predictions for local bi- and trispectra of multi-field cosmological models. Our main application is to ekpyrotic models in which the primordial curvature perturbations are generated via the non-minimal entropic mechanism. In these models, nearly scale-invariant entropy perturbations are generated first due to a non-minimal kinetic coupling between two scalar fields, and subsequently these perturbations are converted into curvature perturbations. Remarkably, the entropy perturbations have vanishing bi- and trispectra during the ekpyrotic phase. However, as we show, the conversion process to curvature perturbations induces local non-Gaussianity parameters fNL and gNL at levels that should be detectable by near-future observations. In fact, in order to obtain a large enough amplitude and small enough bispectrum of the curvature perturbations, as seen in current measurements, the conversion process must be very efficient. Interestingly, for such efficient conversions the trispectrum parameter gNL remains negative and typically of a magnitude Script O(102)-Script O(103), resulting in a distinguishing feature of non-minimally coupled ekpyrotic models.

  12. Income Smoothing: Methodology and Models.

    DTIC Science & Technology

    1986-05-01

    that managers desire a pattern % of income that has low variability relative to a linear time trend. 2. Industry Trend. Target 2 assumes that firms...R167 55? INCOME SMOOTHING: METHODOLOGY ND NODELS(U) UMVL in1POSTGRADUATE SCHOOL MONTEREY CA 0 D HOSES "AY S6 UNCLASSIFIED NP5-604FO53 E * I* vu...California oCiD ELEC fl MAY 12 986 INCOME SMOOTHING - METHODOLOGY AND MODELS by 0. Douglas Moses May 1986 *Approved frpublic release; ditibto uniie

  13. Methodological pluralism and narrative inquiry

    NASA Astrophysics Data System (ADS)

    Michie, Michael

    2013-09-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on `what meaning is being made' rather than `what is happening here' (quadrant 2 rather than quadrant 1). It is suggested that in using the integral theory model, a qualitative research project focuses primarily on one quadrant and is enhanced by approaches suggested in the other quadrants.

  14. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  15. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  16. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  17. ESP Methodology for Science Lecturers.

    ERIC Educational Resources Information Center

    Rogers, Angela; Mulyana, Cukup

    A program designed to teach university science lecturers in Indonesia how to design and teach one-semester courses in English for special purposes (ESP) is described. The program provided lecturers with training in language teaching methodology and course design. The piloting of the teacher training course, focusing on physics instruction, is…

  18. A methodology for string resolution

    SciTech Connect

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine`s status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records.

  19. TESOL Methodology: Five Annotated Bibliographies.

    ERIC Educational Resources Information Center

    Antoun, Elizabeth; Gebhard, Jerry G.; Gutwein, Geraldine; Kim, Won-Hyeong; Staben, Jennifer; York, Aimee

    The five bibliographies included here were selected from those of a graduate-level class in methodology for teaching English to speakers of other languages (TESOL). They were selected based on the quality of research and writing, interest the topic might have for other English-as-a-second-language teachers, and student permission. They include:…

  20. Philosophy, Methodology and Action Research

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  1. Analytical Utility of Campylobacter Methodologies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The National Advisory Committee on Microbiological Criteria for Foods (NACMCF, or the Committee) was asked to address the analytical utility of Campylobacter methodologies in preparation for an upcoming United States Food Safety and Inspection Service (FSIS) baseline study to enumerate Campylobacter...

  2. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  3. Psychophysical estimation of speed discrimination. I. Methodology.

    PubMed

    Lakshminarayanan, Vasudevan; Raghuram, Aparna; Khanna, Ritu

    2005-10-01

    Thresholds were assessed for a speed discrimination task with a pair of luminance-defined drifting gratings. The design and results of a series of experiments dealing in general with speed discrimination are described. Results show that for a speed discrimination task using drifting gratings, simultaneous presentation of the pair of gratings (spatially separated) was preferred over sequential presentation (temporally separated) in order to minimize the effects of eye movements and tracking. An interstimulus interval of at least 1000 ms was necessary to prevent motion aftereffects on subsequently viewed stimuli. For the two reference speeds tested of 2 and 8 deg/s using identical spatial frequency or randomizing spatial frequency for the pair of gratings did not affect speed discrimination thresholds. Implementing a staircase method of estimating thresholds was preferred over the method of constant stimuli or the method of limits. The results of these experiments were used to define the methodology for an investigation of aging and motion perception. These results will be of interest and use to psychophysicists designing and implementing speed discrimination paradigms.

  4. Cancer and Aging: Epidemiology and Methodological Challenges

    PubMed Central

    Pedersen, Jacob K; Engholm, Gerda; Skytthe, Axel; Christensen, Kaare

    2016-01-01

    Epidemiological cancer data shed light on key questions within basic science, clinical medicine and public health. For decades, Denmark has had linkable health registers that contain individual level data on the entire population with virtually complete follow-up. This has enabled high quality studies of cancer epidemiology and minimized the challenges often faced in many countries, such as uncertain identification of the study base, age misreporting, and low validity of the cancer diagnoses. However, methodological challenges still remain to be addressed, especially in cancer epidemiology studies among the elderly and the oldest-old. E.g., a characteristic pattern for many cancer types is that the incidence increases up to a maximum at about ages 75 to 90 years and is then followed by a decline or a leveling off at the oldest ages. It has been suggested that the oldest individuals may be asymptomatic, or even insusceptible to cancer. An alternative interpretation is that this pattern is an artifact due to lower diagnostic intensity among the elderly and oldest-old caused by higher levels of co-morbidities in this age group. Currently, the available cancer epidemiology data are not able to provide clear evidence for any of these hypotheses. PMID:26825001

  5. A pollution reduction methodology for chemical process simulators

    SciTech Connect

    Mallick, S.K.; Cabezas, H.; Bare, J.C.; Sikdar, S.K.

    1996-11-01

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has been modified by weighing the mass flowrate of each pollutant by its potential environmental impact score. This converts the mass balance into an environmental impact balance. This balance defines an impact index with units of environmental impact per mass of products. The impact index measures the potential environmental effects of process wastes. Three different schemes for chemical ranking were considered: (1) no ranking, (2) simple ranking from 0 to 3, and (3) ranking by a scientifically derived measure of human health and environmental effects. Use of the methodology is illustrated with two examples from the production of (1) methyl ethyl ketone and (2) synthetic ammonia.

  6. Minimizing inter-microscope variability in dental microwear texture analysis

    NASA Astrophysics Data System (ADS)

    Arman, Samuel D.; Ungar, Peter S.; Brown, Christopher A.; DeSantis, Larisa R. G.; Schmidt, Christopher; Prideaux, Gavin J.

    2016-06-01

    A common approach to dental microwear texture analysis (DMTA) uses confocal profilometry in concert with scale-sensitive fractal analysis to help understand the diets of extinct mammals. One of the main benefits of DMTA over other methods is the repeatable, objective manner of data collection. This repeatability, however, is threatened by variation in results of DMTA of the same dental surfaces yielded by different microscopes. Here we compare DMTA data of five species of kangaroos measured on seven profilers of varying specifications. Comparison between microscopes confirms that inter-microscope differences are present, but we show that deployment of a number of automated treatments to remove measurement noise can help minimize inter-microscope differences. Applying these same treatments to a published hominin DMTA dataset shows that they alter some significant differences between dietary groups. Minimising microscope variability while maintaining interspecific dietary differences requires then that these factors are balanced in determining appropriate treatments. The process outlined here offers a solution for allowing comparison of data between microscopes, which is essential for ongoing DMTA research. In addition, the process undertaken, including considerations of other elements of DMTA protocols also promises to streamline methodology, remove measurement noise and in doing so, optimize recovery of a reliable dietary signature.

  7. Feminist Methodologies and Engineering Education Research

    ERIC Educational Resources Information Center

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  8. Minimal formulation of joint motion for biomechanisms

    PubMed Central

    Seth, Ajay; Sherman, Michael; Eastman, Peter; Delp, Scott

    2010-01-01

    Biomechanical systems share many properties with mechanically engineered systems, and researchers have successfully employed mechanical engineering simulation software to investigate the mechanical behavior of diverse biological mechanisms, ranging from biomolecules to human joints. Unlike their man-made counterparts, however, biomechanisms rarely exhibit the simple, uncoupled, pure-axial motion that is engineered into mechanical joints such as sliders, pins, and ball-and-socket joints. Current mechanical modeling software based on internal-coordinate multibody dynamics can formulate engineered joints directly in minimal coordinates, but requires additional coordinates restricted by constraints to model more complex motions. This approach can be inefficient, inaccurate, and difficult for biomechanists to customize. Since complex motion is the rule rather than the exception in biomechanisms, the benefits of minimal coordinate modeling are not fully realized in biomedical research. Here we introduce a practical implementation for empirically-defined internal-coordinate joints, which we call “mobilizers.” A mobilizer encapsulates the observations, measurement frame, and modeling requirements into a hinge specification of the permissible-motion manifold for a minimal set of internal coordinates. Mobilizers support nonlinear mappings that are mathematically equivalent to constraint manifolds but have the advantages of fewer coordinates, no constraints, and exact representation of the biomechanical motion-space—the benefits long enjoyed for internal-coordinate models of mechanical joints. Hinge matrices within the mobilizer are easily specified by user-supplied functions, and provide a direct means of mapping permissible motion derived from empirical data. We present computational results showing substantial performance and accuracy gains for mobilizers versus equivalent joints implemented with constraints. Examples of mobilizers for joints from human biomechanics

  9. Sensorless Force Sensing for Minimally Invasive Surgery

    PubMed Central

    Zhao, Baoliang; Nelson, Carl A.

    2015-01-01

    Robotic minimally invasive surgery (R-MIS) has achieved success in various procedures; however, the lack of haptic feedback is considered by some to be a limiting factor. The typical method to acquire tool–tissue reaction forces is attaching force sensors on surgical tools, but this complicates sterilization and makes the tool bulky. This paper explores the feasibility of using motor current to estimate tool-tissue forces and demonstrates acceptable results in terms of time delay and accuracy. This sensorless force estimation method sheds new light on the possibility of equipping existing robotic surgical systems with haptic interfaces that require no sensors and are compatible with existing sterilization methods. PMID:27222680

  10. Functional minimization problems in image processing

    NASA Astrophysics Data System (ADS)

    Kim, Yunho; Vese, Luminita A.

    2008-02-01

    In this work we wish to recover an unknown image from a blurry version. We solve this inverse problem by energy minimization and regularization. We seek a solution of the form u + v, where u is a function of bounded variation (cartoon component), while v is an oscillatory component (texture), modeled by a Sobolev function with negative degree of differentiability. Experimental results show that this cartoon + texture model better recovers textured details in natural images, by comparison with the more standard models where the unknown is restricted only to the space of functions of bounded variation.

  11. Nonunity gain minimal-disturbance measurement

    SciTech Connect

    Sabuncu, Metin; Andersen, Ulrik L.; Mista, Ladislav Jr.; Fiurasek, Jaromir; Filip, Radim; Leuchs, Gerd

    2007-09-15

    We propose and experimentally demonstrate an optimal nonunity gain Gaussian scheme for partial measurement of an unknown coherent state that causes minimal disturbance of the state. The information gain and the state disturbance are quantified by the noise added to the measurement outcomes and to the output state, respectively. We derive the optimal trade-off relation between the two noises and we show that the tradeoff is saturated by nonunity gain teleportation. Optimal partial measurement is demonstrated experimentally using a linear optics scheme with feedforward.

  12. Minimal Hepatic Encephalopathy Impairs Quality of Life

    PubMed Central

    Agrawal, Swastik; Umapathy, Sridharan; Dhiman, Radha K.

    2015-01-01

    Minimal hepatic encephalopathy (MHE) is the mildest form of the spectrum of neurocognitive impairment in cirrhosis. It is a frequent occurrence in patients of cirrhosis and is detectable only by specialized neurocognitive testing. MHE is a clinically significant disorder which impairs daily functioning, driving performance, work capability and learning ability. It also predisposes to the development of overt hepatic encephalopathy, increased falls and increased mortality. This results in impaired quality of life for the patient as well as significant social and economic burden for health providers and care givers. Early detection and treatment of MHE with ammonia lowering therapy can reverse MHE and improve quality of life. PMID:26041957

  13. Minimally invasive splenectomy: an update and review.

    PubMed

    Gamme, Gary; Birch, Daniel W; Karmali, Shahzeer

    2013-08-01

    Laparoscopic splenectomy (LS) has become an established standard of care in the management of surgical diseases of the spleen. The present article is an update and review of current procedures and controversies regarding minimally invasive splenectomy. We review the indications and contraindications for LS as well as preoperative considerations. An individual assessment of the procedures and outcomes of multiport laparoscopic splenectomy, hand-assisted laparoscopic splenectomy, robotic splenectomy, natural orifice transluminal endoscopic splenectomy and single-port splenectomy is included. Furthermore, this review examines postoperative considerations after LS, including the postoperative course of uncomplicated patients, postoperative portal vein thrombosis, infections and malignancy.

  14. Strategies for minimizing nosocomial measles transmission.

    PubMed Central

    Biellik, R. J.; Clements, C. J.

    1997-01-01

    As a result of the highly contagious nature of measles before the onset of rash, nosocomial transmission will remain a threat until the disease is eradicated. However, a number of strategies can minimize its nosocomial spread. It is therefore vital to maximize awareness among health care staff that an individual with measles can enter a health facility at any time and that a continual risk of the nosocomial transmission of measles exists. The present review makes two groups of recommendations: those which are generally applicable to all countries, and certain additional recommendations which may be suitable only for industrialized countries. PMID:9342896

  15. Solar array stepping to minimize array excitation

    NASA Technical Reports Server (NTRS)

    Bhat, Mahabaleshwar K. P. (Inventor); Liu, Tung Y. (Inventor); Plescia, Carl T. (Inventor)

    1989-01-01

    Mechanical oscillations of a mechanism containing a stepper motor, such as a solar-array powered spacecraft, are reduced and minimized by the execution of step movements in pairs of steps, the period between steps being equal to one-half of the period of torsional oscillation of the mechanism. Each pair of steps is repeated at needed intervals to maintain desired continuous movement of the portion of elements to be moved, such as the solar array of a spacecraft. In order to account for uncertainty as well as slow change in the period of torsional oscillation, a command unit may be provided for varying the interval between steps in a pair.

  16. Qualifying and quantifying minimal hepatic encephalopathy.

    PubMed

    Morgan, Marsha Y; Amodio, Piero; Cook, Nicola A; Jackson, Clive D; Kircheis, Gerald; Lauridsen, Mette M; Montagnese, Sara; Schiff, Sami; Weissenborn, Karin

    2016-12-01

    Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables. There is no gold standard for the diagnosis of this syndrome. As these patients have, by definition, no recognizable clinical features of brain dysfunction, the primary prerequisite for the diagnosis is careful exclusion of clinical symptoms and signs. A large number of psychometric tests/test systems have been evaluated in this patient group. Of these the best known and validated is the Portal Systemic Hepatic Encephalopathy Score (PHES) derived from a test battery of five paper and pencil tests; normative reference data are available in several countries. The electroencephalogram (EEG) has been used to diagnose hepatic encephalopathy since the 1950s but, once popular, the technology is not as accessible now as it once was. The performance characteristics of the EEG are critically dependent on the type of analysis undertaken; spectral analysis has better performance characteristics than visual analysis; evolving analytical techniques may provide better diagnostic information while the advent of portable wireless headsets may facilitate more widespread use. A large number of other diagnostic tools have been validated for the diagnosis of minimal hepatic encephalopathy including Critical Flicker Frequency, the Inhibitory Control Test, the Stroop test, the Scan package and the Continuous Reaction Time; each has its pros and cons; strengths and weaknesses; protagonists and detractors. Recent AASLD/EASL Practice Guidelines suggest that the diagnosis of minimal hepatic encephalopathy should be based on the PHES test together with one of the validated alternative techniques or the EEG. Minimal hepatic encephalopathy has a detrimental effect on the well-being of patients and their care

  17. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  18. Minimally Invasive Approach of a Retrocaval Ureter

    PubMed Central

    Pinheiro, Hugo; Ferronha, Frederico; Morales, Jorge; Campos Pinheiro, Luís

    2016-01-01

    The retrocaval ureter is a rare congenital entity, classically managed with open pyeloplasty techniques. The experience obtained with the laparoscopic approach of other more frequent causes of ureteropelvic junction (UPJ) obstruction has opened the method for the minimally invasive approach of the retrocaval ureter. In our paper, we describe a clinical case of a right retrocaval ureter managed successfully with laparoscopic dismembered pyeloplasty. The main standpoints of the procedure are described. Our results were similar to others published by other urologic centers, which demonstrates the safety and feasibility of the procedure for this condition. PMID:27635277

  19. Area Minimizing Discs in Metric Spaces

    NASA Astrophysics Data System (ADS)

    Lytchak, Alexander; Wenger, Stefan

    2017-03-01

    We solve the classical problem of Plateau in the setting of proper metric spaces. Precisely, we prove that among all disc-type surfaces with prescribed Jordan boundary in a proper metric space there exists an area minimizing disc which moreover has a quasi-conformal parametrization. If the space supports a local quadratic isoperimetric inequality for curves we prove that such a solution is locally Hölder continuous in the interior and continuous up to the boundary. Our results generalize corresponding results of Douglas Radò and Morrey from the setting of Euclidean space and Riemannian manifolds to that of proper metric spaces.

  20. Minimal model for spoof acoustoelastic surface states

    SciTech Connect

    Christensen, J. Willatzen, M.; Liang, Z.

    2014-12-15

    Similar to textured perfect electric conductors for electromagnetic waves sustaining artificial or spoof surface plasmons we present an equivalent phenomena for the case of sound. Aided by a minimal model that is able to capture the complex wave interaction of elastic cavity modes and airborne sound radiation in perfect rigid panels, we construct designer acoustoelastic surface waves that are entirely controlled by the geometrical environment. Comparisons to results obtained by full-wave simulations confirm the feasibility of the model and we demonstrate illustrative examples such as resonant transmissions and waveguiding to show a few examples of many where spoof elastic surface waves are useful.

  1. Periodical cicadas: A minimal automaton model

    NASA Astrophysics Data System (ADS)

    de O. Cardozo, Giovano; de A. M. M. Silvestre, Daniel; Colato, Alexandre

    2007-08-01

    The Magicicada spp. life cycles with its prime periods and highly synchronized emergence have defied reasonable scientific explanation since its discovery. During the last decade several models and explanations for this phenomenon appeared in the literature along with a great deal of discussion. Despite this considerable effort, there is no final conclusion about this long standing biological problem. Here, we construct a minimal automaton model without predation/parasitism which reproduces some of these aspects. Our results point towards competition between different strains with limited dispersal threshold as the main factor leading to the emergence of prime numbered life cycles.

  2. Minimal relativistic three-particle equations

    SciTech Connect

    Lindesay, J.

    1981-07-01

    A minimal self-consistent set of covariant and unitary three-particle equations is presented. Numerical results are obtained for three-particle bound states, elastic scattering and rearrangement of bound pairs with a third particle, and amplitudes for breakup into states of three free particles. The mathematical form of the three-particle bound state equations is explored; constraints are set upon the range of eigenvalues and number of eigenstates of these one parameter equations. The behavior of the number of eigenstates as the two-body binding energy decreases to zero in a covariant context generalizes results previously obtained non-relativistically by V. Efimov.

  3. Minimally invasive surgery for esophageal achalasia.

    PubMed

    Chen, Huan-Wen; Du, Ming

    2016-07-01

    Esophageal achalasia is due to the esophagus of neuromuscular dysfunction caused by esophageal functional disease. Its main feature is the lack of esophageal peristalsis, the lower esophageal sphincter pressure and to reduce the swallow's relaxation response. Lower esophageal muscular dissection is one of the main ways to treat esophageal achalasia. At present, the period of muscular layer under the thoracoscope esophagus dissection is one of the treatment of esophageal achalasia. Combined with our experience in minimally invasive esophageal surgery, to improved incision and operation procedure, and adopts the model of the complete period of muscular layer under the thoracoscope esophagus dissection in the treatment of esophageal achalasia.

  4. Flavored dark matter beyond Minimal Flavor Violation

    DOE PAGES

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms asmore » triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.« less

  5. Flavored dark matter beyond Minimal Flavor Violation

    SciTech Connect

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms as triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.

  6. One hospital's road to waste minimization.

    PubMed

    Hooper, D M

    1994-05-01

    There are many new and exciting waste minimization programs being offered to healthcare facilities. Companies are now making reusable operating packs and gowns that are more efficient than disposables. The selling point is that the system will save healthcare money! The reusable programs do save disposal costs for an institution. Shore Memorial has scheduled a trial evaluation for reusable operating room linens to begin May 1, 1994. The concept has not been difficult to sell to physicians and staff. Perhaps this is because people are generally more aware of their environment and the reasons why it should be protected. The hospital will also be evaluating an IV bottle and bag recycling program. The New Jersey Department of Environmental Protection Agency has given approval to proceed with this type of recycling program, and Shore Memorial is in the process of scheduling this trial program with a local vendor. Waste reduction and recycling in healthcare settings will continue to be challenging because of the diversity of the wastestream and the changing environment facing healthcare. Certainly, healthcare has as much of a responsibility to the well-being of patients as it does to keeping the environment healthy. Returning to the "old way" of doing things, such as reusables, does not have a negative impact on people, but it does have an impact on the environment. Shore Memorial believes it is moving in the right direction with its waste minimization program to make a positive environmental impact.

  7. Performance monitoring during a minimal group manipulation.

    PubMed

    Pfabigan, Daniela M; Holzner, Marie-Theres; Lamm, Claus

    2016-10-01

    The on-going (self-)monitoring of our behaviour is inextricably intertwined with the surrounding social context. In this study, we investigated whether a minimal group paradigm assigning individuals to arbitrary group categories is powerful enough to induce changes in behavioural, psychophysiological and event-related potential correlates of performance monitoring. Following arbitrary group assignment based on ostensible task performance and a group identification task, 22 volunteers performed a flanker-task during both in-group and out-group contexts, while electroencephalography was performed. More errors were committed in the out-group compared with the in-group context. Error-related negativity amplitudes were larger for in-group compared with out-group errors. However, subsequent processing reflected in late Pe amplitudes and stimulus-driven conflict reflected in N2 amplitudes were not affected by the group context. Heart rate deceleration (during both correct and incorrect trials) tended to be more pronounced during the out-group compared with the in-group context. This surprising observation was corroborated by subjective ratings of performance satisfaction, in which participants reported higher satisfaction with their out-group performance. This study identified specific stimulus evaluation processes to be affected by a minimal group manipulation and demonstrated thereby transient top-down effects of a social context manipulation on performance monitoring.

  8. Minimally Invasive Laminectomy in Spondylolisthetic Lumbar Stenosis

    PubMed Central

    Caralopoulos, Ilias N.; Bui, Cuong J.

    2014-01-01

    Background Degenerative lumbar stenosis associated with spondylolisthesis is common in elderly patients. The most common symptoms are those of neurogenic claudication with leg pain. Surgery is indicated for those who fail conservative management. The generally accepted recommendation is to perform a laminectomy and a fusion at the involved level. Methods We reviewed our results for minimally invasive single-level decompression without fusion performed by the senior author in patients with symptomatic lumbar stenosis with spondylolisthesis with no dynamic instability from 2008 to 2011 at a single institution. Outcomes were measured using the visual analog scale (VAS), Prolo Economic Functional Rating Scale, and revised Oswestry Disability Index (ODI) at initial presentation and at 3-month, 6-month, and 1-year follow-up time points. Results Records for 28 patients (19 males, 9 females) were reviewed. The success rate, defined as improvement in pain and functional outcome without the need for surgical fusion, was 86%. VAS scores decreased by 6.3 points, Prolo scores increased by 3.5 points, and the ODI decreased by 31% at 1 year. All changes were statistically significant. Conclusion Minimally invasive decompression alone can be a reasonable alternative to decompression and fusion for patients with spondylolisthetic lumbar stenosis and neurogenic claudication with leg pain. Decompression without fusion should be considered for older patients and for patients who are not ideal fusion candidates. PMID:24688331

  9. Environmental projects. Volume 16: Waste minimization assessment

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Goldstone Deep Space Communications Complex (GDSCC), located in the MoJave Desert, is part of the National Aeronautics and Space Administration's (NASA's) Deep Space Network (DSN), the world's largest and most sensitive scientific telecommunications and radio navigation network. The Goldstone Complex is operated for NASA by the Jet Propulsion Laboratory. At present, activities at the GDSCC support the operation of nine parabolic dish antennas situated at five separate locations known as 'sites.' Each of the five sites at the GDSCC has one or more antennas, called 'Deep Space Stations' (DSS's). In the course of operation of these DSS's, various hazardous and non-hazardous wastes are generated. In 1992, JPL retained Kleinfelder, Inc., San Diego, California, to quantify the various streams of hazardous and non-hazardous wastes generated at the GDSCC. In June 1992, Kleinfelder, Inc., submitted a report to JPL entitled 'Waste Minimization Assessment.' This present volume is a JPL-expanded version of the Kleinfelder, Inc. report. The 'Waste Minimization Assessment' report did not find any deficiencies in the various waste-management programs now practiced at the GDSCC, and it found that these programs are being carried out in accordance with environmental rules and regulations.

  10. Osmosis in a minimal model system

    NASA Astrophysics Data System (ADS)

    Lion, Thomas W.; Allen, Rosalind J.

    2012-12-01

    Osmosis is one of the most important physical phenomena in living and soft matter systems. While the thermodynamics of osmosis is well understood, the underlying microscopic dynamical mechanisms remain the subject of discussion. Unravelling these mechanisms is a prerequisite for understanding osmosis in non-equilibrium systems. Here, we investigate the microscopic basis of osmosis, in a system at equilibrium, using molecular dynamics simulations of a minimal model in which repulsive solute and solvent particles differ only in their interactions with an external potential. For this system, we can derive a simple virial-like relation for the osmotic pressure. Our simulations support an intuitive picture in which the solvent concentration gradient, at osmotic equilibrium, arises from the balance between an outward force, caused by the increased total density in the solution, and an inward diffusive flux caused by the decreased solvent density in the solution. While more complex effects may occur in other osmotic systems, our results suggest that they are not required for a minimal picture of the dynamic mechanisms underlying osmosis.

  11. Singlet-stabilized minimal gauge mediation

    NASA Astrophysics Data System (ADS)

    Curtin, David; Tsai, Yuhsin

    2011-04-01

    We propose singlet-stabilized minimal gauge mediation as a simple Intriligator, Seiberg and Shih-based model of direct gauge mediation which avoids both light gauginos and Landau poles. The hidden sector is a massive s-confining supersymmetric QCD that is distinguished by a minimal SU(5) flavor group. The uplifted vacuum is stabilized by coupling the meson to an additional singlet sector with its own U(1) gauge symmetry via nonrenormalizable interactions suppressed by a higher scale ΛUV in the electric theory. This generates a nonzero vacuum expectation value for the singlet meson via the inverted hierarchy mechanism, but requires tuning to a precision ˜(Λ/ΛUV)2, which is ˜10-4. In the course of this analysis we also outline some simple model-building rules for stabilizing uplifted-ISS models, which lead us to conclude that meson deformations are required (or at least heavily favored) to stabilize the adjoint component of the magnetic meson.

  12. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Sala, Filippo; Taoso, Marco; Hambye, Thomas; Panci, Paolo E-mail: thambye@ulb.ac.be E-mail: filippo.sala@cea.fr

    2015-10-01

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  13. Minimal size of a barchan dune.

    PubMed

    Parteli, E J R; Durán, O; Herrmann, H J

    2007-01-01

    Barchans are dunes of high mobility which have a crescent shape and propagate under conditions of unidirectional wind. However, sand dunes only appear above a critical size, which scales with the saturation distance of the sand flux [P. Hersen, S. Douady, and B. Andreotti, Phys. Rev. Lett. 89, 264301 (2002); B. Andreotti, P. Claudin, and S. Douady, Eur. Phys. J. B 28, 321 (2002); G. Sauermann, K. Kroy, and H. J. Herrmann, Phys. Rev. E 64, 31305 (2001)]. It has been suggested by P. Hersen, S. Douady, and B. Andreotti, Phys. Rev. Lett. 89, 264301 (2002)] that this flux fetch distance is itself constant. Indeed, this could not explain the protosize of barchan dunes, which often occur in coastal areas of high litoral drift, and the scale of dunes on Mars. In the present work, we show from three-dimensional calculations of sand transport that the size and the shape of the minimal barchan dune depend on the wind friction speed and the sand flux on the area between dunes in a field. Our results explain the common appearance of barchans a few tens of centimeter high which are observed along coasts. Furthermore, we find that the rate at which grains enter saltation on Mars is one order of magnitude higher than on Earth, and is relevant to correctly obtain the minimal dune size on Mars.

  14. [Theory and practice of minimally invasive endodontics].

    PubMed

    Jiang, H W

    2016-08-01

    The primary goal of modern endodontic therapy is to achieve the long-term retention of a functional tooth by preventing or treating pulpitis or apical periodontitis is. The long-term retention of endodontically treated tooth is correlated with the remaining amount of tooth tissue and the quality of the restoration after root canal filling. In recent years, there has been rapid progress and development in the basic research of endodontic biology, instrument and applied materials, making treatment procedures safer, more accurate, and more efficient. Thus, minimally invasive endodontics(MIE)has received increasing attention at present. MIE aims to preserve the maximum of tooth structure during root canal therapy, and the concept covers the whole process of diagnosis and treatment of teeth. This review article focuses on describing the minimally invasive concepts and operating essentials in endodontics, from diagnosis and treatment planning to the access opening, pulp cavity finishing, root canal cleaning and shaping, 3-dimensional root canal filling and restoration after root canal treatment.

  15. Surgical efficacy of minimally invasive thoracic discectomy.

    PubMed

    Elhadi, Ali M; Zehri, Aqib H; Zaidi, Hasan A; Almefty, Kaith K; Preul, Mark C; Theodore, Nicholas; Dickman, Curtis A

    2015-11-01

    We aimed to determine the clinical indications and surgical outcomes for thoracoscopic discectomy. Thoracic disc disease is a rare degenerative process. Thoracoscopic approaches serve to minimize tissue injury during the approach, but critics argue that this comes at the cost of surgical efficacy. Current reports in the literature are limited to small institutional patient series. We systematically identified all English language articles on thoracoscopic discectomy with at least two patients, published from 1994 to 2013 on MEDLINE, Science Direct, and Google Scholar. We analyzed 12 articles that met the inclusion criteria, five prospective and seven retrospective studies comprising 545 surgical patients. The overall complication rate was 24% (n=129), with reported complications ranging from intercostal neuralgia (6.1%), atelectasis (2.8%), and pleural effusion (2.6%), to more severe complications such as pneumonia (0.8%), pneumothorax (1.3%), and venous thrombosis (0.2%). The average reported postoperative follow-up was 20.5 months. Complete resolution of symptoms was reported in 79% of patients, improvement with residual symptoms in 10.2%, no change in 9.6%, and worsening in 1.2%. The minimally invasive endoscopic approaches to the thoracic spine among selected patients demonstrate excellent clinical efficacy and acceptable complication rates, comparable to the open approaches. Disc herniations confined to a single level, with small or no calcifications, are ideal for such an approach, whereas patients with calcified discs adherent to the dura would benefit from an open approach.

  16. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry.

  17. Navy Shipboard Hazardous Material Minimization Program

    SciTech Connect

    Bieberich, M.J.; Robinson, P.; Chastain, B.

    1994-12-31

    The use of hazardous (and potentially hazardous) materials in shipboard cleaning applications has proliferated as new systems and equipments have entered the fleet to reside alongside existing equipments. With the growing environmental awareness (and additional, more restrictive regulations) at all levels/echelon commands of the DoD, the Navy has initiated a proactive program to undertake the minimization/elimination of these hazardous materials in order to eliminate HMs at the source. This paper will focus on the current Shipboard Hazardous Materials Minimization Program initiatives including the identification of authorized HM currently used onboard, identification of potential substitute materials for HM replacement, identification of new cleaning technologies and processes/procedures, and identification of technical documents which will require revision to eliminate the procurement of HMs into the federal supply system. Also discussed will be the anticipated path required to implement the changes into the fleet and automated decision processes (substitution algorithm) currently employed. The paper will also present the most recent technologies identified for approval or additional testing and analysis including: supercritical CO{sub 2} cleaning, high pressure blasting (H{sub 2}O + baking soda), aqueous and semi-aqueous cleaning materials and processes, solvent replacements and dedicated parts washing systems with internal filtering capabilities, automated software for solvent/cleaning process substitute selection. Along with these technological advances, data availability (from on-line databases and CDROM Database libraries) will be identified and discussed.

  18. Minimally invasive colopexy for pediatric Chilaiditi syndrome.

    PubMed

    Blevins, Wayne A; Cafasso, Danielle E; Fernandez, Minela; Edwards, Mary J

    2011-03-01

    Chilaiditi syndrome is a rare disorder characterized by abdominal pain, respiratory distress, constipation, and vomiting in association with Chilaiditi's sign. Chilaiditi's sign is the finding on plain roentgenogram of colonic interposition between the liver and diaphragm and is usually asymptomatic. Surgery is typically reserved for cases of catastrophic colonic volvulus or perforation because of the syndrome. We present a case of a 6-year-old boy who presented with Chilaiditi syndrome and resulting failure to thrive because of severe abdominal pain and vomiting, which did not improve with laxatives and dietary changes. He underwent a laparoscopic gastrostomy tube placement and laparoscopic colopexy of the transverse colon to the falciform ligament and anterior abdominal wall. Postoperatively, his symptoms resolved completely, as did his failure to thrive. His gastrostomy tube was removed 3 months after surgery and never required use. This is the first case of Chilaiditi syndrome in the pediatric literature we are aware of that was treated with an elective, minimally invasive colopexy. In cases of severe Chilaiditi syndrome refractory to medical treatment, a minimally invasive colopexy should be considered as a possible treatment option and potentially offered before development of life-threatening complications such as volvulus or perforation.

  19. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Hambye, Thomas; Panci, Paolo; Sala, Filippo; Taoso, Marco

    2015-10-12

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  20. MR imaging guidance for minimally invasive procedures

    NASA Astrophysics Data System (ADS)

    Wong, Terence Z.; Kettenbach, Joachim; Silverman, Stuart G.; Schwartz, Richard B.; Morrison, Paul R.; Kacher, Daniel F.; Jolesz, Ferenc A.

    1998-04-01

    Image guidance is one of the major challenges common to all minimally invasive procedures including biopsy, thermal ablation, endoscopy, and laparoscopy. This is essential for (1) identifying the target lesion, (2) planning the minimally invasive approach, and (3) monitoring the therapy as it progresses. MRI is an ideal imaging modality for this purpose, providing high soft tissue contrast and multiplanar imaging, capability with no ionizing radiation. An interventional/surgical MRI suite has been developed at Brigham and Women's Hospital which provides multiplanar imaging guidance during surgery, biopsy, and thermal ablation procedures. The 0.5T MRI system (General Electric Signa SP) features open vertical access, allowing intraoperative imaging to be performed. An integrated navigational system permits near real-time control of imaging planes, and provides interactive guidance for positioning various diagnostic and therapeutic probes. MR imaging can also be used to monitor cryotherapy as well as high temperature thermal ablation procedures sing RF, laser, microwave, or focused ultrasound. Design features of the interventional MRI system will be discussed, and techniques will be described for interactive image acquisition and tracking of interventional instruments. Applications for interactive and near-real-time imaging will be presented as well as examples of specific procedures performed using MRI guidance.

  1. Utilization of biocatalysts in cellulose waste minimization

    SciTech Connect

    Woodward, J.; Evans, B.R.

    1996-09-01

    Cellulose, a polymer of glucose, is the principal component of biomass and, therefore, a major source of waste that is either buried or burned. Examples of biomass waste include agricultural crop residues, forestry products, and municipal wastes. Recycling of this waste is important for energy conservation as well as waste minimization and there is some probability that in the future biomass could become a major energy source and replace fossil fuels that are currently used for fuels and chemicals production. It has been estimated that in the United States, between 100-450 million dry tons of agricultural waste are produced annually, approximately 6 million dry tons of animal waste, and of the 190 million tons of municipal solid waste (MSW) generated annually, approximately two-thirds is cellulosic in nature and over one-third is paper waste. Interestingly, more than 70% of MSW is landfilled or burned, however landfill space is becoming increasingly scarce. On a smaller scale, important cellulosic products such as cellulose acetate also present waste problems; an estimated 43 thousand tons of cellulose ester waste are generated annually in the United States. Biocatalysts could be used in cellulose waste minimization and this chapter describes their characteristics and potential in bioconversion and bioremediation processes.

  2. The minimal curvaton-higgs model

    SciTech Connect

    Enqvist, Kari; Lerner, Rose N.; Takahashi, Tomo E-mail: rose.lerner@desy.de

    2014-01-01

    We present the first full study of the minimal curvaton-higgs (MCH) model, which is a minimal interpretation of the curvaton scenario with one real scalar coupled to the standard model Higgs boson. The standard model coupling allows the dynamics of the model to be determined in detail, including effects from the thermal background and from radiative corrections to the potential. The relevant mechanisms for curvaton decay are incomplete non-perturbative decay (delayed by thermal blocking), followed by decay via a dimension-5 non-renormalisable operator. To avoid spoiling the predictions of big bang nucleosynthesis, we find the ''bare'' curvaton mass to be m{sub σ} ≥ 8 × 10{sup 4}GeV. To match observational data from Planck there is an upper limit on the curvaton-higgs coupling g, between 10{sup −3} and 10{sup −2}, depending on the mass. This is due to interactions with the thermal background. We find that typically non-Gaussianities are small but that if f{sub NL} is observed in the near future then m{sub σ}∼<5 × 10{sup 9}GeV, depending on Hubble scale during inflation. In a thermal dark matter model, the lower bound on m{sub σ} can increase substantially. The parameter space may also be affected once the baryogenesis mechanism is specified.

  3. Minimally invasive surgery in cancer. Immunological response.

    PubMed

    Bobocea, A C; Trandafir, B; Bolca, C; Cordoş, I

    2012-01-01

    Minimally invasive surgery produced major changes in treating abdominal malignancies and early stage lung cancer. Laparoscopy and thoracoscopy are less traumatic than open surgery: allow faster recovery, shorter hospital stay, better cosmesis. Although these clinical benefits are important, prolonged disease-free interval, long-term survival with improved quality of life are most important endpoints for oncologic surgery. Major surgery causes significant alteration of immunological response, of particular importance in oncologic patients, as postoperative immunosuppression has been related to septic complications, lower survival rate, tumor spread and metastases. Clinical studies have shown laparoscopic surgery preserves better the patient's immunological function. Postoperative plasma peak concentrations of IL-6, IL-10, C-reactive protein (CRP) and TNF-alpha were lower after laparoscopic colonic resection. Prospective thoracoscopic VATS lobectomy trials found better preservation of lymphocyte T-cell function and quicker return of proliferative responses to normal, lower levels of CRP, thromboxane and prostacyclin. Immune function is influenced by the extent of surgical trauma. Minimally invasive surgery show reduced acute-phase responses compared with open procedures and better preservation of cellular immune mechanisms.

  4. Power Minimization techniques for Networked Data Centers.

    SciTech Connect

    Low, Steven; Tang, Kevin

    2011-09-28

    Our objective is to develop a mathematical model to optimize energy consumption at multiple levels in networked data centers, and develop abstract algorithms to optimize not only individual servers, but also coordinate the energy consumption of clusters of servers within a data center and across geographically distributed data centers to minimize the overall energy cost and consumption of brown energy of an enterprise. In this project, we have formulated a variety of optimization models, some stochastic others deterministic, and have obtained a variety of qualitative results on the structural properties, robustness, and scalability of the optimal policies. We have also systematically derived from these models decentralized algorithms to optimize energy efficiency, analyzed their optimality and stability properties. Finally, we have conducted preliminary numerical simulations to illustrate the behavior of these algorithms. We draw the following conclusion. First, there is a substantial opportunity to minimize both the amount and the cost of electricity consumption in a network of datacenters, by exploiting the fact that traffic load, electricity cost, and availability of renewable generation fluctuate over time and across geographical locations. Judiciously matching these stochastic processes can optimize the tradeoff between brown energy consumption, electricity cost, and response time. Second, given the stochastic nature of these three processes, real-time dynamic feedback should form the core of any optimization strategy. The key is to develop decentralized algorithms that can be implemented at different parts of the network as simple, local algorithms that coordinate through asynchronous message passing.

  5. Defect reduction through Lean methodology

    NASA Astrophysics Data System (ADS)

    Purdy, Kathleen; Kindt, Louis; Densmore, Jim; Benson, Craig; Zhou, Nancy; Leonard, John; Whiteside, Cynthia; Nolan, Robert; Shanks, David

    2010-09-01

    Lean manufacturing is a systematic method of identifying and eliminating waste. Use of Lean manufacturing techniques at the IBM photomask manufacturing facility has increased efficiency and productivity of the photomask process. Tools, such as, value stream mapping, 5S and structured problem solving are widely used today. In this paper we describe a step-by-step Lean technique used to systematically decrease defects resulting in reduced material costs, inspection costs and cycle time. The method used consists of an 8-step approach commonly referred to as the 8D problem solving process. This process allowed us to identify both prominent issues as well as more subtle problems requiring in depth investigation. The methodology used is flexible and can be applied to numerous situations. Advantages to Lean methodology are also discussed.

  6. Diffusion methodology: time to innovate?

    PubMed

    Meyer, Gary

    2004-01-01

    Over the past 60 years, thousands of diffusion studies have been conducted in numerous disciplines of study including sociology, education, communication, marketing, and pubic health. With few exceptions, these studies have been driven by a methodological approach that has become institutionalized in diffusion research. This approach is characterized by the collection of quantitative data about one innovation gathered from adopters at a single point in time after widespread diffusion has occurred. This dominant approach is examined here in terms of both its strengths and weaknesses and with regard to its contribution to the collective base of understanding the diffusion of innovations. Alternative methodological approaches are proposed and reviewed with consideration for the means by which they may expand the knowledge base.

  7. Methodological assessment of HCC literature

    PubMed Central

    Daniele, G.; Costa, N.; Lorusso, V.; Costa-Maia, J.; Pache, I.; Pirisi, M.

    2013-01-01

    Despite the fact that the hepatocellular carcinoma (HCC) represents a major health problem, very few interventions are available for this disease, and only sorafenib is approved for the treatment of advanced disease. Of note, only very few interventions have been thoroughly evaluated over time for HCC patients compared with several hundreds in other, equally highly lethal, tumours. Additionally, clinical trials in HCC have often been questioned for poor design and methodological issues. As a consequence, a gap between what is measured in clinical trials and what clinicians have to face in daily practice often occurs. As a result of this scenario, even the most recent guidelines for treatment of HCC patients use low strength evidence to make recommendations. In this review, we will discuss some of the potential methodological issues hindering a rational development of new treatments for HCC patients. PMID:23715943

  8. Some methodological issues in biosurveillance.

    PubMed

    Fricker, Ronald D

    2011-02-28

    This paper briefly summarizes a short course I gave at the 12th Biennial Centers for Disease Control and Prevention (CDC) and Agency for Toxic Substances and Disease Registry (ATSDR) Symposium held in Decatur, Georgia on April 6, 2009. The goal of this short course was to discuss various methodological issues of biosurveillance detection algorithms, with a focus on the issues related to developing, evaluating, and implementing such algorithms.

  9. [Methods and methodology of pathology].

    PubMed

    Lushnikov, E F

    2016-01-01

    The lecture gives the state-of-the-art of the methodology of human pathology that is an area of the scientific and practice activity of specialists to produce and systematize objective knowledge of pathology and to use the knowledge in clinical medicine. It considers the objects and subjects of an investigation, materials and methods of a pathologist, and the results of his/her work.

  10. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  11. Methodology for astronaut reconditioning research.

    PubMed

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised.

  12. Energy Efficiency Indicators Methodology Booklet

    SciTech Connect

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  13. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  14. Minimally Informative Prior Distributions for PSA

    SciTech Connect

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that the posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained

  15. Minimizing water consumption when producing hydropower

    NASA Astrophysics Data System (ADS)

    Leon, A. S.

    2015-12-01

    In 2007, hydropower accounted for only 16% of the world electricity production, with other renewable sources totaling 3%. Thus, it is not surprising that when alternatives are evaluated for new energy developments, there is strong impulse for fossil fuel or nuclear energy as opposed to renewable sources. However, as hydropower schemes are often part of a multipurpose water resources development project, they can often help to finance other components of the project. In addition, hydropower systems and their associated dams and reservoirs provide human well-being benefits, such as flood control and irrigation, and societal benefits such as increased recreational activities and improved navigation. Furthermore, hydropower due to its associated reservoir storage, can provide flexibility and reliability for energy production in integrated energy systems. The storage capability of hydropower systems act as a regulating mechanism by which other intermittent and variable renewable energy sources (wind, wave, solar) can play a larger role in providing electricity of commercial quality. Minimizing water consumption for producing hydropower is critical given that overuse of water for energy production may result in a shortage of water for other purposes such as irrigation, navigation or fish passage. This paper presents a dimensional analysis for finding optimal flow discharge and optimal penstock diameter when designing impulse and reaction water turbines for hydropower systems. The objective of this analysis is to provide general insights for minimizing water consumption when producing hydropower. This analysis is based on the geometric and hydraulic characteristics of the penstock, the total hydraulic head and the desired power production. As part of this analysis, various dimensionless relationships between power production, flow discharge and head losses were derived. These relationships were used to withdraw general insights on determining optimal flow discharge and

  16. Simulating granular materials by energy minimization

    NASA Astrophysics Data System (ADS)

    Krijgsman, D.; Luding, S.

    2016-11-01

    Discrete element methods are extremely helpful in understanding the complex behaviors of granular media, as they give valuable insight into all internal variables of the system. In this paper, a novel discrete element method for performing simulations of granular media is presented, based on the minimization of the potential energy in the system. Contrary to most discrete element methods (i.e., soft-particle method, event-driven method, and non-smooth contact dynamics), the system does not evolve by (approximately) integrating Newtons equations of motion in time, but rather by searching for mechanical equilibrium solutions for the positions of all particles in the system, which is mathematically equivalent to locally minimizing the potential energy. The new method allows for the rapid creation of jammed initial conditions (to be used for further studies) and for the simulation of quasi-static deformation problems. The major advantage of the new method is that it allows for truly static deformations. The system does not evolve with time, but rather with the externally applied strain or load, so that there is no kinetic energy in the system, in contrast to other quasi-static methods. The performance of the algorithm for both types of applications of the method is tested. Therefore we look at the required number of iterations, for the system to converge to a stable solution. For each single iteration, the required computational effort scales linearly with the number of particles. During the process of creating initial conditions, the required number of iterations for two-dimensional systems scales with the square root of the number of particles in the system. The required number of iterations increases for systems closer to the jamming packing fraction. For a quasi-static pure shear deformation simulation, the results of the new method are validated by regular soft-particle dynamics simulations. The energy minimization algorithm is able to capture the evolution of the

  17. Minimal Intervention Dentistry – A New Frontier in Clinical Dentistry

    PubMed Central

    NK., Bajwa; A, Pathak

    2014-01-01

    Minimally invasive procedures are the new paradigm in health care. Everything from heart bypasses to gall bladder, surgeries are being performed with these dynamic new techniques. Dentistry is joining this exciting revolution as well. Minimally invasive dentistry adopts a philosophy that integrates prevention, remineralisation and minimal intervention for the placement and replacement of restorations. Minimally invasive dentistry reaches the treatment objective using the least invasive surgical approach, with the removal of the minimal amount of healthy tissues. This paper reviews in brief the concept of minimal intervention in dentistry. PMID:25177659

  18. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  19. Minimally invasive surgery for thyroid eye disease.

    PubMed

    Naik, Milind Neilkant; Nair, Akshay Gopinathan; Gupta, Adit; Kamal, Saurabh

    2015-11-01

    Thyroid eye disease (TED) can affect the eye in myriad ways: proptosis, strabismus, eyelid retraction, optic neuropathy, soft tissue changes around the eye and an unstable ocular surface. TED consists of two phases: active, and inactive. The active phase of TED is limited to a period of 12-18 months and is mainly managed medically with immunosuppression. The residual structural changes due to the resultant fibrosis are usually addressed with surgery, the mainstay of which is orbital decompression. These surgeries are performed during the inactive phase. The surgical rehabilitation of TED has evolved over the years: not only the surgical techniques, but also the concepts, and the surgical tools available. The indications for decompression surgery have also expanded in the recent past. This article discusses the technological and conceptual advances of minimally invasive surgery for TED that decrease complications and speed up recovery. Current surgical techniques offer predictable, consistent results with better esthetics.

  20. Complications in gynecological minimal-access oncosurgery.

    PubMed

    Becker, Sven; De Wilde, Rudy Leon

    2016-08-01

    Complications are the limiting factors of all surgeries. More than performing the actual surgery, learning how to avoid complications before, during, and after surgery is the most important task of every surgeon. Severe complications can lead to patient death. Complications such as ureterovaginal fistulas, resulting from <2 s of inattentive preparation, can lead to years of hardship, suffering, accusation, and litigation. Excellent surgery is about performing the right surgery for the right patient without any complications. Minimally invasive surgery in complex cases is technically challenging. This article details the major causes of complications in laparoscopy for the gynecologic cancer patient and present strategies for prevention, early detection, and intra- and postoperative management.

  1. Reflections concerning triply-periodic minimal surfaces.

    PubMed

    Schoen, Alan H

    2012-10-06

    In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau-Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346-362).

  2. Convex Lower Bounds for Free Energy Minimization

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan

    We construct lower bounds on free energy with convex relaxations from the nonlinear minimization over probabilities to linear programs over expectation values. Finite-temperature expectation values are further resolved into distributions over energy. A superset of valid expectation values is delineated by an incomplete set of linear constraints. Free energy bounds can be improved systematically by adding constraints, which also increases their computational cost. We compute several free energy bounds of increasing accuracy for the triangular-lattice Ising model to assess the utility of this method. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  3. Minimal residual method stronger than polynomial preconditioning

    SciTech Connect

    Faber, V.; Joubert, W.; Knill, E.

    1994-12-31

    Two popular methods for solving symmetric and nonsymmetric systems of equations are the minimal residual method, implemented by algorithms such as GMRES, and polynomial preconditioning methods. In this study results are given on the convergence rates of these methods for various classes of matrices. It is shown that for some matrices, such as normal matrices, the convergence rates for GMRES and for the optimal polynomial preconditioning are the same, and for other matrices such as the upper triangular Toeplitz matrices, it is at least assured that if one method converges then the other must converge. On the other hand, it is shown that matrices exist for which restarted GMRES always converges but any polynomial preconditioning of corresponding degree makes no progress toward the solution for some initial error. The implications of these results for these and other iterative methods are discussed.

  4. Reflections concerning triply-periodic minimal surfaces

    PubMed Central

    Schoen, Alan H.

    2012-01-01

    In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau–Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346–362). PMID:24098851

  5. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  6. Perceptions of Sexual Orientation From Minimal Cues.

    PubMed

    Rule, Nicholas O

    2017-01-01

    People derive considerable amounts of information about each other from minimal nonverbal cues. Apart from characteristics typically regarded as obvious when encountering another person (e.g., age, race, and sex), perceivers can identify many other qualities about a person that are typically rather subtle. One such feature is sexual orientation. Here, I review the literature documenting the accurate perception of sexual orientation from nonverbal cues related to one's adornment, acoustics, actions, and appearance. In addition to chronicling studies that have demonstrated how people express and extract sexual orientation in each of these domains, I discuss some of the basic cognitive and perceptual processes that support these judgments, including how cues to sexual orientation manifest in behavioral (e.g., clothing choices) and structural (e.g., facial morphology) signals. Finally, I attend to boundary conditions in the accurate perception of sexual orientation, such as the states, traits, and group memberships that moderate individuals' ability to reliably decipher others' sexual orientation.

  7. Design and Demonstration of Minimal Lunar Base

    NASA Astrophysics Data System (ADS)

    Boche-Sauvan, L.; Foing, B. H.; Exohab Team

    2009-04-01

    Introduction: We propose a conceptual analysis of a first minimal lunar base, in focussing on the system aspects and coordinating every different part as part an evolving architecture [1-3]. We justify the case for a scientific outpost allowing experiments, sample analysis in laboratory (relevant to the origin and evolution of the Earth, geophysical and geochemical studies of the Moon, life sciences, observation from the Moon). Research: Research activities will be conducted with this first settlement in: - science (of, from and on the Moon) - exploration (robotic mobility, rover, drilling), - technology (communication, command, organisation, automatism). Life sciences. The life sciences aspects are considered through a life support for a crew of 4 (habitat) and a laboratory activity with biological experiments performed on Earth or LEO, but then without any magnetosphere protection and therefore with direct cosmic rays and solar particle effects. Moreover, the ability of studying the lunar environment in the field will be a big asset before settling a permanent base [3-5]. Lunar environment. The lunar environment adds constraints to instruments specifications (vacuum, extreme temperature, regolith, seism, micrometeorites). SMART-1 and other missions data will bring geometrical, chemical and physical details about the environment (soil material characteristics, on surface conditions …). Test bench. To assess planetary technologies and operations preparing for Mars human exploration. Lunar outpost predesign modular concept: To allow a human presence on the moon and to carry out these experiments, we will give a pre-design of a human minimal lunar base. Through a modular concept, this base will be possibly evolved into a long duration or permanent base. We will analyse the possibilities of settling such a minimal base by means of the current and near term propulsion technology, as a full Ariane 5 ME carrying 1.7 T of gross payload to the surface of the Moon

  8. Minimally disruptive schedule repair for MCM missions

    NASA Astrophysics Data System (ADS)

    Molineaux, Matthew; Auslander, Bryan; Moore, Philip G.; Gupta, Kalyan M.

    2015-05-01

    Mine countermeasures (MCM) missions entail planning and operations in very dynamic and uncertain operating environments, which pose considerable risk to personnel and equipment. Frequent schedule repairs are needed that consider the latest operating conditions to keep mission on target. Presently no decision support tools are available for the challenging task of MCM mission rescheduling. To address this capability gap, we have developed the CARPE system to assist operation planners. CARPE constantly monitors the operational environment for changes and recommends alternative repaired schedules in response. It includes a novel schedule repair algorithm called Case-Based Local Schedule Repair (CLOSR) that automatically repairs broken schedules while satisfying the requirement of minimal operational disruption. It uses a case-based approach to represent repair strategies and apply them to new situations. Evaluation of CLOSR on simulated MCM operations demonstrates the effectiveness of case-based strategy. Schedule repairs are generated rapidly, ensure the elimination of all mines, and achieve required levels of clearance.

  9. Waste Minimization and Pollution Prevention Awareness Plan

    SciTech Connect

    Not Available

    1994-04-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, to estimate budget, and to review the plan. In addition to the above, this plan records LLNL`s goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities.

  10. Massive neutrinos and invisible axion minimally connected

    NASA Astrophysics Data System (ADS)

    Bertolini, Stefano; Di Luzio, Luca; Kolešová, Helena; Malinský, Michal

    2015-03-01

    We survey a few minimal scalar extensions of the standard electroweak model that provide a simple setup for massive neutrinos in connection with an invisible axion. The presence of a chiral U (1 ) à la Peccei-Quinn drives the pattern of Majorana neutrino masses while providing a dynamical solution to the strong C P problem and an axion as a dark matter candidate. We paradigmatically apply such a renormalizable framework to type-II seesaw and to two viable models for neutrino oscillations where the neutrino masses arise at one and two loops, respectively. We comment on the naturalness of the effective setups as well as on their implications for vacuum stability and electroweak baryogenesis.

  11. Error minimizing algorithms for nearest eighbor classifiers

    SciTech Connect

    Porter, Reid B; Hush, Don; Zimmer, G. Beate

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  12. Minimally invasive surgery for thyroid eye disease

    PubMed Central

    Naik, Milind Neilkant; Nair, Akshay Gopinathan; Gupta, Adit; Kamal, Saurabh

    2015-01-01

    Thyroid eye disease (TED) can affect the eye in myriad ways: proptosis, strabismus, eyelid retraction, optic neuropathy, soft tissue changes around the eye and an unstable ocular surface. TED consists of two phases: active, and inactive. The active phase of TED is limited to a period of 12–18 months and is mainly managed medically with immunosuppression. The residual structural changes due to the resultant fibrosis are usually addressed with surgery, the mainstay of which is orbital decompression. These surgeries are performed during the inactive phase. The surgical rehabilitation of TED has evolved over the years: not only the surgical techniques, but also the concepts, and the surgical tools available. The indications for decompression surgery have also expanded in the recent past. This article discusses the technological and conceptual advances of minimally invasive surgery for TED that decrease complications and speed up recovery. Current surgical techniques offer predictable, consistent results with better esthetics. PMID:26669337

  13. Minimal Increase Network Coding for Dynamic Networks.

    PubMed

    Zhang, Guoyin; Fan, Xu; Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery.

  14. Linear functional minimization for inverse modeling

    SciTech Connect

    Barajas-Solano, David A.; Wohlberg, Brendt Egon; Vesselinov, Velimir Valentinov; Tartakovsky, Daniel M.

    2015-06-01

    In this paper, we present a novel inverse modeling strategy to estimate spatially distributed parameters of nonlinear models. The maximum a posteriori (MAP) estimators of these parameters are based on a likelihood functional, which contains spatially discrete measurements of the system parameters and spatiotemporally discrete measurements of the transient system states. The piecewise continuity prior for the parameters is expressed via Total Variation (TV) regularization. The MAP estimator is computed by minimizing a nonquadratic objective equipped with the TV operator. We apply this inversion algorithm to estimate hydraulic conductivity of a synthetic confined aquifer from measurements of conductivity and hydraulic head. The synthetic conductivity field is composed of a low-conductivity heterogeneous intrusion into a high-conductivity heterogeneous medium. Our algorithm accurately reconstructs the location, orientation, and extent of the intrusion from the steady-state data only. Finally, addition of transient measurements of hydraulic head improves the parameter estimation, accurately reconstructing the conductivity field in the vicinity of observation locations.

  15. How nanomechanical systems can minimize dissipation.

    PubMed

    Muratore-Ginanneschi, Paolo; Schwieger, Kay

    2014-12-01

    Information processing machines at the nanoscales are unavoidably affected by thermal fluctuations. Efficient design requires understanding how nanomachines can operate at minimal energy dissipation. Here we focus on mechanical systems controlled by smoothly varying potential forces. We show that optimal control equations come about in a natural way if the energy cost to manipulate the potential is taken into account. When such a cost becomes negligible, an optimal control strategy can be constructed by transparent geometrical methods which recover the solution of optimal mass transport equations in the overdamped limit. Our equations are equivalent to hierarchies of kinetic equations of a form well known in the theory of dilute gases. From our results, optimal strategies for energy efficient nanosystems may be devised by established techniques from kinetic theory.

  16. Minimal Increase Network Coding for Dynamic Networks

    PubMed Central

    Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211

  17. Minimal Joule dissipation models of magnetospheric convection

    NASA Astrophysics Data System (ADS)

    Barbosa, D. D.

    This paper gives a topical review of theoretical models of magnetospheric convection based on the concept of minimal Joule dissipation. A two-dimensional slab model of the ionosphere featuring an enhanced conductivity auroral oval is used to compute high-latitude electric fields and currents. Mathematical methods used in the modeling include Fourier analysis, fast Fourier transforms, and variational calculus. Also, conformal transformations are introduced in the analysis, which enable the auroral oval to be represented as a nonconcentric, crescent-shaped figure. Convection patterns appropriate to geomagnetic quiet and disturbed conditions are computed, the differentiating variable being the relative amount of power dissipated in the magnetospheric ring current. When ring current dissipation is small, the convection electric field is restricted to high latitudes (shielding regime), and when it is large, a significant penetration of the field to low latitudes occurs, accompanied by an increase in the ratio of the region I current to the region 2 current.

  18. Weight minimization of a support structure

    NASA Technical Reports Server (NTRS)

    Kluberdanz, Donald J.; Segalman, Helaine J.

    1990-01-01

    This paper addresses the weight minimization of a circular plate-like structure which resulted in a 26 percent weight reduction. The optimization was performed numerically with the COPES/ADS program using the modified method of feasible directions. Design parameters were the inner thickness and outer thickness of the plate with constraints on maximum yield stress and maximum transverse displacement. Also, constraints were specified for the upper and lower bounds of the fundamental frequency and plate thicknesses. The MSC/NASTRAN finite element program was used for the evaluation of response variables. Original and final designs of the plate were tested using an Instron tension-compression machine to compare finite element results to measured strain data. The difference between finite element strain components and measured strain data was within engineering accuracy.

  19. Minimal Joule dissipation models of magnetospheric convection

    NASA Technical Reports Server (NTRS)

    Barbosa, D. D.

    1988-01-01

    This paper gives a topical review of theoretical models of magnetospheric convection based on the concept of minimal Joule dissipation. A two-dimensional slab model of the ionosphere featuring an enhanced conductivity auroral oval is used to compute high-latitude electric fields and currents. Mathematical methods used in the modeling include Fourier analysis, fast Fourier transforms, and variational calculus. Also, conformal transformations are introduced in the analysis, which enable the auroral oval to be represented as a nonconcentric, crescent-shaped figure. Convection patterns appropriate to geomagnetic quiet and disturbed conditions are computed, the differentiating variable being the relative amount of power dissipated in the magnetospheric ring current. When ring current dissipation is small, the convection electric field is restricted to high latitudes (shielding regime), and when it is large, a significant penetration of the field to low latitudes occurs, accompanied by an increase in the ratio of the region I current to the region 2 current.

  20. JSC Metal Finishing Waste Minimization Methods

    NASA Technical Reports Server (NTRS)

    Sullivan, Erica

    2003-01-01

    THe paper discusses the following: Johnson Space Center (JSC) has achieved VPP Star status and is ISO 9001 compliant. The Structural Engineering Division in the Engineering Directorate is responsible for operating the metal finishing facility at JSC. The Engineering Directorate is responsible for $71.4 million of space flight hardware design, fabrication and testing. The JSC Metal Finishing Facility processes flight hardware to support the programs in particular schedule and mission critical flight hardware. The JSC Metal Finishing Facility is operated by Rothe Joint Venture. The Facility provides following processes: anodizing, alodining, passivation, and pickling. JSC Metal Finishing Facility completely rebuilt in 1998. Total cost of $366,000. All new tanks, electrical, plumbing, and ventilation installed. Designed to meet modern safety, environmental, and quality requirements. Designed to minimize contamination and provide the highest quality finishes.

  1. Minimizing forced outage risk in generator bidding

    NASA Astrophysics Data System (ADS)

    Das, Dibyendu

    Competition in power markets has exposed the participating companies to physical and financial uncertainties. Generator companies bid to supply power in a day-ahead market. Once their bids are accepted by the ISO they are bound to supply power. A random outage after acceptance of bids forces a generator to buy power from the expensive real-time hourly spot market and sell to the ISO at the set day-ahead market clearing price, incurring losses. A risk management technique is developed to assess this financial risk associated with forced outages of generators and then minimize it. This work presents a risk assessment module which measures the financial risk of generators bidding in an open market for different bidding scenarios. The day-ahead power market auction is modeled using a Unit Commitment algorithm and a combination of Normal and Cauchy distributions generate the real time hourly spot market. Risk profiles are derived and VaRs are calculated at 98 percent confidence level as a measure of financial risk. Risk Profiles and VaRs help the generators to analyze the forced outage risk and different factors affecting it. The VaRs and the estimated total earning for different bidding scenarios are used to develop a risk minimization module. This module will develop a bidding strategy of the generator company such that its estimated total earning is maximized keeping the VaR below a tolerable limit. This general framework of a risk management technique for the generating companies bidding in competitive day-ahead market can also help them in decisions related to building new generators.

  2. Minimally invasive total hip arthroplasty: in opposition.

    PubMed

    Hungerford, David S

    2004-06-01

    At the Knee Society Winter Meeting in 2003, Seth Greenwald and I debated about whether there should be new standards (ie, regulations) applied to the release of information to the public on "new developments." I argued for the public's "right to know" prior to the publication of peer-reviewed literature. He argued for regulatory constraint or "proving by peer-reviewed publication" before alerting the public. It is not a contradiction for me to currently argue against the public advertising of minimally invasive (MIS) total hip arthroplasty as not yet being in the best interest of the public. It is hard to remember a concept that has so captured both the public's and the surgical community's fancy as MIS. Patients are "demanding" MIS without knowing why. Surgeons are offering it as the next best, greatest thing without having developed the skill and experience to avoid the surgery's risks. If you put "minimally invasive hip replacement" into the Google search engine (http://www.google.com), you get 5,170 matches. If you put the same words in PubMed (http://www.ncbi.nlm.nih.gov/entrez/query.fcgi), referencing the National Library of Medicine database, you get SEVENTEEN; none is really a peer-reviewed article. Most are 1 page papers in orthopedics from medical education meetings. On the other hand, there are over 6,000 peer-reviewed articles on total hip arthroplasty. Dr. Thomas Sculco, my couterpart in this debate, wrote an insightful editorial in the American Journal of Orthopedic Surgery in which he stated: "Although these procedures have generated incredible interest and enthusiasm, I am concerned that they may be performed to the detriment of our patients." I couldn't agree with him more. Smaller is not necessarily better and, when it is worse, it will be the "smaller" that is held accountable.

  3. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  4. Optimal pulsed pumping schedule using calculus of variation methodology

    SciTech Connect

    Johannes, T.W.

    1999-03-01

    The application of a variational optimization technique has demonstrated the potential strength of pulsed pumping operations for use at existing pump-and-treat aquifer remediation sites. The optimized pulsed pumping technique has exhibited notable improvements in operational effectiveness over continuous pumping. The optimized pulsed pumping technique has also exhibited an advantage over uniform time intervals for pumping and resting cycles. The most important finding supports the potential for managing and improving pumping operations in the absence of complete knowledge of plume characteristics. An objective functional was selected to minimize mass of water removed and minimize the non- essential mass of contaminant removed. General forms of an essential concentration function were analyzed to determine the appropriate form required for compliance with management preferences. Third-order essential concentration functions provided optimal solutions for the objective functional. Results of using this form of the essential concentration function in the methodology provided optimal solutions for switching times. The methodology was applied to a hypothetical, two-dimensional aquifer influenced by specified and no-flow boundaries, injection wells and extraction wells. Flow simulations used MODFLOW, transport simulations used MT3D, and the graphical interface for obtaining concentration time series data and flow/transport links were generated by GMS version 2.1.

  5. Event-scale power law recession analysis: quantifying methodological uncertainty

    NASA Astrophysics Data System (ADS)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    between the power-law recession scale parameter and catchment antecedent wetness varies depending on recession definition and fitting choices. Considering study results, we recommend a combination of four key methodological decisions to maximize the quality of fitted recession curves, and to minimize bias in the related populations of fitted recession parameters.

  6. Design for minimizing fracture risk of all-ceramic cantilever dental bridge.

    PubMed

    Zhang, Zhongpu; Zhou, Shiwei; Li, Eric; Li, Wei; Swain, Michael V; Li, Qing

    2015-01-01

    Minimization of the peak stresses and fracture incidence induced by mastication function is considered critical in design of all-ceramic dental restorations, especially for cantilever fixed partial dentures (FPDs). The focus of this study is on developing a mechanically-sound optimal design for all-ceramic cantilever dental bridge in a posterior region. The topology optimization procedure in association with Extended Finite Element Method (XFEM) is implemented here to search for the best possible distribution of porcelain and zirconia materials in the bridge structure. The designs with different volume fractions of zirconia are considered. The results show that this new methodology is capable of improving FPD design by minimizing incidence of crack in comparison with the initial design. Potentially, it provides dental technicians with a new design tool to develop mechanically sound cantilever fixed partial dentures for more complicated clinical situation.

  7. A Piecewise Solution to the Reconfiguration Problem by a Minimal Spanning Tree Algorithm

    NASA Astrophysics Data System (ADS)

    Ramirez, Juan M.; Montoya, Diana P.

    2014-10-01

    This paper proposes a minimal spanning tree (MST) algorithm to solve the networks' reconfiguration problem in radial distribution systems (RDS). The paper focuses on power losses' reduction by selecting the best radial configuration. The reconfiguration problem is a non-differentiable and highly combinatorial optimization problem. The proposed methodology is a deterministic Kruskal's algorithm based on graph theory, which is appropriate for this application generating only a feasible radial topology. The proposed MST algorithm has been tested on an actual RDS, which has been split into subsystems.

  8. Note: A method for minimizing oxide formation during elevated temperature nanoindentation

    SciTech Connect

    Cheng, I. C.; Hodge, A. M.; Garcia-Sanchez, E.

    2014-09-15

    A standardized method to protect metallic samples and minimize oxide formation during elevated-temperature nanoindentation was adapted to a commercial instrument. Nanoindentation was performed on Al (100), Cu (100), and W (100) single crystals submerged in vacuum oil at 200 °C, while the surface morphology and oxidation was carefully monitored using atomic force microscopy (AFM) and X-ray photoelectron spectroscopy (XPS). The results were compared to room temperature and 200 °C nanoindentation tests performed without oil, in order to evaluate the feasibility of using the oil as a protective medium. Extensive surface characterization demonstrated that this methodology is effective for nanoscale testing.

  9. Minimal trellises for linear block codes and their duals

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Dolinar, S.; Ekroot, L.; Mceliece, R. J.; Lin, W.

    1995-01-01

    We consider the problem of finding a trellis for a linear block code that minimizes one or more measures of trellis complexity for a fixed permutation of the code. We examine constraints on trellises, including relationships between the minimal trellis of a code and that of the dual code. We identify the primitive structures that can appear in a minimal trellis and relate this to those for the minimal trellis of the dual code.

  10. Minimization of Gibbs free energy in compositional reservoir simulation

    SciTech Connect

    Trungenstein, J.A.

    1985-02-01

    This paper describes the formulation of vapor-liquid phase equilibrium as a linearly constrained minimization problem. It also describes a second minimization problem designed to test for local phase stability. Vectorized unconstrained minimization techniques can be used to solve this pair of constrained minimization problems. The methods of this paper are applied to liquid-vapor equilibria for mixtures both far from and near to the phase boundary. Significant improvements over the standard successive substitution algorithm are demonstrated.

  11. The Stereo-Electroencephalography Methodology.

    PubMed

    Alomar, Soha; Jones, Jaes; Maldonado, Andres; Gonzalez-Martinez, Jorge

    2016-01-01

    The stereo-electroencephalography (SEEG) methodology and technique was developed almost 60 years ago in Europe. The efficacy and safety of SEEG has been proven. The main advantage is the possibility to study the epileptogenic neuronal network in its dynamic and 3-dimensional aspect, with optimal time and space correlation, with the clinical semiology of the patient's seizures. The main clinical challenge for the near future remains in the further refinement of specific selection criteria for the different methods of invasive monitoring, with the ultimate goal of comparing and validating the results (long-term seizure-free outcome) obtained from different methods of invasive monitoring.

  12. EXCOMP: an exposure comparison methodology

    SciTech Connect

    Lavender, J.C.; Franklin, A.L.

    1986-01-01

    When designing new facilities or modifying existing facilities that involve radioactive material, handling or processing, an area of concern is the radiological exposure received by facility personnel and the environment. The computerized models that are currently used for exposure evaluations are capable of evaluating only one relationship at a time, i.e., the effects of one source, its strength and location, on one work location. EXCOMP (EXposure COMParison) is a methodology developed for the IBM-PC to evaluate radiological exposures. It is capable of evaluating each identified work location in a facility with respect to each identified source effecting it.

  13. Feminist methodologies and engineering education research

    NASA Astrophysics Data System (ADS)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  14. Suggested criteria for evaluating systems engineering methodologies

    NASA Technical Reports Server (NTRS)

    Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.

    1989-01-01

    Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.

  15. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  16. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  17. The Minimal Cost of Life in Space

    NASA Astrophysics Data System (ADS)

    Drysdale, A.; Rutkze, C.; Albright, L.; Ladue, R.

    Life in space requires protection from the external environment, provision of a suitable internal environment, provision of consumables to maintain life, and removal of wastes. Protection from the external environment will mainly require shielding from radiation and meteoroids. Provision of a suitable environment inside the spacecraft will require provision of suitable air pressure and composition, temperature, and protection from environmental toxins (trace contaminants) and pathogenic micro-organisms. Gravity may be needed for longer missions to avoid excessive changes such as decalcification and muscle degeneration. Similarly, the volume required per crewmember will increase as the mission duration increases. Consumables required include oxygen, food, and water. Nitrogen might be required, depending on the total pressure and non-metabolic losses. We normally provide these consumables from the Earth, with a greater or lesser degree of regeneration. In principle, all consumables can be regenerated. Water and air are easiest to regenerate. At the present time, food can only be regenerated by using plants, and higher plants at that. Waste must be removed, including carbon dioxide and other metabolic waste as well as trash such as food packaging, filters, and expended spare parts. This can be done by dumping or regeneration. The minimal cost of life in space would be to use a synthesis process or system to regenerate all consumables from wastes. As the efficiency of the various processes rises, the minimal cost of life support will fall. However, real world regeneration requires significant equipment, power, and crew time. Make-up will be required for those items that cannot be economically regenerated. For very inefficient processes, it might be cheaper to ship all or part of the consumables. We are currently far down the development curve, and for short missions it is cheaper to ship consumables. For longer duration missions, greater closure is cost effective

  18. Minimizing Glovebox Glove Breaches: PART II.

    SciTech Connect

    Cournoyer, M. E.; Andrade, R.M.; Taylor, D. J.; Stimmel, J. J.; Zaelke, R. L.; Balkey, J. J.

    2005-01-01

    As a matter of good business practices, a team of glovebox experts from Los Alamos National Laboratory (LANL) has been assembled to proactively investigate processes and procedures that minimize unplanned breaches in the glovebox, e.g., glove failures. A major part of this effort involves the review of glovebox glove failures that have occurred at the Plutonium Facility and at the Chemical and Metallurgy Research Facility. Information dating back to 1993 has been compiled from formal records. This data has been combined with information obtained from a baseline inventory of about 9,000 glovebox gloves. The key attributes tracked include those related to location, the glovebox glove, type and location of breaches, the worker, and the consequences resulting from breaches. This glovebox glove failure analysis yielded results in the areas of the ease of collecting this type of data, the causes of most glove failures that have occurred, the effectiveness of current controls, and recommendations to improve hazard control systems. As expected, a significant number of breaches involve high-risk operations such as grinding, hammering, using sharps (especially screwdrivers), and assembling equipment. Surprisingly, tasks such as the movement of equipment and material between gloveboxes and the opening of cans are also major contributions of breaches. Almost half the gloves fail within a year of their install date. The greatest consequence for over 90% of glovebox glove failures is alpha contamination of protective clothing. Personnel self-monitoring at the gloveboxes continues to be the most effective way of detecting glovebox glove failures. Glove failures from these tasks can be reduced through changes in procedures and the design of remote-handling apparatus. The Nuclear Materials Technology Division management uses this information to improve hazard control systems to reduce the number of unplanned breaches in the glovebox further. As a result, excursions of contaminants

  19. Minimally Invasive Versus Conventional Aortic Valve Replacement

    PubMed Central

    Attia, Rizwan Q.; Hickey, Graeme L.; Grant, Stuart W.; Bridgewater, Ben; Roxburgh, James C.; Kumar, Pankaj; Ridley, Paul; Bhabra, Moninder; Millner, Russell W. J.; Athanasiou, Thanos; Casula, Roberto; Chukwuemka, Andrew; Pillay, Thasee; Young, Christopher P.

    2016-01-01

    Objective Minimally invasive aortic valve replacement (MIAVR) has been demonstrated as a safe and effective option but remains underused. We aimed to evaluate outcomes of isolated MIAVR compared with conventional aortic valve replacement (CAVR). Methods Data from The National Institute for Cardiovascular Outcomes Research (NICOR) were analyzed at seven volunteer centers (2006–2012). Primary outcomes were in-hospital mortality and midterm survival. Secondary outcomes were postoperative length of stay as well as cumulative bypass and cross-clamp times. Propensity modeling with matched cohort analysis was used. Results Of 307 consecutive MIAVR patients, 151 (49%) were performed during the last 2 years of study with a continued increase in numbers. The 307 MIAVR patients were matched on a 1:1 ratio. In the matched CAVR group, there was no statistically significant difference in in-hospital mortality [MIAVR, 4/307,(1.3%); 95% confidence interval (CI), 0.4%–3.4% vs CAVR, 6/307 (2.0%); 95% CI, 0.8%–4.3%; P = 0.752]. One-year survival rates in the MIAVR and CAVR groups were 94.4% and 94.6%, respectively. There was no statistically significant difference in midterm survival (P = 0.677; hazard ratio, 0.90; 95% CI, 0.56–1.46). Median postoperative length of stay was lower in the MIAVR patients by 1 day (P = 0.009). The mean cumulative bypass time (94.8 vs 91.3 minutes; P = 0.333) and cross-clamp time (74.6 vs 68.4 minutes; P = 0.006) were longer in the MIAVR group; however, this was significant only in the cross-clamp time comparison. Conclusions Minimally invasive aortic valve replacement is a safe alternative to CAVR with respect to operative and 1-year mortality and is associated with a shorter postoperative stay. Further studies are required in high-risk (logistic EuroSCORE > 10) patients to define the role of MIAVR. PMID:26926521

  20. [Minimal Residual Disease (MRD) in gastric carcinoma--an overview].

    PubMed

    Garlipp, B; Steinert, R; Lippert, H; Meyer, F

    2011-02-01

    Despite recent developments in therapy for gastric cancer, the prognosis of this disease remains poor in advanced stages. In many cases even curatively treated patients without any residual tumour develop metachronous metastases. As in other solid tumours, adjuvant therapies can reduce the metastatic risk, which implies that some of these patients harbour isolated tumour cells or micrometastases (minimal residual disease, MRD) that are undetectable by radiological imaging and conventional histopathology but can still be the cause of tumour recurrence. Therefore, reliable methods for diagnosing MRD would be desirable for individually tailoring therapy for these patients. Unfortunately, testing methods for MRD and interpretation of their results are not standardised and studies published on this topic are difficult to interpret due to methodological differences and small sample sizes. As of now, testing for MRD has not become relevant in clinical routine for any of the anatomic compartments lymph nodes, peritoneal lavage fluid, peripheral blood, and bone marrow in the Western hemisphere. Most reliable data on MRD in gastric cancer patients have been reported for peritoneal lavage fluid. In some centres in Japan, this test is routinely being used for making therapeutic decisions, e. g., on the use of intraperitoneal chemotherapy. MRD in resected lymph nodes will be further evaluated in the context of the sentinel lymph node concept and possibly be employed for designing individualised therapy for patients in early disease stages who are not routinely candidates for multimodal treatment. As for tumour cells in peripheral blood and in bone marrow, studies suggest that these cells are only able to form metastases in the presence of certain molecular factors. Therefore, rather than simply confirming the existence of isolated tumour cells in blood or bone marrow, future studies should concentrate on defining their molecular characteristics and the conditions required for

  1. Randomness Amplification under Minimal Fundamental Assumptions on the Devices

    NASA Astrophysics Data System (ADS)

    Ramanathan, Ravishankar; Brandão, Fernando G. S. L.; Horodecki, Karol; Horodecki, Michał; Horodecki, Paweł; Wojewódka, Hanna

    2016-12-01

    Recently, the physically realistic protocol amplifying the randomness of Santha-Vazirani sources producing cryptographically secure random bits was proposed; however, for reasons of practical relevance, the crucial question remained open regarding whether this can be accomplished under the minimal conditions necessary for the task. Namely, is it possible to achieve randomness amplification using only two no-signaling components and in a situation where the violation of a Bell inequality only guarantees that some outcomes of the device for specific inputs exhibit randomness? Here, we solve this question and present a device-independent protocol for randomness amplification of Santha-Vazirani sources using a device consisting of two nonsignaling components. We show that the protocol can amplify any such source that is not fully deterministic into a fully random source while tolerating a constant noise rate and prove the composable security of the protocol against general no-signaling adversaries. Our main innovation is the proof that even the partial randomness certified by the two-party Bell test [a single input-output pair (u* , x* ) for which the conditional probability P (x*|u*) is bounded away from 1 for all no-signaling strategies that optimally violate the Bell inequality] can be used for amplification. We introduce the methodology of a partial tomographic procedure on the empirical statistics obtained in the Bell test that ensures that the outputs constitute a linear min-entropy source of randomness. As a technical novelty that may be of independent interest, we prove that the Santha-Vazirani source satisfies an exponential concentration property given by a recently discovered generalized Chernoff bound.

  2. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  3. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  4. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  5. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  6. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  7. Security Requirements Reusability and the SQUARE Methodology

    DTIC Science & Technology

    2010-09-01

    Security Requirements Reusability and the SQUARE Methodology Travis Christian Faculty Advisor Nancy Mead September 2010 TECHNICAL NOTE...i Table of Contents Executive Summary vii Abstract ix 1 Introduction 1 2 Security Requirements in Current Practice 2 3 The SQUARE Methodology ...the technical staff at the Software Engineering Institute and principal investigator for the SQUARE methodology . Her expertise and guidance made this

  8. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Training methodology. 904.205... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and...

  9. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  10. ADVISORY ON UPDATED METHODOLOGY FOR ...

    EPA Pesticide Factsheets

    The National Academy of Sciences (NAS) published the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in 2006. The Committee analyzed the most recent epidemiology from the important exposed cohorts and factored in changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee also considered relevant radiobiological data, including that from the Department of Energy's low dose effects research program. Based on the review of this information, the Committee proposed a set of models for estimating risks from low-dose ionizing radiation. ORIA then prepared a white paper revising the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This is the first product to be developed as a result of the BEIR VII report. We requested that the SAB conduct an advisory during the development of this methodology. The second product to be prepared will be a revised version of the document,

  11. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care.

  12. Prochlorococcus: advantages and limits of minimalism.

    PubMed

    Partensky, Frédéric; Garczarek, Laurence

    2010-01-01

    Prochlorococcus is the key phytoplanktonic organism of tropical gyres, large ocean regions that are depleted of the essential macronutrients needed for photosynthesis and cell growth. This cyanobacterium has adapted itself to oligotrophy by minimizing the resources necessary for life through a drastic reduction of cell and genome sizes. This rarely observed strategy in free-living organisms has conferred on Prochlorococcus a considerable advantage over other phototrophs, including its closest relative Synechococcus, for life in this vast yet little variable ecosystem. However, this strategy seems to reach its limits in the upper layer of the S Pacific gyre, the most oligotrophic region of the world ocean. By losing some important genes and/or functions during evolution, Prochlorococcus has seemingly become dependent on co-occurring microorganisms. In this review, we present some of the recent advances in the ecology, biology, and evolution of Prochlorococcus, which because of its ecological importance and tiny genome is rapidly imposing itself as a model organism in environmental microbiology.

  13. Wormholes minimally violating the null energy condition

    SciTech Connect

    Bouhmadi-López, Mariam; Lobo, Francisco S N; Martín-Moruno, Prado E-mail: fslobo@fc.ul.pt

    2014-11-01

    We consider novel wormhole solutions supported by a matter content that minimally violates the null energy condition. More specifically, we consider an equation of state in which the sum of the energy density and radial pressure is proportional to a constant with a value smaller than that of the inverse area characterising the system, i.e., the area of the wormhole mouth. This approach is motivated by a recently proposed cosmological event, denoted {sup t}he little sibling of the big rip{sup ,} where the Hubble rate and the scale factor blow up but the cosmic derivative of the Hubble rate does not [1]. By using the cut-and-paste approach, we match interior spherically symmetric wormhole solutions to an exterior Schwarzschild geometry, and analyse the stability of the thin-shell to linearized spherically symmetric perturbations around static solutions, by choosing suitable properties for the exotic material residing on the junction interface radius. Furthermore, we also consider an inhomogeneous generalization of the equation of state considered above and analyse the respective stability regions. In particular, we obtain a specific wormhole solution with an asymptotic behaviour corresponding to a global monopole.

  14. Cultural change and support of waste minimization

    SciTech Connect

    Boylan, M.S.

    1991-12-31

    The process of bringing a subject like pollution prevention to top of mind awareness, where designed to prevent waste becomes part of business as usual, is called cultural change. With Department of Energy orders and management waste minimization commitment statements on file, the REAL work is just beginning at the Idaho National Engineering Laboratory (INEL); shaping the attitudes of 11,000+ employees. The difficulties of such a task are daunting. The 890 square mile INEL site and in-town support offices mean a huge diversity of employee jobs and waste streams; from cafeteria and auto maintenance wastes to high-level nuclear waste casks. INEL is pursuing a three component cultural change strategy: training, publicity, and public outreach. To meet the intent of DOE orders, all INEL employees are slated to receive pollution prevention orientation training. More technical training is given to targeted groups like purchasing and design engineering. To keep newly learned pollution prevention concepts top-of-mind, extensive site-wide publicity is being developed and conducted, culminating in the April Pollution Prevention Awareness Week coinciding with Earth Day 1992. Finally, news of INEL pollution prevention successes is shared with the public to increase their overall environmental awareness and their knowledge of INEL activities. An important added benefit is the sense of pride the program instills in INEL employees to have their successes displayed so publicly.

  15. Minimal mimicry: mere effector matching induces preference.

    PubMed

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-12-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually performed. In Experiments 1 and 2, participants moved either their arms or legs, and watched avatars that moved either their arms or legs, respectively, without any instructions to mimic. The executed movements themselves and their pace were completely different between participants (fast circular movements) and targets (slow linear movements). Participants preferred avatars that moved the same body part as they did over avatars that moved a different body part. In Experiment 3, using human targets and differently paced movements, movement similarity was manipulated in addition to effector overlap (moving forward-backward or sideways with arms or legs, respectively). Only effector matching, but not movement matching, influenced preference ratings. These findings suggest that mere effector overlap is sufficient to trigger preference by mimicry.

  16. Process optimized minimally invasive total hip replacement

    PubMed Central

    Gebel, Philipp; Oszwald, Markus; Ishaque, Bernd; Ahmed, Gaffar; Blessing, Recha; Thorey, Fritz; Ottersbach, Andreas

    2012-01-01

    The purpose of this study was to analyse a new concept of using the the minimally invasive direct anterior approach (DAA) in total hip replacement (THR) in combination with the leg positioner (Rotex- Table) and a modified retractor system (Condor). We evaluated retrospectively the first 100 primary THR operated with the new concept between 2009 and 2010, regarding operation data, radiological and clinical outcome (HOOS). All surgeries were perfomed in a standardized operation technique including navigation. The average age of the patients was 68 years (37 to 92 years), with a mean BMI of 26.5 (17 to 43). The mean time of surgery was 80 min. (55 to 130 min). The blood loss showed an average of 511.5 mL (200 to 1000 mL). No intra-operative complications occurred. The postoperative complication rate was 6%. The HOOS increased from 43 points pre-operatively to 90 (max 100 points) 3 months after surgery. The radiological analysis showed an average cup inclination of 43° and a leg length discrepancy in a range of +/− 5 mm in 99%. The presented technique led to excellent clinic results, showed low complication rates and allowed correct implant positions although manpower was saved. PMID:22577504

  17. Infrared dynamics of minimal walking technicolor

    SciTech Connect

    Del Debbio, Luigi; Lucini, Biagio; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2010-07-01

    We study the gauge sector of minimal walking technicolor, which is an SU(2) gauge theory with n{sub f}=2 flavors of Wilson fermions in the adjoint representation. Numerical simulations are performed on lattices N{sub t}xN{sub s}{sup 3}, with N{sub s} ranging from 8 to 16 and N{sub t}=2N{sub s}, at fixed {beta}=2.25, and varying the fermion bare mass m{sub 0}, so that our numerical results cover the full range of fermion masses from the quenched region to the chiral limit. We present results for the string tension and the glueball spectrum. A comparison of mesonic and gluonic observables leads to the conclusion that the infrared dynamics is given by an SU(2) pure Yang-Mills theory with a typical energy scale for the spectrum sliding to zero with the fermion mass. The typical mesonic mass scale is proportional to and much larger than this gluonic scale. Our findings are compatible with a scenario in which the massless theory is conformal in the infrared. An analysis of the scaling of the string tension with the fermion mass toward the massless limit allows us to extract the chiral condensate anomalous dimension {gamma}{sub *}, which is found to be {gamma}{sub *}=0.22{+-}0.06.

  18. Orbital debris minimization and mitigation techniques

    NASA Technical Reports Server (NTRS)

    Loftus, Joseph P., Jr.; Anz-Meador, Phillip D.; Reynolds, Robert

    1992-01-01

    Man's activity in space has generated significant amounts of debris that remain in orbit for periods of sufficient duration to become a hazard to future space activities. Upper stages and spacecraft that have ended their functional life are the largest objects. In the past, additional debris has been generated by inadvertent explosions of upper stages and spacecraft, by intentional explosions for military reasons, and possibly by a few breakups resulting from collisions. In the future, debris can be generated by collisions among spacecraft as the number of orbital objects continues to grow at rates greater than natural forces remove them from orbit. There are design and operations practices that can minimize the inadvertent generation of debris. There are other design and operations options for removing objects from space at the end of their useful service so they are not available as a source for the generation of future debris. Those studies are the primary concern of this paper. The most economic removal of objects is achieved when those objects have the capability to execute the necessary maneuvers with their own systems and resources. The most costly option is to have some other system remove the spacecraft after it has become a derelict. Numerous options are being studied to develop systems and techniques that can remove spacecraft from useful orbits at the end of their useful life and do so for the least mass penalty and economic cost.

  19. Linearized Functional Minimization for Inverse Modeling

    SciTech Connect

    Wohlberg, Brendt; Tartakovsky, Daniel M.; Dentz, Marco

    2012-06-21

    Heterogeneous aquifers typically consist of multiple lithofacies, whose spatial arrangement significantly affects flow and transport. The estimation of these lithofacies is complicated by the scarcity of data and by the lack of a clear correlation between identifiable geologic indicators and attributes. We introduce a new inverse-modeling approach to estimate both the spatial extent of hydrofacies and their properties from sparse measurements of hydraulic conductivity and hydraulic head. Our approach is to minimize a functional defined on the vectors of values of hydraulic conductivity and hydraulic head fields defined on regular grids at a user-determined resolution. This functional is constructed to (i) enforce the relationship between conductivity and heads provided by the groundwater flow equation, (ii) penalize deviations of the reconstructed fields from measurements where they are available, and (iii) penalize reconstructed fields that are not piece-wise smooth. We develop an iterative solver for this functional that exploits a local linearization of the mapping from conductivity to head. This approach provides a computationally efficient algorithm that rapidly converges to a solution. A series of numerical experiments demonstrates the robustness of our approach.

  20. Minimal genetic device with multiple tunable functions

    NASA Astrophysics Data System (ADS)

    Bagh, Sangram; Mandal, Mahuya; McMillen, David R.

    2010-08-01

    The ability to design artificial genetic devices with predictable functions is critical to the development of synthetic biology. Given the highly variable requirements of biological designs, the ability to tune the behavior of a genetic device is also of key importance; such tuning will allow devices to be matched with other components into larger systems, and to be shifted into the correct parameter regimes to elicit desired behaviors. Here, we have developed a minimal synthetic genetic system that acts as a multifunction, tunable biodevice in the bacterium Escherichia coli. First, it acts as a biochemical AND gate, sensing the extracellular small molecules isopropyl β-D -1-thiogalactopyranoside and anhydrotetracycline as two input signals and expressing enhanced green fluorescent protein as an output signal. Next, the output signal of the AND gate can be amplified by the application of another extracellular chemical, arabinose. Further, the system can generate a wide range of chemically tunable single input-output response curves, without any genetic alteration of the circuit, by varying the concentrations of a set of extracellular small molecules. We have developed and parameterized a simple transfer function model for the system, and shown that the model successfully explains and predicts the quantitative relationships between input and output signals in the system.

  1. Modeling minimal residual disease (MRD)-testing.

    PubMed

    Butturini, Anna; Klein, John; Gale, Robert Peter

    2003-04-01

    There is considerable effort to develop more sensitive methods to detect minimal residual disease (MRD) in bone marrow and blood samples of persons with cancer. Results of MRD-testing are used to predict clinical outcome and determine if more anti-cancer therapy is needed. Mathematical models were developed to assess factors affecting sensitivity and specificity of MRD-testing at diverse cancer cell prevalences. Modeling results and predictions were compared to results of large published studies.Accuracy of MRD-testing depends on cancer cell prevalence and distribution in the blood or bone marrow of the subject, sensitivity and specificity of the MRD-test and sample size. In subjects with low cancer cell prevalences (< or = 10(-4)) results of MRD testing are likely inaccurate. Increasingly sensitive MRD-tests are only marginally useful; the major obstacle to accuracy is inadequate sampling. Increasing sensitivity of methods to detect MRD is unlikely sufficient to increase accuracy of MRD-testing. In contrast, increased sampling (size and frequency) and assigning a high cut-off value (for example, > or = 10(-3)) to declare a MRD-test positive will increase sensitivity and specificity, respectively.

  2. Bacterial Stressors in Minimally Processed Food

    PubMed Central

    Capozzi, Vittorio; Fiocco, Daniela; Amodio, Maria Luisa; Gallone, Anna; Spano, Giuseppe

    2009-01-01

    Stress responses are of particular importance to microorganisms, because their habitats are subjected to continual changes in temperature, osmotic pressure, and nutrients availability. Stressors (and stress factors), may be of chemical, physical, or biological nature. While stress to microorganisms is frequently caused by the surrounding environment, the growth of microbial cells on its own may also result in induction of some kinds of stress such as starvation and acidity. During production of fresh-cut produce, cumulative mild processing steps are employed, to control the growth of microorganisms. Pathogens on plant surfaces are already stressed and stress may be increased during the multiple mild processing steps, potentially leading to very hardy bacteria geared towards enhanced survival. Cross-protection can occur because the overlapping stress responses enable bacteria exposed to one stress to become resistant to another stress. A number of stresses have been shown to induce cross protection, including heat, cold, acid and osmotic stress. Among other factors, adaptation to heat stress appears to provide bacterial cells with more pronounced cross protection against several other stresses. Understanding how pathogens sense and respond to mild stresses is essential in order to design safe and effective minimal processing regimes. PMID:19742126

  3. Prochlorococcus: Advantages and Limits of Minimalism

    NASA Astrophysics Data System (ADS)

    Partensky, Frédéric; Garczarek, Laurence

    2010-01-01

    Prochlorococcus is the key phytoplanktonic organism of tropical gyres, large ocean regions that are depleted of the essential macronutrients needed for photosynthesis and cell growth. This cyanobacterium has adapted itself to oligotrophy by minimizing the resources necessary for life through a drastic reduction of cell and genome sizes. This rarely observed strategy in free-living organisms has conferred on Prochlorococcus a considerable advantage over other phototrophs, including its closest relative Synechococcus, for life in this vast yet little variable ecosystem. However, this strategy seems to reach its limits in the upper layer of the S Pacific gyre, the most oligotrophic region of the world ocean. By losing some important genes and/or functions during evolution, Prochlorococcus has seemingly become dependent on co-occurring microorganisms. In this review, we present some of the recent advances in the ecology, biology, and evolution of Prochlorococcus, which because of its ecological importance and tiny genome is rapidly imposing itself as a model organism in environmental microbiology.

  4. Minimizing metastatic risk in radiotherapy fractionation schedules

    NASA Astrophysics Data System (ADS)

    Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin

    2015-11-01

    Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.

  5. Minimal Technologies Application Project: Planning and installation

    SciTech Connect

    Zellmer, S.D.; Hinchman, R.R.; Severinghaus, W.D.; Johnson, D.O.; Brent, J.J.

    1989-03-01

    Intensive and continuous tactical training during the last 35 years at the Hohenfels Training Area in West Germany has caused the loss of vegetative ground cover and has accelerated soil erosion rates, resulting in extensive environmental damage, safety hazards, and unrealistic training habitats. The objectives of this project are to develop and evaluate revegetation procedures for establishing adequate vegetative cover to control erosion at minimal costs and disruption to training activities. This project involved the development and installation of 12 revegetation procedures that combined four seedbed preparation methods and seeding options with three site-closure periods. In March 1987, the four seedbed preparation/seeding options and closure periods were selected, a study site design and location chosen, and specifications for the revegetation procedures developed. A German rehabilitation contractor attempted the specified seedbed preparation and seeding on the 13.5-ha site in June, but abnormally high rainfall, usually wet site conditions, and lack of adequate equipment prevented the contractor from completing six of the 12 planned procedures. Planning and execution of the project has nonetheless provided valuable information on the importance and use of soil analytical results, seed availability and cost data, contractor equipment requirements, and time required for planning future revegetation efforts. Continued monitoring of vegetative ground cover at the site for the next two years, combined with cost information, will provide necessary data to determine which of the six revegetation procedures is the most effective. These data will be used in planning future rehabilitation efforts on tactical training areas.

  6. Linear functional minimization for inverse modeling

    DOE PAGES

    Barajas-Solano, David A.; Wohlberg, Brendt Egon; Vesselinov, Velimir Valentinov; ...

    2015-06-01

    In this paper, we present a novel inverse modeling strategy to estimate spatially distributed parameters of nonlinear models. The maximum a posteriori (MAP) estimators of these parameters are based on a likelihood functional, which contains spatially discrete measurements of the system parameters and spatiotemporally discrete measurements of the transient system states. The piecewise continuity prior for the parameters is expressed via Total Variation (TV) regularization. The MAP estimator is computed by minimizing a nonquadratic objective equipped with the TV operator. We apply this inversion algorithm to estimate hydraulic conductivity of a synthetic confined aquifer from measurements of conductivity and hydraulicmore » head. The synthetic conductivity field is composed of a low-conductivity heterogeneous intrusion into a high-conductivity heterogeneous medium. Our algorithm accurately reconstructs the location, orientation, and extent of the intrusion from the steady-state data only. Finally, addition of transient measurements of hydraulic head improves the parameter estimation, accurately reconstructing the conductivity field in the vicinity of observation locations.« less

  7. Flavor mixing democracy and minimal CP violation

    NASA Astrophysics Data System (ADS)

    Gerard, Jean-Marc; Xing, Zhi-zhong

    2012-06-01

    We point out that there is a unique parametrization of quark flavor mixing in which every angle is close to the Cabibbo angle θC≃13° with the CP-violating phase ϕq around 1°, implying that they might all be related to the strong hierarchy among quark masses. Applying the same parametrization to lepton flavor mixing, we find that all three mixing angles are comparably large (around π/4) and the Dirac CP-violating phase ϕl is also minimal as compared with its values in the other eight possible parametrizations. In this spirit, we propose a simple neutrino mixing ansatz which is equivalent to the tri-bimaximal flavor mixing pattern in the ϕl→0 limit and predicts sin θ13=1/√{2}sin(ϕl/2) for reactor antineutrino oscillations. Hence the Jarlskog invariant of leptonic CP violation Jl=(sin ϕl)/12 can reach a few percent if θ13 lies in the range 7°⩽θ13⩽10°.

  8. Hazardous waste minimization report for CY 1986

    SciTech Connect

    Kendrick, C.M.

    1990-12-01

    Oak Ridge National Laboratory (ORNL) is a multipurpose research and development facility. Its primary role is the support of energy technology through applied research and engineering development and scientific research in basic and physical sciences. ORNL also is a valuable resource in the solution of problems of national importance, such as nuclear and chemical waste management. In addition, useful radioactive and stable isotopes which are unavailable from the private sector are produced at ORNL. As a result of these activities, hazardous, radioactive, and mixed wastes are generated at ORNL. A formal hazardous waste minimization program for ORNL was launched in mid 1985 in response to the requirements of Section 3002 of the Resource Conservation and Recovery Act (RCRA). During 1986, a task plan was developed. The six major tasks include: planning and implementation of a laboratory-wide chemical inventory and the subsequent distribution, treatment, storage, and/or disposal (TSD) of unneeded chemicals; establishment and implementation of a distribution system for surplus chemicals to other (internal and external) organizations; training and communication functions necessary to inform and motivate laboratory personnel; evaluation of current procurement and tracking systems for hazardous materials and recommendation and implementation of improvements; systematic review of applicable current and proposed ORNL procedures and ongoing and proposed activities for waste volume and/or toxicity reduction potential; and establishment of criteria by which to measure progress and reporting of significant achievements. 8 refs., 1 fig., 5 tabs.

  9. Minimizing or eliminating refueling of nuclear reactor

    DOEpatents

    Doncals, Richard A.; Paik, Nam-Chin; Andre, Sandra V.; Porter, Charles A.; Rathbun, Roy W.; Schwallie, Ambrose L.; Petras, Diane S.

    1989-01-01

    Demand for refueling of a liquid metal fast nuclear reactor having a life of 30 years is eliminated or reduced to intervals of at least 10 years by operating the reactor at a low linear-power density, typically 2.5 kw/ft of fuel rod, rather than 7.5 or 15 kw/ft, which is the prior art practice. So that power of the same magnitude as for prior art reactors is produced, the volume of the core is increased. In addition, the height of the core and it diameter are dimensioned so that the ratio of the height to the diameter approximates 1 to the extent practicable considering the requirement of control and that the pressure drop in the coolant shall not be excessive. The surface area of a cylinder of given volume is a minimum if the ratio of the height to the diameter is 1. By minimizing the surface area, the leakage of neutrons is reduced. By reducing the linear-power density, increasing core volume, reducing fissile enrichment and optimizing core geometry, internal-core breeding of fissionable fuel is substantially enhanced. As a result, core operational life, limited by control worth requirements and fuel burnup capability, is extended up to 30 years of continuous power operation.

  10. Minimal flow units for magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Orlandi, P.

    2016-08-01

    We present direct numerical simulations of two minimal flow units (MFUs) to investigate the differences between inviscid and viscous simulations, and the different behavior of the evolution for conducting fluids. In these circumstances the introduction of the Lorentz force in the momentum equation produces different scenarios. The Taylor-Green vortex, in the past, was an MFU widely considered for both conducting and non-conducting fluids. The simulations were performed by pseudo-spectral numerical methods; these are repeated here by using a finite difference second-order accurate, energy-conserving scheme for ν =0. Having observed that this initial condition could be inefficient for capturing the eventual occurrence of a finite time singularity a potentially more efficient MFU consisting of two interacting Lamb dipoles was considered. It was found that the two flows have a different time evolution in the vortical dominated stage. In this stage, turbulent structures of different size are generated leading to spectra, in the inviscid conditions, with a {k}-3 range. In real conditions the viscosity produces smaller scales characteristic of fully developed turbulence with energy spectra with well defined exponential and inertial ranges. In the presence of non-conducting conditions the passive vector behaves as the vorticity. The evolution is different in the presence of conducting conditions. Although the time evolution is different, both flows lead to spectra in Kolmogorov units with the same shape at high and intermediate wave numbers.

  11. Minimal realistic SU(5) Grand Unified Theory

    NASA Astrophysics Data System (ADS)

    Assad, Nima

    2016-03-01

    Despite making predictions in unprecedented agreement with experiment, such as the magnetic dipole moment of the electron to one part in a billion, the experimental confirmation of neutrino flavor oscillations, and thus of massive neutrinos, implies that the Standard Model (SM) of particle physics is incomplete. An extension of the SM, which retains its low energy predictions while accounting for massive neutrinos, is achieved through the introduction of the dimension 5 Weinberg operator and its associated energy scale above the electroweak (102 GeV), but below the Planck scale (1019 GeV). The Beyond Standard Model (BSM) class of Grand Unified Theories (GUTs) implicates such a scale (1016 GeV) in the unification of the three SM gauge couplings, thus making the origin of neutrino mass a theoretically appealing probe into particle behavior at energies currently inaccessible experimentally. Here, we compare the 24F and 15H extensions of the Georgi-Glashow SU(5) GUT to accommodate massive neutrinos and to unify SM gauge couplings while minimizing the theory's additional field content. Using the Monte Carlo event generator MadGraph, each extension is found to produce distinct signatures at the run II of the LHC.

  12. New Methodology for Estimating Fuel Economy by Vehicle Class

    SciTech Connect

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumption rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.

  13. A new drop-shape methodology for surface tension measurement

    NASA Astrophysics Data System (ADS)

    Cabezas, M. G.; Bateni, A.; Montanero, J. M.; Neumann, A. W.

    2004-11-01

    Drop-shape techniques, such as axisymmetric drop-shape analysis (ADSA), have been widely used to measure surface tension. In the current schemes, theoretical curves are fitted to the experimental profiles by adjusting the value of surface tension. The best match between theoretical and experimental profiles identifies the surface tension of the drop. Extracting the experimental drop profile using edge detection, is an important part of the current drop-shape techniques. However, edge detections fail when acquisition of sharp images is not possible due to experimental or optical limitations. A new drop-shape approach is presented, which eliminates the need for the edge detection and provides a wider range of applicability. The new methodology, called theoretical image fitting analysis (TIFA), generates theoretical images of the drop and forms an error function that describes the pixel-by-pixel deviation of the theoretical image from the experimental one. Taking surface tension as an adjustable parameter, TIFA minimizes the error function, i.e. fits the theoretical image to the experimental one. The validity of the new methodology is examined by comparing the results with those of ADSA. Using the new methodology it is finally possible to enhance the study of the surface tension of lung surfactants at higher concentrations. Due to the opaqueness of the solution, such studies were limited to the low concentrations of surfactants heretofore.

  14. Minimal flavor violation in the minimal U(1)B-L model and resonant leptogenesis

    NASA Astrophysics Data System (ADS)

    Okada, Nobuchika; Orikasa, Yuta; Yamada, Toshifumi

    2012-10-01

    We investigate the resonant leptogenesis scenario in the minimally U(1)B-L extended standard model with minimal flavor violation. In our model, the U(1)B-L gauge symmetry is broken at the TeV scale and standard model singlet neutrinos gain Majorana masses of order TeV. In addition, we introduce a flavor symmetry on the singlet neutrinos at a scale higher than TeV. The flavor symmetry is explicitly broken by the neutrino Dirac Yukawa coupling, which induces splittings in the singlet neutrino Majorana masses at lower scales through renormalization group evolutions. We call this setup minimal flavor violation. The mass splittings are proportional to the tiny Dirac Yukawa coupling, and hence they automatically enhance the CP asymmetry parameter necessary for the resonant leptogenesis mechanism. In this paper, we calculate the baryon number yield by solving the Boltzmann equations, including the effects of U(1)B-L gauge boson that also has TeV scale mass and causes washing-out of the singlet neutrinos in the course of thermal leptogenesis. The Dirac Yukawa coupling for neutrinos is fixed in terms of neutrino oscillation data and an arbitrary 3×3 complex-valued orthogonal matrix. We show that the right amount of baryon number asymmetry can be achieved through thermal leptogenesis in the context of the minimal flavor violation with singlet neutrinos and U(1)B-L gauge boson at the TeV scale. These particles can be discovered at the LHC in the near future.

  15. A new minimally invasive technique for cholecystectomy. Subxiphoid "minimal stress triangle": microceliotomy.

    PubMed Central

    Tyagi, N S; Meredith, M C; Lumb, J C; Cacdac, R G; Vanterpool, C C; Rayls, K R; Zerega, W D; Silbergleit, A

    1994-01-01

    OBJECTIVE: The authors devised a minimally invasive technique for cholecystectomy via microceliotomy that provides safety attainable with the open conventional approach and postoperative results comparable to laparoscopic cholecystectomy. SUMMARY BACKGROUND DATA: Laparoscopic cholecystectomy has evolved as a minimally invasive outpatient procedure. Patients can return rapidly to preoperative status with minimal postoperative morbidity and pain, and the small scar size is cosmetically desirable. Unfortunately, there are reports of serious intraoperative complications, including injury to blood vessels, bowel, and the bile ducts, caused by failure to identify structures properly. The conventional cholecystectomy technique currently is relegated to patients on whom the laparoscopic procedure cannot be performed. METHODS: Cholecystectomy was performed through a 3-cm transverse high subxiphoid incision in the "minimal stress triangle." The location, anterior to Calot's triangle, was critical in providing a direct vertical view of the biliary ducts during dissection. Direct view cholecystectomy was performed using endoscopic instruments without pneumoperitoneum. Postoperative data were compared with both laparoscopic and open cholecystectomy results. RESULTS: Using the microceliotomy technique in the ambulatory setting, cholecystectomy was performed successfully in 99.3% (N = 143) of cases. Biliary leakage beyond the third postoperative day was caused by failure of clips or obstruction to bile flow. The postoperative morbidity, acceptability of scar, and analgesic requirements compare favorably with other techniques. Microceliotomy is cost effective. Portal hypertension is a contraindication for this procedure. CONCLUSIONS: The microceliotomy approach offers a viable, safe, and cost-effective alternative to the laparoscopic technique for cholecystectomy, especially when facilities for laparoscopy are not available or when the laparoscopic procedure cannot be performed

  16. Is it possible to standardize the treatment of primary spontaneous pneumothorax? Part 1: etiology, symptoms, diagnostics, minimally invasive treatment

    PubMed Central

    Rokicki, Marek; Wojtacha, Jacek; Filipowski, Marek; Dżejlili, Agata; Czyżewski, Damian

    2016-01-01

    The authors of this report present the history of primary spontaneous pneumothorax (PSP) treatment, its etiology, clinical symptoms, and diagnostic methodology. Further, they discuss minimally invasive methods of treating PSP such as thoracentesis and chemical pleurodesis. They discuss the pros and cons of each method, emphasizing that, according to the international recommendations, they should be used as the first line of treatment for PSP. PMID:28096829

  17. Beyond minimal lepton-flavored Dark Matter

    SciTech Connect

    Chen, Mu-Chun; Huang, Jinrui; Takhistov, Volodymyr

    2016-02-09

    In this paper ,we consider a class of flavored dark matter (DM) theories where dark matter interacts with the Standard Model lepton fields at the renormalizable level. We allow for a general coupling matrix between the dark matter and leptons whose structure is beyond the one permitted by the minimal flavor violation (MFV) assumption. It is assumed that this is the only new source of flavor violation in addition to the Standard Model (SM) Yukawa interactions. The setup can be described by augmenting the SM flavor symmetry by an additional SU(3)χ, under which the dark matter χ transforms. This framework is especially phenomenologically rich, due to possible novel flavor-changing interactions which are not present within the more restrictive MFV framework. As a representative case study of this setting, which we call “beyond MFV” (BMFV), we consider Dirac fermion dark matter which transforms as a singlet under the SM gauge group and a triplet under SU(3)χ. The DM fermion couples to the SM lepton sector through a scalar mediator Φ. Unlike the case of quark-flavored DM, we show that there is no Z3 symmetry within either the MFV or BMFV settings which automatically stabilizes the lepton-flavored DM. We discuss constraints on this setup from flavor-changing processes, DM relic abundance as well as direct and indirect detections. We find that relatively large flavor-changing couplings are possible, while the dark matter mass is still within the phenomenologically interesting region below the TeV scale. Collider signatures which can be potentially searched for at the lepton and hadron colliders are discussed. Finally, we discuss the implications for decaying dark matter, which can appear if an additional stabilizing symmetry is not imposed.

  18. Beyond minimal lepton-flavored Dark Matter

    DOE PAGES

    Chen, Mu-Chun; Huang, Jinrui; Takhistov, Volodymyr

    2016-02-09

    In this paper ,we consider a class of flavored dark matter (DM) theories where dark matter interacts with the Standard Model lepton fields at the renormalizable level. We allow for a general coupling matrix between the dark matter and leptons whose structure is beyond the one permitted by the minimal flavor violation (MFV) assumption. It is assumed that this is the only new source of flavor violation in addition to the Standard Model (SM) Yukawa interactions. The setup can be described by augmenting the SM flavor symmetry by an additional SU(3)χ, under which the dark matter χ transforms. This frameworkmore » is especially phenomenologically rich, due to possible novel flavor-changing interactions which are not present within the more restrictive MFV framework. As a representative case study of this setting, which we call “beyond MFV” (BMFV), we consider Dirac fermion dark matter which transforms as a singlet under the SM gauge group and a triplet under SU(3)χ. The DM fermion couples to the SM lepton sector through a scalar mediator Φ. Unlike the case of quark-flavored DM, we show that there is no Z3 symmetry within either the MFV or BMFV settings which automatically stabilizes the lepton-flavored DM. We discuss constraints on this setup from flavor-changing processes, DM relic abundance as well as direct and indirect detections. We find that relatively large flavor-changing couplings are possible, while the dark matter mass is still within the phenomenologically interesting region below the TeV scale. Collider signatures which can be potentially searched for at the lepton and hadron colliders are discussed. Finally, we discuss the implications for decaying dark matter, which can appear if an additional stabilizing symmetry is not imposed.« less

  19. Software Replica of Minimal Living Processes

    NASA Astrophysics Data System (ADS)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  20. Planetary protection: elements for cost minimization

    NASA Astrophysics Data System (ADS)

    Debus, Andre

    2003-11-01

    In line with the UN Outer Space Treaty (article IX of the Outer Space Treaty - London/Washington January 27., 1967 -) and with COSPAR recommendations, for ethical, safety and scientific reasons, exploration of the solar system needs to comply with planetary protection constraints in order to avoid extraterrestrial bodies contamination, particularly biological contamination by terrestrial microorganisms. It is also required to protect Earth from an eventual contamination carried by return systems or samples. The search for life in extraterrestrial samples, in situ or in the frame of sample return missions, must be conducted in order to state with the maximum possible confidence, because the discovery or the non-discovery of life in sample has a direct impact on updatations of planetary protection specifications for future missions. This last requirement imposes consequently also for implementation in order to preserve extra terrestrial samples properties, protecting also indirectly exobiological science. These constraints impose to set up unusual requirements for project teams involved in such solar system exploration missions, requirements based on hardware sterilization, sterile integration, organic cleanliness, microbiological and cleanliness control, the use of high reliability system in order to avoid crashs, the definition of specific trajectories and their control, recontamination prevention .... etc. Implementation of such requirements induces costs, difficult to estimate, but which can be important depending on the solar system target and the mission definition (fly-by, orbiter or lander). The cost impact of a planetary protection program could be important if some basic rules are not taken into account enough early and consequently, upon past experience, some recommendations can be proposed here in order to manage properly such programs and to minimize their cost.

  1. Logarithmic minimal models with Robin boundary conditions

    NASA Astrophysics Data System (ADS)

    Bourgine, Jean-Emile; Pearce, Paul A.; Tartaglia, Elena

    2016-06-01

    We consider general logarithmic minimal models LM≤ft( p,{{p}\\prime}\\right) , with p,{{p}\\prime} coprime, on a strip of N columns with the (r, s) Robin boundary conditions introduced by Pearce, Rasmussen and Tipunin. On the lattice, these models are Yang-Baxter integrable loop models that are described algebraically by the one-boundary Temperley-Lieb algebra. The (r, s) Robin boundary conditions are a class of integrable boundary conditions satisfying the boundary Yang-Baxter equations which allow loop segments to either reflect or terminate on the boundary. The associated conformal boundary conditions are organized into infinitely extended Kac tables labelled by the Kac labels r\\in {Z} and s\\in {N} . The Robin vacuum boundary condition, labelled by ≤ft(r,s-\\frac{1}{2}\\right)=≤ft(0,\\frac{1}{2}\\right) , is given as a linear combination of Neumann and Dirichlet boundary conditions. The general (r, s) Robin boundary conditions are constructed, using fusion, by acting on the Robin vacuum boundary with an (r, s)-type seam consisting of an r-type seam of width w columns and an s-type seam of width d  =  s  -  1 columns. The r-type seam admits an arbitrary boundary field which we fix to the special value ξ =-\\fracλ{2} where λ =\\frac≤ft( {{p}\\prime}-p\\right)π{{{p}\\prime}} is the crossing parameter. The s-type boundary introduces d defects into the bulk. We consider the commuting double-row transfer matrices and their associated quantum Hamiltonians and calculate analytically the boundary free energies of the (r, s) Robin boundary conditions. Using finite-size corrections and sequence extrapolation out to system sizes N+w+d≤slant 26 , the conformal spectrum of boundary operators is accessible by numerical diagonalization of the Hamiltonians. Fixing the parity of N for r\

  2. An H-infinity norm minimization approach

    NASA Astrophysics Data System (ADS)

    Muse, Jonathan A.

    This dissertation seeks to merge the ideas from robust control theory such as Hinfinity control design and the Small Gain Theorem, L stability theory and Lyapunov stability from nonlinear control, and recent theoretical achievements in adaptive control. The fusion of frequency domain and linear time domain ideas allows the derivation of an H infinity Norm Minimization Approach (H infinity-NMA) for adaptive control architecture that permits a control designer to simplify the adaptive tuning process and tune the uncertainty compensation characteristics via linear control design techniques, band limit the adaptive control signal, efficiently handle redundant actuators, and handle unmatched uncertainty and matched uncertainty in a single design framework. The two stage design framework is similar to that used in robust control, but without sacrificing performance. The first stage of the design considers an ideal system with the system uncertainty completely known. For this system, a control law is designed using linear Hinfinity theory. Then in the second stage, an adaptive process is implemented that emulates the behavior of the ideal system. If the linear Hinfinity design is applied to control the emulated system, it then guarantees closed loop system stability of the actual system. All of this is accomplished while providing notions of transient performance bounds between the ideal system and the true system. Extensions to the theory include architectures for a class of output feedback systems, limiting the authority of an adaptive control system, and a method for improving the performance of an adaptive system with slow dynamics without any modification terms. Applications focus on using aerodynamic flow control for aircraft flight control and the Crew Launch Vehicle.

  3. Inconsistent reporting of minimally invasive surgery errors

    PubMed Central

    White, AD; Skelton, M; Mushtaq, F; Pike, TW; Mon-Williams, M; Lodge, JPA; Wilkie, RM

    2015-01-01

    Introduction Minimally invasive surgery (MIS) is a complex task requiring dexterity and high level cognitive function. Unlike surgical ‘never events’, potentially important (and frequent) manual or cognitive slips (‘technical errors’) are underresearched. Little is known about the occurrence of routine errors in MIS, their relationship to patient outcome, and whether they are reported accurately and/or consistently. Methods An electronic survey was sent to all members of the Association of Surgeons of Great Britain and Ireland, gathering demographic information, experience and reporting of MIS errors, and a rating of factors affecting error prevalence. Results Of 249 responses, 203 completed more than 80% of the questions regarding the surgery they had performed in the preceding 12 months. Of these, 47% reported a significant error in their own performance and 75% were aware of a colleague experiencing error. Technical skill, knowledge, situational awareness and decision making were all identified as particularly important for avoiding errors in MIS. Reporting of errors was variable: 15% did not necessarily report an intraoperative error to a patient while 50% did not consistently report at an institutional level. Critically, 12% of surgeons were unaware of the procedure for reporting a technical error and 59% felt guidance is needed. Overall, 40% believed a confidential reporting system would increase their likelihood of reporting an error. Conclusion These data indicate inconsistent reporting of operative errors, and highlight the need to better understand how and why technical errors occur in MIS. A confidential ‘no blame’ reporting system might help improve patient outcomes and avoid a closed culture that can undermine public confidence. PMID:26492908

  4. Phenomenology in minimal theory of massive gravity

    NASA Astrophysics Data System (ADS)

    De Felice, Antonio; Mukohyama, Shinji

    2016-04-01

    We investigate the minimal theory of massive gravity (MTMG) recently introduced. After reviewing the original construction based on its Hamiltonian in the vielbein formalism, we reformulate it in terms of its Lagrangian in both the vielbein and the metric formalisms. It then becomes obvious that, unlike previous attempts in the literature of Lorentz-violating massive gravity, not only the potential but also the kinetic structure of the action is modified from the de Rham-Gabadadze-Tolley (dRGT) massive gravity theory. We confirm that the number of physical degrees of freedom in MTMG is two at fully nonlinear level. This proves the absence of various possible pathologies such as superluminality, acausality and strong coupling. Afterwards, we discuss the phenomenology of MTMG in the presence of a dust fluid. We find that on a flat homogeneous and isotropic background we have two branches. One of them (self-accelerating branch) naturally leads to acceleration without the genuine cosmological constant or dark energy. For this branch both the scalar and the vector modes behave exactly as in general relativity (GR). The phenomenology of this branch differs from GR in the tensor modes sector, as the tensor modes acquire a non-zero mass. Hence, MTMG serves as a stable nonlinear completion of the self-accelerating cosmological solution found originally in dRGT theory. The other branch (normal branch) has a dynamics which depends on the time-dependent fiducial metric. For the normal branch, the scalar mode sector, even though as in GR only one scalar mode is present (due to the dust fluid), differs from the one in GR, and, in general, structure formation will follow a different phenomenology. The tensor modes will be massive, whereas the vector modes, for both branches, will have the same phenomenology as in GR.

  5. Phenomenology in minimal theory of massive gravity

    SciTech Connect

    Felice, Antonio De; Mukohyama, Shinji

    2016-04-15

    We investigate the minimal theory of massive gravity (MTMG) recently introduced. After reviewing the original construction based on its Hamiltonian in the vielbein formalism, we reformulate it in terms of its Lagrangian in both the vielbein and the metric formalisms. It then becomes obvious that, unlike previous attempts in the literature of Lorentz-violating massive gravity, not only the potential but also the kinetic structure of the action is modified from the de Rham-Gabadadze-Tolley (dRGT) massive gravity theory. We confirm that the number of physical degrees of freedom in MTMG is two at fully nonlinear level. This proves the absence of various possible pathologies such as superluminality, acausality and strong coupling. Afterwards, we discuss the phenomenology of MTMG in the presence of a dust fluid. We find that on a flat homogeneous and isotropic background we have two branches. One of them (self-accelerating branch) naturally leads to acceleration without the genuine cosmological constant or dark energy. For this branch both the scalar and the vector modes behave exactly as in general relativity (GR). The phenomenology of this branch differs from GR in the tensor modes sector, as the tensor modes acquire a non-zero mass. Hence, MTMG serves as a stable nonlinear completion of the self-accelerating cosmological solution found originally in dRGT theory. The other branch (normal branch) has a dynamics which depends on the time-dependent fiducial metric. For the normal branch, the scalar mode sector, even though as in GR only one scalar mode is present (due to the dust fluid), differs from the one in GR, and, in general, structure formation will follow a different phenomenology. The tensor modes will be massive, whereas the vector modes, for both branches, will have the same phenomenology as in GR.

  6. Methodology for flammable gas evaluations

    SciTech Connect

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  7. Indirect Lightning Safety Assessment Methodology

    SciTech Connect

    Ong, M M; Perkins, M P; Brown, C G; Crull, E W; Streit, R D

    2009-04-24

    Lightning is a safety hazard for high-explosives (HE) and their detonators. In the However, the current flowing from the strike point through the rebar of the building The methodology for estimating the risk from indirect lighting effects will be presented. It has two parts: a method to determine the likelihood of a detonation given a lightning strike, and an approach for estimating the likelihood of a strike. The results of these two parts produce an overall probability of a detonation. The probability calculations are complex for five reasons: (1) lightning strikes are stochastic and relatively rare, (2) the quality of the Faraday cage varies from one facility to the next, (3) RF coupling is inherently a complex subject, (4) performance data for abnormally stressed detonators is scarce, and (5) the arc plasma physics is not well understood. Therefore, a rigorous mathematical analysis would be too complex. Instead, our methodology takes a more practical approach combining rigorous mathematical calculations where possible with empirical data when necessary. Where there is uncertainty, we compensate with conservative approximations. The goal is to determine a conservative estimate of the odds of a detonation. In Section 2, the methodology will be explained. This report will discuss topics at a high-level. The reasons for selecting an approach will be justified. For those interested in technical details, references will be provided. In Section 3, a simple hypothetical example will be given to reinforce the concepts. While the methodology will touch on all the items shown in Figure 1, the focus of this report is the indirect effect, i.e., determining the odds of a detonation from given EM fields. Professor Martin Uman from the University of Florida has been characterizing and defining extreme lightning strikes. Using Professor Uman's research, Dr. Kimball Merewether at Sandia National Laboratory in Albuquerque calculated the EM fields inside a Faraday-cage type

  8. Clinical indicators: a methodological approach.

    PubMed

    Scott, L; Grimmer, K

    1995-03-01

    Clinical indicators offer physiotherapists a tool by which quality of care can be flagged and evaluated. Such a flag requires a measure of the cost of treatment as well as the outcome achieved for that cost. Traditionally, the number of treatments to discharge has been used as a proxy measure of cost as well as outcome of physiotherapy care. However, physiotherapists recognize that the cost of treatment is an inappropriate reflection of the outcome of care in many instances. The challenge for physiotherapists in developing clinical indicators is to set appropriate flags of quality of care. This paper presents a methodological approach to the development of a flag of performance for acute lower back pain.

  9. Simulation Enabled Safeguards Assessment Methodology

    SciTech Connect

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.

  10. Lean methodology in health care.

    PubMed

    Kimsey, Diane B

    2010-07-01

    Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area.

  11. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  12. Comparing open and minimally invasive surgical procedures for oesophagectomy in the treatment of cancer: the ROMIO (Randomised Oesophagectomy: Minimally Invasive or Open) feasibility study and pilot trial.

    PubMed Central

    Metcalfe, Chris; Avery, Kerry; Berrisford, Richard; Barham, Paul; Noble, Sian M; Fernandez, Aida Moure; Hanna, George; Goldin, Robert; Elliott, Jackie; Wheatley, Timothy; Sanders, Grant; Hollowood, Andrew; Falk, Stephen; Titcomb, Dan; Streets, Christopher; Donovan, Jenny L; Blazeby, Jane M

    2016-01-01

    BACKGROUND Localised oesophageal cancer can be curatively treated with surgery (oesophagectomy) but the procedure is complex with a risk of complications, negative effects on quality of life and a recovery period of 6-9 months. Minimal-access surgery may accelerate recovery. OBJECTIVES The ROMIO (Randomised Oesophagectomy: Minimally Invasive or Open) study aimed to establish the feasibility of, and methodology for, a definitive trial comparing minimally invasive and open surgery for oesophagectomy. Objectives were to quantify the number of eligible patients in a pilot trial; develop surgical manuals as the basis for quality assurance; standardise pathological processing; establish a method to blind patients to their allocation in the first week post surgery; identify measures of postsurgical outcome of importance to patients and clinicians; and establish the main cost differences between the surgical approaches. DESIGN Pilot parallel three-arm randomised controlled trial nested within feasibility work. SETTING Two UK NHS departments of upper gastrointestinal surgery. PARTICIPANTS Patients aged ≥ 18 years with histopathological evidence of oesophageal or oesophagogastric junctional adenocarcinoma, squamous cell cancer or high-grade dysplasia, referred for oesophagectomy or oesophagectomy following neoadjuvant chemo(radio)therapy. INTERVENTIONS Oesophagectomy, with patients randomised to open surgery, a hybrid open chest and minimally invasive abdomen or totally minimally invasive access. MAIN OUTCOME MEASURE The primary outcome measure for the pilot trial was the number of patients recruited per month, with the main trial considered feasible if at least 2.5 patients per month were recruited. RESULTS During 21 months of recruitment, 263 patients were assessed for eligibility; of these, 135 (51%) were found to be eligible and 104 (77%) agreed to participate, an average of five patients per month. In total, 41 patients were allocated to open surgery, 43 to the

  13. Identifying a minimal rheological configuration: a tool for effective and efficient constitutive modeling of soft tissues.

    PubMed

    Jordan, Petr; Kerdok, Amy E; Howe, Robert D; Socrate, Simona

    2011-04-01

    We describe a modeling methodology intended as a preliminary step in the identification of appropriate constitutive frameworks for the time-dependent response of biological tissues. The modeling approach comprises a customizable rheological network of viscous and elastic elements governed by user-defined 1D constitutive relationships. The model parameters are identified by iterative nonlinear optimization, minimizing the error between experimental and model-predicted structural (load-displacement) tissue response under a specific mode of deformation. We demonstrate the use of this methodology by determining the minimal rheological arrangement, constitutive relationships, and model parameters for the structural response of various soft tissues, including ex vivo perfused porcine liver in indentation, ex vivo porcine brain cortical tissue in indentation, and ex vivo human cervical tissue in unconfined compression. Our results indicate that the identified rheological configurations provide good agreement with experimental data, including multiple constant strain rate load/unload tests and stress relaxation tests. Our experience suggests that the described modeling framework is an efficient tool for exploring a wide array of constitutive relationships and rheological arrangements, which can subsequently serve as a basis for 3D constitutive model development and finite-element implementations. The proposed approach can also be employed as a self-contained tool to obtain simplified 1D phenomenological models of the structural response of biological tissue to single-axis manipulations for applications in haptic technologies.

  14. Unrestricted disposal of minimal activity levels of radioactive wastes: exposure and risk calculations

    SciTech Connect

    Fields, D.E.; Emerson, C.J.

    1984-08-01

    The US Nuclear Regulatory Commission is currently considering revision of rule 10 CFR Part 20, which covers disposal of solid wastes containing minimal radioactivity. In support of these revised rules, we have evaluated the consequences of disposing of four waste streams at four types of disposal areas located in three different geographic regions. Consequences are expressed in terms of human exposures and associated health effects. Each geographic region has its own climate and geology. Example waste streams, waste disposal methods, and geographic regions chosen for this study are clearly specified. Monetary consequences of minimal activity waste disposal are briefly discussed. The PRESTO methodology was used to evaluate radionuclide transport and health effects. This methodology was developed to assess radiological impacts to a static local population for a 1000-year period following disposal. Pathways and processes of transit from the trench to exposed populations included the following considerations: groundwater transport, overland flow, erosion, surface water dilution, resuspension, atmospheric transport, deposition, inhalation, and ingestion of contaminated beef, milk, crops, and water. 12 references, 2 figures, 8 tables.

  15. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    PubMed

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  16. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    PubMed Central

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132

  17. A review of the A400m final assembly line balancing methodology

    NASA Astrophysics Data System (ADS)

    Ríos, J.; Mas, F.; Menéndez, J. L.

    2012-04-01

    Assembly Line Balancing (ALB) comprises ordering of tasks among workstations to satisfy precedence constraints and objective functions. However, due to the specific features of an aeronautical Final Assembly Line (FAL), such approach is not fully suitable. In a FAL, the number of workstations relates to technological criteria rather than to a calculation aiming to minimize the total number of stations. To improve current practices, a methodological approach was taken to address the conceptual modeling of an assembly line, reviewing state of the art balancing techniques and the methodology used in the AIRBUS A400M FAL.

  18. [Developing the methodology of examining the lower limb veins in cosmonauts for the space medicine practice].

    PubMed

    Kotovskaia, A R; Fomina, G A; Sal'nikov, A V; Iarmanova, E N

    2014-01-01

    The article centres on development of a methodology for evaluating the function of lower limb veins of cosmonauts in microgravity. The whys and wherefores of the choice of occlusive plethysmography equipment and procedure are explained. Much place is given to arguments for the requisite body and limb positioning during venous plethysmography pre launch and on return from space flight. To minimize the gravity effect on venous blood flow, the body should be in the level position and the calf aligned with the hydrodynamically indifferent point. Determining the type of test occlusion, occlusion adjustments, venous parameters of interest, and data processing procedure constitute the methodology.

  19. Orbital debris minimization and mitigation techniques

    NASA Astrophysics Data System (ADS)

    Loftus, Joseph P.; Anz-Meador, Philip D.; Reynolds, Robert

    1993-08-01

    Man's activity in space has generated significant amounts of debris that remain in orbit long enough to become a hazard to future space activities. Upper stages and spacecraft that have ended their functional life are the largest objects. In the past, additional debris has been generated by inadvertent explosions of upper stages and spacecraft, by intentional explosions for military reasons, and possibly by a few breakups resulting from collisions. In the future, debris can be generated by collisions among spacecraft as the number of orbital objects continues to grow at a rate greater than the rate at which natural forces remove them from orbit. Some design and operations practices can minimize the inadvertent generation of debris, and others can remove objects from space at the end of their useful service so they are not a source for the generation of future debris. Those studies are the primary concern of this paper. The issues are different in the low Earth orbits and in the geosynchronous orbits. In low Earth orbit, the hazards generated by potential collisions among spacecraft are severe because the events take place at such high velocities. In geosynchronous orbit, the collision consequence is not so severe because the relative velocities are low-less than 1 km/s. But because of the value of the limited arc and the extremely long lifetime of the satellites, debris generated in the orbit must be removed to a different orbit at the end of life if it is not to be a hazard to future operational spacecraft. The issue at present seems to be how high the reboost maneuver must be and what the system design and maneuver strategy should be to ensure effectiveness. The most economic removal of objects is achieved when those objects have the capability to execute the necessary maneuvers with their own systems and resources. The most costly option is to have some other system remove the object after it has become a derelict. Numerous options are being studied to develop

  20. Minimally Invasive Colorectal Cancer Surgery in Europe

    PubMed Central

    Babaei, Masoud; Balavarca, Yesilda; Jansen, Lina; Gondos, Adam; Lemmens, Valery; Sjövall, Annika; B⊘rge Johannesen, Tom; Moreau, Michel; Gabriel, Liberale; Gonçalves, Ana Filipa; Bento, Maria José; van de Velde, Tony; Kempfer, Lana Raffaela; Becker, Nikolaus; Ulrich, Alexis; Ulrich, Cornelia M.; Schrotz-King, Petra; Brenner, Hermann

    2016-01-01

    Abstract Minimally invasive surgery (MIS) of colorectal cancer (CRC) was first introduced over 20 years ago and recently has gained increasing acceptance and usage beyond clinical trials. However, data on dissemination of the method across countries and on long-term outcomes are still sparse. In the context of a European collaborative study, a total of 112,023 CRC cases from 3 population-based (N = 109,695) and 4 institute-based clinical cancer registries (N = 2328) were studied and compared on the utilization of MIS versus open surgery. Cox regression models were applied to study associations between surgery type and survival of patients from the population-based registries. The study considered adjustment for potential confounders. The percentage of CRC patients undergoing MIS differed substantially between centers and generally increased over time. MIS was significantly less often used in stage II to IV colon cancer compared with stage I in most centers. MIS tended to be less often used in older (70+) than in younger colon cancer patients. MIS tended to be more often used in women than in men with rectal cancer. MIS was associated with significantly reduced mortality among colon cancer patients in the Netherlands (hazard ratio [HR] 0.66, 95% confidence interval [CI] (0.63–0.69), Sweden (HR 0.68, 95% CI 0.60–0.76), and Norway (HR 0.73, 95% CI 0.67–0.79). Likewise, MIS was associated with reduced mortality of rectal cancer patients in the Netherlands (HR 0.74, 95% CI 0.68–0.80) and Sweden (HR 0.77, 95% CI 0.66–0.90). Utilization of MIS in CRC resection is increasing, but large variation between European countries and clinical centers prevails. Our results support association of MIS with substantially enhanced survival among colon cancer patients. Further studies controlling for selection bias and residual confounding are needed to establish role of MIS in survival of patients. PMID:27258522

  1. MINIMIZATION OF CARBON LOSS IN COAL REBURNING

    SciTech Connect

    Vladimir Zamansky; Vitali Lissianski; Pete Maly; Richard Koppang

    2002-09-10

    This project develops Fuel-Flexible Reburning (FFR) technology that is an improved version of conventional reburning. In FFR solid fuel is partially gasified before injection into the reburning zone of a boiler. Partial gasification of the solid fuel improves efficiency of NO{sub x} reduction and decreases LOI by increasing fuel reactivity. Objectives of this project were to develop engineering and scientific information and know-how needed to improve the cost of reburning via increased efficiency and minimized LOI and move the FFR technology to the demonstration and commercialization stage. All project objectives and technical performance goals have been met, and competitive advantages of FFR have been demonstrated. The work included a combination of experimental and modeling studies designed to identify optimum process conditions, confirm the process mechanism and to estimate cost effectiveness of the FFR technology. Experimental results demonstrated that partial gasification of a solid fuel prior to injection into the reburning zone improved the efficiency of NO{sub x} reduction and decreased LOI. Several coals with different volatiles content were tested. Testing suggested that incremental increase in the efficiency of NO{sub x} reduction due to coal gasification was more significant for coals with low volatiles content. Up to 14% increase in the efficiency of NO{sub x} reduction in comparison with basic reburning was achieved with coal gasification. Tests also demonstrated that FFR improved efficiency of NO{sub x} reduction for renewable fuels with high fuel-N content. Modeling efforts focused on the development of the model describing reburning with gaseous gasification products. Modeling predicted that the composition of coal gasification products depended on temperature. Comparison of experimental results and modeling predictions suggested that the heterogeneous NO{sub x} reduction on the surface of char played important role. Economic analysis confirmed

  2. Pesticides and other chemicals: minimizing worker exposures.

    PubMed

    Keifer, Matthew; Gasperini, Frank; Robson, Mark

    2010-07-01

    Pesticides, ammonia, and sanitizers, all used in agricultural production present ongoing risks for exposed workers. Pesticides continue to poison workers despite elimination of some of the most toxic older products. Obligatory reporting of pesticide poisonings exists in 30 states and surveillance of poisoning occurs in only 12. Estimates of poisoning numbers have been based on sampling but funding for this is scant and in constant jeopardy. There appears to be a downward trend in poisonings nationally based on SENSOR data. Newer more pest-specific pesticides are generally less toxic and present less health risks but may have unpredicted health effects in humans that may not emerge until used widely. Internationally, older cheaper chemicals continue to be used with serious consequences in many developing countries. Monitoring workers for overexposure to pesticides broadly is impractical with the exception of the cholinesterase inhibitors. We can learn much from monitoring systems. Unfortunately, monitoring tools are economically inaccessible for most other chemical groups. New technologies for toxicity testing will necessitate new biomonitoring tools that should be supplied by the producers of these chemicals and made available for protecting worker and the public. Protection of workers from pesticides is primarily based on personal protective equipment use, which presents significant hardship for workers in hot environments and is generally considered the least effective approach on the hierarchy of controls in worker protection. Isolation through the use of closed systems has been employed, though rarely studied as to effectiveness in field use. Substitution or replacing harmful substances with safer ones is underway as more pest specific chemicals enter the pesticide portfolio and older ones drop out. This paper summarizes the panel presentation, "Minimizing Exposures to Pesticides and Other Chemicals," at the Agricultural Safety and Health Council of America

  3. On the formulation of a minimal uncertainty model for robust control with structured uncertainty

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1991-01-01

    In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix

  4. Emerging Seafood Preservation Techniques to Extend Freshness and Minimize Vibrio Contamination

    PubMed Central

    Ronholm, Jennifer; Lau, Fiona; Banerjee, Swapan K.

    2016-01-01

    Globally, the popularity of seafood consumption is increasing exponentially. To meet the demands of a growing market, the seafood industry has increasingly been innovating ways to keep their products fresh and safe while increasing production. Marine environments harbor several species of indigenous microorganisms, some of which, including Vibrio spp., may be harmful to humans, and all of which are part of the natural microbiota of the seafood. After harvest, seafood products are often shipped over large geographic distances, sometimes for prolonged periods, during which the food must stay fresh and pathogen proliferation must be minimized. Upon arrival there is often a strong desire, arising from both culinary and nutritional considerations, to consume seafood products raw, or minimally cooked. This supply chain along with popular preferences have increased challenges for the seafood industry. This has resulted in a desire to develop methodologies that reduce pathogenic and spoilage organisms in seafood items to comply with regulations and result in minimal changes to the taste, texture, and nutritional content of the final product. This mini-review discusses and compares several emerging technologies, such as treatment with plant derived natural compounds, phage lysis, high-pressure processing, and irradiation for their ability to control pathogenic vibrios, limit the growth of spoilage organisms, and keep the desired organoleptic properties of the seafood product intact. PMID:27047466

  5. Application of Sequential Quadratic Programming to Minimize Smart Active Flap Rotor Hub Loads

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi; Leyland, Jane

    2014-01-01

    In an analytical study, SMART active flap rotor hub loads have been minimized using nonlinear programming constrained optimization methodology. The recently developed NLPQLP system (Schittkowski, 2010) that employs Sequential Quadratic Programming (SQP) as its core algorithm was embedded into a driver code (NLP10x10) specifically designed to minimize active flap rotor hub loads (Leyland, 2014). Three types of practical constraints on the flap deflections have been considered. To validate the current application, two other optimization methods have been used: i) the standard, linear unconstrained method, and ii) the nonlinear Generalized Reduced Gradient (GRG) method with constraints. The new software code NLP10x10 has been systematically checked out. It has been verified that NLP10x10 is functioning as desired. The following are briefly covered in this paper: relevant optimization theory; implementation of the capability of minimizing a metric of all, or a subset, of the hub loads as well as the capability of using all, or a subset, of the flap harmonics; and finally, solutions for the SMART rotor. The eventual goal is to implement NLP10x10 in a real-time wind tunnel environment.

  6. Miniature temperature insensitive fiber optic sensors for minimally invasive surgical devices

    NASA Astrophysics Data System (ADS)

    Rajan, Ginu; Callaghan, Dean; Semenova, Yuliya; Farrell, Gerald

    2011-05-01

    This paper presents the concept of implementing miniature temperature insensitive optical fiber sensors into minimally invasive surgical devices such as graspers, staplers and scissors. The lack of temperature insensitive and accurate force feedback end effectors make the current minimally invasive surgeries (MIS) less effective especially in the area of electrosurgery. The failure to provide accurate force feedback information reduces the user's sense of immersion in the operating procedure. In this paper we present fiber sensors based on photonic crystal fibers (PCF) for force feedback from the end effectors. Two types of miniature temperature insensitive PCF sensors can be implemented for MIS applications; a Fabry-Perot interferometric sensor based on hollow core PCF and a tapered modal interferometric sensor based on a solid core PCF. A concept for interrogating these sensors effectively at minimal cost is also presented. The integration of sensors onto the end effectors is also important as one has to find an optimum position for maximum strain/force transfer to the fiber sensor without interfering with the operation of the surgical tool. We have also presented the methodology for incorporating the sensors into surgical end-effectors in this paper.

  7. VISION 21 SYSTEMS ANALYSIS METHODOLOGIES

    SciTech Connect

    G.S. Samuelsen; A. Rao; F. Robson; B. Washom

    2003-08-11

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into power plant systems that meet performance and emission goals of the Vision 21 program. The study efforts have narrowed down the myriad of fuel processing, power generation, and emission control technologies to selected scenarios that identify those combinations having the potential to achieve the Vision 21 program goals of high efficiency and minimized environmental impact while using fossil fuels. The technology levels considered are based on projected technical and manufacturing advances being made in industry and on advances identified in current and future government supported research. Included in these advanced systems are solid oxide fuel cells and advanced cycle gas turbines. The results of this investigation will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  8. Why the debate over minimal risk needs to be reconsidered.

    PubMed

    Binik, Ariella; Weijer, Charles

    2014-08-01

    Minimal risk is a central concept in the ethical analysis of research with children. It is defined as the risks ". . . ordinarily encountered in daily life . . . ." But the question arises: who is the referent for minimal risk? Commentators in the research ethics literature often answer this question by endorsing one of two possible interpretations: the uniform interpretation (which is also known as the absolute interpretation) or the relative interpretation of minimal risk. We argue that describing the debate over minimal risk as a disagreement between the uniform and the relative interpretation impedes progress on the identification of a justifiable referent for minimal risk. There are two main problems with this approach: (1) constructing the debate over minimal risk as a disagreement between a uniform and a relative interpretation misconstrues the main difference between competing interpretations and (2) neither the uniform nor the relative interpretation identifies one unique and consistent group of children as the referent for minimal risk. We conclude that progress on the debate over minimal risk requires that we abandon the uniform and relative interpretations and address the main moral problem at stake: whether healthy children or the subjects of the research should be the referent for minimal risk.

  9. CONSULTATION ON UPDATED METHODOLOGY FOR ...

    EPA Pesticide Factsheets

    The National Academy of Sciences (NAS) expects to publish the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in calendar year 2005. The committee is expected to have analyzed the most recent epidemiology from the important exposed cohorts and to have factored in any changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee will also consider any relevant radiobiological data, including those from the Department of Energy's low dose effects research program. Based on their evaluation of relevant information, the Committee is then expected to propose a set of models for estimating risks from low-dose ionizing radiation. ORIA will review the BEIR VII report and consider revisions to the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This will be the subject of the Consultation. This project supports a major risk management initiative to improve the basis on which radiation risk decisions are made. This project, funded by several Federal Agencies, reflects an attempt to characterize risks where there are substantial uncertainties. The outcome will improve our ability to assess risks well into the future and will strengthen EPAs overall capability for assessing and managing radiation risks. the BEIR VII report is funde

  10. Waste Package Design Methodology Report

    SciTech Connect

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  11. 331 models and grand unification: From minimal SU(5) to minimal SU(6)

    NASA Astrophysics Data System (ADS)

    Deppisch, Frank F.; Hati, Chandan; Patra, Sudhanwa; Sarkar, Utpal; Valle, José W. F.

    2016-11-01

    We consider the possibility of grand unification of the SU(3)c ⊗ SU(3)L ⊗ U(1)X model in an SU(6) gauge unification group. Two possibilities arise. Unlike other conventional grand unified theories, in SU(6) one can embed the 331 model as a subgroup such that different multiplets appear with different multiplicities. Such a scenario may emerge from the flux breaking of the unified group in an E(6) F-theory GUT. This provides new ways of achieving gauge coupling unification in 331 models while providing the radiative origin of neutrino masses. Alternatively, a sequential variant of the SU(3)c ⊗ SU(3)L ⊗ U(1)X model can fit within a minimal SU(6) grand unification, which in turn can be a natural E(6) subgroup. This minimal SU(6) embedding does not require any bulk exotics to account for the chiral families while allowing for a TeV scale SU(3)c ⊗ SU(3)L ⊗ U(1)X model with seesaw-type neutrino masses.

  12. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    NASA Astrophysics Data System (ADS)

    Franco, G.

    2009-12-01

    second disfavors the investor who loses part of the investment without a reasonable cause. A streamlined and fairly automated methodology has been developed to design parametric triggers that minimize the basis risk while still maintaining their level of relative simplicity. Basis risk is minimized in both, first and second generation, parametric cat bonds through an optimization procedure that aims to find the most appropriate magnitude thresholds, geographic zones, and weight index values. Sensitivity analyses to different design assumptions show that first generation cat bonds are typically affected by a large negative basis risk, namely the risk that the bond will not trigger for events within the risk level transferred, unless a sufficiently small geographic resolution is selected to define the trigger zones. Second generation cat bonds in contrast display a bias towards negative or positive basis risk depending on the degree of the polynomial used as well as on other design parameters. Two examples are presented, the construction of a first generation parametric trigger mechanism for Costa Rica and the design of a second generation parametric index for Japan.

  13. Work Domain Analysis: Theoretical Concepts and Methodology

    DTIC Science & Technology

    2005-02-01

    method to elicit expert knowledge: A case study in the methodology of cognitive task analysis. Human Factors, 40, 254-276. Itoh, J., Sakuma, A...Work Domain Analysis: Theoretical Concepts and Methodology Neelam Naikar, Robyn Hopcroft, and Anna Moylan Air Operations...theoretical and methodological approach for work domain analysis (WDA), the first phase of cognitive work analysis. The report: (1) addresses a number of

  14. Simplified Methodology for Calculating Building Heating Loads.

    DTIC Science & Technology

    1980-11-01

    an inexpensive, accurate, and reliable simplified methodology , termed the "Modified Bin Method ", for 2 calculating building heating loads. In doing so...I AD-AI01 725 AIR FORCE INST OF TECH WRIGMT-PATTERSON AFB OH F/6 13/1 SIMPLIFIED METHODOLOGY FOR CALCULATING BUILDING HEATING LOADS.(U) NOV 80 S 0...University The Graduate School ," Department of Architectural Engineering 4, Simplified Methodology for Calculating Building Heating Loads, -A /. ’.- A

  15. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  16. Constructivism: a naturalistic methodology for nursing inquiry.

    PubMed

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  17. Grounded theory methodology--narrativity revisited.

    PubMed

    Ruppel, Paul Sebastian; Mey, Günter

    2015-06-01

    This article aims to illuminate the role of narrativity in Grounded Theory Methodology and to explore an approach within Grounded Theory Methodology that is sensitized towards aspects of narrativity. The suggested approach takes into account narrativity as an aspect of the underlying data. It reflects how narrativity could be conceptually integrated and systematically used for shaping the way in which coding, category development and the presentation of results in a Grounded Theory Methodology study proceed.

  18. Nuclear insurance risk assessment using risk-based methodology

    SciTech Connect

    Wendland, W.G. )

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance.

  19. Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.

    2012-01-01

    This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.

  20. Spectral zone selection methodology for pebble bed reactors

    SciTech Connect

    Ramatsemela Mphahlele; Abderrafi M. Ougouag; Kostadin N. Ivanov; Hans D. Gougar

    2011-01-01

    A methodology is developed for determining boundaries of spectral zones for pebble bed reactors. A spectral zone is defined as a region made up of a number of nodes whose characteristics are collectively similar and that are assigned the same few-group diffusion constants. The spectral zones are selected in such a manner that the difference (error) between the reference transport solution and the diffusion code solution takes a minimum value. This is achieved by choosing spectral zones through optimally minimizing this error. The objective function for the optimization algorithm is the total reaction rate error, which is defined as the sum of the leakage, absorption and fission reaction rates errors in each zone. The selection of these spectral zones is such that the core calculation results based on diffusion theory are within an acceptable tolerance as compared to a proper transport reference solution. Through this work, a consistent approach for identifying spectral zones that yield more accurate diffusion results is introduced.

  1. Approaches to semi-synthetic minimal cells: a review

    NASA Astrophysics Data System (ADS)

    Luisi, Pier Luigi; Ferri, Francesca; Stano, Pasquale

    2006-01-01

    Following is a synthetic review on the minimal living cell, defined as an artificial or a semi-artificial cell having the minimal and sufficient number of components to be considered alive. We describe concepts and experiments based on these constructions, and we point out that an operational definition of minimal cell does not define a single species, but rather a broad family of interrelated cell-like structures. The relevance of these researches, considering that the minimal cell should also correspond to the early simple cell in the origin of life and early evolution, is also explained. In addition, we present detailed data in relation to minimal genome, with observations cited by several authors who agree on setting the theoretical full-fledged minimal genome to a figure between 200 and 300 genes. However, further theoretical assumptions may significantly reduce this number (i.e. by eliminating ribosomal proteins and by limiting DNA and RNA polymerases to only a few, less specific molecular species). Generally, the experimental approach to minimal cells consists in utilizing liposomes as cell models and in filling them with genes/enzymes corresponding to minimal cellular functions. To date, a few research groups have successfully induced the expression of single proteins, such as the green fluorescence protein, inside liposomes. Here, different approaches are described and compared. Present constructs are still rather far from the minimal cell, and experimental as well as theoretical difficulties opposing further reduction of complexity are discussed. While most of these minimal cell constructions may represent relatively poor imitations of a modern full-fledged cell, further studies will begin precisely from these constructs. In conclusion, we give a brief outline of the next possible steps on the road map to the minimal cell.

  2. [Phase I cancer trials methodology].

    PubMed

    Le Tourneau, Christophe; Faivre, Sandrine; Raymond, Eric; Diéras, Véronique

    2007-11-01

    The main objective of phase I cancer trials is to determine precisely the recommended dose of an anticancer agent as a single agent or in a context of combinations of anticancer agents (including cytotoxic agents, immunotherapy, radiotherapy...), that is administered for the first time in man, to further proceed clinical development with phase II and III trials. The recommended dose must have the greatest efficiency with acceptable toxicity. For the anticancer agents, the ratio risk/benefit is high, since toxicities associated with many cancer therapeutic agents are substantial and because the efficacy is often limited. Thus, phase I cancer trials present unique challenges in comparison to other therapeutic areas. Indeed, it is essential to minimize the numbers of patients treated at subefficient dose levels, and in the same time not to expose the patients to unacceptable toxicity. Historically, the first method that has been used is the Fibonacci escalation. The major problems raised with this method have been the lengths of the trials and the risk to treat substantial numbers of patients at nontherapeutix doses. Thus, novel methods have been then developed modifying the numbers of patients included at each dose level and the rapidity of dose escalation. These methods include pharmacologically guided dose escalation, escalation with overdose control and the continual reassessment method which are both statistically based dose escalation methods, and the accelerated titration designs. Concerning the targeted anticancer therapies, the therapeutic effect on the target, due to their higher specificity, can be obtained using doses that have few toxicity. Using the toxicity to determine the recommended dose for phase II trials, as it is the case for "classical > anticancer agents, does not seem to be sufficient. Alternatives to determine the optimal biological dose include measurement of target inhibition, pharmacokinetic analysis and functional imaging.

  3. WASTE MINIMIZATION ASSESSMENT FOR A MANUFACTURER OF REFURBISHED RAILCAR ASSEMBLIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. Waste Minimization Assessment Centers (WMACs) were established at selected ...

  4. Non-Minimal RF2-TYPE Corrections to Holographic Superconductor

    NASA Astrophysics Data System (ADS)

    Sert, Özcan; Adak, Muzaffer

    2013-12-01

    We study (2+1)-dimensional holographic superconductors in the presence of non-minimally coupled electromagnetic field to gravity by considering an arbitrary linear combination of RF2-type invariants with three parameters. Our analytical analysis shows that the non-minimal couplings affect the condensate and the critical temperature.

  5. Probing the non-minimal Higgs sector at the SSC

    SciTech Connect

    Gunion, J.F.; Haber, H.E.; Komamiya, S.; Yamamoto, H.; Barbaro-Galtieri, A.

    1987-11-01

    Non-minimal Higgs sectors occur in the Standard Model with more than one Higgs doublet, as well as in theories that go beyond the Standard Model. In this report, we discuss how Higgs search strategies must be altered, with respect to the Standard Model approaches, in order to probe the non-minimal Higgs sectors at the SSC.

  6. Waste Minimization Assessment for Multilayered Printed Circuit Board Manufacturing

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium- size manu facturers who want to minimize their generation of hazardous waste but lack the expertise to do so. Waste Minimization Assessment Centers (WMACs) were established at s...

  7. 10 CFR 850.25 - Exposure reduction and minimization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Exposure reduction and minimization. 850.25 Section 850.25 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.25 Exposure reduction and minimization. (a) The responsible employer must ensure that no worker...

  8. 10 CFR 850.25 - Exposure reduction and minimization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Exposure reduction and minimization. 850.25 Section 850.25 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.25 Exposure reduction and minimization. (a) The responsible employer must ensure that no worker...

  9. 10 CFR 850.25 - Exposure reduction and minimization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Exposure reduction and minimization. 850.25 Section 850.25 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.25 Exposure reduction and minimization. (a) The responsible employer must ensure that no worker...

  10. 10 CFR 850.25 - Exposure reduction and minimization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Exposure reduction and minimization. 850.25 Section 850.25 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.25 Exposure reduction and minimization. (a) The responsible employer must ensure that no worker...

  11. 10 CFR 850.25 - Exposure reduction and minimization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Exposure reduction and minimization. 850.25 Section 850.25 Energy DEPARTMENT OF ENERGY CHRONIC BERYLLIUM DISEASE PREVENTION PROGRAM Specific Program Requirements § 850.25 Exposure reduction and minimization. (a) The responsible employer must ensure that no worker...

  12. Minimally invasive treatment of multilevel spinal epidural abscess.

    PubMed

    Safavi-Abbasi, Sam; Maurer, Adrian J; Rabb, Craig H

    2013-01-01

    The use of minimally invasive tubular retractor microsurgery for treatment of multilevel spinal epidural abscess is described. This technique was used in 3 cases, and excellent results were achieved. The authors conclude that multilevel spinal epidural abscesses can be safely and effectively managed using microsurgery via a minimally invasive tubular retractor system.

  13. Many Denjoy minimal sets for monotone recurrence relations

    NASA Astrophysics Data System (ADS)

    Wang, Ya-Nan; Qin, Wen-Xin

    2014-09-01

    We extend Mather's work (1985 Comment. Math. Helv. 60 508-57) to high-dimensional cylinder maps defined by monotone recurrence relations, e.g. the generalized Frenkel-Kontorova model with finite range interactions. We construct uncountably many Denjoy minimal sets provided that the Birkhoff minimizers with some irrational rotation number ω do not form a foliation.

  14. WASTE MINIMIZATION ASSESSMENT FOR A MANUFACTURER OF PRINTED LABELS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium-size manufacturers who want to minimize their generation of hazardous waste but lack the expertise to do so. Waste Minimization Assessment Centers (WMACs) were established at sel...

  15. Dens in dente: A minimally invasive nonsurgical approach!

    PubMed Central

    Hegde, Vivek; Morawala, Abdul; Gupta, Abhilasha; Khandwawala, Naqiyaa

    2016-01-01

    Dens invaginatus, also known as dens in dente, is a rare anomaly affecting human dentition. The condition results in invagination of an amelodental structure within the pulp. This case report discusses the current management protocol of dens invaginatus using a minimally invasive and nonsurgical treatment option. As with most conditions, early diagnosis and preventive measures help minimize complications in dens invaginatus cases. PMID:27656073

  16. WASTE MINIMIZATION ASSESSMENT FOR A MANUFACTURER OF SPEED REDUCTION EQUIPMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium-size manufacturers who want to minimize their generation of hazardous waste but lack the expertise to do so. Waste Minimization Assessment Centers (WMACs) were established at sel...

  17. Minimal mass size of a stable {sup 3}He cluster

    SciTech Connect

    Guardiola, R.; Navarro, J.

    2005-03-01

    The minimal number of {sup 3}He atoms required to form a bound cluster has been estimated by means of a diffusion Monte Carlo procedure within the fixed-node approximation. Several importance sampling wave functions have been employed in order to consider different shell-model configurations. The resulting upper bound for the minimal number is 32 atoms.

  18. WASTE MINIMIZATION ASSESSMENT FOR A MANUFACTURER OF SHEET METAL COMPONENTS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. n an effort to assist these manufacturers Waste Minimization ssessment Cente...

  19. Anyons in quantum mechanics with a minimal length

    NASA Astrophysics Data System (ADS)

    Buisseret, Fabien

    2017-02-01

    The existence of anyons, i.e. quantum states with an arbitrary spin, is a generic feature of standard quantum mechanics in (2 + 1) -dimensional Minkowski spacetime. Here it is shown that relativistic anyons may exist also in quantum theories where a minimal length is present. The interplay between minimal length and arbitrary spin effects are discussed.

  20. National Institutes of Health: Mixed waste minimization and treatment

    SciTech Connect

    1995-08-01

    The Appalachian States Low-Level Radioactive Waste Commission requested the US Department of Energy`s National Low-Level Waste Management Program (NLLWMP) to assist the biomedical community in becoming more knowledgeable about its mixed waste streams, to help minimize the mixed waste stream generated by the biomedical community, and to identify applicable treatment technologies for these mixed waste streams. As the first step in the waste minimization process, liquid low-level radioactive mixed waste (LLMW) streams generated at the National Institutes of Health (NIH) were characterized and combined into similar process categories. This report identifies possible waste minimization and treatment approaches for the LLMW generated by the biomedical community identified in DOE/LLW-208. In development of the report, on site meetings were conducted with NIH personnel responsible for generating each category of waste identified as lacking disposal options. Based on the meetings and general waste minimization guidelines, potential waste minimization options were identified.