Science.gov

Sample records for minimal cut-set methodology

  1. Computing complex metabolic intervention strategies using constrained minimal cut sets.

    PubMed

    Hädicke, Oliver; Klamt, Steffen

    2011-03-01

    The model-driven search for gene deletion strategies that increase the production performance of microorganisms is an essential part of metabolic engineering. One theoretical approach is based on Minimal Cut Sets (MCSs) which are minimal sets of knockouts disabling the operation of a specified set of target elementary modes. A limitation of the current approach is that MCSs can induce side effects disabling also desired functionalities. We, therefore, generalize MCSs to Constrained MCSs (cMCSs) allowing for the additional definition of a set of desired modes of which a minimum number must be preserved. Exemplarily for ethanol production by Escherichia coli, we demonstrate that this approach offers enormous flexibility in defining and solving knockout problems. Moreover, many existing methods can be reformulated as special cMCS problems. The cMCSs approach allows systematic enumeration of all equivalent gene deletion combinations and also helps to determine robust knockout strategies for coupled product and biomass synthesis. PMID:21147248

  2. Comparison and improvement of algorithms for computing minimal cut sets

    PubMed Central

    2013-01-01

    Background Constrained minimal cut sets (cMCSs) have recently been introduced as a framework to enumerate minimal genetic intervention strategies for targeted optimization of metabolic networks. Two different algorithmic schemes (adapted Berge algorithm and binary integer programming) have been proposed to compute cMCSs from elementary modes. However, in their original formulation both algorithms are not fully comparable. Results Here we show that by a small extension to the integer program both methods become equivalent. Furthermore, based on well-known preprocessing procedures for integer programming we present efficient preprocessing steps which can be used for both algorithms. We then benchmark the numerical performance of the algorithms in several realistic medium-scale metabolic models. The benchmark calculations reveal (i) that these preprocessing steps can lead to an enormous speed-up under both algorithms, and (ii) that the adapted Berge algorithm outperforms the binary integer approach. Conclusions Generally, both of our new implementations are by at least one order of magnitude faster than other currently available implementations. PMID:24191903

  3. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same

  4. Fast computation of minimal cut sets in metabolic networks with a Berge algorithm that utilizes binary bit pattern trees.

    PubMed

    Jungreuthmayer, Christian; Beurton-Aimar, Marie; Zanghellini, Jürgen

    2013-01-01

    Minimal cut sets are a valuable tool for analyzing metabolic networks and for identifying optimal gene intervention strategies by eliminating unwanted metabolic functions and keeping desired functionality. Minimal cut sets rely on the concept of elementary flux modes, which are sets of indivisible metabolic pathways under steady-state condition. However, the computation of minimal cut sets is nontrivial, as even medium-sized metabolic networks with just 100 reactions easily have several hundred million elementary flux modes. We developed a minimal cut set tool that implements the well-known Berge algorithm and utilizes a novel approach to significantly reduce the program run time by using binary bit pattern trees. By using the introduced tree approach, the size of metabolic models that can be analyzed and optimized by minimal cut sets is pushed to new and considerably higher limits.

  5. A new efficient algorithm generating all minimal S-T cut-sets in a graph-modeled network

    NASA Astrophysics Data System (ADS)

    Malinowski, Jacek

    2016-06-01

    A new algorithm finding all minimal s-t cut-sets in a graph-modeled network with failing links and nodes is presented. It is based on the analysis of the tree of acyclic s-t paths connecting a given pair of nodes in the considered structure. The construction of such a tree is required by many existing algorithms for s-t cut-sets generation in order to eliminate "stub" edges or subgraphs through which no acyclic path passes. The algorithm operates on the acyclic paths tree alone, i.e. no other analysis of the network's topology is necessary. It can be applied to both directed and undirected graphs, as well as partly directed ones. It is worth noting that the cut-sets can be composed of both links and failures, while many known algorithms do not take nodes into account, which is quite restricting from the practical point of view. The developed cut-sets generation technique makes the algorithm significantly faster than most of the previous methods, as proved by the experiments.

  6. FTA Basic Event & Cut Set Ranking.

    1999-05-04

    Version 00 IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability ofmore » the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code.« less

  7. SIGPI. Fault Tree Cut Set System Performance

    SciTech Connect

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.

  8. SIGPI. Fault Tree Cut Set System Performance

    SciTech Connect

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.

  9. Fault Tree Cut Set System Performance.

    2000-02-21

    Version 00 SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependentmore » components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  10. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  11. Critical methodological factors in diagnosing minimal residual disease in hematological malignancies using quantitative PCR.

    PubMed

    Nyvold, Charlotte Guldborg

    2015-05-01

    Hematological malignancies are a heterogeneous group of cancers with respect to both presentation and prognosis, and many subtypes are nowadays associated with aberrations that make up excellent molecular targets for the quantification of minimal residual disease. The quantitative PCR methodology is outstanding in terms of sensitivity, specificity and reproducibility and thus an excellent choice for minimal residual disease assessment. However, the methodology still has pitfalls that should be carefully considered when the technique is integrated in a clinical setting.

  12. Proposed SPAR Modeling Method for Quantifying Time Dependent Station Blackout Cut Sets

    SciTech Connect

    John A. Schroeder

    2010-06-01

    Abstract: The U.S. Nuclear Regulatory Commission’s (USNRC’s) Standardized Plant Analysis Risk (SPAR) models and industry risk models take similar approaches to analyzing the risk associated with loss of offsite power and station blackout (LOOP/SBO) events at nuclear reactor plants. In both SPAR models and industry models, core damage risk resulting from a LOOP/SBO event is analyzed using a combination of event trees and fault trees that produce cut sets that are, in turn, quantified to obtain a numerical estimate of the resulting core damage risk. A proposed SPAR method for quantifying the time-dependent cut sets is sometimes referred to as a convolution method. The SPAR method reflects assumptions about the timing of emergency diesel failures, the timing of subsequent attempts at emergency diesel repair, and the timing of core damage that may be different than those often used in industry models. This paper describes the proposed SPAR method.

  13. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  14. Using benchmarking to minimize common DOE waste streams. Volume 1, Methodology and liquid photographic waste

    SciTech Connect

    Levin, V.

    1994-04-01

    Finding innovative ways to reduce waste streams generated at Department of Energy (DOE) sites by 50% by the year 2000 is a challenge for DOE`s waste minimization efforts. This report examines the usefulness of benchmarking as a waste minimization tool, specifically regarding common waste streams at DOE sites. A team of process experts from a variety of sites, a project leader, and benchmarking consultants completed the project with management support provided by the Waste Minimization Division EM-352. Using a 12-step benchmarking process, the team examined current waste minimization processes for liquid photographic waste used at their sites and used telephone and written questionnaires to find ``best-in-class`` industrv partners willing to share information about their best waste minimization techniques and technologies through a site visit. Eastman Kodak Co., and Johnson Space Center/National Aeronautics and Space Administration (NASA) agreed to be partners. The site visits yielded strategies for source reduction, recycle/recovery of components, regeneration/reuse of solutions, and treatment of residuals, as well as best management practices. An additional benefit of the work was the opportunity for DOE process experts to network and exchange ideas with their peers at similar sites.

  15. POLLUTION BALANCE: A NEW METHODOLOGY FOR MINIMIZING WASTE PRODUCTION IN MANUFACTURING PROCESSES.

    EPA Science Inventory

    A new methodolgy based on a generic pollution balance equation, has been developed for minimizing waste production in manufacturing processes. A "pollution index," defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative meas...

  16. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations.

    PubMed

    Andrews, Nick A; Latrémolière, Alban; Basbaum, Allan I; Mogil, Jeffrey S; Porreca, Frank; Rice, Andrew S C; Woolf, Clifford J; Currie, Gillian L; Dworkin, Robert H; Eisenach, James C; Evans, Scott; Gewandter, Jennifer S; Gover, Tony D; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P; Kennedy, Jeffrey D; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P; McMahon, Stephen B; Price, Theodore J; Ross, Sarah E; Scherrer, Grégory; Seal, Rebecca P; Sena, Emily S; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I; Turk, Dennis C; Whiteside, Garth

    2016-04-01

    There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public-private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  17. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations.

    PubMed

    Andrews, Nick A; Latrémolière, Alban; Basbaum, Allan I; Mogil, Jeffrey S; Porreca, Frank; Rice, Andrew S C; Woolf, Clifford J; Currie, Gillian L; Dworkin, Robert H; Eisenach, James C; Evans, Scott; Gewandter, Jennifer S; Gover, Tony D; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P; Kennedy, Jeffrey D; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P; McMahon, Stephen B; Price, Theodore J; Ross, Sarah E; Scherrer, Grégory; Seal, Rebecca P; Sena, Emily S; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I; Turk, Dennis C; Whiteside, Garth

    2016-04-01

    There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public-private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues.

  18. A minimally invasive methodology based on morphometric parameters for day 2 embryo quality assessment.

    PubMed

    Molina, Inmaculada; Lázaro-Ibáñez, Elisa; Pertusa, Jose; Debón, Ana; Martínez-Sanchís, Juan Vicente; Pellicer, Antonio

    2014-10-01

    The risk of multiple pregnancy to maternal-fetal health can be minimized by reducing the number of embryos transferred. New tools for selecting embryos with the highest implantation potential should be developed. The aim of this study was to evaluate the ability of morphological and morphometric variables to predict implantation by analysing images of embryos. This was a retrospective study of 135 embryo photographs from 112 IVF-ICSI cycles carried out between January and March 2011. The embryos were photographed immediately before transfer using Cronus 3 software. Their images were analysed using the public program ImageJ. Significant effects (P < 0.05), and higher discriminant power to predict implantation were observed for the morphometric embryo variables compared with morphological ones. The features for successfully implanted embryos were as follows: four cells on day 2 of development; all blastomeres with circular shape (roundness factor greater than 0.9), an average zona pellucida thickness of 13 µm and an average of 17695.1 µm² for the embryo area. Embryo size, which is described by its area and the average roundness factor for each cell, provides two objective variables to consider when predicting implantation. This approach should be further investigated for its potential ability to improve embryo scoring.

  19. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations

    PubMed Central

    Andrews, Nick A.; Latrémolière, Alban; Basbaum, Allan I.; Mogil, Jeffrey S.; Porreca, Frank; Rice, Andrew S.C.; Woolf, Clifford J.; Currie, Gillian L.; Dworkin, Robert H.; Eisenach, James C.; Evans, Scott; Gewandter, Jennifer S.; Gover, Tony D.; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P.; Kennedy, Jeffrey D.; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P.; McMahon, Stephen B.; Price, Theodore J.; Ross, Sarah E.; Scherrer, Grégory; Seal, Rebecca P.; Sena, Emily S.; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I.; Turk, Dennis C.; Whiteside, Garth

    2015-01-01

    Abstract There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  20. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology.

  1. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows.

  2. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows. PMID:24871934

  3. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  4. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  5. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  6. Methodological Gravitism

    ERIC Educational Resources Information Center

    Zaman, Muhammad

    2011-01-01

    In this paper the author presents the case of the exchange marriage system to delineate a model of methodological gravitism. Such a model is not a deviation from or alteration to the existing qualitative research approaches. I have adopted culturally specific methodology to investigate spouse selection in line with the Grounded Theory Method. This…

  7. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  8. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  9. Probabilistic inspection strategies for minimizing service failures

    NASA Technical Reports Server (NTRS)

    Brot, Abraham

    1994-01-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  10. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  11. [Minimal Change Esophagitis].

    PubMed

    Ryu, Han Seung; Choi, Suck Chei

    2016-01-25

    Gastroesophageal reflux disease (GERD) is defined as a condition which develops when the reflux of gastric contents causes troublesome symptoms and long-term complications. GERD can be divided into erosive reflux disease and non-erosive reflux disease based on endoscopic findings defined by the presence of mucosal break. The Los Angeles classification excludes minimal changes as an evidence of reflux esophagitis because of poor interobserver agreement. In the Asian literature, minimal changes are considered as one of the endoscopic findings of reflux esophagitis, but the clinical significance is still controversial. Minimal change esophagitis is recognized quite frequently among patients with GERD and many endoscopists recognize such findings in their clinical practice. This review is intended to clarify the definition of minimal change esophagitis and their histology, interobserver agreement, and symptom association with GERD.

  12. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  13. Inverse Modeling Via Linearized Functional Minimization

    NASA Astrophysics Data System (ADS)

    Barajas-Solano, D. A.; Wohlberg, B.; Vesselinov, V. V.; Tartakovsky, D. M.

    2014-12-01

    We present a novel parameter estimation methodology for transient models of geophysical systems with uncertain, spatially distributed, heterogeneous and piece-wise continuous parameters.The methodology employs a bayesian approach to propose an inverse modeling problem for the spatial configuration of the model parameters.The likelihood of the configuration is formulated using sparse measurements of both model parameters and transient states.We propose using total variation regularization (TV) as the prior reflecting the heterogeneous, piece-wise continuity assumption on the parameter distribution.The maximum a posteriori (MAP) estimator of the parameter configuration is then computed by minimizing the negative bayesian log-posterior using a linearized functional minimization approach. The computation of the MAP estimator is a large-dimensional nonlinear minimization problem with two sources of nonlinearity: (1) the TV operator, and (2) the nonlinear relation between states and parameters provided by the model's governing equations.We propose a a hybrid linearized functional minimization (LFM) algorithm in two stages to efficiently treat both sources of nonlinearity.The relation between states and parameters is linearized, resulting in a linear minimization sub-problem equipped with the TV operator; this sub-problem is then minimized using the Alternating Direction Method of Multipliers (ADMM). The methodology is illustrated with a transient saturated groundwater flow application in a synthetic domain, stimulated by external point-wise loadings representing aquifer pumping, together with an array of discrete measurements of hydraulic conductivity and transient measurements of hydraulic head.We show that our inversion strategy is able to recover the overall large-scale features of the parameter configuration, and that the reconstruction is improved by the addition of transient information of the state variable.

  14. Mycotoxin methodology.

    PubMed

    Scott, P M

    1995-01-01

    Sensitive, specific, accurate and precise methods of analysis are needed for enforcement of mycotoxin regulations, other monitoring programmes, and research studies. Rapid screening tests are useful for control at all stages of food and feed production. There is a wide choice of both quantitative and qualitative methods for the more well known mycotoxins. Those at present covered by method standardization organizations such as AOAC International are aflatoxins (including M1), Alternaria toxins, citrinin, cyclopiazonic acid, ergot alkaloids, fumonisins, ochratoxins, patulin, trichothecenes, and zearalenone. Methodology for mycotoxins is selectively reviewed in this paper with emphasis on the procedures comprising the analytical method--sampling, extraction of naturally contaminated samples, clean-up, detection and determination, and confirmation. Also covered are automation, method comparison, and method assessment.

  15. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  16. Testing methodologies

    SciTech Connect

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  17. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  18. Minimally invasive periodontal therapy.

    PubMed

    Dannan, Aous

    2011-10-01

    Minimally invasive dentistry is a concept that preserves dentition and supporting structures. However, minimally invasive procedures in periodontal treatment are supposed to be limited within periodontal surgery, the aim of which is to represent alternative approaches developed to allow less extensive manipulation of surrounding tissues than conventional procedures, while accomplishing the same objectives. In this review, the concept of minimally invasive periodontal surgery (MIPS) is firstly explained. An electronic search for all studies regarding efficacy and effectiveness of MIPS between 2001 and 2009 was conducted. For this purpose, suitable key words from Medical Subject Headings on PubMed were used to extract the required studies. All studies are demonstrated and important results are concluded. Preliminary data from case cohorts and from many studies reveal that the microsurgical access flap, in terms of MIPS, has a high potential to seal the healing wound from the contaminated oral environment by achieving and maintaining primary closure. Soft tissues are mostly preserved and minimal gingival recession is observed, an important feature to meet the demands of the patient and the clinician in the esthetic zone. However, although the potential efficacy of MIPS in the treatment of deep intrabony defects has been proved, larger studies are required to confirm and extend the reported positive preliminary outcomes.

  19. Minimizing Promotion Trauma.

    ERIC Educational Resources Information Center

    Darling, LuAnn W.; McGrath, Loraine

    1983-01-01

    Nursing administrators can minimize promotion trauma and its unnecessary cost by building awareness of the transition process, clarifying roles and expectations, and attending to the promoted employee's needs. This article will help nursing administrators develop a concept of manager care combined with programs for orientation of new managers,…

  20. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  1. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  2. Discrete Minimal Surface Algebras

    NASA Astrophysics Data System (ADS)

    Arnlind, Joakim; Hoppe, Jens

    2010-05-01

    We consider discrete minimal surface algebras (DMSA) as generalized noncommutative analogues of minimal surfaces in higher dimensional spheres. These algebras appear naturally in membrane theory, where sequences of their representations are used as a regularization. After showing that the defining relations of the algebra are consistent, and that one can compute a basis of the enveloping algebra, we give several explicit examples of DMSAs in terms of subsets of sln (any semi-simple Lie algebra providing a trivial example by itself). A special class of DMSAs are Yang-Mills algebras. The representation graph is introduced to study representations of DMSAs of dimension d ≤ 4, and properties of representations are related to properties of graphs. The representation graph of a tensor product is (generically) the Cartesian product of the corresponding graphs. We provide explicit examples of irreducible representations and, for coinciding eigenvalues, classify all the unitary representations of the corresponding algebras.

  3. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  4. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  5. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  6. Minimal E6 unification

    NASA Astrophysics Data System (ADS)

    Susič, Vasja

    2016-06-01

    A realistic model in the class of renormalizable supersymmetric E6 Grand Unified Theories is constructed. Its matter sector consists of 3 × 27 representations, while the Higgs sector is 27 +27 ¯+35 1'+35 1' ¯+78 . An analytic solution for a Standard Model vacuum is found and the Yukawa sector analyzed. It is argued that if one considers the increased predictability due to only two symmetric Yukawa matrices in this model, it can be considered a minimal SUSY E6 model with this type of matter sector. This contribution is based on Ref. [1].

  7. Logarithmic superconformal minimal models

    NASA Astrophysics Data System (ADS)

    Pearce, Paul A.; Rasmussen, Jørgen; Tartaglia, Elena

    2014-05-01

    The higher fusion level logarithmic minimal models {\\cal LM}(P,P';n) have recently been constructed as the diagonal GKO cosets {(A_1^{(1)})_k\\oplus (A_1^ {(1)})_n}/ {(A_1^{(1)})_{k+n}} where n ≥ 1 is an integer fusion level and k = nP/(P‧- P) - 2 is a fractional level. For n = 1, these are the well-studied logarithmic minimal models {\\cal LM}(P,P')\\equiv {\\cal LM}(P,P';1). For n ≥ 2, we argue that these critical theories are realized on the lattice by n × n fusion of the n = 1 models. We study the critical fused lattice models {\\cal LM}(p,p')_{n\\times n} within a lattice approach and focus our study on the n = 2 models. We call these logarithmic superconformal minimal models {\\cal LSM}(p,p')\\equiv {\\cal LM}(P,P';2) where P = |2p - p‧|, P‧ = p‧ and p, p‧ are coprime. These models share the central charges c=c^{P,P';2}=\\frac {3}{2}\\big (1-{2(P'-P)^2}/{P P'}\\big ) of the rational superconformal minimal models {\\cal SM}(P,P'). Lattice realizations of these theories are constructed by fusing 2 × 2 blocks of the elementary face operators of the n = 1 logarithmic minimal models {\\cal LM}(p,p'). Algebraically, this entails the fused planar Temperley-Lieb algebra which is a spin-1 Birman-Murakami-Wenzl tangle algebra with loop fugacity β2 = [x]3 = x2 + 1 + x-2 and twist ω = x4 where x = eiλ and λ = (p‧- p)π/p‧. The first two members of this n = 2 series are superconformal dense polymers {\\cal LSM}(2,3) with c=-\\frac {5}{2}, β2 = 0 and superconformal percolation {\\cal LSM}(3,4) with c = 0, β2 = 1. We calculate the bulk and boundary free energies analytically. By numerically studying finite-size conformal spectra on the strip with appropriate boundary conditions, we argue that, in the continuum scaling limit, these lattice models are associated with the logarithmic superconformal models {\\cal LM}(P,P';2). For system size N, we propose finitized Kac character formulae of the form q^{-{c^{P,P';2}}/{24}+\\Delta ^{P,P';2} _{r

  8. Minimizing fan energy costs

    SciTech Connect

    Monroe, R.C.

    1985-05-27

    Minimizing fan energy costs and maximizing fan efficiency is the subject of this paper. Blade design itself can cause poor flow distribution and inefficiency. A basic design criterion is that a blade should produce uniform flow over the entire plane of the fan. Also an inherent problem with the axial fan is swirl -- the tangential deflection of exit-flow caused by the effect of torque. Swirl can be prevented with an inexpensive hub component. Basic efficiency can be checked by means of the fan's performance curve. Generally, fewer blades translate into higher axial-fan efficiency. A crowded inboard area creates hub turbulence which lessens efficiency. Whether the pitch of fan blades is fixed or variable also affects energy consumption. Power savings of 50% per year or more can be realized by replacing fixed-pitch, continuously operating fans with fans whose blade pitch or speed is automatically varied.

  9. Transanal Minimally Invasive Surgery

    PubMed Central

    deBeche-Adams, Teresa; Nassif, George

    2015-01-01

    Transanal minimally invasive surgery (TAMIS) was first described in 2010 as a crossover between single-incision laparoscopic surgery and transanal endoscopic microsurgery (TEM) to allow access to the proximal and mid-rectum for resection of benign and early-stage malignant rectal lesions. The TAMIS technique can also be used for noncurative intent surgery of more advanced lesions in patients who are not candidates for radical surgery. Proper workup and staging should be done before surgical decision-making. In addition to the TAMIS port, instrumentation and set up include readily available equipment found in most operating suites. TAMIS has proven its usefulness in a wide range of applications outside of local excision, including repair of rectourethral fistula, removal of rectal foreign body, control of rectal hemorrhage, and as an adjunct in total mesorectal excision for rectal cancer. TAMIS is an easily accessible, technically feasible, and cost-effective alternative to TEM. PMID:26491410

  10. [Minimal invasive implantology].

    PubMed

    Bruck, N; Zagury, A; Nahlieli, O

    2015-07-01

    Endoscopic surgery has changed the philosophy and practice of modern surgery in all aspects of medicine. It gave rise to minimally invasive surgery procedures based on the ability to visualize and to operate via small channels. In maxillofacial surgery, our ability to see clearly the surgical field opened an entirely new world of exploration, as conditions that were once almost impossible to control and whose outcome was uncertain can be now predictably managed. in this article we will descripe the advantage of using the oral endoscope during the dental implantology procedure, and we will describe a unique implant which enable us in combination with the oral endoscope to create a maxillary sinus lift with out the need of the major surgery with all of its risks and complication.

  11. [Minimally invasive breast surgery].

    PubMed

    Mátrai, Zoltán; Gulyás, Gusztáv; Kunos, Csaba; Sávolt, Akos; Farkas, Emil; Szollár, András; Kásler, Miklós

    2014-02-01

    Due to the development in medical science and industrial technology, minimally invasive procedures have appeared in the surgery of benign and malignant breast diseases. In general , such interventions result in significantly reduced breast and chest wall scars, shorter hospitalization and less pain, but they require specific, expensive devices, longer surgical time compared to open surgery. Furthermore, indications or oncological safety have not been established yet. It is quite likely, that minimally invasive surgical procedures with high-tech devices - similar to other surgical subspecialties -, will gradually become popular and it may form part of routine breast surgery even. Vacuum-assisted core biopsy with a therapeutic indication is suitable for the removal of benign fibroadenomas leaving behind an almost invisible scar, while endoscopically assisted skin-sparing and nipple-sparing mastectomy, axillary staging and reconstruction with latissimus dorsi muscle flap are all feasible through the same short axillary incision. Endoscopic techniques are also suitable for the diagnostics and treatment of intracapsular complications of implant-based breast reconstructions (intracapsular fluid, implant rupture, capsular contracture) and for the biopsy of intracapsular lesions with uncertain pathology. Perception of the role of radiofrequency ablation of breast tumors requires further hands-on experience, but it is likely that it can serve as a replacement of surgical removal in a portion of primary tumors in the future due to the development in functional imaging and anticancer drugs. With the reduction of the price of ductoscopes routine examination of the ductal branch system, guided microdochectomy and targeted surgical removal of terminal ducto-lobular units or a "sick lobe" as an anatomical unit may become feasible. The paper presents the experience of the authors and provides a literature review, for the first time in Hungarian language on the subject. Orv. Hetil

  12. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  13. A perturbation technique for shield weight minimization

    SciTech Connect

    Watkins, E.F.; Greenspan, E. )

    1993-01-01

    The radiation shield optimization code SWAN (Ref. 1) was originally developed for minimizing the thickness of a shield that will meet a given dose (or another) constraint or for extremizing a performance parameter of interest (e.g., maximizing energy multiplication or minimizing dose) while maintaining the shield volume constraint. The SWAN optimization process proved to be highly effective (e.g., see Refs. 2, 3, and 4). The purpose of this work is to investigate the applicability of the SWAN methodology to problems in which the weight rather than the volume is the relevant shield characteristic. Such problems are encountered in shield design for space nuclear power systems. The investigation is carried out using SWAN with the coupled neutron-photon cross-section library FLUNG (Ref. 5).

  14. A minimal lentivirus Tat.

    PubMed Central

    Derse, D; Carvalho, M; Carroll, R; Peterlin, B M

    1991-01-01

    Transcriptional regulatory mechanisms found in lentiviruses employ RNA enhancer elements called trans-activation responsive (TAR) elements. These nascent RNA stem-loops are cis-acting targets of virally encoded Tat effectors. Interactions between Tat and TAR increase the processivity of transcription complexes and lead to efficient copying of viral genomes. To study essential elements of this trans activation, peptide motifs from Tats of two distantly related lentiviruses, equine infectious anemia virus (EIAV) and human immunodeficiency virus type 1 (HIV-1), were fused to the coat protein of bacteriophage R17 and tested on the long terminal repeat of EIAV, where TAR was replaced by the R17 operator, the target of the coat protein. This independent RNA-tethering mechanism mapped activation domains of Tats from HIV-1 and EIAV to 47 and 15 amino acids and RNA-binding domains to 10 and 26 amino acids, respectively. Thus, a minimal lentivirus Tat consists of 25 amino acids, of which 15 modify viral transcription and 10 bind to the target RNA stem-loop. Images PMID:1658392

  15. USGS Methodology for Assessing Continuous Petroleum Resources

    USGS Publications Warehouse

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  16. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  17. Minimizing Launch Mass for ISRU Processes

    NASA Technical Reports Server (NTRS)

    England, C.; Hallinan, K. P.

    2004-01-01

    The University of Dayton and the Jet Propulsion Laboratory are developing a methodology for estimating the Earth launch mass (ELM) of processes for In-Situ Resource Utilization (ISRU) with a focus on lunar resource recovery. ISRU may be enabling for both an extended presence on the Moon, and for large sample return missions and for a human presence on Mars. To accomplish these exploration goals, the resources recovered by ISRU must offset the ELM for the recovery process. An appropriate figure of merit is the cost of the exploration mission, which is closely related to ELM. For a given production rate and resource concentration, the lowest ELM - and the best ISRU process - is achieved by minimizing capital equipment for both the ISRU process and energy production. ISRU processes incur Carnot limitations and second law losses (irreversibilities) that ultimately determine production rate, material utilization and energy efficiencies. Heat transfer, chemical reaction, and mechanical operations affect the ELM in ways that are best understood by examining the process's detailed energetics. Schemes for chemical and thermal processing that do not incorporate an understanding of second law losses will be incompletely understood. Our team is developing a methodology that will aid design and selection of ISRU processes by identifying the impact of thermodynamic losses on ELM. The methodology includes mechanical, thermal and chemical operations, and, when completed, will provide a procedure and rationale for optimizing their design and minimizing their cost. The technique for optimizing ISRU with respect to ELM draws from work of England and Funk that relates the cost of endothermic processes to their second law efficiencies. Our team joins their approach for recovering resources by chemical processing with analysis of thermal and mechanical operations in space. Commercial firms provide cost inputs for ELM and planetary landing. Additional information is included in the

  18. Microbiological Methodology in Astrobiology

    NASA Technical Reports Server (NTRS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  19. Language Policy and Methodology

    ERIC Educational Resources Information Center

    Liddicoat, Antony J.

    2004-01-01

    The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit…

  20. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  1. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  2. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  3. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  4. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement. PMID:27630988

  5. Influenza SIRS with Minimal Pneumonitis.

    PubMed

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement. PMID:27630988

  6. Minimal but non-minimal inflation and electroweak symmetry breaking

    NASA Astrophysics Data System (ADS)

    Marzola, Luca; Racioppi, Antonio

    2016-10-01

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r ≈ 10‑3, typical of Higgs-inflation models, but in contrast yields a scalar spectral index ns simeq 0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  7. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  8. Menopause and Methodological Doubt

    ERIC Educational Resources Information Center

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  9. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  10. Rovers minimize human disturbance in research on wild animals.

    PubMed

    Le Maho, Yvon; Whittington, Jason D; Hanuise, Nicolas; Pereira, Louise; Boureau, Matthieu; Brucker, Mathieu; Chatelain, Nicolas; Courtecuisse, Julien; Crenner, Francis; Friess, Benjamin; Grosbellet, Edith; Kernaléguen, Laëtitia; Olivier, Frédérique; Saraux, Claire; Vetter, Nathanaël; Viblanc, Vincent A; Thierry, Bernard; Tremblay, Pascale; Groscolas, René; Le Bohec, Céline

    2014-12-01

    Investigating wild animals while minimizing human disturbance remains an important methodological challenge. When approached by a remote-operated vehicle (rover) which can be equipped to make radio-frequency identifications, wild penguins had significantly lower and shorter stress responses (determined by heart rate and behavior) than when approached by humans. Upon immobilization, the rover-unlike humans-did not disorganize colony structure, and stress rapidly ceased. Thus, rovers can reduce human disturbance of wild animals and the resulting scientific bias.

  11. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  12. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  13. Minimally invasive aortic valve surgery.

    PubMed

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-09-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  14. What is minimally invasive dentistry?

    PubMed

    Ericson, Dan

    2004-01-01

    Minimally Invasive Dentistry is the application of "a systematic respect for the original tissue." This implies that the dental profession recognizes that an artifact is of less biological value than the original healthy tissue. Minimally invasive dentistry is a concept that can embrace all aspects of the profession. The common delineator is tissue preservation, preferably by preventing disease from occurring and intercepting its progress, but also removing and replacing with as little tissue loss as possible. It does not suggest that we make small fillings to restore incipient lesions or surgically remove impacted third molars without symptoms as routine procedures. The introduction of predictable adhesive technologies has led to a giant leap in interest in minimally invasive dentistry. The concept bridges the traditional gap between prevention and surgical procedures, which is just what dentistry needs today. The evidence-base for survival of restorations clearly indicates that restoring teeth is a temporary palliative measure that is doomed to fail if the disease that caused the condition is not addressed properly. Today, the means, motives and opportunities for minimally invasive dentistry are at hand, but incentives are definitely lacking. Patients and third parties seem to be convinced that the only things that count are replacements. Namely, they are prepared to pay for a filling but not for a procedure that can help avoid having one.

  15. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  16. Assembly of a minimal protocell

    NASA Astrophysics Data System (ADS)

    Rasmussen, Steen

    2007-03-01

    What is minimal life, how can we make it, and how can it be useful? We present experimental and computational results towards bridging nonliving and living matter, which results in life that is different and much simpler than contemporary life. A simple yet tightly coupled catalytic cooperation between genes, metabolism, and container forms the design underpinnings of our protocell, which is a minimal self-replicating molecular machine. Experimentally, we have recently demonstrated this coupling by having an informational molecule (8-oxoguanine) catalytically control the light driven metabolic (Ru-bpy based) production of container materials (fatty acids). This is a significant milestone towards assembling a minimal self-replicating molecular machine. Recent theoretical investigations indicate that coordinated exponential component growth should naturally emerge as a result from such a catalytic coupling between the main protocellular components. A 3-D dissipative particle simulation (DPD) study of the full protocell life-cycle exposes a number of anticipated systemic issues associated with the remaining experimental challenges for the implementation of the minimal protocell. Finally we outline how more general self-replicating materials could be useful.

  17. GPS system simulation methodology

    NASA Technical Reports Server (NTRS)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  18. Minimally invasive surgical approach to pancreatic malignancies.

    PubMed

    Bencini, Lapo; Annecchiarico, Mario; Farsi, Marco; Bartolini, Ilenia; Mirasolo, Vita; Guerra, Francesco; Coratti, Andrea

    2015-12-15

    Pancreatic surgery for malignancy is recognized as challenging for the surgeons and risky for the patients due to consistent perioperative morbidity and mortality. Furthermore, the oncological long-term results are largely disappointing, even for those patients who experience an uneventfully hospital stay. Nevertheless, surgery still remains the cornerstone of a multidisciplinary treatment for pancreatic cancer. In order to maximize the benefits of surgery, the advent of both laparoscopy and robotics has led many surgeons to treat pancreatic cancers with these new methodologies. The reduction of postoperative complications, length of hospital stay and pain, together with a shorter interval between surgery and the beginning of adjuvant chemotherapy, represent the potential advantages over conventional surgery. Lastly, a better cosmetic result, although not crucial in any cancerous patient, could also play a role by improving overall well-being and patient self-perception. The laparoscopic approach to pancreatic surgery is, however, difficult in inexperienced hands and requires a dedicated training in both advanced laparoscopy and pancreatic surgery. The recent large diffusion of the da Vinci(®) robotic platform seems to facilitate many of the technical maneuvers, such as anastomotic biliary and pancreatic reconstructions, accurate lymphadenectomy, and vascular sutures. The two main pancreatic operations, distal pancreatectomy and pancreaticoduodenectomy, are approachable by a minimally invasive path, but more limited interventions such as enucleation are also feasible. Nevertheless, a word of caution should be taken into account when considering the increasing costs of these newest technologies because the main concerns regarding these are the maintenance of all oncological standards and the lack of long-term follow-up. The purpose of this review is to examine the evidence for the use of minimally invasive surgery in pancreatic cancer (and less aggressive tumors

  19. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  20. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  1. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  2. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  3. Minimally invasive surgery. Future developments.

    PubMed

    Wickham, J E

    1994-01-15

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures. PMID:8312776

  4. Minimal absent words in four human genome assemblies.

    PubMed

    Garcia, Sara P; Pinho, Armando J

    2011-01-01

    Minimal absent words have been computed in genomes of organisms from all domains of life. Here, we aim to contribute to the catalogue of human genomic variation by investigating the variation in number and content of minimal absent words within a species, using four human genome assemblies. We compare the reference human genome GRCh37 assembly, the HuRef assembly of the genome of Craig Venter, the NA12878 assembly from cell line GM12878, and the YH assembly of the genome of a Han Chinese individual. We find the variation in number and content of minimal absent words between assemblies more significant for large and very large minimal absent words, where the biases of sequencing and assembly methodologies become more pronounced. Moreover, we find generally greater similarity between the human genome assemblies sequenced with capillary-based technologies (GRCh37 and HuRef) than between the human genome assemblies sequenced with massively parallel technologies (NA12878 and YH). Finally, as expected, we find the overall variation in number and content of minimal absent words within a species to be generally smaller than the variation between species.

  5. Methodology for research I

    PubMed Central

    Garg, Rakesh

    2016-01-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles. PMID:27729690

  6. Multiple myeloma, immunotherapy and minimal residual disease.

    PubMed

    Kusenda, J; Kovarikova, A

    2016-01-01

    Multiple myeloma (MM) is an incurable heterogeneous hematological malignancy in which relapse is characterized by re-growth of residual tumor and immune suppression with a complex biology that affects many aspects of the disease and its response to treatment. The bone marrow microenvironment, including immune cells, plays a central role in MM pathogenesis, survival, and drug resistance. The advances in basic and translational research, introduction of novel agents, particularly combination therapies, improved indicators of quality of life and survival. Minimal residual disease (MRD) detection by multiparameter flow cytometry (MFC) has revolutionized monitoring of treatment response in MM. The importance of MFC methodology will be further strengthened by the ongoing international standardization efforts. Results of MRD testing provide unique and clinically important information and demonstrated the prognostic significance of MRD in patients, leading to regulate treatment intensity in many contemporary protocols. In this review, we will summarize the principal approaches in MM immunotherapy, focusing how new agents have potential in the treatment of MM and application of MRD detection by MFC as a surrogate endpoint would allow quicker evaluation of treatment outcomes and rapid identification of effective new therapies.

  7. MINIMAL RESIDUAL DISEASE IN ACUTE LYMPHOBLASTIC LEUKEMIA

    PubMed Central

    Campana, Dario

    2009-01-01

    In patients with acute lymphoblastic leukemia (ALL), monitoring of minimal residual disease (MRD) offers a way to precisely assess early treatment response and detect relapse. Established methods to study MRD are flow cytometric detection of abnormal immunophenotypes, polymerase chain reaction (PCR) amplification of antigen-receptor genes, and PCR amplification of fusion transcripts. The strong correlation between MRD levels and risk of relapse in childhood ALL is well established; studies in adult patients also support its prognostic value. Hence, results of MRD studies can be used to select treatment intensity and duration, and estimate the optimal timing for hematopoietic stem cell transplantation. Practical issues in the implementation of MRD assays in clinical studies include determining the most informative time point to study MRD, the levels of MRD that will trigger changes in treatment intensity, as well as the relative cost and informative power of different methodologies. The identification of new markers of leukemia and the use of increasingly refined assays should further facilitate routine monitoring of MRD and help clarifying the cellular and biologic features of leukemic cells that resist chemotherapy in vivo. PMID:19100372

  8. Unsupported standing with minimized ankle muscle fatigue.

    PubMed

    Mihelj, Matjaz; Munih, Marko

    2004-08-01

    In the past, limited unsupported standing has been restored in patients with thoracic spinal cord injury through open-loop functional electrical stimulation of paralyzed knee extensor muscles and the support of intact arm musculature. Here an optimal control system for paralyzed ankle muscles was designed that enables the subject to stand without hand support in a sagittal plane. The paraplegic subject was conceptualized as an underactuated double inverted pendulum structure with an active degree of freedom in the upper trunk and a passive degree of freedom in the paralyzed ankle joints. Control system design is based on the minimization of a cost function that estimates the effort of ankle joint muscles via observation of the ground reaction force position, relative to ankle joint axis. Furthermore, such a control system integrates voluntary upper trunk activity and artificial control of ankle joint muscles, resulting in a robust standing posture. Figures are shown for the initial simulation study, followed by disturbance tests on an intact volunteer and several laboratory trials with a paraplegic person. Benefits of the presented methodology are prolonged standing sessions and in the fact that the subject is able to maintain voluntary control over upper body orientation in space, enabling simple functional standing. PMID:15311817

  9. Unsupported standing with minimized ankle muscle fatigue.

    PubMed

    Mihelj, Matjaz; Munih, Marko

    2004-08-01

    In the past, limited unsupported standing has been restored in patients with thoracic spinal cord injury through open-loop functional electrical stimulation of paralyzed knee extensor muscles and the support of intact arm musculature. Here an optimal control system for paralyzed ankle muscles was designed that enables the subject to stand without hand support in a sagittal plane. The paraplegic subject was conceptualized as an underactuated double inverted pendulum structure with an active degree of freedom in the upper trunk and a passive degree of freedom in the paralyzed ankle joints. Control system design is based on the minimization of a cost function that estimates the effort of ankle joint muscles via observation of the ground reaction force position, relative to ankle joint axis. Furthermore, such a control system integrates voluntary upper trunk activity and artificial control of ankle joint muscles, resulting in a robust standing posture. Figures are shown for the initial simulation study, followed by disturbance tests on an intact volunteer and several laboratory trials with a paraplegic person. Benefits of the presented methodology are prolonged standing sessions and in the fact that the subject is able to maintain voluntary control over upper body orientation in space, enabling simple functional standing.

  10. Temporal structure of consciousness and minimal self in schizophrenia.

    PubMed

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one's first-person experiential perspective as, for example, an "altered presence" during which the sense of the experienced self ("mineness") is subtly affected, or "altered sense of demarcation," i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an "I." In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed.

  11. Temporal structure of consciousness and minimal self in schizophrenia

    PubMed Central

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one’s first-person experiential perspective as, for example, an “altered presence” during which the sense of the experienced self (“mineness”) is subtly affected, or “altered sense of demarcation,” i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an “I.” In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  12. Evidence-Based Integrated Environmental Solutions For Secondary Lead Smelters: Pollution Prevention And Waste Minimization Technologies And Practices

    EPA Science Inventory

    An evidence-based methodology was adopted in this research to establish strategies to increase lead recovery and recycling via a systematic review and critical appraisal of the published literature. In particular, the research examines pollution prevention and waste minimization...

  13. Minimal model for Brownian vortexes.

    PubMed

    Sun, Bo; Grier, David G; Grosberg, Alexander Y

    2010-08-01

    A Brownian vortex is a noise-driven machine that uses thermal fluctuations to extract a steady-state flow of work from a static force field. Its operation is characterized by loops in a probability current whose topology and direction can change with changes in temperature. We present discrete three- and four-state minimal models for Brownian vortexes that can be solved exactly with a master-equation formalism. These models elucidate conditions required for flux reversal in Brownian vortexes and provide insights into their thermodynamic efficiency through the rate of entropy production. PMID:20866791

  14. [Minimally invasive iridocorneal angle surgery].

    PubMed

    Jordan, J F

    2012-07-01

    The classical filtration surgery with trabeculectomy or drainage of chamber fluid with episcleral implants is the most effective method for permanent reduction of intraocular pressure to lower and normal levels. Even though both operative procedures are well-established the high efficiency of the method causes potentially dangerous intraoperative as well as interoperative complications with a frequency which cannot be ignored. In the past this led to a search for low complication alternatives with non-penetrating glaucoma surgery (NPGS) and the search is still continuing. Trabecular meshwork surgery in particular with continuous development of new operation techniques steered the focus to a complication-poor and minimally invasive, gonioscopic glaucoma surgery.

  15. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  16. Radiometric calibration by rank minimization.

    PubMed

    Lee, Joon-Young; Matsushita, Yasuyuki; Shi, Boxin; Kweon, In So; Ikeuchi, Katsushi

    2013-01-01

    We present a robust radiometric calibration framework that capitalizes on the transform invariant low-rank structure in the various types of observations, such as sensor irradiances recorded from a static scene with different exposure times, or linear structure of irradiance color mixtures around edges. We show that various radiometric calibration problems can be treated in a principled framework that uses a rank minimization approach. This framework provides a principled way of solving radiometric calibration problems in various settings. The proposed approach is evaluated using both simulation and real-world datasets and shows superior performance to previous approaches.

  17. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes. PMID:16711089

  18. Integrated decision support system for waste minimization analysis in chemical processes.

    PubMed

    Halim, Iskandar; Srinivasan, Rajagopalan

    2002-04-01

    The need to build and operate environmentally friendly plants has challenged the chemical industry to consider waste minimization or even elimination starting from the early stages of process development. A thorough waste minimization analysis requires specialized expertise and is laborious, time-consuming, expensive, and knowledge-intensive. This has caused a major technical barrier for implementing waste minimization programswithin the industry. Previously, we had reported a systematic methodology and a knowledge-based system, called ENVOPExpert, for identifying waste minimization opportunities in chemical processes. In this paper, we propose an integrated qualitative-quantitative methodology to identify waste minimization alternatives and assess their efficacy in terms of environmental impact and process economics. A qualitative analysis is first conducted to identify the sources of wastes and to propose alternatives for eliminating or minimizing them. Environmental impact of each alternative is then calculated by doing a quantitative pollutant balance. The capital expenditure required for implementing the alternative and the resulting plant operating costs are also calculated and used in the evaluation of the waste minimization alternatives. Through this, practical and cost-effective options can be identified. This methodology has been implemented as an integrated decision support system and tested using the hydrodealkylation process case study with satisfactory results.

  19. Courseware Engineering Methodology.

    ERIC Educational Resources Information Center

    Uden, Lorna

    2002-01-01

    Describes development of the Courseware Engineering Methodology (CEM), created to guide novices in designing effective courseware. Discusses CEM's four models: pedagogical (concerned with the courseware's pedagogical aspects), conceptual (dealing with software engineering), interface (relating to human-computer interaction), and hypermedia…

  20. Complicating Methodological Transparency

    ERIC Educational Resources Information Center

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  1. Video: Modalities and Methodologies

    ERIC Educational Resources Information Center

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data collection…

  2. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  3. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  4. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  5. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  6. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  7. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  8. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR.

  9. Minimal unitary (covariant) scattering theory

    SciTech Connect

    Lindesay, J.V.; Markevich, A.

    1983-06-01

    In the minimal three particle equations developed by Lindesay the two body input amplitude was an on shell relativistic generalization of the non-relativistic scattering model characterized by a single mass parameter ..mu.. which in the two body (m + m) system looks like an s-channel bound state (..mu.. < 2m) or virtual state (..mu.. > 2m). Using this driving term in covariant Faddeev equations generates a rich covariant and unitary three particle dynamics. However, the simplest way of writing the relativisitic generalization of the Faddeev equations can take the on shell Mandelstam parameter s = 4(q/sup 2/ + m/sup 2/), in terms of which the two particle input is expressed, to negative values in the range of integration required by the dynamics. This problem was met in the original treatment by multiplying the two particle input amplitude by THETA(s). This paper provides what we hope to be a more direct way of meeting the problem.

  10. A minimally invasive smile enhancement.

    PubMed

    Peck, Fred H

    2014-01-01

    Minimally invasive dentistry refers to a wide variety of dental treatments. On the restorative aspect of dental procedures, direct resin bonding can be a very conservative treatment option for the patient. When tooth structure does not need to be removed, the patient benefits. Proper treatment planning is essential to determine how conservative the restorative treatment will be. This article describes the diagnosis, treatment options, and procedural techniques in the restoration of 4 maxillary anterior teeth with direct composite resin. The procedural steps are reviewed with regard to placing the composite and the variety of colors needed to ensure a natural result. Finishing and polishing of the composite are critical to ending with a natural looking dentition that the patient will be pleased with for many years.

  11. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  12. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  13. Waste minimization in chrome plating

    SciTech Connect

    Scheuer, J.; Walter, K.; Nastasi, M.

    1996-09-01

    This is the final report of a one year laboratory directed research and development project at the Los Alamos National Laboratory (LANL). Traditional wet chemical electroplating techniques utilize toxic materials and pose environmental hazards in the disposal of primary baths and waste waters. Pollutants include metals and nonmetals, such as oil, grease, phosphates, and toxic and organic compounds. This project is focused on development of plasma source ion implantation (PSII), a novel and cost-effective surface modification technique, to minimize and ultimately eliminate waste generated in chrome plating. We are collaborating with and industrial partner to design material systems, utilize the PSII processes in existing Los Alamos experimental facilities, and analyze both material and performance characteristics.

  14. Non-minimal Inflationary Attractors

    SciTech Connect

    Kallosh, Renata; Linde, Andrei E-mail: alinde@stanford.edu

    2013-10-01

    Recently we identified a new class of (super)conformally invariant theories which allow inflation even if the scalar potential is very steep in terms of the original conformal variables. Observational predictions of a broad class of such theories are nearly model-independent. In this paper we consider generalized versions of these models where the inflaton has a non-minimal coupling to gravity with a negative parameter ξ different from its conformal value -1/6. We show that these models exhibit attractor behavior. With even a slight increase of |ξ| from |ξ| = 0, predictions of these models for n{sub s} and r rapidly converge to their universal model-independent values corresponding to conformal coupling ξ = −1/6. These values of n{sub s} and r practically coincide with the corresponding values in the limit ξ → −∞.

  15. Waste minimization in analytical methods

    SciTech Connect

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-05-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department`s goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision.

  16. Tobacco documents research methodology.

    PubMed

    Anderson, Stacey J; McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-05-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings.

  17. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  18. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  19. Methodology for research II

    PubMed Central

    Bhaskar, S Bala; Manjuladevi, M

    2016-01-01

    Research is a systematic process, which uses scientific methods to generate new knowledge that can be used to solve a query or improve on the existing system. Any research on human subjects is associated with varying degree of risk to the participating individual and it is important to safeguard the welfare and rights of the participants. This review focuses on various steps involved in methodology (in continuation with the previous section) before the data are submitted for publication. PMID:27729691

  20. Fast track evaluation methodology.

    PubMed

    Duke, J R

    1991-06-01

    Evaluating hospital information systems has taken a variety of forms since the initial development and use of automation. The process itself has moved from a hardware-based orientation controlled by data processing professionals to systems solutions and a user-driven process overseen by management. At Harbor Hospital Center in Baltimore, a fast track methodology has been introduced to shorten system evaluation time to meet the rapid changes that constantly affect the healthcare industry.

  1. Darwin's Methodological Evolution.

    PubMed

    Lennox, James G

    2005-01-01

    A necessary condition for having a revolution named after you is that you are an innovator in your field. I argue that if Charles Darwin meets this condition, it is as a philosopher and methodologist. In 1991, I made the case for Darwin's innovative use of "thought experiment" in the Origin. Here I place this innovative practice in the context of Darwin's methodological commitments, trace its origins back into Darwin's notebooks, and pursue Darwin's suggestion that it owes its inspiration to Charles Lyell.

  2. MINIMIZATION OF CARBON LOSS IN COAL REBURNING

    SciTech Connect

    Vladimir M. Zamansky; Vitali V. Lissianski

    2001-09-07

    This project develops Fuel-Flexible Reburning (FFR), which combines conventional reburning and Advanced Reburning (AR) technologies with an innovative method of delivering coal as the reburning fuel. The overall objective of this project is to develop engineering and scientific information and know-how needed to improve the cost of reburning via increased efficiency and minimized carbon in ash and move the FFR technology to the demonstration and commercialization stage. Specifically, the project entails: (1) optimizing FFR with injection of gasified and partially gasified fuels with respect to NO{sub x} and carbon in ash reduction; (2) characterizing flue gas emissions; (3) developing a process model to predict FFR performance; (4) completing an engineering and economic analysis of FFR as compared to conventional reburning and other commercial NO{sub x} control technologies, and (5) developing a full-scale FFR design methodology. The project started in August 2000 and will be conducted over a two-year period. The work includes a combination of analytical and experimental studies to identify optimum process configurations and develop a design methodology for full-scale applications. The first year of the program included pilot-scale tests to evaluate performances of two bituminous coals in basic reburning and modeling studies designed to identify parameters that affect the FFR performance and to evaluate efficiency of coal pyrolysis products as a reburning fuel. Tests were performed in a 300 kW Boiler Simulator Facility to characterize bituminous coals as reburning fuels. Tests showed that NO{sub x} reduction in basic coal reburning depends on process conditions, initial NO{sub x} and coal type. Up to 60% NO{sub x} reduction was achieved at optimized conditions. Modeling activities during first year concentrated on the development of coal reburning model and on the prediction of NO{sub x} reduction in reburning by coal gasification products. Modeling predicted that

  3. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  4. Optimized 3D watermarking for minimal surface distortion.

    PubMed

    Bors, Adrian G; Luo, Ming

    2013-05-01

    This paper proposes a new approach to 3D watermarking by ensuring the optimal preservation of mesh surfaces. A new 3D surface preservation function metric is defined consisting of the distance of a vertex displaced by watermarking to the original surface, to the watermarked object surface as well as the actual vertex displacement. The proposed method is statistical, blind, and robust. Minimal surface distortion according to the proposed function metric is enforced during the statistical watermark embedding stage using Levenberg-Marquardt optimization method. A study of the watermark code crypto-security is provided for the proposed methodology. According to the experimental results, the proposed methodology has high robustness against the common mesh attacks while preserving the original object surface during watermarking.

  5. Minimizing Variation in Outdoor CPV Power Ratings: Preprint

    SciTech Connect

    Muller, M.; Marion, B.; Rodriguez, J.; Kurtz, S.

    2011-07-01

    The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  6. Multidimensional minimal spanning tree: The Dow Jones case

    NASA Astrophysics Data System (ADS)

    Brida, Juan Gabriel; Risso, Wiston Adrián

    2008-09-01

    This paper introduces a new methodology in order to construct Minimal Spanning Trees (MST) and Hierarchical Trees (HT) using the information provided by more than one variable. In fact, the Symbolic Time Series Analysis (STSA) approach is applied to the Dow Jones companies using information not only from asset returns but also for trading volume. The US stock market structure is obtained, showing eight clusters of companies and General Electric as a central node in the tree. We use different partitions showing that the results do not depend on the particular partition. In addition, we apply Monte Carlo simulations suggesting that the tree is not the result of random connections.

  7. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  8. Closed locally minimal nets on tetrahedra

    SciTech Connect

    Strelkova, Nataliya P

    2011-01-31

    Closed locally minimal networks are in a sense a generalization of closed geodesics. A complete classification is known of closed locally minimal networks on regular (and generally any equihedral) tetrahedra. In the present paper certain necessary and certain sufficient conditions are given for at least one closed locally minimal network to exist on a given non-equihedral tetrahedron. Bibliography: 6 titles.

  9. Minimally Invasive Mitral Valve Surgery II

    PubMed Central

    Wolfe, J. Alan; Malaisrie, S. Chris; Farivar, R. Saeid; Khan, Junaid H.; Hargrove, W. Clark; Moront, Michael G.; Ryan, William H.; Ailawadi, Gorav; Agnihotri, Arvind K.; Hummel, Brian W.; Fayers, Trevor M.; Grossi, Eugene A.; Guy, T. Sloane; Lehr, Eric J.; Mehall, John R.; Murphy, Douglas A.; Rodriguez, Evelio; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Lewis, Clifton T. P.; Barnhart, Glenn R.; Goldman, Scott M.

    2016-01-01

    Abstract Techniques for minimally invasive mitral valve repair and replacement continue to evolve. This expert opinion, the second of a 3-part series, outlines current best practices for nonrobotic, minimally invasive mitral valve procedures, and for postoperative care after minimally invasive mitral valve surgery. PMID:27654406

  10. Recursively minimally-deformed oscillators

    NASA Astrophysics Data System (ADS)

    Katriel, J.; Quesne, C.

    1996-04-01

    A recursive deformation of the boson commutation relation is introduced. Each step consists of a minimal deformation of a commutator [a,a°]=fk(... ;n̂) into [a,a°]qk+1=fk(... ;n̂), where ... stands for the set of deformation parameters that fk depends on, followed by a transformation into the commutator [a,a°]=fk+1(...,qk+1;n̂) to which the deformed commutator is equivalent within the Fock space. Starting from the harmonic oscillator commutation relation [a,a°]=1 we obtain the Arik-Coon and Macfarlane-Biedenharn oscillators at the first and second steps, respectively, followed by a sequence of multiparameter generalizations. Several other types of deformed commutation relations related to the treatment of integrable models and to parastatistics are also obtained. The ``generic'' form consists of a linear combination of exponentials of the number operator, and the various recursive families can be classified according to the number of free linear parameters involved, that depends on the form of the initial commutator.

  11. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  12. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  13. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  14. LESSons in minimally invasive urology.

    PubMed

    Dev, Harveer; Sooriakumaran, Prasanna; Tewari, Ashutosh; Rane, Abhay

    2011-05-01

    Since the introduction of laparoscopic surgery, the promise of lower postoperative morbidity and improved cosmesis has been achieved. LaparoEndoscopic Single Site (LESS) surgery potentially takes this further. Following the first human urological LESS report in 2007, numerous case series have emerged, as well as comparative studies comparing LESS with standard laparoscopy. Technological developments in instrumentation, access and optics devices are overcoming some of the challenges that are raised when operating through a single site. Further advances in the technique have included the incorporation of robotics (R-LESS), which exploit the ergonomic benefits of ex vivo robotic platforms in an attempt to further improve the implementation of LESS procedures. In the future, urologists may be able to benefit from in vivo micro-robots that will allow the manipulation of tissue from internal repositionable platforms. The use of magnetic anchoring and guidance systems (MAGS) might allow the external manoeuvring of intra-corporeal instruments to reduce clashing and facilitate triangulation. However, the final promise in minimally invasive surgery is natural orifice transluminal endoscopic surgery (NOTES), with its scarless technique. It remains to be seen whether NOTES, LESS, or any of these future developments will prove their clinical utility over standard laparoscopic methods.

  15. Medical waste: a minimal hazard.

    PubMed

    Keene, J H

    1991-11-01

    Medical waste is a subset of municipal waste, and regulated medical waste comprises less than 1% of the total municipal waste volume in the United States. As part of the overall waste stream, medical waste does contribute in a relative way to the aesthetic damage of the environment. Likewise, some small portion of the total release of hazardous chemicals and radioactive materials is derived from medical wastes. These comments can be made about any generated waste, regulated or unregulated. Healthcare professionals, including infection control personnel, microbiologists, public health officials, and others, have unsuccessfully argued that there is no evidence that past methods of treatment and disposal of regulated medical waste constitute any public health hazard. Historically, discovery of environmental contamination by toxic chemical disposal has followed assurances that the material was being disposed of in a safe manner. Therefore, a cynical public and its elected officials have demanded proof that the treatment and disposal of medical waste (i.e., infectious waste) do not constitute a public health hazard. Existent studies on municipal waste provide that proof. In order to argue that the results of these municipal waste studies are demonstrative of the minimal potential infectious environmental impact and lack of public health hazard associated with medical waste, we must accept the following: that the pathogens are the same whether they come from the hospital or the community, and that the municipal waste studied contained waste materials we now define as regulated medical waste.(ABSTRACT TRUNCATED AT 250 WORDS)

  16. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  17. Probabilistic river forecast methodology

    NASA Astrophysics Data System (ADS)

    Kelly, Karen Suzanne

    1997-09-01

    The National Weather Service (NWS) operates deterministic conceptual models to predict the hydrologic response of a river basin to precipitation. The output from these models are forecasted hydrographs (time series of the future river stage) at certain locations along a river. In order for the forecasts to be useful for optimal decision making, the uncertainty associated with them must be quantified. A methodology is developed for this purpose that (i) can be implemented with any deterministic hydrologic model, (ii) receives a probabilistic forecast of precipitation as input, (iii) quantifies all sources of uncertainty, (iv) operates in real-time and within computing constraints, and (v) produces probability distributions of future river stages. The Bayesian theory which supports the methodology involves transformation of a distribution of future precipitation into one of future river stage, and statistical characterization of the uncertainty in the hydrologic model. This is accomplished by decomposing total uncertainty into that associated with future precipitation and that associated with the hydrologic transformations. These are processed independently and then integrated into a predictive distribution which constitutes a probabilistic river stage forecast. A variety of models are presented for implementation of the methodology. In the most general model, a probability of exceedance associated with a given future hydrograph specified. In the simplest model, a probability of exceedance associated with a given future river stage is specified. In conjunction with the Ohio River Forecast Center of the NWS, the simplest model is used to demonstrate the feasibility of producing probabilistic river stage forecasts for a river basin located in headwaters. Previous efforts to quantify uncertainty in river forecasting have only considered selected sources of uncertainty, been specific to a particular hydrologic model, or have not obtained an entire probability

  18. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  19. SHRS RRCL selection methodology

    SciTech Connect

    Trainer, J E; Delaney, H M

    1980-05-01

    The CRBRP Safety Related Reliability Program requires a listing of those design items whose failure would impact the ability of the plant for a safe shutdown or shutdown heat removal mission. This listing is referred to as the Reliability Related Components List (RRCL) and heretofore, has been based on sound engineering judgement. As a check for completeness and as a means of providing a probabilistic-based rationale for component selection, this report presents a methodology by which this selection can be made for each of the decay heat removal loops comprising the Shutdown Heat Removal System (SHRS) using existing EG and G fault tree models.

  20. Emergency exercise methodology

    SciTech Connect

    Klimczak, C.A.

    1993-01-01

    Competence for proper response to hazardous materials emergencies is enhanced and effectively measured by exercises which test plans and procedures and validate training. Emergency exercises are most effective when realistic criteria is used and a sequence of events is followed. The scenario is developed from pre-determined exercise objectives based on hazard analyses, actual plans and procedures. The scenario should address findings from previous exercises and actual emergencies. Exercise rules establish the extent of play and address contingencies during the exercise. All exercise personnel are assigned roles as players, controllers or evaluators. These participants should receive specialized training in advance. A methodology for writing an emergency exercise plan will be detailed.

  1. Methodology of stereotactic biopsy.

    PubMed

    Carapella, C M; Mastrostefano, R; Raus, L; Riccio, A

    1989-01-01

    The great technological improvement in the neurosurgical tools and in the neuroradiological imaging has brought about the diffusion of the stereotactic techniques. They are crucial for the diagnosis and treatment of intracranial expanding lesions of small dimensions or located in sites inaccessible to conventional techniques. The Authors describe the most common systems and methodologies for the stereotactic biopsy. They stress the importance of performing serial explorations which can provide evidence of the heterogeneity of the neoplastic lesion and of the infiltration of the brain adjacent to the tumor.

  2. RHIC DATA CORRELATION METHODOLOGY.

    SciTech Connect

    MICHNOFF,R.; D'OTTAVIO,T.; HOFF,L.; MACKAY,W.; SATOGATA,T.

    1999-03-29

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper.

  3. Relative Hazard Calculation Methodology

    SciTech Connect

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-09-07

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation).

  4. [Minimally invasive percutaneous nephrolitholapaxy (MIP)].

    PubMed

    Nagele, U; Schilling, D; Anastasiadis, A G; Walcher, U; Sievert, K D; Merseburger, A S; Kuczyk, M; Stenzl, A

    2008-09-01

    Minimally invasive percutaneous nephrolithopaxy (MIP) was developed to combine the excellent stone-free rates of the conventional percutaneous nephrolithopaxy (PCNL) technique with the low morbidity of the miniaturized PCNL (Mini-Perc) and, at the same time, achieve a high level of patient comfort. The procedure is characterized not only by the diameter of the miniaturized 18-Fr Amplatz sheath that was adopted from the Mini-Perc but also by the following features: ultrasound-guided puncture of the kidney; single-step dilatation of the access tract; ballistic lithotripsy; a low-pressure irrigation system together with stone retraction by irrigation with a specially designed nephroscope sheath, for the so-called vacuum cleaner effect; and a sealed and tubeless access tract with primary closure of the channel independent of hemorrhage and without a second-look procedure.The results of the first 57 patients demonstrate primary stone-free rates of 92.9% with operating times averaging 62 (25-123) min. Severe complications, such as sepsis or bleeding requiring blood transfusion, did not occur. The high and predictable stone-free rate and a low morbidity comparable to that of ureteroscopy and extracorporeal shock-wave lithotripsy make MIP an attractive option for patients and urologists. The "vacuum cleaner effect" with quick removal of stone fragments reduces operating time and prevents new stone formation by avoiding residual fragments. The direct and primary closure of the access tract increases patient comfort and is justified by the reintervention rate of less than 8% in the presented cohort.The lack of a need for second-look nephroscopies, the vacuum cleaner effect, improved patient comfort without nephrostomy tubes, as well as surgery times comparable to that of traditional PCNL demonstrate a consequent evolution of the Mini-Perc. MIP therefore represents a promising and future-oriented module in modern stone therapy.

  5. Situating methodology within qualitative research.

    PubMed

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  6. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  7. General prevention and risk minimization in LCA: a combined approach.

    PubMed

    Sleeswijk, Anneke Wegener

    2003-01-01

    Methods for life cycle assessment of products (LCA) are most often based on the general prevention principle, as opposed to the risk minimization principle. Here, the desirability and feasibility of a combined approach are discussed, along with the conditions for elaboration in the framework of LCA methodology, and the consequences for LCA practice. A combined approach provides a separate assessment of above and below threshold pollution, offering the possibility to combat above threshold impacts with priority. Spatial differentiation in fate, exposure, and effect modelling is identified to play a central role in the implementation. The collection of region-specific data turns out to be the most elaborate requirement for the implementation in both methodology and practice. A methodological framework for the construction of characterization factors is provided. Along with spatial differentiation of existing parameters, two newly introduced spatial parameters play a key role: the sensitivity factor and the threshold factor. The practicability of the proposed procedure is illustrated by an example of its application. Providing a reasonable data availability, the development of separate LCA characterization factors for the respective assessment of pollution levels above and below environmental threshold values seems to be a feasible task that may add to LCA credibility. PMID:12635961

  8. Glycaemic index methodology.

    PubMed

    Brouns, F; Bjorck, I; Frayn, K N; Gibbs, A L; Lang, V; Slama, G; Wolever, T M S

    2005-06-01

    The glycaemic index (GI) concept was originally introduced to classify different sources of carbohydrate (CHO)-rich foods, usually having an energy content of >80 % from CHO, to their effect on post-meal glycaemia. It was assumed to apply to foods that primarily deliver available CHO, causing hyperglycaemia. Low-GI foods were classified as being digested and absorbed slowly and high-GI foods as being rapidly digested and absorbed, resulting in different glycaemic responses. Low-GI foods were found to induce benefits on certain risk factors for CVD and diabetes. Accordingly it has been proposed that GI classification of foods and drinks could be useful to help consumers make 'healthy food choices' within specific food groups. Classification of foods according to their impact on blood glucose responses requires a standardised way of measuring such responses. The present review discusses the most relevant methodological considerations and highlights specific recommendations regarding number of subjects, sex, subject status, inclusion and exclusion criteria, pre-test conditions, CHO test dose, blood sampling procedures, sampling times, test randomisation and calculation of glycaemic response area under the curve. All together, these technical recommendations will help to implement or reinforce measurement of GI in laboratories and help to ensure quality of results. Since there is current international interest in alternative ways of expressing glycaemic responses to foods, some of these methods are discussed.

  9. Engineering radioecology: Methodological considerations

    SciTech Connect

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-12-31

    The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.

  10. Cancer Cytogenetics: Methodology Revisited

    PubMed Central

    2014-01-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed. PMID:25368816

  11. Minimizing electrode contamination in an electrochemical cell

    DOEpatents

    Kim, Yu Seung; Zelenay, Piotr; Johnston, Christina

    2014-12-09

    An electrochemical cell assembly that is expected to prevent or at least minimize electrode contamination includes one or more getters that trap a component or components leached from a first electrode and prevents or at least minimizes them from contaminating a second electrode.

  12. Is goal ascription possible in minimal mindreading?

    PubMed

    Butterfill, Stephen A; Apperly, Ian A

    2016-03-01

    In this response to the commentary by Michael and Christensen, we first explain how minimal mindreading is compatible with the development of increasingly sophisticated mindreading behaviors that involve both executive functions and general knowledge and then sketch 1 approach to a minimal account of goal ascription. PMID:26901746

  13. Minimally invasive surgery in neonates and infants

    PubMed Central

    Lin, Tiffany; Pimpalwar, Ashwin

    2010-01-01

    Minimally invasive surgery (MIS) has significantly improved the field of surgery, with benefits including shorter operating time, improved recovery time, minimizing stress and pain due to smaller incisions, and even improving mortality. MIS procedures, including their indications, impact, limitations, and possible future evolution in neonates and infants, are discussed in this article. PMID:21180496

  14. Minimally Invasive Mitral Valve Surgery I

    PubMed Central

    Ailawadi, Gorav; Agnihotri, Arvind K.; Mehall, John R.; Wolfe, J. Alan; Hummel, Brian W.; Fayers, Trevor M.; Farivar, R. Saeid; Grossi, Eugene A.; Guy, T. Sloane; Hargrove, W. Clark; Khan, Junaid H.; Lehr, Eric J.; Malaisrie, S. Chris; Murphy, Douglas A.; Rodriguez, Evelio; Ryan, William H.; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Goldman, Scott M.; Lewis, Clifton T. P.; Barnhart, Glenn R.

    2016-01-01

    Abstract Widespread adoption of minimally invasive mitral valve repair and replacement may be fostered by practice consensus and standardization. This expert opinion, first of a 3-part series, outlines current best practices in patient evaluation and selection for minimally invasive mitral valve procedures, and discusses preoperative planning for cannulation and myocardial protection. PMID:27654407

  15. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  16. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  17. Dosimetric methodology of the ICRP

    SciTech Connect

    Eckerman, K.F.

    1994-12-31

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples.

  18. Status of sonic boom methodology and understanding

    NASA Technical Reports Server (NTRS)

    Darden, Christine M.; Powell, Clemans A.; Hayes, Wallace D.; George, Albert R.; Pierce, Allan D.

    1989-01-01

    In January 1988, approximately 60 representatives of industry, academia, government, and the military gathered at NASA-Langley for a 2 day workshop on the state-of-the-art of sonic boom physics, methodology, and understanding. The purpose of the workshop was to assess the sonic boom area, to determine areas where additional sonic boom research is needed, and to establish some strategies and priorities in this sonic boom research. Attendees included many internationally recognized sonic boom experts who had been very active in the Supersonic Transport (SST) and Supersonic Cruise Aircraft Research Programs of the 60's and 70's. Summaries of the assessed state-of-the-art and the research needs in theory, minimization, atmospheric effects during propagation, and human response are given.

  19. Minimal change disease: a CD80 podocytopathy?

    PubMed

    Ishimoto, Takuji; Shimada, Michiko; Araya, Carlos E; Huskey, Janna; Garin, Eduardo H; Johnson, Richard J

    2011-07-01

    Minimal change disease is the most common nephrotic syndrome in children. Although the etiology of minimal change disease remains to be elucidated, it has been postulated that it is the result of a circulating T-cell factor that causes podocyte cytoskeleton disorganization leading to increased glomerular capillary permeability and/or changes in glomerular basement membrane heparan sulfate glycosaminoglycans resulting in proteinuria. Minimal change disease has been associated with allergies and Hodgkin disease. Consistent with these associations, a role for interleukin-13 with minimal change disease has been proposed. Furthermore, studies evaluating podocytes also have evolved. Recently, increased expression of CD80 (also termed B7-1) on podocytes was identified as a mechanism for proteinuria. CD80 is inhibited by binding to CTLA-4, which is expressed on regulatory T cells. Recently, we showed that urinary CD80 is increased in minimal change disease patients and limited studies have suggested that it is not commonly present in the urine of patients with other glomerular diseases. Interleukin-13 or microbial products via Toll-like receptors could be factors that induce CD80 expression on podocytes. CTLA-4 appears to regulate CD80 expression in podocytes, and to be altered in minimal change disease patients. These findings lead us to suggest that proteinuria in minimal change disease is caused by persistent CD80 expression in podocytes, possibly initiated by stimulation of these cells by antigens or cytokines.

  20. Methodological Pluralism and Narrative Inquiry

    ERIC Educational Resources Information Center

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  1. Adopting a new philosophy: minimal invasion.

    PubMed

    Whitehouse, Joseph A

    2006-06-01

    Dentistry is a dynamic profession with new trends evolving. Minimally invasive dentistry is becoming not just a concept but a way of practicing. Creative people are finding ways, materials, and technology that enable patients to experience less hard-tissue or soft-tissue removal, improved prevention and maintenance, and increased attention to a philosophy of "less is more." The World Congress of Minimally Invasive Dentistry was formed to facilitate the sharing of these new concepts. The members embrace change, and dentistry offers the constant opportunity for such. As the standard of care moves toward minimally invasive dentistry, patients will benefit. PMID:16792118

  2. Technology applications for radioactive waste minimization

    SciTech Connect

    Devgun, J.S.

    1994-07-01

    The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry.

  3. Minimally Invasive Cardiovascular Surgery: Incisions and Approaches

    PubMed Central

    Langer, Nathaniel B.; Argenziano, Michael

    2016-01-01

    Throughout the modern era of cardiac surgery, most operations have been performed via median sternotomy with cardiopulmonary bypass. This paradigm is changing, however, as cardiovascular surgery is increasingly adopting minimally invasive techniques. Advances in patient evaluation, instrumentation, and operative technique have allowed surgeons to perform a wide variety of complex operations through smaller incisions and, in some cases, without cardiopulmonary bypass. With patients desiring less invasive operations and the literature supporting decreased blood loss, shorter hospital length of stay, improved postoperative pain, and better cosmesis, minimally invasive cardiac surgery should be widely practiced. Here, we review the incisions and approaches currently used in minimally invasive cardiovascular surgery. PMID:27127555

  4. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    1992-02-01

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  5. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  6. Minimization of power consumption during charging of superconducting accelerating cavities

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Anirban Krishna; Ziemann, Volker; Ruber, Roger; Goryashko, Vitaliy

    2015-11-01

    The radio frequency cavities, used to accelerate charged particle beams, need to be charged to their nominal voltage after which the beam can be injected into them. The standard procedure for such cavity filling is to use a step charging profile. However, during initial stages of such a filling process a substantial amount of the total energy is wasted in reflection for superconducting cavities because of their extremely narrow bandwidth. The paper presents a novel strategy to charge cavities, which reduces total energy reflection. We use variational calculus to obtain analytical expression for the optimal charging profile. Energies, reflected and required, and generator peak power are also compared between the charging schemes and practical aspects (saturation, efficiency and gain characteristics) of power sources (tetrodes, IOTs and solid state power amplifiers) are also considered and analysed. The paper presents a methodology to successfully identify the optimal charging scheme for different power sources to minimize total energy requirement.

  7. Testing Gaugino Universality in Minimal Supergravity at the LHC

    NASA Astrophysics Data System (ADS)

    Krislock, Abram; Arnowitt, Richard; Dutta, Bhaskar; Gurrola, Alfredo; Kamon, Teruki; Kolev, Nikolay; Simeon, Paul

    2006-10-01

    SUSY is a leading theory to uniquely open the possibility of unification of fundamental forces. As a result, the well motivated minimal supergravity (mSUGRA) models predict a particular mass relation among the three kinds of supersymmetric gauge bosons (gluino, next-to-lightest neutralino, and the lightest neutralino). The relation, originated by gaugino mass universality, will give an insight of Grand Unified Theories. The previous study showed that we will have to identify the tau lepton with a transverse energy above 20 GeV to probe the cosmologically allowed mSUGRA parameter space at the LHC. We extend the study by investigating a methodology of testing the mass universality hypothesis as well as the maximum reach of the gaugino masses.

  8. A non-parametric segmentation methodology for oral videocapillaroscopic images.

    PubMed

    Bellavia, Fabio; Cacioppo, Antonino; Lupaşcu, Carmen Alina; Messina, Pietro; Scardina, Giuseppe; Tegolo, Domenico; Valenti, Cesare

    2014-05-01

    We aim to describe a new non-parametric methodology to support the clinician during the diagnostic process of oral videocapillaroscopy to evaluate peripheral microcirculation. Our methodology, mainly based on wavelet analysis and mathematical morphology to preprocess the images, segments them by minimizing the within-class luminosity variance of both capillaries and background. Experiments were carried out on a set of real microphotographs to validate this approach versus handmade segmentations provided by physicians. By using a leave-one-patient-out approach, we pointed out that our methodology is robust, according to precision-recall criteria (average precision and recall are equal to 0.924 and 0.923, respectively) and it acts as a physician in terms of the Jaccard index (mean and standard deviation equal to 0.858 and 0.064, respectively). PMID:24657094

  9. A non-parametric segmentation methodology for oral videocapillaroscopic images.

    PubMed

    Bellavia, Fabio; Cacioppo, Antonino; Lupaşcu, Carmen Alina; Messina, Pietro; Scardina, Giuseppe; Tegolo, Domenico; Valenti, Cesare

    2014-05-01

    We aim to describe a new non-parametric methodology to support the clinician during the diagnostic process of oral videocapillaroscopy to evaluate peripheral microcirculation. Our methodology, mainly based on wavelet analysis and mathematical morphology to preprocess the images, segments them by minimizing the within-class luminosity variance of both capillaries and background. Experiments were carried out on a set of real microphotographs to validate this approach versus handmade segmentations provided by physicians. By using a leave-one-patient-out approach, we pointed out that our methodology is robust, according to precision-recall criteria (average precision and recall are equal to 0.924 and 0.923, respectively) and it acts as a physician in terms of the Jaccard index (mean and standard deviation equal to 0.858 and 0.064, respectively).

  10. Academic Achievement and Minimal Brain Dysfunction

    ERIC Educational Resources Information Center

    Edwards, R. Philip; And Others

    1971-01-01

    The investigation provided no evidence that a diagnosis of minimal brain dysfunction based on a pediatric neurological evaluation and/or visual-motor impairment as measured by the Bender-Gestalt, is a useful predictor of academic achievement. (Author)

  11. Minimally Invasive Treatments for Breast Cancer

    MedlinePlus

    ... SIR login) Interventional Radiology Minimally Invasive Treatments for Breast Cancer Interventional Radiology Treatments Offer New Options and Hope ... have in the fight against breast cancer. About Breast Cancer When breast tissue divides and grows at an ...

  12. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    Not Available

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  13. Genetic algorithms for minimal source reconstructions

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.

    1993-12-01

    Under-determined linear inverse problems arise in applications in which signals must be estimated from insufficient data. In these problems the number of potentially active sources is greater than the number of observations. In many situations, it is desirable to find a minimal source solution. This can be accomplished by minimizing a cost function that accounts from both the compatibility of the solution with the observations and for its ``sparseness``. Minimizing functions of this form can be a difficult optimization problem. Genetic algorithms are a relatively new and robust approach to the solution of difficult optimization problems, providing a global framework that is not dependent on local continuity or on explicit starting values. In this paper, the authors describe the use of genetic algorithms to find minimal source solutions, using as an example a simulation inspired by the reconstruction of neural currents in the human brain from magnetoencephalographic (MEG) measurements.

  14. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  15. Heart bypass surgery - minimally invasive - discharge

    MedlinePlus

    ... coronary artery bypass - discharge; RACAB - discharge; Keyhole heart surgery - discharge ... You had minimally invasive coronary artery bypass surgery on one ... an artery from your chest to create a detour, or bypass, around ...

  16. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  17. Sludge minimization technologies--an overview.

    PubMed

    Odegaard, H

    2004-01-01

    The management of wastewater sludge from wastewater treatment plants represents one of the major challenges in wastewater treatment today. The cost of the sludge treatment amounts to more than the cost of the liquid in many cases. Therefore the focus on and interest in sludge minimization is steadily increasing. In this paper an overview is given for sludge minimization (sludge mass reduction) options. It is demonstrated that sludge minimization may be a result of reduced production of sludge and/or disintegration processes that may take place both in the wastewater treatment stage and in the sludge stage. Various sludge disintegration technologies for sludge minimization are discussed, including mechanical methods (focusing on stirred ball-mill, high-pressure homogenizer, ultrasonic disintegrator), chemical methods (focusing on the use of ozone), physical methods (focusing on thermal and thermal/chemical hydrolysis) and biological methods (focusing on enzymatic processes).

  18. Analysis of lipid flow on minimal surfaces

    NASA Astrophysics Data System (ADS)

    Bahmani, Fatemeh; Christenson, Joel; Rangamani, Padmini

    2016-03-01

    Interaction between the bilayer shape and surface flow is important for capturing the flow of lipids in many biological membranes. Recent microscopy evidence has shown that minimal surfaces (planes, catenoids, and helicoids) occur often in cellular membranes. In this study, we explore lipid flow in these geometries using a `stream function' formulation for viscoelastic lipid bilayers. Using this formulation, we derive two-dimensional lipid flow equations for the commonly occurring minimal surfaces in lipid bilayers. We show that for three minimal surfaces (planes, catenoids, and helicoids), the surface flow equations satisfy Stokes flow equations. In helicoids and catenoids, we show that the tangential velocity field is a Killing vector field. Thus, our analysis provides fundamental insight into the flow patterns of lipids on intracellular organelle membranes that are characterized by fixed shapes reminiscent of minimal surfaces.

  19. Minimally Invasive Forefoot Surgery in France.

    PubMed

    Meusnier, Tristan; Mukish, Prikesht

    2016-06-01

    Study groups have been formed in France to advance the use of minimally invasive surgery. These techniques are becoming more frequently used and the technique nuances are continuing to evolve. The objective of this article was to advance the awareness of the current trends in minimally invasive surgery for common diseases of the forefoot. The percutaneous surgery at the forefoot is less developed at this time, but also will be discussed.

  20. The advantages of minimally invasive dentistry.

    PubMed

    Christensen, Gordon J

    2005-11-01

    Minimally invasive dentistry, in cases in which it is appropriate, is a concept that preserves dentitions and supporting structures. In this column, I have discussed several examples of minimally invasive dental techniques. This type of dentistry is gratifying for dentists and appreciated by patients. If more dentists would practice it, the dental profession could enhance the public's perception of its honesty and increase its professionalism as well.

  1. Current research in sonic-boom minimization

    NASA Technical Reports Server (NTRS)

    Darden, C. M.; Mack, R. J.

    1976-01-01

    A review is given of several questions as yet unanswered in the area of sonic-boom research. Efforts, both here at Langley and elsewhere, in the area of minimization, human response, design techniques and in developing higher order propagation methods are discussed. In addition, a wind-tunnel test program being conducted to assess the validity of minimization methods based on a forward spike in the F-function is described.

  2. Minimally invasive treatment of infected pancreatic necrosis

    PubMed Central

    Cebulski, Włodzimierz; Słodkowski, Maciej; Krasnodębski, Ireneusz W.

    2014-01-01

    Infected pancreatic necrosis is a challenging complication that worsens prognosis in acute pancreatitis. For years, open necrosectomy has been the mainstay treatment option in infected pancreatic necrosis, although surgical debridement still results in high morbidity and mortality rates. Recently, many reports on minimally invasive treatment in infected pancreatic necrosis have been published. This paper presents a review of minimally invasive techniques and attempts to define their role in the management of infected pancreatic necrosis. PMID:25653725

  3. Aortic Valve Surgery: Minimally Invasive Options

    PubMed Central

    Ramlawi, Basel; Bedeir, Kareem; Lamelas, Joseph

    2016-01-01

    Minimally invasive aortic valve surgery has not been adopted by a significant proportion of cardiac surgeons despite proven benefits. This may be related to a high learning curve and technical issues requiring retraining. In this review, we discuss the data for minimally invasive aortic valve surgery and describe our operative technique for both ministernotomy and anterior thoracotomy approaches. We also discuss the advent of novel sutureless valves and how these techniques compare to available transcatheter aortic valve procedures. PMID:27127559

  4. Minimally Invasive Osteotomies of the Calcaneus.

    PubMed

    Guyton, Gregory P

    2016-09-01

    Osteotomies of the calcaneus are powerful surgical tools, representing a critical component of the surgical reconstruction of pes planus and pes cavus deformity. Modern minimally invasive calcaneal osteotomies can be performed safely with a burr through a lateral incision. Although greater kerf is generated with the burr, the effect is modest, can be minimized, and is compatible with many fixation techniques. A hinged jig renders the procedure more reproducible and accessible.

  5. Future of Minimally Invasive Colorectal Surgery.

    PubMed

    Whealon, Matthew; Vinci, Alessio; Pigazzi, Alessio

    2016-09-01

    Minimally invasive surgery is slowly taking over as the preferred operative approach for colorectal diseases. However, many of the procedures remain technically difficult. This article will give an overview of the state of minimally invasive surgery and the many advances that have been made over the last two decades. Specifically, we discuss the introduction of the robotic platform and some of its benefits and limitations. We also describe some newer techniques related to robotics. PMID:27582647

  6. Minimally Invasive Surgery in Gynecologic Oncology

    PubMed Central

    Mori, Kristina M.; Neubauer, Nikki L.

    2013-01-01

    Minimally invasive surgery has been utilized in the field of obstetrics and gynecology as far back as the 1940s when culdoscopy was first introduced as a visualization tool. Gynecologists then began to employ minimally invasive surgery for adhesiolysis and obtaining biopsies but then expanded its use to include procedures such as tubal sterilization (Clyman (1963), L. E. Smale and M. L. Smale (1973), Thompson and Wheeless (1971), Peterson and Behrman (1971)). With advances in instrumentation, the first laparoscopic hysterectomy was successfully performed in 1989 by Reich et al. At the same time, minimally invasive surgery in gynecologic oncology was being developed alongside its benign counterpart. In the 1975s, Rosenoff et al. reported using peritoneoscopy for pretreatment evaluation in ovarian cancer, and Spinelli et al. reported on using laparoscopy for the staging of ovarian cancer. In 1993, Nichols used operative laparoscopy to perform pelvic lymphadenectomy in cervical cancer patients. The initial goals of minimally invasive surgery, not dissimilar to those of modern medicine, were to decrease the morbidity and mortality associated with surgery and therefore improve patient outcomes and patient satisfaction. This review will summarize the history and use of minimally invasive surgery in gynecologic oncology and also highlight new minimally invasive surgical approaches currently in development. PMID:23997959

  7. Economic impact of minimally invasive lumbar surgery

    PubMed Central

    Hofstetter, Christoph P; Hofer, Anna S; Wang, Michael Y

    2015-01-01

    Cost effectiveness has been demonstrated for traditional lumbar discectomy, lumbar laminectomy as well as for instrumented and noninstrumented arthrodesis. While emerging evidence suggests that minimally invasive spine surgery reduces morbidity, duration of hospitalization, and accelerates return to activites of daily living, data regarding cost effectiveness of these novel techniques is limited. The current study analyzes all available data on minimally invasive techniques for lumbar discectomy, decompression, short-segment fusion and deformity surgery. In general, minimally invasive spine procedures appear to hold promise in quicker patient recovery times and earlier return to work. Thus, minimally invasive lumbar spine surgery appears to have the potential to be a cost-effective intervention. Moreover, novel less invasive procedures are less destabilizing and may therefore be utilized in certain indications that traditionally required arthrodesis procedures. However, there is a lack of studies analyzing the economic impact of minimally invasive spine surgery. Future studies are necessary to confirm the durability and further define indications for minimally invasive lumbar spine procedures. PMID:25793159

  8. Multifunction minimization for programmable logic arrays

    SciTech Connect

    Campbell, J.A.

    1984-01-01

    The problem of minimizing two-level AND/OR Boolean algebraic functions of n inputs and m outputs for implementation on programmable logic arrays (PLA) is examined. The theory of multiple-output functions as well as the historically alternative approaches to reckoning the cost of an equation implementation are reviewed. The PLA is shown to be a realization of the least product gate equation cost criterion. The multi-function minimization is dealt with in the context of a directed tree search algorithm developed in previous research. The PLA oriented minimization is shown to alter the nature of each of the basic tenets of multiple-output minimization used in earlier work. The concept of a non-prime but selectable implicant is introduced. A new cost criterion, the quantum cost, is discussed, and an approximation algorithm utilizing this criterion is developed. A timing analysis of a cyclic resolution algorithm for PLA based functions is presented. Lastly, the question of efficiency in automated minimization algorithms is examined. The application of the PLA cost criterion is shown to exhibit intrinsic increases in computational efficiency. A minterm classification algorithm is suggested and a PLA minimization algorithm is implemented in the FORTRAN language.

  9. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  10. Sequential unconstrained minimization algorithms for constrained optimization

    NASA Astrophysics Data System (ADS)

    Byrne, Charles

    2008-02-01

    The problem of minimizing a function f(x):RJ → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G_k(x)=f(x)+g_k(x), to obtain xk. The auxiliary functions gk(x):D ⊆ RJ → R+ are nonnegative on the set D, each xk is assumed to lie within D, and the objective is to minimize the continuous function f:RJ → R over x in the set C=\\overline D , the closure of D. We assume that such minimizers exist, and denote one such by \\hat x . We assume that the functions gk(x) satisfy the inequalities 0\\leq g_k(x)\\leq G_{k-1}(x)-G_{k-1}(x^{k-1}), for k = 2, 3, .... Using this assumption, we show that the sequence {f(xk)} is decreasing and converges to f({\\hat x}) . If the restriction of f(x) to D has bounded level sets, which happens if \\hat x is unique and f(x) is closed, proper and convex, then the sequence {xk} is bounded, and f(x^*)=f({\\hat x}) , for any cluster point x*. Therefore, if \\hat x is unique, x^*={\\hat x} and \\{x^k\\}\\rightarrow {\\hat x} . When \\hat x is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton-Raphson method. The proof techniques used for SUMMA can be extended to obtain related results for the induced proximal

  11. Mach, methodology, hysteresis and economics

    NASA Astrophysics Data System (ADS)

    Cross, R.

    2008-11-01

    This methodological note examines the epistemological foundations of hysteresis with particular reference to applications to economic systems. The economy principles of Ernst Mach are advocated and used in this assessment.

  12. [Ancient DNA: principles and methodologies].

    PubMed

    De Angelis, Flavio; Scorrano, Gabriele; Rickards, Olga

    2013-01-01

    Paleogenetics is providing increasing evidence about the biological characteristics of ancient populations. This paper examines the guiding principles and methodologies to the study of ancient DNA with constant references to the state of the art in this fascinating disciplin.

  13. Methodological Problems of Soviet Pedagogy

    ERIC Educational Resources Information Center

    Noah, Harold J., Ed.; Beach, Beatrice S., Ed.

    1974-01-01

    Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)

  14. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  15. A methodology for chair evaluation.

    PubMed

    Drury, C G; Coury, B G

    1982-09-01

    A methodology for evaluating a single chair, rather than making a comparison among chairs, was developed from previous chair studies. The methodology was found to be rapid and effective when applied to a prototype chair, giving information to the manufacturer on overall comfort and good and bad points in the design. Testing took place on three tasks and showed that chair comfort is influenced by the task as well as the chair. PMID:15676443

  16. Blackfolds, plane waves and minimal surfaces

    NASA Astrophysics Data System (ADS)

    Armas, Jay; Blau, Matthias

    2015-07-01

    Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.

  17. Minimally invasive procedures on the lumbar spine

    PubMed Central

    Skovrlj, Branko; Gilligan, Jeffrey; Cutler, Holt S; Qureshi, Sheeraz A

    2015-01-01

    Degenerative disease of the lumbar spine is a common and increasingly prevalent condition that is often implicated as the primary reason for chronic low back pain and the leading cause of disability in the western world. Surgical management of lumbar degenerative disease has historically been approached by way of open surgical procedures aimed at decompressing and/or stabilizing the lumbar spine. Advances in technology and surgical instrumentation have led to minimally invasive surgical techniques being developed and increasingly used in the treatment of lumbar degenerative disease. Compared to the traditional open spine surgery, minimally invasive techniques require smaller incisions and decrease approach-related morbidity by avoiding muscle crush injury by self-retaining retractors, preventing the disruption of tendon attachment sites of important muscles at the spinous processes, using known anatomic neurovascular and muscle planes, and minimizing collateral soft-tissue injury by limiting the width of the surgical corridor. The theoretical benefits of minimally invasive surgery over traditional open surgery include reduced blood loss, decreased postoperative pain and narcotics use, shorter hospital length of stay, faster recover and quicker return to work and normal activity. This paper describes the different minimally invasive techniques that are currently available for the treatment of degenerative disease of the lumbar spine. PMID:25610845

  18. Minimal control power of the controlled teleportation

    NASA Astrophysics Data System (ADS)

    Jeong, Kabgyun; Kim, Jaewan; Lee, Soojoon

    2016-03-01

    We generalize the control power of a perfect controlled teleportation of an entangled three-qubit pure state, suggested by Li and Ghose [Phys. Rev. A 90, 052305 (2014), 10.1103/PhysRevA.90.052305], to the control power of a general controlled teleportation of a multiqubit pure state. Thus, we define the minimal control power, and calculate the values of the minimal control power for a class of general three-qubit Greenberger-Horne-Zeilinger (GHZ) states and the three-qubit W class whose states have zero three-tangles. Moreover, we show that the standard three-qubit GHZ state and the standard three-qubit W state have the maximal values of the minimal control power for the two classes, respectively. This means that the minimal control power can be interpreted as not only an operational quantity of a three-qubit quantum communication but also a degree of three-qubit entanglement. In addition, we calculate the values of the minimal control power for general n -qubit GHZ states and the n -qubit W -type states.

  19. Genetic research on biospecimens poses minimal risk.

    PubMed

    Wendler, David S; Rid, Annette

    2015-01-01

    Genetic research on human biospecimens is increasingly common. However, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to resolve this debate using the widely-endorsed 'risks of daily life' standard. The three extant versions of this standard all suggest that, with proper measures in place to protect confidentiality, most genetic research on human biospecimens poses minimal risk to donors.

  20. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  1. Minimally invasive surgery for atrial fibrillation.

    PubMed

    Lancaster, Timothy S; Melby, Spencer J; Damiano, Ralph J

    2016-04-01

    The surgical treatment of atrial fibrillation (AF) has been revolutionized over the past two decades through surgical innovation and improvements in endoscopic imaging, ablation technology, and surgical instrumentation. These advances have prompted the development of the less complex and less morbid Cox-Maze IV procedure, and have allowed its adaptation to a minimally invasive right mini-thoracotomy approach that can be used in stand-alone AF ablation and in patients undergoing concomitant mitral and tricuspid valve surgery. Other minimally invasive ablation techniques have been developed for stand-alone AF ablation, including video-assisted pulmonary vein isolation, extended left atrial lesion sets, and a hybrid approach. This review will discuss the tools, techniques, and outcomes of minimally invasive surgical procedures currently being practiced for AF ablation.

  2. Advanced pyrochemical technologies for minimizing nuclear waste

    SciTech Connect

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-06-01

    The Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium, from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 liters of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper will present technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts.

  3. Robotically assisted minimally invasive mitral valve surgery

    PubMed Central

    Alwair, Hazaim; Nifong, Wiley L; Chitwood, W Randolph

    2013-01-01

    Increased recognition of advantages, over the last decade, of minimizing surgical trauma by operating through smaller incisions and its direct impact on reduced postoperative pain, quicker recovery, improved cosmesis and earlier return to work has spurred the minimally invasive cardiac surgical revolution. This transition began in the early 1990s with advancements in endoscopic instruments, video & fiberoptic technology and improvements in perfusion systems for establishing cardiopulmonary bypass (CPB) via peripheral cannulation. Society of Thoracic Surgeons data documents that 20% of all mitral valve surgeries are performed using minimally invasive techniques, with half being robotically assisted. This article reviews the current status of robotically assisted mitral valve surgery, its advantages and technical modifications for optimizing clinical outcomes. PMID:24251030

  4. [EVOLUTION OF MINIMALLY INVASIVE CARDIAC SURGERY].

    PubMed

    Fujita, Tomoyuki; Kobayashi, Junjiro

    2016-03-01

    Minimally invasive surgery is an attractive choice for patients undergoing major cardiac surgery. We review the history of minimally invasive valve surgery in this article. Due to many innovations in surgical tools, cardiopulmonary bypass systems, visualization systems, and robotic systems as well as surgical techniques, minimally invasive cardiac surgery has become standard care for valve lesion repair. In particular, aortic cross-clamp techniques and methods for cardioplegia using the Chitwood clamp and root cannula or endoballoon catheter in combination with femoro-femoral bypass systems have made such procedures safer and more practical. On the other hand, robotically assisted surgery has not become standard due to the cost and slow learning curve. However, along with the development of robotics, this less-invasive technique may provide another choice for patients in the near future. PMID:27295770

  5. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  6. Improved rapid prototyping methodology for MPEG-4 IC development

    NASA Astrophysics Data System (ADS)

    Tang, Clive K. K.; Moseler, Kathy; Levi, Sami

    1998-12-01

    One important factor in deciding the success of a new consumer product or integrated circuit is minimized time-to- market. A rapid prototyping methodology that encompasses algorithm development in the hardware design phase will have great impact on reducing time-to-market. In this paper, a proven hardware design methodology and a novel top-down design methodology based on Frontier Design's DSP Station tool are described. The proven methodology was used during development of the MC149570 H.261/H.263 video codec manufactured by Motorola. This paper discusses an improvement to this method to create an integrated environment for both system and hardware development, thereby further reducing the time-to-market. The software tool chosen is DSP Station tool by Frontier Design. The rich features of DSP Station tool will be described and then it will be shown how these features may be useful in designing from algorithm to silicon. How this methodology may be used in the development of a new MPEG4 Video Communication ASIC will be outlined. A brief comparison with a popular tool, Signal Processing WorkSystem tool by Cadence, will also be given.

  7. Minimally invasive surgical techniques in periodontal regeneration.

    PubMed

    Cortellini, Pierpaolo

    2012-09-01

    A review of the current scientific literature was undertaken to evaluate the efficacy of minimally invasive periodontal regenerative surgery in the treatment of periodontal defects. The impact on clinical outcomes, surgical chair-time, side effects and patient morbidity were evaluated. An electronic search of PUBMED database from January 1987 to December 2011 was undertaken on dental journals using the key-word "minimally invasive surgery". Cohort studies, retrospective studies and randomized controlled clinical trials referring to treatment of periodontal defects with at least 6 months of follow-up were selected. Quality assessment of the selected studies was done through the Strength of Recommendation Taxonomy Grading (SORT) System. Ten studies (1 retrospective, 5 cohorts and 4 RCTs) were included. All the studies consistently support the efficacy of minimally invasive surgery in the treatment of periodontal defects in terms of clinical attachment level gain, probing pocket depth reduction and minimal gingival recession. Six studies reporting on side effects and patient morbidity consistently indicate very low levels of pain and discomfort during and after surgery resulting in a reduced intake of pain-killers and very limited interference with daily activities in the post-operative period. Minimally invasive surgery might be considered a true reality in the field of periodontal regeneration. The observed clinical improvements are consistently associated with very limited morbidity to the patient during the surgical procedure as well as in the post-operative period. Minimally invasive surgery, however, cannot be applied at all cases. A stepwise decisional algorithm should support clinicians in choosing the treatment approach.

  8. Minimally invasive transforaminal lumbosacral interbody fusion.

    PubMed

    Chang, Peng-Yuan; Wang, Michael Y

    2016-07-01

    In minimally invasive spinal fusion surgery, transforaminal lumbar (sacral) interbody fusion (TLIF) is one of the most common procedures that provides both anterior and posterior column support without retraction or violation to the neural structure. Direct and indirect decompression can be done through this single approach. Preoperative plain radiographs and MR scan should be carefully evaluated. This video demonstrates a standard approach for how to perform a minimally invasive transforaminal lumbosacral interbody fusion. The video can be found here: https://youtu.be/bhEeafKJ370 . PMID:27364426

  9. The Parisi Formula has a Unique Minimizer

    NASA Astrophysics Data System (ADS)

    Auffinger, Antonio; Chen, Wei-Kuo

    2015-05-01

    In 1979, Parisi (Phys Rev Lett 43:1754-1756, 1979) predicted a variational formula for the thermodynamic limit of the free energy in the Sherrington-Kirkpatrick model, and described the role played by its minimizer. This formula was verified in the seminal work of Talagrand (Ann Math 163(1):221-263, 2006) and later generalized to the mixed p-spin models by Panchenko (Ann Probab 42(3):946-958, 2014). In this paper, we prove that the minimizer in Parisi's formula is unique at any temperature and external field by establishing the strict convexity of the Parisi functional.

  10. Instabilities and Solitons in Minimal Strips

    NASA Astrophysics Data System (ADS)

    Machon, Thomas; Alexander, Gareth P.; Goldstein, Raymond E.; Pesci, Adriana I.

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ4 theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  11. The concept of minimally invasive dentistry.

    PubMed

    Ericson, Dan

    2007-01-01

    This paper reviews Minimally Invasive Dentistry (MID) from a day-to-day dentistry perspective, focusing mostly on cariology and restorative dentistry, even though it embraces many aspects of dentistry. The concept of MID supports a systematic respect for the original tissue, including diagnosis, risk assessment, preventive treatment, and minimal tissue removal upon restoration. The motivation for MID emerges from the fact that fillings are not permanent and that the main reasons for failure are secondary caries and filling fracture. To address these flaws, there is a need for economical re-routing so that practices can survive on maintaining dental health and not only by operative procedures.

  12. Minimally invasive restorative dentistry: a biomimetic approach.

    PubMed

    Malterud, Mark I

    2006-08-01

    When providing dental treatment for a given patient, the practitioner should use a minimally invasive technique that conserves sound tooth structure as a clinical imperative. Biomimetics is a tenet that guides the author's practice and is generally described as the mimicking of natural life. This can be accomplished in many cases using contemporary composite resins and adhesive dental procedures. Both provide clinical benefits and support the biomimetic philosophy for treatment. This article illustrates a minimally invasive approach for the restoration of carious cervical defects created by poor hygiene exacerbated by the presence of orthodontic brackets.

  13. Minimally invasive repair of meta-bones.

    PubMed

    Piras, Alessandro; Guerrero, Tomás G

    2012-09-01

    Metacarpal and metatarsal fractures are common injuries in small animals and, in most of the cases, can be treated by minimally invasive techniques. Bone plates applied through epi-periosteal tunnels can stabilize meta-bones. Meta-bones III and IV are stabilized by dorsally applied plates. Meta-bones II and V are stabilized using plates applied medially and laterally. The scarcity of soft tissue coverage and the simple anatomy of meta-bones make these fractures amenable to fixation by using minimally invasive techniques. This practice should reduce morbidity and enhance healing time.

  14. Instabilities and Solitons in Minimal Strips.

    PubMed

    Machon, Thomas; Alexander, Gareth P; Goldstein, Raymond E; Pesci, Adriana I

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ^{4} theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation. PMID:27419593

  15. Minimal mass design of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Nagase, Kenji; Skelton, R. E.

    2014-03-01

    This paper provides a unified framework for minimal mass design of tensegrity systems. For any given configuration and any given set of external forces, we design force density (member force divided by length) and cross-section area to minimize the structural mass subject to an equilibrium condition and a maximum stress constraint. The answer is provided by a linear program. Stability is assured by a positive definite stiffness matrix. This condition is described by a linear matrix inequality. Numerical examples are shown to illustrate the proposed method.

  16. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D.W.; Bennett, B.L.; Cockroft, N.J.

    1998-09-08

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal. 5 figs.

  17. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D. Wayne; Bennett, Bryan L.; Cockroft, Nigel J.

    1998-01-01

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal.

  18. Minimal scales from an extended Hilbert space

    NASA Astrophysics Data System (ADS)

    Kober, Martin; Nicolini, Piero

    2010-12-01

    We consider an extension of the conventional quantum Heisenberg algebra, assuming that coordinates as well as momenta fulfil nontrivial commutation relations. As a consequence, a minimal length and a minimal mass scale are implemented. Our commutators do not depend on positions and momenta and we provide an extension of the coordinate coherent state approach to noncommutative geometry. We explore, as a toy model, the corresponding quantum field theory in a (2+1)-dimensional spacetime. Then we investigate the more realistic case of a (3+1)-dimensional spacetime, foliated into noncommutative planes. As a result, we obtain propagators, which are finite in the ultraviolet as well as the infrared regime.

  19. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  20. From Jack polynomials to minimal model spectra

    NASA Astrophysics Data System (ADS)

    Ridout, David; Wood, Simon

    2015-01-01

    In this note, a deep connection between free field realizations of conformal field theories and symmetric polynomials is presented. We give a brief introduction into the necessary prerequisites of both free field realizations and symmetric polynomials, in particular Jack symmetric polynomials. Then we combine these two fields to classify the irreducible representations of the minimal model vertex operator algebras as an illuminating example of the power of these methods. While these results on the representation theory of the minimal models are all known, this note exploits the full power of Jack polynomials to present significant simplifications of the original proofs in the literature.

  1. Non-minimal inflation and SUSY GUTs

    SciTech Connect

    Okada, Nobuchika

    2012-07-27

    The Standard Model Higgs boson with the nonminimal coupling to the gravitational curvature can drive cosmological inflation. We study this type of inflationary scenario in the context of supergravity. We first point out that it is naturally implemented in the minimal supersymmetric SU(5) model, and hence virtually in any GUT models. Next we propose another scenario based on the Minimal Supersymmetric Standard Model supplemented by the right-handed neutrinos. These models can be tested by new observational data from the Planck satellite experiments within a few years.

  2. Methodology of metal criticality determination.

    PubMed

    Graedel, T E; Barr, Rachel; Chandler, Chelsea; Chase, Thomas; Choi, Joanne; Christoffersen, Lee; Friedlander, Elizabeth; Henly, Claire; Jun, Christine; Nassar, Nedal T; Schechner, Daniel; Warren, Simon; Yang, Man-Yu; Zhu, Charles

    2012-01-17

    A comprehensive methodology has been created to quantify the degree of criticality of the metals of the periodic table. In this paper, we present and discuss the methodology, which is comprised of three dimensions: supply risk, environmental implications, and vulnerability to supply restriction. Supply risk differs with the time scale (medium or long), and at its more complex involves several components, themselves composed of a number of distinct indicators drawn from readily available peer-reviewed indexes and public information. Vulnerability to supply restriction differs with the organizational level (i.e., global, national, and corporate). The criticality methodology, an enhancement of a United States National Research Council template, is designed to help corporate, national, and global stakeholders conduct risk evaluation and to inform resource utilization and strategic decision-making. Although we believe our methodological choices lead to the most robust results, the framework has been constructed to permit flexibility by the user. Specific indicators can be deleted or added as desired and weighted as the user deems appropriate. The value of each indicator will evolve over time, and our future research will focus on this evolution. The methodology has proven to be sufficiently robust as to make it applicable across the entire spectrum of metals and organizational levels and provides a structural approach that reflects the multifaceted factors influencing the availability of metals in the 21st century.

  3. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.

  4. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services. PMID:16378531

  5. Minimal Mimicry: Mere Effector Matching Induces Preference

    ERIC Educational Resources Information Center

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-01-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually…

  6. Minimal Interventions in the Teaching of Mathematics

    ERIC Educational Resources Information Center

    Foster, Colin

    2014-01-01

    This paper addresses ways in which mathematics pedagogy can benefit from insights gleaned from counselling. Person-centred counselling stresses the value of genuineness, warm empathetic listening and minimal intervention to support people in solving their own problems and developing increased autonomy. Such an approach contrasts starkly with the…

  7. Challenging the minimal supersymmetric SU(5) model

    SciTech Connect

    Bajc, Borut; Lavignac, Stéphane; Mede, Timon

    2014-06-24

    We review the main constraints on the parameter space of the minimal renormalizable supersymmetric SU(5) grand unified theory. They consist of the Higgs mass, proton decay, electroweak symmetry breaking and fermion masses. Superpartner masses are constrained both from below and from above, giving hope for confirming or definitely ruling out the theory in the future. This contribution is based on Ref. [1].

  8. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. The intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative s...

  9. Tsallis distribution from minimally selected order statistics

    SciTech Connect

    Wilk, G.; Wlodarczyk, Z.

    2007-12-06

    We demonstrate that selection of the minimal value of ordered variables leads in a natural way to its distribution being given by the Tsallis distribution, the same as that resulting from Tsallis nonextensive statistics. The possible application of this result to the multiparticle production processes is indicated.

  10. Banach spaces that realize minimal fillings

    SciTech Connect

    Bednov, B. B.; Borodin, P. A. E-mail: pborodin@inbox.ru

    2014-04-30

    It is proved that a real Banach space realizes minimal fillings for all its finite subsets (a shortest network spanning a fixed finite subset always exists and has the minimum possible length) if and only if it is a predual of L{sub 1}. The spaces L{sub 1} are characterized in terms of Steiner points (medians). Bibliography: 25 titles. (paper)

  11. Minimal Brain Dysfunction: Associations with Perinatal Complications.

    ERIC Educational Resources Information Center

    Nichols, Paul L.

    Examined with over 28,000 7-year-old children whose mothers registered for prenatal care was the relationship between perinatal complications and such characteristics as poor school achievement, hyperactivity, and neurological soft signs associated with the diagnosis of minimal brain dysfunction (MBD). Ten perinatal antecedents were studied:…

  12. Pancreatic cancer: Open or minimally invasive surgery?

    PubMed

    Zhang, Yu-Hua; Zhang, Cheng-Wu; Hu, Zhi-Ming; Hong, De-Fei

    2016-08-28

    Pancreatic duct adenocarcinoma is one of the most fatal malignancies, with R0 resection remaining the most important part of treatment of this malignancy. However, pancreatectomy is believed to be one of the most challenging procedures and R0 resection remains the only chance for patients with pancreatic cancer to have a good prognosis. Some surgeons have tried minimally invasive pancreatic surgery, but the short- and long-term outcomes of pancreatic malignancy remain controversial between open and minimally invasive procedures. We collected comparative data about minimally invasive and open pancreatic surgery. The available evidence suggests that minimally invasive pancreaticoduodenectomy (MIPD) is as safe and feasible as open PD (OPD), and shows some benefit, such as less intraoperative blood loss and shorter postoperative hospital stay. Despite the limited evidence for MIPD in pancreatic cancer, most of the available data show that the short-term oncological adequacy is similar between MIPD and OPD. Some surgical techniques, including superior mesenteric artery-first approach and laparoscopic pancreatoduodenectomy with major vein resection, are believed to improve the rate of R0 resection. Laparoscopic distal pancreatectomy is less technically demanding and is accepted in more pancreatic centers. It is technically safe and feasible and has similar short-term oncological prognosis compared with open distal pancreatectomy. PMID:27621576

  13. Minimally Invasive Surgery for Inflammatory Bowel Disease

    PubMed Central

    Holder-Murray, Jennifer; Marsicovetere, Priscilla

    2015-01-01

    Abstract: Surgical management of inflammatory bowel disease is a challenging endeavor given infectious and inflammatory complications, such as fistula, and abscess, complex often postoperative anatomy, including adhesive disease from previous open operations. Patients with Crohn's disease and ulcerative colitis also bring to the table the burden of their chronic illness with anemia, malnutrition, and immunosuppression, all common and contributing independently as risk factors for increased surgical morbidity in this high-risk population. However, to reduce the physical trauma of surgery, technologic advances and worldwide experience with minimally invasive surgery have allowed laparoscopic management of patients to become standard of care, with significant short- and long-term patient benefits compared with the open approach. In this review, we will describe the current state-of the-art for minimally invasive surgery for inflammatory bowel disease and the caveats inherent with this practice in this complex patient population. Also, we will review the applicability of current and future trends in minimally invasive surgical technique, such as laparoscopic “incisionless,” single-incision laparoscopic surgery (SILS), robotic-assisted, and other techniques for the patient with inflammatory bowel disease. There can be no doubt that minimally invasive surgery has been proven to decrease the short- and long-term burden of surgery of these chronic illnesses and represents high-value care for both patient and society. PMID:25989341

  14. Minimally Invasive Mitral Valve Surgery III

    PubMed Central

    Lehr, Eric J.; Guy, T. Sloane; Smith, Robert L.; Grossi, Eugene A.; Shemin, Richard J.; Rodriguez, Evelio; Ailawadi, Gorav; Agnihotri, Arvind K.; Fayers, Trevor M.; Hargrove, W. Clark; Hummel, Brian W.; Khan, Junaid H.; Malaisrie, S. Chris; Mehall, John R.; Murphy, Douglas A.; Ryan, William H.; Salemi, Arash; Segurola, Romualdo J.; Smith, J. Michael; Wolfe, J. Alan; Weldner, Paul W.; Barnhart, Glenn R.; Goldman, Scott M.; Lewis, Clifton T. P.

    2016-01-01

    Abstract Minimally invasive mitral valve operations are increasingly common in the United States, but robotic-assisted approaches have not been widely adopted for a variety of reasons. This expert opinion reviews the state of the art and defines best practices, training, and techniques for developing a successful robotics program. PMID:27662478

  15. Consequences of "Minimal" Group Affiliations in Children

    ERIC Educational Resources Information Center

    Dunham, Yarrow; Baron, Andrew Scott; Carey, Susan

    2011-01-01

    Three experiments (total N = 140) tested the hypothesis that 5-year-old children's membership in randomly assigned "minimal" groups would be sufficient to induce intergroup bias. Children were randomly assigned to groups and engaged in tasks involving judgments of unfamiliar in-group or out-group children. Despite an absence of information…

  16. Practice Enables Successful Learning under Minimal Guidance

    ERIC Educational Resources Information Center

    Brunstein, Angela; Betts, Shawn; Anderson, John R.

    2009-01-01

    Two experiments were conducted, contrasting a minimally guided discovery condition with a variety of instructional conditions. College students interacted with a computer-based tutor that presented algebra-like problems in a novel graphical representation. Although the tutor provided no instruction in a discovery condition, it constrained the…

  17. Minimally invasive pancreatic surgery – a review

    PubMed Central

    Damoli, Isacco; Ramera, Marco; Paiella, Salvatore; Marchegiani, Giovanni; Bassi, Claudio

    2015-01-01

    During the past 20 years the application of a minimally invasive approach to pancreatic surgery has progressively increased. Distal pancreatectomy is the most frequently performed procedure, because of the absence of a reconstructive phase. However, middle pancreatectomy and pancreatoduodenectomy have been demonstrated to be safe and feasible as well. Laparoscopic distal pancreatectomy is recognized as the gold standard treatment for small tumors of the pancreatic body-tail, with several advantages over the traditional open approach in terms of patient recovery. The surgical treatment of lesions of the pancreatic head via a minimally invasive approach is still limited to a few highly experienced surgeons, due to the very challenging resection and complex anastomoses. Middle pancreatectomy and enucleation are indicated for small and benign tumors and offer the maximum preservation of the parenchyma. The introduction of a robotic platform more than ten years ago increased the interest of many surgeons in minimally invasive treatment of pancreatic diseases. This new technology overcomes all the limitations of laparoscopic surgery, but actual benefits for the patients are still under investigation. The increased costs associated with robotic surgery are under debate too. This article presents the state of the art of minimally invasive pancreatic surgery. PMID:26240612

  18. Minimization of Salmonella Contamination on Raw Poultry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many reviews have discussed Salmonella in poultry and suggested best practices to minimize this organism on raw poultry meat. Despite years of research and conscientious control efforts by industry and regulatory agencies, human salmonellosis rates have declined only modestly and Salmonella is stil...

  19. MULTIOBJECTIVE PARALLEL GENETIC ALGORITHM FOR WASTE MINIMIZATION

    EPA Science Inventory

    In this research we have developed an efficient multiobjective parallel genetic algorithm (MOPGA) for waste minimization problems. This MOPGA integrates PGAPack (Levine, 1996) and NSGA-II (Deb, 2000) with novel modifications. PGAPack is a master-slave parallel implementation of a...

  20. Pancreatic cancer: Open or minimally invasive surgery?

    PubMed Central

    Zhang, Yu-Hua; Zhang, Cheng-Wu; Hu, Zhi-Ming; Hong, De-Fei

    2016-01-01

    Pancreatic duct adenocarcinoma is one of the most fatal malignancies, with R0 resection remaining the most important part of treatment of this malignancy. However, pancreatectomy is believed to be one of the most challenging procedures and R0 resection remains the only chance for patients with pancreatic cancer to have a good prognosis. Some surgeons have tried minimally invasive pancreatic surgery, but the short- and long-term outcomes of pancreatic malignancy remain controversial between open and minimally invasive procedures. We collected comparative data about minimally invasive and open pancreatic surgery. The available evidence suggests that minimally invasive pancreaticoduodenectomy (MIPD) is as safe and feasible as open PD (OPD), and shows some benefit, such as less intraoperative blood loss and shorter postoperative hospital stay. Despite the limited evidence for MIPD in pancreatic cancer, most of the available data show that the short-term oncological adequacy is similar between MIPD and OPD. Some surgical techniques, including superior mesenteric artery-first approach and laparoscopic pancreatoduodenectomy with major vein resection, are believed to improve the rate of R0 resection. Laparoscopic distal pancreatectomy is less technically demanding and is accepted in more pancreatic centers. It is technically safe and feasible and has similar short-term oncological prognosis compared with open distal pancreatectomy. PMID:27621576

  1. Pancreatic cancer: Open or minimally invasive surgery?

    PubMed Central

    Zhang, Yu-Hua; Zhang, Cheng-Wu; Hu, Zhi-Ming; Hong, De-Fei

    2016-01-01

    Pancreatic duct adenocarcinoma is one of the most fatal malignancies, with R0 resection remaining the most important part of treatment of this malignancy. However, pancreatectomy is believed to be one of the most challenging procedures and R0 resection remains the only chance for patients with pancreatic cancer to have a good prognosis. Some surgeons have tried minimally invasive pancreatic surgery, but the short- and long-term outcomes of pancreatic malignancy remain controversial between open and minimally invasive procedures. We collected comparative data about minimally invasive and open pancreatic surgery. The available evidence suggests that minimally invasive pancreaticoduodenectomy (MIPD) is as safe and feasible as open PD (OPD), and shows some benefit, such as less intraoperative blood loss and shorter postoperative hospital stay. Despite the limited evidence for MIPD in pancreatic cancer, most of the available data show that the short-term oncological adequacy is similar between MIPD and OPD. Some surgical techniques, including superior mesenteric artery-first approach and laparoscopic pancreatoduodenectomy with major vein resection, are believed to improve the rate of R0 resection. Laparoscopic distal pancreatectomy is less technically demanding and is accepted in more pancreatic centers. It is technically safe and feasible and has similar short-term oncological prognosis compared with open distal pancreatectomy.

  2. Methodology in clinical sleep research.

    PubMed

    Rosa, A; Poyares, D; Moraes, W; Cintra, F

    2007-05-01

    This review presents traditional and cutting-edge interventions in sleep research, including descriptions of the relationship of rapid eye movement-non-rapid eye movement sleep with the autonomous nervous system, and dream research methodology. Although sleep and dreaming are overlapping and non- separable phenomena, they are not typically addressed simultaneously in the scientific sleep research literature. Therefore, a more extensive overview of dream research has been included with a focus on objective dream content analysis and the theory of neurocognitive analysis. A bridge is made between dream content analysis and current sleep research methodologies.

  3. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  4. The Library Space Utilization Methodology.

    ERIC Educational Resources Information Center

    Hall, Richard B.

    1978-01-01

    Describes the Library Space Utilization (LSU) methodology, which demonstrates that significant information about the functional requirements of a library can be measured and displayed in a quantitative and graphic form. It measures "spatial" relationships between selected functional divisions; it also determines how many people--staff and…

  5. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  6. The Question of Research Methodologies.

    ERIC Educational Resources Information Center

    Aanstoos, Christopher M.

    This paper argues that a human science approach should be included in the American Psychological Association's (APA) pending reconsideration of accreditation specifications. Psychology's curriculum will remain incomplete and sterile until it assimilates this approach. Some of the key procedures of human science research methodology are outlined,…

  7. A Methodological Investigation of Cultivation.

    ERIC Educational Resources Information Center

    Rubin, Alan M.; And Others

    Cultivation theory states that television engenders negative emotions in heavy viewers. Noting that cultivation methodology contains an apparent response bias, a study examined relationships between television exposure and positive restatements of cultivation concepts and tested a more instrumental media uses and effects model. Cultivation was…

  8. ESP Methodology for Science Lecturers.

    ERIC Educational Resources Information Center

    Rogers, Angela; Mulyana, Cukup

    A program designed to teach university science lecturers in Indonesia how to design and teach one-semester courses in English for special purposes (ESP) is described. The program provided lecturers with training in language teaching methodology and course design. The piloting of the teacher training course, focusing on physics instruction, is…

  9. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  10. Philosophy, Methodology and Action Research

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  11. Should minimal residual disease guide therapy in AML?

    PubMed

    Paietta, Elisabeth

    2015-01-01

    The prognostic power of minimal residual disease after therapy for acute leukemias is not in question. It is only logical that the finding of leukemic blast cells after therapy predicts for impending relapse or at least the need for additional treatment. Which level of what is called minimal residual disease (MRD) is clinically relevant, however, depends on the efficacy of the initial treatment as well as the treatment strategies available to target MRD. There are a multitude of additional factors that can alter the clinical significance of MRD, including the genotype of the patient's leukemic cells. The fact that methodologies of MRD detection are not standardized and thresholds for defining MRD positivity vary depending upon MRD detection method and the operator's skills or convictions only add to the complexity of MRD interpretation. While enormous efforts are devoted to enhancing the sensitivity of MRD detection, eg, by next-generation sequencing, improvements of methods for detecting MRD per se will not automatically lead to a more reliable estimation of total tumor burden. Most importantly, even the best assay will yield accurate MRD results only if the tissue source for MRD determination is of good quality. Another aspect of potentially crucial importance is the heterogenous distribution of leukemic cells throughout the skeleton after treatment, recently demonstrated for acute myeloid leukemia (AML) by bone marrow imaging. Once technical difficulties of MRD measurement are resolved and better MRD-targeting drugs are developed, we still need to learn about alternate proposed mechanisms to explain MRD-independent prognostication, well described in acute lymphoid leukemia, before MRD can be included routinely in the guidance of therapy in AML.

  12. Methodology for determining multilayered temperature inversions

    NASA Astrophysics Data System (ADS)

    Fochesatto, G. J.

    2015-05-01

    Temperature sounding of the atmospheric boundary layer (ABL) and lower troposphere exhibits multilayered temperature inversions specially in high latitudes during extreme winters. These temperature inversion layers are originated based on the combined forcing of local- and large-scale synoptic meteorology. At the local scale, the thermal inversion layer forms near the surface and plays a central role in controlling the surface radiative cooling and air pollution dispersion; however, depending upon the large-scale synoptic meteorological forcing, an upper level thermal inversion can also exist topping the local ABL. In this article a numerical methodology is reported to determine thermal inversion layers present in a given temperature profile and deduce some of their thermodynamic properties. The algorithm extracts from the temperature profile the most important temperature variations defining thermal inversion layers. This is accomplished by a linear interpolation function of variable length that minimizes an error function. The algorithm functionality is demonstrated on actual radiosonde profiles to deduce the multilayered temperature inversion structure with an error fraction set independently.

  13. Directional reflectance characterization facility and measurement methodology

    NASA Astrophysics Data System (ADS)

    McGuckin, B. T.; Haner, D. A.; Menzies, R. T.; Esproles, C.; Brothers, A. M.

    1996-08-01

    A precision reflectance characterization facility, constructed specifically for the measurement of the bidirectional reflectance properties of Spectralon panels planned for use as in-flight calibrators on the NASA Multiangle Imaging Spectroradiometer (MISR) instrument is described. The incident linearly polarized radiation is provided at three laser wavelengths: 442, 632.8, and 859.9 nm. Each beam is collimated when incident on the Spectralon. The illuminated area of the panel is viewed with a silicon photodetector that revolves around the panel (360 ) on a 30-cm boom extending from a common rotational axis. The reflected radiance detector signal is ratioed with the signal from a reference detector to minimize the effect of amplitude instabilities in the laser sources. This and other measures adopted to reduce noise have resulted in a bidirectional reflection function (BRF) calibration facility with a measurement precision with regard to a BRF measurement of 0.002 at the 1 confidence level. The Spectralon test piece panel is held in a computer-controlled three-axis rotational assembly capable of a full 360 rotation in the horizontal plane and 90 in the vertical. The angular positioning system has repeatability and resolution of 0.001 . Design details and an outline of the measurement methodology are presented.

  14. The Sense of Commitment: A Minimal Approach

    PubMed Central

    Michael, John; Sebanz, Natalie; Knoblich, Günther

    2016-01-01

    This paper provides a starting point for psychological research on the sense of commitment within the context of joint action. We begin by formulating three desiderata: to illuminate the motivational factors that lead agents to feel and act committed, to pick out the cognitive processes and situational factors that lead agents to sense that implicit commitments are in place, and to illuminate the development of an understanding of commitment in ontogeny. In order to satisfy these three desiderata, we propose a minimal framework, the core of which is an analysis of the minimal structure of situations which can elicit a sense of commitment. We then propose a way of conceptualizing and operationalizing the sense of commitment, and discuss cognitive and motivational processes which may underpin the sense of commitment. PMID:26779080

  15. Commercial radioactive waste minimization program development guidance

    SciTech Connect

    Fischer, D.K.

    1991-01-01

    This document is one of two prepared by the EG&G Idaho, Inc., Waste Management Technical Support Program Group, National Low-Level Waste Management Program Unit. One of several Department of Energy responsibilities stated in the Amendments Act of 1985 is to provide technical assistance to compact regions Host States, and nonmember States (to the extent provided in appropriations acts) in establishing waste minimization program plans. Technical assistance includes, among other things, the development of technical guidelines for volume reduction options. Pursuant to this defined responsibility, the Department of Energy (through EG&G Idaho, Inc.) has prepared this report, which includes guidance on defining a program, State/compact commission participation, and waste minimization program plans.

  16. Commercial radioactive waste minimization program development guidance

    SciTech Connect

    Fischer, D.K.

    1991-01-01

    This document is one of two prepared by the EG G Idaho, Inc., Waste Management Technical Support Program Group, National Low-Level Waste Management Program Unit. One of several Department of Energy responsibilities stated in the Amendments Act of 1985 is to provide technical assistance to compact regions Host States, and nonmember States (to the extent provided in appropriations acts) in establishing waste minimization program plans. Technical assistance includes, among other things, the development of technical guidelines for volume reduction options. Pursuant to this defined responsibility, the Department of Energy (through EG G Idaho, Inc.) has prepared this report, which includes guidance on defining a program, State/compact commission participation, and waste minimization program plans.

  17. Minimal Left-Right Symmetric Dark Matter.

    PubMed

    Heeck, Julian; Patra, Sudhanwa

    2015-09-18

    We show that left-right symmetric models can easily accommodate stable TeV-scale dark matter particles without the need for an ad hoc stabilizing symmetry. The stability of a newly introduced multiplet either arises accidentally as in the minimal dark matter framework or comes courtesy of the remaining unbroken Z_{2} subgroup of B-L. Only one new parameter is introduced: the mass of the new multiplet. As minimal examples, we study left-right fermion triplets and quintuplets and show that they can form viable two-component dark matter. This approach is, in particular, valid for SU(2)×SU(2)×U(1) models that explain the recent diboson excess at ATLAS in terms of a new charged gauge boson of mass 2 TeV.

  18. The method of minimal normal forms

    SciTech Connect

    Mane, S.R.; Weng, W.T.

    1992-01-01

    Normal form methods for solving nonlinear differential equations are reviewed and the comparative merits of three methods are evaluated. The concept of the minimal normal form is explained and is shown to be superior to other choices. The method is then extended to apply to the evaluation of discrete maps of an accelerator or storage ring. Such an extension, as suggested in this paper, is more suited for accelerator-based applications than a formulation utilizing continuous differential equations. A computer code has been generated to systematically implement various normal form formulations for maps in two-dimensional phase space. Specific examples of quadratic and cubic nonlinear fields were used and solved by the method developed. The minimal normal form method shown here gives good results using relatively low order expansions.

  19. The method of minimal normal forms

    SciTech Connect

    Mane, S.R.; Weng, W.T.

    1992-12-31

    Normal form methods for solving nonlinear differential equations are reviewed and the comparative merits of three methods are evaluated. The concept of the minimal normal form is explained and is shown to be superior to other choices. The method is then extended to apply to the evaluation of discrete maps of an accelerator or storage ring. Such an extension, as suggested in this paper, is more suited for accelerator-based applications than a formulation utilizing continuous differential equations. A computer code has been generated to systematically implement various normal form formulations for maps in two-dimensional phase space. Specific examples of quadratic and cubic nonlinear fields were used and solved by the method developed. The minimal normal form method shown here gives good results using relatively low order expansions.

  20. Minimal conditions for protocell stationary growth.

    PubMed

    Bigan, Erwan; Steyaert, Jean-Marc; Douady, Stéphane

    2015-01-01

    We show that self-replication of a chemical system encapsulated within a membrane growing from within is possible without any explicit feature such as autocatalysis or metabolic closure, and without the need for their emergence through complexity. We use a protocell model relying upon random conservative chemical reaction networks with arbitrary stoichiometry, and we investigate the protocell's capability for self-replication, for various numbers of reactions in the network. We elucidate the underlying mechanisms in terms of simple minimal conditions pertaining only to the topology of the embedded chemical reaction network. A necessary condition is that each moiety must be fed, and a sufficient condition is that each siphon is fed. Although these minimal conditions are purely topological, by further endowing conservative chemical reaction networks with thermodynamically consistent kinetics, we show that the growth rate tends to increase on increasing the Gibbs energy per unit molecular weight of the nutrient and on decreasing that of the membrane precursor. PMID:25951201

  1. The Minimal Supersymmetric Fat Higgs Model

    SciTech Connect

    Harnik, Roni; Kribs, Graham D.; Larson, Daniel T.; Murayama, Hitoshi

    2003-11-26

    We present a calculable supersymmetric theory of a composite"fat'" Higgs boson. Electroweak symmetry is broken dynamically through a new gauge interaction that becomes strong at an intermediate scale. The Higgs mass can easily be 200-450 GeV along with the superpartner masses, solving the supersymmetric little hierarchy problem. We explicitly verify that the model is consistent with precision electroweak data without fine-tuning. Gauge coupling unification can be maintained despite the inherently strong dynamics involved in electroweak symmetry breaking. Supersymmetrizing the Standard Model therefore does not imply a light Higgs mass, contrary to the lore in the literature. The Higgs sector of the minimal Fat Higgs model has a mass spectrum that is distinctly different from the Minimal Supersymmetric Standard Model.

  2. [Minimally invasive operations in vascular surgery].

    PubMed

    Stádler, Petr; Sedivý, Petr; Dvorácek, Libor; Slais, Marek; Vitásek, Petr; El Samman, Khaled; Matous, Pavel

    2011-01-01

    Minimally invasive surgery provides an attractive alternative compared with conventional surgical approaches and is popular with patients, particularly because of its favourable cosmetic results. Vascular surgery has taken its inspiration from general surgery and, over the past few years, has also been reducing the invasiveness of its operating methods. In addition to traditional laparoscopic techniques, we most frequently encounter the endovascular treatment of aneurysms of the thoracic and abdominal aorta and, most recently, robot-assisted surgery in the area of the abdominal aorta and pelvic arteries. Minimally invasive surgical interventions also have other advantages, including less operative trauma, a reduction in post-operative pain, shorter periods spent in the intensive care unit and overall hospitalization times, an earlier return to normal life and, finally, a reduction in total treatment costs.

  3. Neurocontroller analysis via evolutionary network minimization.

    PubMed

    Ganon, Zohar; Keinan, Alon; Ruppin, Eytan

    2006-01-01

    This study presents a new evolutionary network minimization (ENM) algorithm. Neurocontroller minimization is beneficial for finding small parsimonious networks that permit a better understanding of their workings. The ENM algorithm is specifically geared to an evolutionary agents setup, as it does not require any explicit supervised training error, and is very easily incorporated in current evolutionary algorithms. ENM is based on a standard genetic algorithm with an additional step during reproduction in which synaptic connections are irreversibly eliminated. It receives as input a successfully evolved neurocontroller and aims to output a pruned neurocontroller, while maintaining the original fitness level. The small neurocontrollers produced by ENM provide upper bounds on the neurocontroller size needed to perform a given task successfully, and can provide for more effcient hardware implementations. PMID:16859448

  4. Tall sections from non-minimal transformations

    NASA Astrophysics Data System (ADS)

    Morrison, David R.; Park, Daniel S.

    2016-10-01

    In previous work, we have shown that elliptic fibrations with two sections, or Mordell-Weil rank one, can always be mapped birationally to a Weierstrass model of a certain form, namely, the Jacobian of a P^{112} model. Most constructions of elliptically fibered Calabi-Yau manifolds with two sections have been carried out assuming that the image of this birational map was a "minimal" Weierstrass model. In this paper, we show that for some elliptically fibered Calabi-Yau manifolds with Mordell-Weil rank-one, the Jacobian of the P^{112} model is not minimal. Said another way, starting from a Calabi-Yau Weierstrass model, the total space must be blown up (thereby destroying the "Calabi-Yau" property) in order to embed the model into P^{112} . In particular, we show that the elliptic fibrations studied recently by Klevers and Taylor fall into this class of models.

  5. Heavier Higgs particles: Indications from Minimal Supersymmetry

    NASA Astrophysics Data System (ADS)

    Maiani, L.; Polosa, A. D.; Riquer, V.

    2012-12-01

    We use the most recent data on the Higgs-like resonance h observed at 125 GeV to derive information about the mass of the heavier Higgs particles predicted by Minimal Supersymmetry. We treat as independent parameters the couplings of h to top quark, beauty and massive vector bosons and, in this three-dimensional space, we locate the point realizing the best fit to data and compare it to the position of the Standard Model point and to the region of coupling values accommodating heavier Higgs particles in Minimal Supersymmetry. We conclude that mass values 320 ≲MH ≲ 360 GeV are compatible at 2σ with the best fit of couplings to present data, larger values being compatible at the 1σ level. Values of 1 ≲ tan β ≲ 6 are compatible with data.

  6. Topological minimally entangled states via geometric measure

    NASA Astrophysics Data System (ADS)

    Buerschaper, Oliver; García-Saez, Artur; Orús, Román; Wei, Tzu-Chieh

    2014-11-01

    Here we show how the Minimally Entangled States (MES) of a 2d system with topological order can be identified using the geometric measure of entanglement. We show this by minimizing this measure for the doubled semion, doubled Fibonacci and toric code models on a torus with non-trivial topological partitions. Our calculations are done either quasi-exactly for small system sizes, or using the tensor network approach in Orús et al (arXiv:1406.0585) for large sizes. As a byproduct of our methods, we see that the minimisation of the geometric entanglement can also determine the number of Abelian quasiparticle excitations in a given model. The results in this paper provide a very efficient and accurate way of extracting the full topological information of a 2d quantum lattice model from the multipartite entanglement structure of its ground states.

  7. Minimally Invasive Surgical Therapies for Atrial Fibrillation

    PubMed Central

    Nakamura, Yoshitsugu; Kiaii, Bob; Chu, Michael W. A.

    2012-01-01

    Atrial fibrillation is the most common sustained arrhythmia and is associated with significant risks of thromboembolism, stroke, congestive heart failure, and death. There have been major advances in the management of atrial fibrillation including pharmacologic therapies, antithrombotic therapies, and ablation techniques. Surgery for atrial fibrillation, including both concomitant and stand-alone interventions, is an effective therapy to restore sinus rhythm. Minimally invasive surgical ablation is an emerging field that aims for the superior results of the traditional Cox-Maze procedure through a less invasive operation with lower morbidity, quicker recovery, and improved patient satisfaction. These novel techniques utilize endoscopic or minithoracotomy approaches with various energy sources to achieve electrical isolation of the pulmonary veins in addition to other ablation lines. We review advancements in minimally invasive techniques for atrial fibrillation surgery, including management of the left atrial appendage. PMID:22666609

  8. Minimizing nonadiabaticities in optical-lattice loading

    NASA Astrophysics Data System (ADS)

    Dolfi, Michele; Kantian, Adrian; Bauer, Bela; Troyer, Matthias

    2015-03-01

    In the quest to reach lower temperatures of ultracold gases in optical-lattice experiments, nonadiabaticities during lattice loading represent one of the limiting factors that prevent the same low temperatures being reached as in experiments without lattices. Simulating the loading of a bosonic quantum gas into a one-dimensional optical lattice with and without a trap, we find that the redistribution of atomic density inside a global confining potential is by far the dominant source of heating. Based on these results we propose adjusting the trapping potential during loading to minimize changes to the density distribution. Our simulations confirm that a very simple linear interpolation of the trapping potential during loading already significantly decreases the heating of a quantum gas, and we discuss how loading protocols minimizing density redistributions can be designed.

  9. Nonlinear transient analysis via energy minimization

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.

  10. Minimizing the Fluid Used to Induce Fracturing

    NASA Astrophysics Data System (ADS)

    Boyle, E. J.

    2015-12-01

    The less fluid injected to induce fracturing means less fluid needing to be produced before gas is produced. One method is to inject as fast as possible until the desired fracture length is obtained. Presented is an alternative injection strategy derived by applying optimal system control theory to the macroscopic mass balance. The picture is that the fracture is constant in aperture, fluid is injected at a controlled rate at the near end, and the fracture unzips at the far end until the desired length is obtained. The velocity of the fluid is governed by Darcy's law with larger permeability for flow along the fracture length. Fracture growth is monitored through micro-seismicity. Since the fluid is assumed to be incompressible, the rate at which fluid is injected is balanced by rate of fracture growth and rate of loss to bounding rock. Minimizing injected fluid loss to the bounding rock is the same as minimizing total injected fluid How to change the injection rate so as to minimize the total injected fluid is a problem in optimal control. For a given total length, the variation of the injected rate is determined by variations in overall time needed to obtain the desired fracture length, the length at any time, and the rate at which the fracture is growing at that time. Optimal control theory leads to a boundary condition and an ordinary differential equation in time whose solution is an injection protocol that minimizes the fluid used under the stated assumptions. That method is to monitor the rate at which the square of the fracture length is growing and adjust the injection rate proportionately.

  11. Holographic dark energy from minimal supergravity

    NASA Astrophysics Data System (ADS)

    Landim, Ricardo C. G.

    2016-02-01

    We embed models of holographic dark energy (HDE) coupled to dark matter (DM) in minimal supergravity plus matter, with one chiral superfield. We analyze two cases. The first one has the Hubble radius as the infrared (IR) cutoff and the interaction between the two fluids is proportional to the energy density of the DE. The second case has the future event horizon as IR cutoff while the interaction is proportional to the energy density of both components of the dark sector.

  12. Minimal Basis for Gauge Theory Amplitudes

    SciTech Connect

    Bjerrum-Bohr, N. E. J.; Damgaard, Poul H.; Vanhove, Pierre

    2009-10-16

    Identities based on monodromy for integrations in string theory are used to derive relations between different color-ordered tree-level amplitudes in both bosonic and supersymmetric string theory. These relations imply that the color-ordered tree-level n-point gauge theory amplitudes can be expanded in a minimal basis of (n-3)exclamation amplitudes. This result holds for any choice of polarizations of the external states and in any number of dimensions.

  13. Minimally invasive dentistry: a review and update.

    PubMed

    Brostek, Andrew M; Bochenek, Andrew J; Walsh, Laurence J

    2006-06-01

    The term "Minimal Invasive (MI) Dentistry" can best be defined as the management of caries with a biological approach, rather than with a traditional (surgical) operative dentistry approach. Where operative dentistry is required, this is now carried out in the most conservative manner with minimal destruction of tooth structure. This new approach to caries management changes the emphasis from diagnosing carious lesions as cavities (and a repeating cycle of restorations), to one of diagnosing the oral ecological imbalance and effecting biological changes in the biofilm. The goal of MI is to stop the disease process and then to restore lost tooth structure and function, maximizing the healing potential of the tooth. The thought process which underpins this new minimal invasive approach can be organized into three main categories: (1) Recognize, which means identify patient caries risk, (2) Remineralize, which means prevent caries and reverse non-cavitated caries, and (3) Repair, which means control caries activity, maximize healing and repair the damage. The disease of dental caries is not just demineralization, but a process of repeated demineralization cycles caused by an imbalance in the ecological and chemical equilibrium of the biofilm /tooth interface (the ecological plaque hypothesis). Dietary and lifestyle patterns, especially carbohydrate frequency, water intake and smoking, play an important role in changing the biofilm ecology and pathogenicity. Tools for chairside assessment of saliva and plaque, allow risk to be assessed and patient compliance monitored. The remineralizing properties of saliva can be enhanced using materials which release biologically available calcium, phosphate and fluoride ions (CPP-ACP and CPP-ACFP). Use of biocides can also alter the pathogenic properties of plaque. Use of these MI treatment protocols, can repair early lesions and improve patient understanding and compliance. This review article introduces some of the key concepts

  14. Smooth GERBS, orthogonal systems and energy minimization

    SciTech Connect

    Dechevsky, Lubomir T. E-mail: pza@hin.no; Zanaty, Peter E-mail: pza@hin.no

    2013-12-18

    New results are obtained in three mutually related directions of the rapidly developing theory of generalized expo-rational B-splines (GERBS) [7, 6]: closed-form computability of C{sup ∞}-smooth GERBS in terms of elementary and special functions, Hermite interpolation and least-squares best approximation via smooth GERBS, energy minimizing properties of smooth GERBS similar to those of the classical cubic polynomial B-splines.

  15. Perennial CBL centering problem can be minimized

    SciTech Connect

    Pilkington, P.E.

    1987-11-30

    The cement bond log (CBL) has been plagued by the problem of poor tool centering since it was introduced to the oil industry. There are reasons why poor tool centering continues to cause problems, and there are steps that operators and service companies can take to minimize this unnecessary phenomenon. Before going into these proposed solutions, some case histories are presented to illustrate the problem.

  16. Asymptotic safety, emergence and minimal length

    NASA Astrophysics Data System (ADS)

    Percacci, Roberto; Vacca, Gian Paolo

    2010-12-01

    There seems to be a common prejudice that asymptotic safety is either incompatible with, or at best unrelated to, the other topics in the title. This is not the case. In fact, we show that (1) the existence of a fixed point with suitable properties is a promising way of deriving emergent properties of gravity, and (2) there is a sense in which asymptotic safety implies a minimal length. In doing so we also discuss possible signatures of asymptotic safety in scattering experiments.

  17. [Minimally Invasive Treatment of Esophageal Benign Diseases].

    PubMed

    Inoue, Haruhiro

    2016-07-01

    As a minimally invasive treatment of esophageal achalasia per-oral endoscopic myotomy( POEM) was developed in 2008. More than 1,100 cases of achalasia-related diseases received POEM. Success rate of the procedure was more than 95%(Eckerdt score improvement 3 points and more). No serious( Clavian-Dindo classification III b and more) complication was experienced. These results suggest that POEM becomes a standard minimally invasive treatment for achalasia-related diseases. As an off-shoot of POEM submucosal tumor removal through submucosal tunnel (per-oral endoscopic tumor resection:POET) was developed and safely performed. Best indication of POET is less than 5 cm esophageal leiomyoma. A novel endoscopic treatment of gastroesophageal reflux disease (GERD) was developed. Anti-reflux mucosectomy( ARMS) is nearly circumferential mucosal reduction of gastric cardia mucosa. ARMS is performed in 56 consecutive cases of refractory GERD. No major complications were encountered and excellent clinical results. Best indication of ARMS is a refractory GERD without long sliding hernia. Longest follow-up case is more than 10 years. Minimally invasive treatments for esophageal benign diseases are currently performed by therapeutic endoscopy. PMID:27440038

  18. New algorithms for the minimal form'' problem

    SciTech Connect

    Oliveira, J.S.; Cook, G.O. Jr. ); Purtill, M.R. . Center for Communications Research)

    1991-12-20

    It is widely appreciated that large-scale algebraic computation (performing computer algebra operations on large symbolic expressions) places very significant demands upon existing computer algebra systems. Because of this, parallel versions of many important algorithms have been successfully sought, and clever techniques have been found for improving the speed of the algebraic simplification process. In addition, some attention has been given to the issue of restructuring large expressions, or transforming them into minimal forms.'' By minimal form,'' we mean that form of an expression that involves a minimum number of operations in the sense that no simple transformation on the expression leads to a form involving fewer operations. Unfortunately, the progress that has been achieved to date on this very hard problem is not adequate for the very significant demands of large computer algebra problems. In response to this situation, we have developed some efficient algorithms for constructing minimal forms.'' In this paper, the multi-stage algorithm in which these new algorithms operate is defined and the features of these algorithms are developed. In a companion paper, we introduce the core algebra engine of a new tool that provides the algebraic framework required for the implementation of these new algorithms.

  19. Waste minimization in an autobody repair shop

    SciTech Connect

    Baria, D.N.; Dorland, D.; Bergeron, J.T.

    1994-12-31

    This work was done to document the waste minimization incorporated in a new autobody repair facility in Hermantown, Minnesota. Humes Collision Center incorporated new waste reduction techniques when it expanded its old facilities in 1992 and it was able to achieve the benefits of cost reduction and waste reduction. Humes Collision Center repairs an average of 500 cars annually and is a very small quantity generator (VSQG) of hazardous waste, as defined by the Minnesota Pollution Control Agency (MPCA). The hazardous waste consists of antifreeze, batteries, paint sludge, refrigerants, and used oil, while the nonhazardous waste consists of cardboard, glass, paint filters, plastic, sanding dust, scrap metal, and wastewater. The hazardous and nonhazardous waste output were decreased by 72%. In addition, there was a 63% reduction in the operating costs. The waste minimization includes antifreeze recovery and recycling, reduction in unused waste paint, reduction, recovery and recycle of waste lacquer thinner for cleaning spray guns and paint cups, elimination of used plastic car bags, recovery and recycle of refrigerant, reduction in waste sandpaper and elimination of sanding dust, and elimination of waste paint filters. The rate of return on the investment in waste minimization equipment is estimated from 37% per year for the distillation unit, 80% for vacuum sanding, 146% for computerized paint mixing, 211% for the refrigerant recycler, to 588% per year for the gun washer. The corresponding payback time varies from 3 years to 2 months.

  20. The non-minimal ekpyrotic trispectrum

    SciTech Connect

    Fertig, Angelika; Lehners, Jean-Luc E-mail: jlehners@aei.mpg.de

    2016-01-01

    Employing the covariant formalism, we derive the evolution equations for two scalar fields with non-canonical field space metric up to third order in perturbation theory. These equations can be used to derive predictions for local bi- and trispectra of multi-field cosmological models. Our main application is to ekpyrotic models in which the primordial curvature perturbations are generated via the non-minimal entropic mechanism. In these models, nearly scale-invariant entropy perturbations are generated first due to a non-minimal kinetic coupling between two scalar fields, and subsequently these perturbations are converted into curvature perturbations. Remarkably, the entropy perturbations have vanishing bi- and trispectra during the ekpyrotic phase. However, as we show, the conversion process to curvature perturbations induces local non-Gaussianity parameters f{sub NL} and g{sub NL} at levels that should be detectable by near-future observations. In fact, in order to obtain a large enough amplitude and small enough bispectrum of the curvature perturbations, as seen in current measurements, the conversion process must be very efficient. Interestingly, for such efficient conversions the trispectrum parameter g{sub NL} remains negative and typically of a magnitude O(10{sup 2})–O(10{sup 3}), resulting in a distinguishing feature of non-minimally coupled ekpyrotic models.

  1. Esophageal surgery in minimally invasive era

    PubMed Central

    Bencini, Lapo; Moraldi, Luca; Bartolini, Ilenia; Coratti, Andrea

    2016-01-01

    The widespread popularity of new surgical technologies such as laparoscopy, thoracoscopy and robotics has led many surgeons to treat esophageal diseases with these methods. The expected benefits of minimally invasive surgery (MIS) mainly include reductions of postoperative complications, length of hospital stay, and pain and better cosmetic results. All of these benefits could potentially be of great interest when dealing with the esophagus due to the potentially severe complications that can occur after conventional surgery. Moreover, robotic platforms are expected to reduce many of the difficulties encountered during advanced laparoscopic and thoracoscopic procedures such as anastomotic reconstructions, accurate lymphadenectomies, and vascular sutures. Almost all esophageal diseases are approachable in a minimally invasive way, including diverticula, gastro-esophageal reflux disease, achalasia, perforations and cancer. Nevertheless, while the limits of MIS for benign esophageal diseases are mainly technical issues and costs, oncologic outcomes remain the cornerstone of any procedure to cure malignancies, for which the long-term results are critical. Furthermore, many of the minimally invasive esophageal operations should be compared to pharmacologic interventions and advanced pure endoscopic procedures; such a comparison requires a difficult literature analysis and leads to some confounding results of clinical trials. This review aims to examine the evidence for the use of MIS in both malignancies and more common benign disease of the esophagus, with a particular emphasis on future developments and ongoing areas of research. PMID:26843913

  2. Minimally invasive local therapies for liver cancer

    PubMed Central

    Li, David; Kang, Josephine; Golas, Benjamin J.; Yeung, Vincent W.; Madoff, David C.

    2014-01-01

    Primary and metastatic liver tumors are an increasing global health problem, with hepatocellular carcinoma (HCC) now being the third leading cause of cancer-related mortality worldwide. Systemic treatment options for HCC remain limited, with Sorafenib as the only prospectively validated agent shown to increase overall survival. Surgical resection and/or transplantation, locally ablative therapies and regional or locoregional therapies have filled the gap in liver tumor treatments, providing improved survival outcomes for both primary and metastatic tumors. Minimally invasive local therapies have an increasing role in the treatment of both primary and metastatic liver tumors. For patients with low volume disease, these therapies have now been established into consensus practice guidelines. This review highlights technical aspects and outcomes of commonly utilized, minimally invasive local therapies including laparoscopic liver resection (LLR), radiofrequency ablation (RFA), microwave ablation (MWA), high-intensity focused ultrasound (HIFU), irreversible electroporation (IRE), and stereotactic body radiation therapy (SBRT). In addition, the role of combination treatment strategies utilizing these minimally invasive techniques is reviewed. PMID:25610708

  3. Minimally invasive thyroidectomy: a ten years experience

    PubMed Central

    Viani, Lorenzo; Montana, Chiara Montana; Cozzani, Federico; Sianesi, Mario

    2016-01-01

    Background The conventional thyroidectomy is the most frequent surgical procedure for thyroidal surgical disease. From several years were introduced minimally invasive approaches to thyroid surgery. These new procedures improved the incidence of postoperative pain, cosmetic results, patient’s quality of life, postoperative morbidity. The mini invasive video-assisted thyroidectomy (MIVAT) is a minimally invasive procedure that uses a minicervicotomy to treat thyroidal diseases. Methods We present our experience on 497 consecutively treated patients with MIVAT technique. We analyzed the mean age, sex, mean operative time, rate of bleeding, hypocalcemia, transitory and definitive nerve palsy (6 months after the procedure), postoperative pain scale from 0 to 10 at 1 hour and 24 hours after surgery, mean hospital stay. Results The indications to treat were related to preoperative diagnosis: 182 THYR 6, 184 THYR 3–4, 27 plummer, 24 basedow, 28 toxic goiter, 52 goiter. On 497 cases we have reported 1 case of bleeding (0,2%), 12 (2,4%) cases of transitory nerve palsy and 4 (0,8%) definitive nerve palsy. The rate of serologic hypocalcemia was 24.9% (124 cases) and clinical in 7.2% (36 cases); 1 case of hypoparathyroidism (0.2%). Conclusions The MIVAT is a safe approach to surgical thyroid disease, the cost are similar to CT as the adverse events. The minicervicotomy is really a minimally invasive tissue dissection. PMID:27294036

  4. Feminist Methodologies and Engineering Education Research

    ERIC Educational Resources Information Center

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  5. A Mixed Methodological Analysis of the Role of Culture in the Clinical Decision-Making Process

    ERIC Educational Resources Information Center

    Hays, Danica G.; Prosek, Elizabeth A.; McLeod, Amy L.

    2010-01-01

    Even though literature indicates that particular cultural groups receive more severe diagnoses at disproportionate rates, there has been minimal research that addresses how culture interfaces specifically with clinical decision making. This mixed methodological study of 41 counselors indicated that cultural characteristics of both counselors and…

  6. Defect reduction through Lean methodology

    NASA Astrophysics Data System (ADS)

    Purdy, Kathleen; Kindt, Louis; Densmore, Jim; Benson, Craig; Zhou, Nancy; Leonard, John; Whiteside, Cynthia; Nolan, Robert; Shanks, David

    2010-09-01

    Lean manufacturing is a systematic method of identifying and eliminating waste. Use of Lean manufacturing techniques at the IBM photomask manufacturing facility has increased efficiency and productivity of the photomask process. Tools, such as, value stream mapping, 5S and structured problem solving are widely used today. In this paper we describe a step-by-step Lean technique used to systematically decrease defects resulting in reduced material costs, inspection costs and cycle time. The method used consists of an 8-step approach commonly referred to as the 8D problem solving process. This process allowed us to identify both prominent issues as well as more subtle problems requiring in depth investigation. The methodology used is flexible and can be applied to numerous situations. Advantages to Lean methodology are also discussed.

  7. Methodological Challenges in Online Trials

    PubMed Central

    Khadjesari, Zarnie; White, Ian R; Kalaitzaki, Eleftheria; Godfrey, Christine; McCambridge, Jim; Thompson, Simon G; Wallace, Paul

    2009-01-01

    Health care and health care services are increasingly being delivered over the Internet. There is a strong argument that interventions delivered online should also be evaluated online to maximize the trial’s external validity. Conducting a trial online can help reduce research costs and improve some aspects of internal validity. To date, there are relatively few trials of health interventions that have been conducted entirely online. In this paper we describe the major methodological issues that arise in trials (recruitment, randomization, fidelity of the intervention, retention, and data quality), consider how the online context affects these issues, and use our experience of one online trial evaluating an intervention to help hazardous drinkers drink less (DownYourDrink) to illustrate potential solutions. Further work is needed to develop online trial methodology. PMID:19403465

  8. [CODESIGN METHODOLOGIES: A ENABLING RESOURCE?].

    PubMed

    Oboeuf, Alexandre; Aiguier, Grégory; Loute, Alain

    2016-01-01

    To reflect on the learning of the relationship in the care, seventeen people were mobilized to participate in a day of codesign. This methodology is to foster the creativity of a group with a succession creativity exercises. This article is primarily intended to reflect on the conditions by which such a methodology can become a resource for thinking learning ethics. The role of affectivity in the success of a codesign day is questioned. This work highlights include its central place in the construction of the innovative climate and the divergent thinking mechanism. The article aims to open new questions on the articulation exercises, affectivity, the role of the animator or that of the patient. The research perspectives invite disciplinary dialogue. PMID:27305797

  9. Minimal selective concentrations of tetracycline in complex aquatic bacterial biofilms.

    PubMed

    Lundström, Sara V; Östman, Marcus; Bengtsson-Palme, Johan; Rutgersson, Carolin; Thoudal, Malin; Sircar, Triranta; Blanck, Hans; Eriksson, K Martin; Tysklind, Mats; Flach, Carl-Fredrik; Larsson, D G Joakim

    2016-05-15

    Selection pressure generated by antibiotics released into the environment could enrich for antibiotic resistance genes and antibiotic resistant bacteria, thereby increasing the risk for transmission to humans and animals. Tetracyclines comprise an antibiotic class of great importance to both human and animal health. Accordingly, residues of tetracycline are commonly detected in aquatic environments. To assess if tetracycline pollution in aquatic environments promotes development of resistance, we determined minimal selective concentrations (MSCs) in biofilms of complex aquatic bacterial communities using both phenotypic and genotypic assays. Tetracycline significantly increased the relative abundance of resistant bacteria at 10 μg/L, while specific tet genes (tetA and tetG) increased significantly at the lowest concentration tested (1 μg/L). Taxonomic composition of the biofilm communities was altered with increasing tetracycline concentrations. Metagenomic analysis revealed a concurrent increase of several tet genes and a range of other genes providing resistance to different classes of antibiotics (e.g. cmlA, floR, sul1, and mphA), indicating potential for co-selection. Consequently, MSCs for the tet genes of ≤ 1 μg/L suggests that current exposure levels in e.g. sewage treatment plants could be sufficient to promote resistance. The methodology used here to assess MSCs could be applied in risk assessment of other antibiotics as well. PMID:26938321

  10. Iterative minimization algorithm for efficient calculations of transition states

    NASA Astrophysics Data System (ADS)

    Gao, Weiguo; Leng, Jing; Zhou, Xiang

    2016-03-01

    This paper presents an efficient algorithmic implementation of the iterative minimization formulation (IMF) for fast local search of transition state on potential energy surface. The IMF is a second order iterative scheme providing a general and rigorous description for the eigenvector-following (min-mode following) methodology. We offer a unified interpretation in numerics via the IMF for existing eigenvector-following methods, such as the gentlest ascent dynamics, the dimer method and many other variants. We then propose our new algorithm based on the IMF. The main feature of our algorithm is that the translation step is replaced by solving an optimization subproblem associated with an auxiliary objective function which is constructed from the min-mode information. We show that using an efficient scheme for the inexact solver and enforcing an adaptive stopping criterion for this subproblem, the overall computational cost will be effectively reduced and a super-linear rate between the accuracy and the computational cost can be achieved. A series of numerical tests demonstrate the significant improvement in the computational efficiency for the new algorithm.

  11. Minimal selective concentrations of tetracycline in complex aquatic bacterial biofilms.

    PubMed

    Lundström, Sara V; Östman, Marcus; Bengtsson-Palme, Johan; Rutgersson, Carolin; Thoudal, Malin; Sircar, Triranta; Blanck, Hans; Eriksson, K Martin; Tysklind, Mats; Flach, Carl-Fredrik; Larsson, D G Joakim

    2016-05-15

    Selection pressure generated by antibiotics released into the environment could enrich for antibiotic resistance genes and antibiotic resistant bacteria, thereby increasing the risk for transmission to humans and animals. Tetracyclines comprise an antibiotic class of great importance to both human and animal health. Accordingly, residues of tetracycline are commonly detected in aquatic environments. To assess if tetracycline pollution in aquatic environments promotes development of resistance, we determined minimal selective concentrations (MSCs) in biofilms of complex aquatic bacterial communities using both phenotypic and genotypic assays. Tetracycline significantly increased the relative abundance of resistant bacteria at 10 μg/L, while specific tet genes (tetA and tetG) increased significantly at the lowest concentration tested (1 μg/L). Taxonomic composition of the biofilm communities was altered with increasing tetracycline concentrations. Metagenomic analysis revealed a concurrent increase of several tet genes and a range of other genes providing resistance to different classes of antibiotics (e.g. cmlA, floR, sul1, and mphA), indicating potential for co-selection. Consequently, MSCs for the tet genes of ≤ 1 μg/L suggests that current exposure levels in e.g. sewage treatment plants could be sufficient to promote resistance. The methodology used here to assess MSCs could be applied in risk assessment of other antibiotics as well.

  12. Minimizing inter-microscope variability in dental microwear texture analysis

    NASA Astrophysics Data System (ADS)

    Arman, Samuel D.; Ungar, Peter S.; Brown, Christopher A.; DeSantis, Larisa R. G.; Schmidt, Christopher; Prideaux, Gavin J.

    2016-06-01

    A common approach to dental microwear texture analysis (DMTA) uses confocal profilometry in concert with scale-sensitive fractal analysis to help understand the diets of extinct mammals. One of the main benefits of DMTA over other methods is the repeatable, objective manner of data collection. This repeatability, however, is threatened by variation in results of DMTA of the same dental surfaces yielded by different microscopes. Here we compare DMTA data of five species of kangaroos measured on seven profilers of varying specifications. Comparison between microscopes confirms that inter-microscope differences are present, but we show that deployment of a number of automated treatments to remove measurement noise can help minimize inter-microscope differences. Applying these same treatments to a published hominin DMTA dataset shows that they alter some significant differences between dietary groups. Minimising microscope variability while maintaining interspecific dietary differences requires then that these factors are balanced in determining appropriate treatments. The process outlined here offers a solution for allowing comparison of data between microscopes, which is essential for ongoing DMTA research. In addition, the process undertaken, including considerations of other elements of DMTA protocols also promises to streamline methodology, remove measurement noise and in doing so, optimize recovery of a reliable dietary signature.

  13. Minimizing variability of cascade impaction measurements in inhalers and nebulizers.

    PubMed

    Bonam, Matthew; Christopher, David; Cipolla, David; Donovan, Brent; Goodwin, David; Holmes, Susan; Lyapustina, Svetlana; Mitchell, Jolyon; Nichols, Steve; Pettersson, Gunilla; Quale, Chris; Rao, Nagaraja; Singh, Dilraj; Tougas, Terrence; Van Oort, Mike; Walther, Bernd; Wyka, Bruce

    2008-01-01

    The purpose of this article is to catalogue in a systematic way the available information about factors that may influence the outcome and variability of cascade impactor (CI) measurements of pharmaceutical aerosols for inhalation, such as those obtained from metered dose inhalers (MDIs), dry powder inhalers (DPIs) or products for nebulization; and to suggest ways to minimize the influence of such factors. To accomplish this task, the authors constructed a cause-and-effect Ishikawa diagram for a CI measurement and considered the influence of each root cause based on industry experience and thorough literature review. The results illustrate the intricate network of underlying causes of CI variability, with the potential for several multi-way statistical interactions. It was also found that significantly more quantitative information exists about impactor-related causes than about operator-derived influences, the contribution of drug assay methodology and product-related causes, suggesting a need for further research in those areas. The understanding and awareness of all these factors should aid in the development of optimized CI methods and appropriate quality control measures for aerodynamic particle size distribution (APSD) of pharmaceutical aerosols, in line with the current regulatory initiatives involving quality-by-design (QbD).

  14. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  15. Energy Efficiency Indicators Methodology Booklet

    SciTech Connect

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  16. Methodology Rigor in Clinical Research

    PubMed Central

    Yang, Lynda J-S.; Chang, Kate W-C.; Chung, Kevin C.

    2012-01-01

    Background Methodology rigor increases the quality of clinical research by encouraging freedom from the biases inherent in clinical studies. As randomized controlled studies (clinical trial design) are rarely applicable to surgical research, we address the commonly used observational study designs and methodologies by presenting guidelines for rigor. Methods Review of study designs including cohort, case-control, and cross-sectional studies and case series/reports, as well as biases and confounders of study design. Results Details about biases and confounders at each study stage, study characteristics, rigor checklists, and published literature examples for each study design are summarized and presented in this report. Conclusions For those surgeons interested in pursuing clinical research, mastery of the principles of methodology rigor is imperative in of the context of evidence-based medicine and widespread publication of clinical studies. Knowledge of the study designs, their appropriate application, and strictly adhering to study design methods can provide high-quality evidence to serve as the basis for rational clinical decision-making. PMID:22634695

  17. Calculating averted caries attributable to school-based sealant programs with a minimal data set

    PubMed Central

    Griffin, Susan O.; Jones, Kari; Crespin, Matthew

    2016-01-01

    Objectives We describe a methodology for school-based sealant programs (SBSP) to estimate averted cavities,(i.e.,difference in cavities without and with SBSP) over 9 years using a minimal data set. Methods A Markov model was used to estimate averted cavities. SBSP would input estimates of their annual attack rate (AR) and 1-year retention rate. The model estimated retention 2+ years after placement with a functional form obtained from the literature. Assuming a constant AR, SBSP can estimate their AR with child-level data collected prior to sealant placement on sealant presence, number of decayed/filled first molars, and age. We demonstrate the methodology with data from the Wisconsin SBSP. Finally, we examine how sensitive averted cavities obtained with this methodology is if an SBSP were to over or underestimate their AR or 1-year retention. Results Demonstrating the methodology with estimated AR (= 7 percent) and 1-year retention (= 92 percent) from the Wisconsin SBSP data, we found that placing 31,324 sealants averted 10,718 cavities. Sensitivity analysis indicated that for any AR, the magnitude of the error (percent) in estimating averted cavities was always less than the magnitude of the error in specifying the AR and equal to the error in specifying the 1-year retention rate. We also found that estimates of averted cavities were more robust to misspecifications of AR for higher- versus lower-risk children. Conclusions With Excel (Microsoft Corporation, Redmond, WA, USA) spreadsheets available upon request, SBSP can use this methodology to generate reasonable estimates of their impact with a minimal data set. PMID:24423023

  18. Minimal-change disease secondary to etanercept.

    PubMed

    Koya, Mariko; Pichler, Raimund; Jefferson, J Ashley

    2012-10-01

    Etanercept is a soluble tumor necrosis factor alpha (TNFα) receptor which is widely used in the treatment of rheumatoid arthritis, psoriasis and other autoimmune inflammatory disorders. It is known for its relative lack of nephrotoxicity; however, there are reports on the development of nephrotic syndrome associated with the treatment with TNFα antagonists. Here, we describe a patient with psoriasis who developed biopsy-proven minimal-change disease (MCD) shortly after initiating etanercept. Our case is unique in that the MCD resolved after discontinuation of this medication, notably without the use of corticosteroids, strongly suggesting a drug-related phenomenon.

  19. Minimally Invasive Approach of a Retrocaval Ureter

    PubMed Central

    Pinheiro, Hugo; Ferronha, Frederico; Morales, Jorge; Campos Pinheiro, Luís

    2016-01-01

    The retrocaval ureter is a rare congenital entity, classically managed with open pyeloplasty techniques. The experience obtained with the laparoscopic approach of other more frequent causes of ureteropelvic junction (UPJ) obstruction has opened the method for the minimally invasive approach of the retrocaval ureter. In our paper, we describe a clinical case of a right retrocaval ureter managed successfully with laparoscopic dismembered pyeloplasty. The main standpoints of the procedure are described. Our results were similar to others published by other urologic centers, which demonstrates the safety and feasibility of the procedure for this condition.

  20. [Minimally-invasive therapy of urinary stones].

    PubMed

    Knoll, T; Trojan, L; Haecker, A; Michel, M S; Köhrmann, K U; Alken, P

    2003-09-01

    Open surgery was the standard therapy for urinary calculi up to about 30 years ago. This changed upon introduction of extracorporeal shockwave lithotripsy (ESWL) in 1980, a procedure that is now the primary therapy for 70 % of the patients in western countries. Simultaneously, endourological procedures like ureterorenoscopy (URS) and percutaneous nephrolithotripsy (PCNL) have been improved, and now, modern small diameter and highly efficient instruments offer an ideal alternative to shockwave lithotripsy. Today, minimally-invasive stone treatment has replaced open stone surgery almost completely. This article introduces ESWL, URS and PCNL and discusses indications, outcomes and limitations.

  1. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  2. Temperature-Transformed ``Minimal Coupling'': Magnetofluid Unification

    NASA Astrophysics Data System (ADS)

    Mahajan, S. M.

    2003-01-01

    The dynamics of a relativistic, hot charged fluid is expressed in terms of a hybrid magnetofluid field which unifies the electromagnetic field with an appropriately defined but analogous flow field. The unification is affected by a well-defined prescription that allows the derivation of the equations of motion of a plasma embedded in an electromagnetic field from the field-free equations. The relationship of this prescription with the minimal coupling prescription of particle dynamics is discussed; the changes brought about by the plasma temperature are highlighted. A few consequences of the unification are worked out.

  3. Minimally Invasive Approach of a Retrocaval Ureter

    PubMed Central

    Pinheiro, Hugo; Ferronha, Frederico; Morales, Jorge; Campos Pinheiro, Luís

    2016-01-01

    The retrocaval ureter is a rare congenital entity, classically managed with open pyeloplasty techniques. The experience obtained with the laparoscopic approach of other more frequent causes of ureteropelvic junction (UPJ) obstruction has opened the method for the minimally invasive approach of the retrocaval ureter. In our paper, we describe a clinical case of a right retrocaval ureter managed successfully with laparoscopic dismembered pyeloplasty. The main standpoints of the procedure are described. Our results were similar to others published by other urologic centers, which demonstrates the safety and feasibility of the procedure for this condition. PMID:27635277

  4. Minimally Invasive Transforaminal Lumbar Interbody Fusion.

    PubMed

    Ahn, Junyoung; Tabaraee, Ehsan; Singh, Kern

    2015-07-01

    Minimally invasive transforaminal lumbar interbody fusion (MIS TLIF) is performed via tubular dilators thereby preserving the integrity of the paraspinal musculature. The decreased soft tissue disruption in the MIS technique has been associated with significantly decreased blood loss, shorter length of hospitalization, and an expedited return to work while maintaining comparable arthrodesis rates when compared with the open technique particularly in the setting of spondylolisthesis (isthmic and degenerative), recurrent symptomatic disk herniation, spinal stenosis, pseudoarthrosis, iatrogenic instability, and spinal trauma. The purpose of this article and the accompanying video wass to demonstrate the techniques for a primary, single-level MIS TLIF. PMID:26079840

  5. Minimal electroweak model for monopole annihilation

    SciTech Connect

    Farris, T.H. ); Kephart, T.W.; Weiler, T.J. ); Yuan, T.C. )

    1992-02-03

    We construct the minimal (most economical in fields) extension of the standard model implementing the Langacker-Pi mechanism for reducing the grand unified theory (GUT) monopole cosmic density to an allowed level. The model contains just a single charged scalar field in addition to the standard Higgs doublet, and is easily embeddable in any GUT. We identify the region of parameter space where monopoles annihilate in the higher temperature early Universe. A particularly alluring possibility is that the demise of monopoles at the electroweak scale is in fact the origin of the Universe's net baryon number.

  6. Minimal resonant leptogenesis and lepton flavour violation

    SciTech Connect

    Deppisch, Frank F.; Pilaftsis, Apostolos

    2012-07-27

    We discuss minimal non-supersymmetric models of resonant leptogenesis, based on an approximate flavour symmetries. As an illustrative example, we consider a resonant {tau}-leptogenesis model, compatible with universal right-handed neutrino masses at the GUT scale, where the required heavy-neutrino mass splittings are generated radiatively. In particular, we explicitly demonstrate, how a minimum number of three heavy Majorana neutrinos is needed, in order to obtain successful leptogenesis and experimentally testable rates for processes of lepton flavour violation, such as {mu}{yields}e{gamma} and {mu}{yields}e conversion in nuclei.

  7. Minimally invasive surgery for esophageal achalasia

    PubMed Central

    Chen, Huan-Wen

    2016-01-01

    Esophageal achalasia is due to the esophagus of neuromuscular dysfunction caused by esophageal functional disease. Its main feature is the lack of esophageal peristalsis, the lower esophageal sphincter pressure and to reduce the swallow’s relaxation response. Lower esophageal muscular dissection is one of the main ways to treat esophageal achalasia. At present, the period of muscular layer under the thoracoscope esophagus dissection is one of the treatment of esophageal achalasia. Combined with our experience in minimally invasive esophageal surgery, to improved incision and operation procedure, and adopts the model of the complete period of muscular layer under the thoracoscope esophagus dissection in the treatment of esophageal achalasia. PMID:27499977

  8. Minimal polar swimmer at low Reynolds number.

    PubMed

    Pandey, Ankita; Simha, R Aditi

    2012-06-01

    We propose a minimal model for a polar swimmer, consisting of two spheres connected by a rigid slender arm, at low Reynolds number. The propulsive velocity for the proposed model is the maximum for any swimming cycle with the same variations in its two degrees of freedom and its displacement in a cycle is achieved entirely in one step. The stroke averaged flow field generated by the contractile swimmer at large distances is found to be dipolar. In addition, the changing radius of one of the spheres generates the field of a potential doublet centered at its initial position.

  9. Resin composites in minimally invasive dentistry.

    PubMed

    Jacobsen, Thomas

    2004-01-01

    The concept of minimally invasive dentistry will provide favorable conditions for the use of composite resin. However, a number of factors must be considered when placing composite resins in conservatively prepared cavities, including: aspects on the adaptation of the composite resin to the cavity walls; the use of adhesives; and techniques for obtaining adequate proximal contacts. The clinician must also adopt an equally conservative approach when treating failed restorations. The quality of the composite resin restoration will not only be affected by the outline form of the preparation but also by the clinician's technique and understanding of the materials.

  10. Periodical cicadas: A minimal automaton model

    NASA Astrophysics Data System (ADS)

    de O. Cardozo, Giovano; de A. M. M. Silvestre, Daniel; Colato, Alexandre

    2007-08-01

    The Magicicada spp. life cycles with its prime periods and highly synchronized emergence have defied reasonable scientific explanation since its discovery. During the last decade several models and explanations for this phenomenon appeared in the literature along with a great deal of discussion. Despite this considerable effort, there is no final conclusion about this long standing biological problem. Here, we construct a minimal automaton model without predation/parasitism which reproduces some of these aspects. Our results point towards competition between different strains with limited dispersal threshold as the main factor leading to the emergence of prime numbered life cycles.

  11. Minimal relativistic three-particle equations

    SciTech Connect

    Lindesay, J.

    1981-07-01

    A minimal self-consistent set of covariant and unitary three-particle equations is presented. Numerical results are obtained for three-particle bound states, elastic scattering and rearrangement of bound pairs with a third particle, and amplitudes for breakup into states of three free particles. The mathematical form of the three-particle bound state equations is explored; constraints are set upon the range of eigenvalues and number of eigenstates of these one parameter equations. The behavior of the number of eigenstates as the two-body binding energy decreases to zero in a covariant context generalizes results previously obtained non-relativistically by V. Efimov.

  12. Beyond the minimal top partner decay

    NASA Astrophysics Data System (ADS)

    Serra, Javi

    2015-09-01

    Light top partners are the prime sign of naturalness in composite Higgs models. We explore here the possibility of non-standard top partner phenomenology. We show that even in the simplest extension of the minimal composite Higgs model, featuring an extra singlet pseudo Nambu-Goldstone boson, the branching ratios of the top partners into standard channels can be significantly altered, with no substantial change in the generated Higgs potential. Together with the variety of possible final states from the decay of the pseudo-scalar singlet, this motivates more extensive analyses in the search for the top partners.

  13. Minimizing Occupational Exposure to Antineoplastic Agents.

    PubMed

    Polovich, Martha

    2016-01-01

    The inherent toxicity of antineoplastic drugs used for the treatment of cancer makes them harmful to healthy cells as well as to cancer cells. Nurses who prepare and/or administer the agents potentially are exposed to the drugs and their negative effects. Knowledge about these drugs and the precautions aimed at reducing exposure are essential aspects of infusion nursing practice. This article briefly reviews the mechanisms of action of common antineoplastic drugs, the adverse outcomes associated with exposure, the potential for occupational exposure from preparation and administration, and recommended strategies for minimizing occupational exposure. PMID:27598070

  14. Minimally Invasive Atrial Fibrillation Surgery: Hybrid Approach

    PubMed Central

    Beller, Jared P.; Downs, Emily A.; Ailawadi, Gorav

    2016-01-01

    Atrial fibrillation is a challenging pathologic process. There continues to be a great need for the development of a reproducible, durable cure when medical management has failed. An effective, minimally invasive, sternal-sparing intervention without the need for cardiopulmonary bypass is a promising treatment approach. In this article, we describe a hybrid technique being refined at our center that combines a thoracoscopic epicardial surgical approach with an endocardial catheter-based procedure. We also discuss our results and review the literature describing this unique treatment approach. PMID:27127561

  15. The minimal geometric deformation approach extended

    NASA Astrophysics Data System (ADS)

    Casadio, R.; Ovalle, J.; da Rocha, Roldão

    2015-11-01

    The minimal geometric deformation approach was introduced in order to study the exterior spacetime around spherically symmetric self-gravitating systems, such as stars or similar astrophysical objects, in the Randall-Sundrum brane-world framework. A consistent extension of this approach is developed here, which contains modifications of both the time component and the radial component of a spherically symmetric metric. A modified Schwarzschild geometry is obtained as an example of its simplest application, and a new solution that is potentially useful to describe stars in the brane-world is also presented.

  16. Minimal model for spoof acoustoelastic surface states

    SciTech Connect

    Christensen, J. Willatzen, M.; Liang, Z.

    2014-12-15

    Similar to textured perfect electric conductors for electromagnetic waves sustaining artificial or spoof surface plasmons we present an equivalent phenomena for the case of sound. Aided by a minimal model that is able to capture the complex wave interaction of elastic cavity modes and airborne sound radiation in perfect rigid panels, we construct designer acoustoelastic surface waves that are entirely controlled by the geometrical environment. Comparisons to results obtained by full-wave simulations confirm the feasibility of the model and we demonstrate illustrative examples such as resonant transmissions and waveguiding to show a few examples of many where spoof elastic surface waves are useful.

  17. Minimizing physical restraints in acute care.

    PubMed

    Struck, Bryan D

    2005-08-01

    The use of restraints to protect patients and insure continuation of care is an accepted fact in today's medical practice. However over the last 20 years a growing body of evidence supports the idea that restraints are harmful and should be used as the last resort. Since 1987, federal law requires long term care facilities to be restraint free. This article describes the use of restraints in the acute care setting, complications of using restraints and efforts to minimize restraint use in order to compliant with national policies.

  18. Status of the minimal supersymmetric SO(10)

    SciTech Connect

    Dorsner, Ilja

    2010-02-10

    We discuss status of the minimal supersymmetric SO(10) in both low and split supersymmetry regime. To demonstrate viability of the model we present a good fit of the fermion masses and their mixings. The solution needs a strongly split supersymmetry with gauginos and higgsinos around 10{sup 2} TeV, sfermions close to 10{sup 14} GeV and a GUT scale of around 6x10{sup 15} GeV. It predicts fast proton decay rates, hierarchical neutrino masses and large leptonic mixing angle sin{theta}{sub 13}{approx_equal}0.1.

  19. Mercury Contamination: Fate and Risk Minimization Strategies

    NASA Astrophysics Data System (ADS)

    Charlet, L.

    Two river basins have been studied in French Guyana, which are subject to heavy mercury contamination, due to illegal gold mining. Within the framework of an interdisciplinary European project, the fate of mercury in water, air, soil, sediment has been studied, as well as its bio-accumulation in the food chain. This bioaccumulation results in the contamination of amerindian populations, through fish consumption. This study has been done in close contact with the economic and political actors. The results of the scientific interdisciplinary study has been translated in terms of risk minimization strategies, which are analyzed in the framework of the European Water Framework Directive.

  20. Learning Minimal Latent Directed Information Polytrees.

    PubMed

    Etesami, Jalal; Kiyavash, Negar; Coleman, Todd

    2016-09-01

    We propose an approach for learning latent directed polytrees as long as there exists an appropriately defined discrepancy measure between the observed nodes. Specifically, we use our approach for learning directed information polytrees where samples are available from only a subset of processes. Directed information trees are a new type of probabilistic graphical models that represent the causal dynamics among a set of random processes in a stochastic system. We prove that the approach is consistent for learning minimal latent directed trees. We analyze the sample complexity of the learning task when the empirical estimator of mutual information is used as the discrepancy measure. PMID:27391682

  1. Minimally Invasive Approach of a Retrocaval Ureter.

    PubMed

    Fidalgo, Nuno; Pinheiro, Hugo; Ferronha, Frederico; Morales, Jorge; Campos Pinheiro, Luís

    2016-01-01

    The retrocaval ureter is a rare congenital entity, classically managed with open pyeloplasty techniques. The experience obtained with the laparoscopic approach of other more frequent causes of ureteropelvic junction (UPJ) obstruction has opened the method for the minimally invasive approach of the retrocaval ureter. In our paper, we describe a clinical case of a right retrocaval ureter managed successfully with laparoscopic dismembered pyeloplasty. The main standpoints of the procedure are described. Our results were similar to others published by other urologic centers, which demonstrates the safety and feasibility of the procedure for this condition. PMID:27635277

  2. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  3. A geometric approach to direct minimization

    NASA Astrophysics Data System (ADS)

    van Voorhis, Troy; Head-Gordon, Martin

    The approach presented, geometric direct minimization (GDM), is derived from purely geometrical arguments, and is designed to minimize a function of a set of orthonormal orbitals. The optimization steps consist of sequential unitary transformations of the orbitals, and convergence is accelerated using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) approach in the iterative subspace, together with a diagonal approximation to the Hessian for the remaining degrees of freedom. The approach is tested by implementing the solution of the self-consistent field (SCF) equations and comparing results with the standard direct inversion in the iterative subspace (DIIS) method. It is found that GDM is very robust and converges in every system studied, including several cases in which DIIS fails to find a solution. For main group compounds, GDM convergence is nearly as rapid as DIIS, whereas for transition metalcontaining systems we find that GDM is significantly slower than DIIS. A hybrid procedure where DIIS is used for the first several iterations and GDM is used thereafter is found to provide a robust solution for transition metal-containing systems.

  4. Minimal model for aeolian sand dunes.

    PubMed

    Kroy, Klaus; Sauermann, Gerd; Herrmann, Hans J

    2002-09-01

    We present a minimal model for the formation and migration of aeolian sand dunes in unidirectional winds. It combines a perturbative description of the turbulent wind velocity field above the dune with a continuum saltation model that allows for saturation transients in the sand flux. The latter are shown to provide a characteristic length scale, called saturation length, which is distinct from the saltation length of the grains. The model admits two different classes of solutions for the steady-state profile along the wind direction: smooth heaps and dunes with slip face. We clarify the origin of the characteristic properties of these solutions and analyze their scaling behavior. We also investigate in some detail the dynamic evolution of heaps and dunes, including the steady-state migration velocity and transient shape relaxation. Although the minimal model employs nonlocal expressions for the wind shear stress as well as for the sand flux, it is simple enough to serve as a very efficient tool for analytical and numerical investigations and opens up the way to simulations of large scale desert topographies. PMID:12366107

  5. Osmosis in a minimal model system.

    PubMed

    Lion, Thomas W; Allen, Rosalind J

    2012-12-28

    Osmosis is one of the most important physical phenomena in living and soft matter systems. While the thermodynamics of osmosis is well understood, the underlying microscopic dynamical mechanisms remain the subject of discussion. Unravelling these mechanisms is a prerequisite for understanding osmosis in non-equilibrium systems. Here, we investigate the microscopic basis of osmosis, in a system at equilibrium, using molecular dynamics simulations of a minimal model in which repulsive solute and solvent particles differ only in their interactions with an external potential. For this system, we can derive a simple virial-like relation for the osmotic pressure. Our simulations support an intuitive picture in which the solvent concentration gradient, at osmotic equilibrium, arises from the balance between an outward force, caused by the increased total density in the solution, and an inward diffusive flux caused by the decreased solvent density in the solution. While more complex effects may occur in other osmotic systems, our results suggest that they are not required for a minimal picture of the dynamic mechanisms underlying osmosis.

  6. [Theory and practice of minimally invasive endodontics].

    PubMed

    Jiang, H W

    2016-08-01

    The primary goal of modern endodontic therapy is to achieve the long-term retention of a functional tooth by preventing or treating pulpitis or apical periodontitis is. The long-term retention of endodontically treated tooth is correlated with the remaining amount of tooth tissue and the quality of the restoration after root canal filling. In recent years, there has been rapid progress and development in the basic research of endodontic biology, instrument and applied materials, making treatment procedures safer, more accurate, and more efficient. Thus, minimally invasive endodontics(MIE)has received increasing attention at present. MIE aims to preserve the maximum of tooth structure during root canal therapy, and the concept covers the whole process of diagnosis and treatment of teeth. This review article focuses on describing the minimally invasive concepts and operating essentials in endodontics, from diagnosis and treatment planning to the access opening, pulp cavity finishing, root canal cleaning and shaping, 3-dimensional root canal filling and restoration after root canal treatment. PMID:27511034

  7. Environmental projects. Volume 16: Waste minimization assessment

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Goldstone Deep Space Communications Complex (GDSCC), located in the MoJave Desert, is part of the National Aeronautics and Space Administration's (NASA's) Deep Space Network (DSN), the world's largest and most sensitive scientific telecommunications and radio navigation network. The Goldstone Complex is operated for NASA by the Jet Propulsion Laboratory. At present, activities at the GDSCC support the operation of nine parabolic dish antennas situated at five separate locations known as 'sites.' Each of the five sites at the GDSCC has one or more antennas, called 'Deep Space Stations' (DSS's). In the course of operation of these DSS's, various hazardous and non-hazardous wastes are generated. In 1992, JPL retained Kleinfelder, Inc., San Diego, California, to quantify the various streams of hazardous and non-hazardous wastes generated at the GDSCC. In June 1992, Kleinfelder, Inc., submitted a report to JPL entitled 'Waste Minimization Assessment.' This present volume is a JPL-expanded version of the Kleinfelder, Inc. report. The 'Waste Minimization Assessment' report did not find any deficiencies in the various waste-management programs now practiced at the GDSCC, and it found that these programs are being carried out in accordance with environmental rules and regulations.

  8. Power Minimization techniques for Networked Data Centers.

    SciTech Connect

    Low, Steven; Tang, Kevin

    2011-09-28

    Our objective is to develop a mathematical model to optimize energy consumption at multiple levels in networked data centers, and develop abstract algorithms to optimize not only individual servers, but also coordinate the energy consumption of clusters of servers within a data center and across geographically distributed data centers to minimize the overall energy cost and consumption of brown energy of an enterprise. In this project, we have formulated a variety of optimization models, some stochastic others deterministic, and have obtained a variety of qualitative results on the structural properties, robustness, and scalability of the optimal policies. We have also systematically derived from these models decentralized algorithms to optimize energy efficiency, analyzed their optimality and stability properties. Finally, we have conducted preliminary numerical simulations to illustrate the behavior of these algorithms. We draw the following conclusion. First, there is a substantial opportunity to minimize both the amount and the cost of electricity consumption in a network of datacenters, by exploiting the fact that traffic load, electricity cost, and availability of renewable generation fluctuate over time and across geographical locations. Judiciously matching these stochastic processes can optimize the tradeoff between brown energy consumption, electricity cost, and response time. Second, given the stochastic nature of these three processes, real-time dynamic feedback should form the core of any optimization strategy. The key is to develop decentralized algorithms that can be implemented at different parts of the network as simple, local algorithms that coordinate through asynchronous message passing.

  9. Minimally invasive surgery for gastric cancer.

    PubMed

    Güner, Ali; Hyung, Woo Jin

    2014-01-01

    The interest in minimally invasive surgery (MIS) has rapidly increased in recent decades and surgeons have adopted minimally invasive techniques due to its reduced invasiveness and numerous advantages for patients. With increased surgical experience and newly developed surgical instruments, MIS has become the preferred approach not only for benign disease but also for oncologic surgery. Recently, robotic systems have been developed to overcome difficulties of standard laparoscopic instruments during complex procedures. Its advantages including three-dimensional images, tremor filtering, motion scaling, articulated instruments, and stable retraction have created the opportunity to use robotic technology in many procedures including cancer surgery. Gastric cancer is one of the most common causes of cancer-related deaths worldwide. While its overall incidence has decreased worldwide, the proportion of early gastric cancer has increased mainly in eastern countries following mass screening programs. The shift in the paradigm of gastric cancer treatment is toward less invasive approaches in order to improve the patient's quality of life while adhering to oncological principles. In this review, we aimed to summarize the operative strategy and current literature in laparoscopic and robotic surgery for gastric cancer.

  10. [Minimally Invasive Open Surgery for Lung Cancer].

    PubMed

    Nakagawa, Kazuo; Watanabe, Shunichi

    2016-07-01

    Significant efforts have been made to reduce the invasiveness of surgical procedures by surgeons for a long time. Surgeons always keep it in mind that the basic principle performing less invasive surgical procedures for malignant tumors is to decrease the invasiveness for patients without compromising oncological curability and surgical safety. Video-assisted thoracic surgery (VATS) has been used increasingly as a minimally invasive approach to lung cancer surgery. Whereas, whether VATS lobectomy is a less invasive procedure and has equivalent or better clinical effect compared with open lobectomy for patients with lung cancer remains controversial because of the absence of randomized prospective studies. The degree of difficulty for anatomical lung resection depends on the degree of the fissure development, mobility of hilar lymph nodes, and the degree of pleural adhesions. During pulmonary surgery, thoracic surgeons always have to deal with not only these difficulties but other unexpected events such as intraoperative bleeding. Recently, we perform pulmonary resection for lung cancer with minimally invasive open surgery (MIOS) approach. In this article, we introduce the surgical procedure of MIOS and demonstrate short-term results. Off course, the efficacy of MIOS needs to be further evaluated with long-term results. PMID:27440030

  11. [Minimally invasive glaucoma surgery using the trabectome].

    PubMed

    Wecker, T; Jordan, J F

    2015-03-01

    The main barrier reducing outflow of aqueous humor in open angle glaucomas is the juxtacanalicular trabecular meshwork. The trabectome removes this pathophysiologically altered tissue by electroablation, thus allowing for the collector channels draining Schlemm's canal to directly communicate with the anterior chamber. In studies published so far, about 30% decrease of intraocular pressure and a simultaneous 42% reduction of pressure-lowering eyedrops could be achieved in primary and secondary open angle glaucomas. A clear cornea tunnel is used to advance the trabectome to the trabecular meshwork, leaving the conjunctiva unaffected. Hence minimally invasive chamber angle surgery using this device is in particular suitable for patients with an altered ocular surface. Lowering of intraocular pressure and reduction of needed topical medication seems to be distinct in pseudoexfoliative glaucoma. Surgery with the trabectome and phacoemulsification can easily be combined in one procedure. Using a minimally invasive approach, the complication profile of the trabectome is rather advantageous, not exceeding the general risks of globe-opening surgery. Ab-interno trabeculotomy is a safe and effective method for treatment of patients with primary or secondary open angle glaucomas and moderate target pressures.

  12. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry.

  13. Osmosis in a minimal model system.

    PubMed

    Lion, Thomas W; Allen, Rosalind J

    2012-12-28

    Osmosis is one of the most important physical phenomena in living and soft matter systems. While the thermodynamics of osmosis is well understood, the underlying microscopic dynamical mechanisms remain the subject of discussion. Unravelling these mechanisms is a prerequisite for understanding osmosis in non-equilibrium systems. Here, we investigate the microscopic basis of osmosis, in a system at equilibrium, using molecular dynamics simulations of a minimal model in which repulsive solute and solvent particles differ only in their interactions with an external potential. For this system, we can derive a simple virial-like relation for the osmotic pressure. Our simulations support an intuitive picture in which the solvent concentration gradient, at osmotic equilibrium, arises from the balance between an outward force, caused by the increased total density in the solution, and an inward diffusive flux caused by the decreased solvent density in the solution. While more complex effects may occur in other osmotic systems, our results suggest that they are not required for a minimal picture of the dynamic mechanisms underlying osmosis. PMID:23277960

  14. The minimal curvaton-higgs model

    SciTech Connect

    Enqvist, Kari; Lerner, Rose N.; Takahashi, Tomo E-mail: rose.lerner@desy.de

    2014-01-01

    We present the first full study of the minimal curvaton-higgs (MCH) model, which is a minimal interpretation of the curvaton scenario with one real scalar coupled to the standard model Higgs boson. The standard model coupling allows the dynamics of the model to be determined in detail, including effects from the thermal background and from radiative corrections to the potential. The relevant mechanisms for curvaton decay are incomplete non-perturbative decay (delayed by thermal blocking), followed by decay via a dimension-5 non-renormalisable operator. To avoid spoiling the predictions of big bang nucleosynthesis, we find the ''bare'' curvaton mass to be m{sub σ} ≥ 8 × 10{sup 4}GeV. To match observational data from Planck there is an upper limit on the curvaton-higgs coupling g, between 10{sup −3} and 10{sup −2}, depending on the mass. This is due to interactions with the thermal background. We find that typically non-Gaussianities are small but that if f{sub NL} is observed in the near future then m{sub σ}∼<5 × 10{sup 9}GeV, depending on Hubble scale during inflation. In a thermal dark matter model, the lower bound on m{sub σ} can increase substantially. The parameter space may also be affected once the baryogenesis mechanism is specified.

  15. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Hambye, Thomas; Panci, Paolo; Sala, Filippo; Taoso, Marco

    2015-10-12

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  16. An Aristotelian Account of Minimal Chemical Life

    NASA Astrophysics Data System (ADS)

    Bedau, Mark A.

    2010-12-01

    This paper addresses the open philosophical and scientific problem of explaining and defining life. This problem is controversial, and there is nothing approaching a consensus about what life is. This raises a philosophical meta-question: Why is life so controversial and so difficult to define? This paper proposes that we can attribute a significant part of the controversy over life to use of a Cartesian approach to explaining life, which seeks necessary and sufficient conditions for being an individual living organism, out of the context of other organisms and the abiotic environment. The Cartesian approach contrasts with an Aristotelian approach to explaining life, which considers life only in the whole context in which it actually exists, looks at the characteristic phenomena involving actual life, and seeks the deepest and most unified explanation for those phenomena. The phenomena of life might be difficult to delimit precisely, but it certainly includes life's characteristic hallmarks, borderline cases, and puzzles. The Program-Metabolism-Container (PMC) model construes minimal chemical life as a functionally integrated triad of chemical systems, which are identified as the Program, Metabolism, and Container. Rasmussen diagrams precisely depict the functional definition of minimal chemical life. The PMC model illustrates the Aristotelian approach to life, because it explains eight of life's hallmarks, one of life's borderline cases (the virus), and two of life's puzzles.

  17. An Aristotelian account of minimal chemical life.

    PubMed

    Bedau, Mark A

    2010-12-01

    This paper addresses the open philosophical and scientific problem of explaining and defining life. This problem is controversial, and there is nothing approaching a consensus about what life is. This raises a philosophical meta-question: Why is life so controversial and so difficult to define? This paper proposes that we can attribute a significant part of the controversy over life to use of a Cartesian approach to explaining life, which seeks necessary and sufficient conditions for being an individual living organism, out of the context of other organisms and the abiotic environment. The Cartesian approach contrasts with an Aristotelian approach to explaining life, which considers life only in the whole context in which it actually exists, looks at the characteristic phenomena involving actual life, and seeks the deepest and most unified explanation for those phenomena. The phenomena of life might be difficult to delimit precisely, but it certainly includes life's characteristic hallmarks, borderline cases, and puzzles. The Program-Metabolism-Container (PMC) model construes minimal chemical life as a functionally integrated triad of chemical systems, which are identified as the Program, Metabolism, and Container. Rasmussen diagrams precisely depict the functional definition of minimal chemical life. The PMC model illustrates the Aristotelian approach to life, because it explains eight of life's hallmarks, one of life's borderline cases (the virus), and two of life's puzzles.

  18. Free energies for singleton minimal states

    NASA Astrophysics Data System (ADS)

    Golden, J. M.

    2016-11-01

    It is assumed that any free energy function exhibits strict periodic behavior for histories that have been periodic for all past times. This is not the case for the work function, which, however, has the usual defining properties of a free energy. Forms given in fairly recent years for the minimum and related free energies of linear materials with memory have this property. Materials for which the minimal states are all singletons are those for which at least some of the singularities of the Fourier transform of the relaxation function are not isolated. For such materials, the maximum free energy is the work function, and free energies intermediate between the minimum free energy and the work function should be given by a linear relation involving these two quantities. All such functionals, except the minimum free energy, therefore do not have strict periodic behavior for periodic histories, which contradicts our assumption. A way out of the difficulty is explored which involves approximating the relaxation function by a form for which the minimal states are no longer singletons. A representation can then be given of an arbitrary free energy as a linear combination of the minimum, maximum and intermediate free energies derived in earlier work. This representation obeys our periodicity assumption. Numerical data are presented, supporting the consistency of this approach.

  19. Osmosis in a minimal model system

    NASA Astrophysics Data System (ADS)

    Lion, Thomas W.; Allen, Rosalind J.

    2012-12-01

    Osmosis is one of the most important physical phenomena in living and soft matter systems. While the thermodynamics of osmosis is well understood, the underlying microscopic dynamical mechanisms remain the subject of discussion. Unravelling these mechanisms is a prerequisite for understanding osmosis in non-equilibrium systems. Here, we investigate the microscopic basis of osmosis, in a system at equilibrium, using molecular dynamics simulations of a minimal model in which repulsive solute and solvent particles differ only in their interactions with an external potential. For this system, we can derive a simple virial-like relation for the osmotic pressure. Our simulations support an intuitive picture in which the solvent concentration gradient, at osmotic equilibrium, arises from the balance between an outward force, caused by the increased total density in the solution, and an inward diffusive flux caused by the decreased solvent density in the solution. While more complex effects may occur in other osmotic systems, our results suggest that they are not required for a minimal picture of the dynamic mechanisms underlying osmosis.

  20. Vortex Patterns in Ginzburg-Landau Minimizers

    NASA Astrophysics Data System (ADS)

    Serfaty, Sylvia; Sandier, Etienne

    2010-03-01

    We present a survey of results obtained with Etienne Sandier on vortices in the minimizers of the 2D Ginzburg-Landau energy of superconductivity with an applied magnetic field, in the asymptotic regime of large kappa where vortices become point-like. We describe results which characterize the critical values of the applied field for which vortices appear, their numbers and locations. If the applied field is large enough, it is observed in experiments that vortices are densely packed and form triangular (hexagonal) lattices named Abrikosov lattices. Part of our results is the rigorous derivation of a mean field model describing the optimal density of vortices at leading order in the energy, and then the derivation of a next order limiting energy which governs the positions of the vortices after blow-up at their inter-distance scale. This limiting energy is a logarithmic-type interaction between points in the plane. Among lattice configurations it is uniquely minimized by the hexagonal lattice, thus providing a first justification of the Abrikosov lattice in this regime.

  1. MR imaging guidance for minimally invasive procedures

    NASA Astrophysics Data System (ADS)

    Wong, Terence Z.; Kettenbach, Joachim; Silverman, Stuart G.; Schwartz, Richard B.; Morrison, Paul R.; Kacher, Daniel F.; Jolesz, Ferenc A.

    1998-04-01

    Image guidance is one of the major challenges common to all minimally invasive procedures including biopsy, thermal ablation, endoscopy, and laparoscopy. This is essential for (1) identifying the target lesion, (2) planning the minimally invasive approach, and (3) monitoring the therapy as it progresses. MRI is an ideal imaging modality for this purpose, providing high soft tissue contrast and multiplanar imaging, capability with no ionizing radiation. An interventional/surgical MRI suite has been developed at Brigham and Women's Hospital which provides multiplanar imaging guidance during surgery, biopsy, and thermal ablation procedures. The 0.5T MRI system (General Electric Signa SP) features open vertical access, allowing intraoperative imaging to be performed. An integrated navigational system permits near real-time control of imaging planes, and provides interactive guidance for positioning various diagnostic and therapeutic probes. MR imaging can also be used to monitor cryotherapy as well as high temperature thermal ablation procedures sing RF, laser, microwave, or focused ultrasound. Design features of the interventional MRI system will be discussed, and techniques will be described for interactive image acquisition and tracking of interventional instruments. Applications for interactive and near-real-time imaging will be presented as well as examples of specific procedures performed using MRI guidance.

  2. Minimal nitrogen requirements of Corynebacterium renal strains.

    PubMed

    VanEseltine, W P; Cox, W M; Kadis, S

    1978-01-01

    Corynebacterium renale strain 10849 was grown in a chemically defined medium containing glucosamine, ammonium sulfate, and 5 amino acids as possible nitrogen sources. Although glucosamine was slightly stimulatory, its omission from the medium had a minimal effect on growth, and washed cells introduced into glucosamine-free medium grew readily through 10 serial transfers, demonstrating that this compound was not required for growth. Individual omissions of isoleucine, valine, methionine, and glutamine resulted in lengthened lag periods and reduced growth rates in initial transfers, but recovery occurred in subsequent serial transfers so that by the 3rd or 4th transfer, growth rates and cell crops were only slightly less than in control cultures in complete medium. Omission of cystine resulted in a permanently low growth rate and reduced cell crop, but this was remedied by substituting various nonnitrogenous compounds containing reduced sulfur. Strain 10849 and 6 additional strains were then serially cultured in a minimal defined medium in which sodium thioglycolate provided reduced sulfur and ammonium sulfate served as sole nitrogen source. Since only ammonium ion was required as the nitrogen source, it could be concluded that C renale, which rapidly hydrolyzes urea, should find an adequate source of nitrogen for growth in the urinary tract of animals. PMID:629434

  3. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Sala, Filippo; Taoso, Marco; Hambye, Thomas; Panci, Paolo E-mail: thambye@ulb.ac.be E-mail: filippo.sala@cea.fr

    2015-10-01

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  4. Utilization of biocatalysts in cellulose waste minimization

    SciTech Connect

    Woodward, J.; Evans, B.R.

    1996-09-01

    Cellulose, a polymer of glucose, is the principal component of biomass and, therefore, a major source of waste that is either buried or burned. Examples of biomass waste include agricultural crop residues, forestry products, and municipal wastes. Recycling of this waste is important for energy conservation as well as waste minimization and there is some probability that in the future biomass could become a major energy source and replace fossil fuels that are currently used for fuels and chemicals production. It has been estimated that in the United States, between 100-450 million dry tons of agricultural waste are produced annually, approximately 6 million dry tons of animal waste, and of the 190 million tons of municipal solid waste (MSW) generated annually, approximately two-thirds is cellulosic in nature and over one-third is paper waste. Interestingly, more than 70% of MSW is landfilled or burned, however landfill space is becoming increasingly scarce. On a smaller scale, important cellulosic products such as cellulose acetate also present waste problems; an estimated 43 thousand tons of cellulose ester waste are generated annually in the United States. Biocatalysts could be used in cellulose waste minimization and this chapter describes their characteristics and potential in bioconversion and bioremediation processes.

  5. Microbial life detection with minimal assumptions

    NASA Astrophysics Data System (ADS)

    Kounaves, Samuel P.; Noll, Rebecca A.; Buehler, Martin G.; Hecht, Michael H.; Lankford, Kurt; West, Steven J.

    2002-02-01

    To produce definitive and unambiguous results, any life detection experiment must make minimal assumptions about the nature of extraterrestrial life. The only criteria that fits this definition is the ability to reproduce and in the process create a disequilibrium in the chemical and redox environment. The Life Detection Array (LIDA), an instrument proposed for the 2007 NASA Mars Scout Mission, and in the future for the Jovian moons, enables such an experiment. LIDA responds to minute biogenic chemical and physical changes in two identical 'growth' chambers. The sensitivity is provided by two differentially monitored electrochemical sensor arrays. Growth in one of the chambers alters the chemistry and ionic properties and results in a signal. This life detection system makes minimal assumptions; that after addition of water the microorganism replicates and in the process will produce small changes in its immediate surroundings by consuming, metabolizing, and excreting a number of molecules and/or ionic species. The experiment begins by placing an homogenized split-sample of soil or water into each chamber, adding water if soil, sterilizing via high temperature, and equilibrating. In the absence of any microorganism in either chamber, no signal will be detected. The inoculation of one chamber with even a few microorganisms which reproduce, will create a sufficient disequilibrium in the system (compared to the control) to be detectable. Replication of the experiment and positive results would lead to a definitive conclusion of biologically induced changes. The split sample and the nanogram inoculation eliminates chemistry as a causal agent.

  6. Soft tissue damage after minimally invasive THA

    PubMed Central

    2010-01-01

    Background and purpose Minimally invasive surgery (MIS) for hip replacement is thought to minimize soft tissue damage. We determined the damage caused by 4 different MIS approaches as compared to a conventional lateral transgluteal approach. Methods 5 surgeons each performed a total hip arthroplasty on 5 fresh frozen cadaver hips, using either a MIS anterior, MIS anterolateral, MIS 2-incision, MIS posterior, or lateral transgluteal approach. Postoperatively, the hips were dissected and muscle damage color-stained. We measured proportional muscle damage relative to the midsubstance cross-sectional surface area (MCSA) using computerized color detection. The integrity of external rotator muscles, nerves, and ligaments was assessed by direct observation. Results None of the other MIS approaches resulted in less gluteus medius muscle damage than the lateral transgluteal approach. However, the MIS anterior approach completely preserved the gluteus medius muscle in 4 cases while partial damage occurred in 1 case. Furthermore, the superior gluteal nerve was transected in 4 cases after a MIS anterolateral approach and in 1 after the lateral transgluteal approach. The lateral femoral cutaneous nerve was transected once after both the MIS anterior approach and the MIS 2-incision approach. Interpretation The MIS anterior approach may preserve the gluteus medius muscle during total hip arthroplasty, but with a risk of damaging the lateral femoral cutaneous nerve. PMID:21110702

  7. Navy Shipboard Hazardous Material Minimization Program

    SciTech Connect

    Bieberich, M.J.; Robinson, P.; Chastain, B.

    1994-12-31

    The use of hazardous (and potentially hazardous) materials in shipboard cleaning applications has proliferated as new systems and equipments have entered the fleet to reside alongside existing equipments. With the growing environmental awareness (and additional, more restrictive regulations) at all levels/echelon commands of the DoD, the Navy has initiated a proactive program to undertake the minimization/elimination of these hazardous materials in order to eliminate HMs at the source. This paper will focus on the current Shipboard Hazardous Materials Minimization Program initiatives including the identification of authorized HM currently used onboard, identification of potential substitute materials for HM replacement, identification of new cleaning technologies and processes/procedures, and identification of technical documents which will require revision to eliminate the procurement of HMs into the federal supply system. Also discussed will be the anticipated path required to implement the changes into the fleet and automated decision processes (substitution algorithm) currently employed. The paper will also present the most recent technologies identified for approval or additional testing and analysis including: supercritical CO{sub 2} cleaning, high pressure blasting (H{sub 2}O + baking soda), aqueous and semi-aqueous cleaning materials and processes, solvent replacements and dedicated parts washing systems with internal filtering capabilities, automated software for solvent/cleaning process substitute selection. Along with these technological advances, data availability (from on-line databases and CDROM Database libraries) will be identified and discussed.

  8. Flavored dark matter beyond Minimal Flavor Violation

    DOE PAGES

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms asmore » triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.« less

  9. Flavored dark matter beyond Minimal Flavor Violation

    SciTech Connect

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms as triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.

  10. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry. PMID:26925471

  11. Psychometric tests for diagnosing minimal hepatic encephalopathy.

    PubMed

    Weissenborn, Karin

    2013-06-01

    While it is consensus that minimal hepatic encephalopathy (mHE) has significant impact on a patient's daily living, and thus should be diagnosed and treated, there is no consensus about the optimal diagnostic tools. At present the most frequently used psychometric methods for diagnosing minimal hepatic encephalopathy are the Inhibitory Control Test and the Psychometric Hepatic Encephalopathy Score PHES. Another frequently used method is Critical Flicker Frequency. The PHES and the Repeatable Battery for the Assessment of Neuropsychological Status have been recommended for diagnosing mHE by a working party commissioned by the International Society for Hepatic Encephalopathy and Nitrogen Metabolism. Recently the Continuous Reaction Time Test, which has been used in the 1980ies, has gained new interest. Today, no data are available that allow to decide which of these methods is the most appropriate. In fact, even basic information such as dependence on age, sex and education or influence of diseases that frequently accompany liver cirrhosis upon test results is missing for most of them. Future studies must address these questions to improve diagnosis of mHE. PMID:22993201

  12. Contrast-detail phantom scoring methodology.

    PubMed

    Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander

    2005-03-01

    Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on

  13. Contrast-detail phantom scoring methodology.

    PubMed

    Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander

    2005-03-01

    Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on

  14. Legal methodology as nursing problem solving.

    PubMed

    Weiler, K; Rhodes, A M

    1991-01-01

    This article presents legal methodology as a form of nursing research. The four elements of the legal methodology are examined. The sources of legal authority are explained and the essential components of one case, Sermchief v Gonzales are presented in detail as an illustration of the legal methodology. In addition to the elements of the methodology, the methodological tools for the process are described. The computer and manual searching strategies are identified. Finally, the citation system which serves as an integral portion of the methodological process is discussed.

  15. Optimal pulsed pumping schedule using calculus of variation methodology

    SciTech Connect

    Johannes, T.W.

    1999-03-01

    The application of a variational optimization technique has demonstrated the potential strength of pulsed pumping operations for use at existing pump-and-treat aquifer remediation sites. The optimized pulsed pumping technique has exhibited notable improvements in operational effectiveness over continuous pumping. The optimized pulsed pumping technique has also exhibited an advantage over uniform time intervals for pumping and resting cycles. The most important finding supports the potential for managing and improving pumping operations in the absence of complete knowledge of plume characteristics. An objective functional was selected to minimize mass of water removed and minimize the non- essential mass of contaminant removed. General forms of an essential concentration function were analyzed to determine the appropriate form required for compliance with management preferences. Third-order essential concentration functions provided optimal solutions for the objective functional. Results of using this form of the essential concentration function in the methodology provided optimal solutions for switching times. The methodology was applied to a hypothetical, two-dimensional aquifer influenced by specified and no-flow boundaries, injection wells and extraction wells. Flow simulations used MODFLOW, transport simulations used MT3D, and the graphical interface for obtaining concentration time series data and flow/transport links were generated by GMS version 2.1.

  16. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  17. Proof test methodology for composites

    NASA Technical Reports Server (NTRS)

    Wu, Edward M.; Bell, David K.

    1992-01-01

    The special requirements for proof test of composites are identified based on the underlying failure process of composites. Two proof test methods are developed to eliminate the inevitable weak fiber sites without also causing flaw clustering which weakens the post-proof-test composite. Significant reliability enhancement by these proof test methods has been experimentally demonstrated for composite strength and composite life in tension. This basic proof test methodology is relevant to the certification and acceptance of critical composite structures. It can also be applied to the manufacturing process development to achieve zero-reject for very large composite structures.

  18. Feminist methodology in nursing research.

    PubMed

    Webb, C

    1984-05-01

    In this paper the author discusses her experiences as a feminist, nurse and sociologist carrying out a study of hysterectomy. Difficulties in setting up the research, carrying out interviews and publishing reports result from masculine models of sociological research and from the fact that nursing work, including research, is carried out in a context of medical domination. These experiences are analysed in terms of feminist research methodology in sociology, and it is argued that in the case of nursing and women's health a feminist perspective offers opportunities for mutual consciousness-raising and for working together to challenge male medical control over these aspects of women's lives.

  19. APMS SVD methodology and implementation

    SciTech Connect

    BG Amidan; TA Ferryman

    2000-04-17

    One of the main tasks within the Aviation Performance Measurement System (APMS) program uses statistical methodologies to find atypical flights. With thousands of flights a day and hundreds of parameters being recorded every second for each flight, the amount of data escalates and the ability to find atypical flights becomes more difficult. The purpose of this paper is to explain the method known as single value decomposition (SVD) employed to search for the atypical flights and display useful graphics that facilitate understanding the causes of atypicality for these flights. Other methods could also perform this search and some are planned for future implementation.

  20. Simulating granular materials by energy minimization

    NASA Astrophysics Data System (ADS)

    Krijgsman, D.; Luding, S.

    2016-03-01

    Discrete element methods are extremely helpful in understanding the complex behaviors of granular media, as they give valuable insight into all internal variables of the system. In this paper, a novel discrete element method for performing simulations of granular media is presented, based on the minimization of the potential energy in the system. Contrary to most discrete element methods (i.e., soft-particle method, event-driven method, and non-smooth contact dynamics), the system does not evolve by (approximately) integrating Newtons equations of motion in time, but rather by searching for mechanical equilibrium solutions for the positions of all particles in the system, which is mathematically equivalent to locally minimizing the potential energy. The new method allows for the rapid creation of jammed initial conditions (to be used for further studies) and for the simulation of quasi-static deformation problems. The major advantage of the new method is that it allows for truly static deformations. The system does not evolve with time, but rather with the externally applied strain or load, so that there is no kinetic energy in the system, in contrast to other quasi-static methods. The performance of the algorithm for both types of applications of the method is tested. Therefore we look at the required number of iterations, for the system to converge to a stable solution. For each single iteration, the required computational effort scales linearly with the number of particles. During the process of creating initial conditions, the required number of iterations for two-dimensional systems scales with the square root of the number of particles in the system. The required number of iterations increases for systems closer to the jamming packing fraction. For a quasi-static pure shear deformation simulation, the results of the new method are validated by regular soft-particle dynamics simulations. The energy minimization algorithm is able to capture the evolution of the

  1. Simulating granular materials by energy minimization

    NASA Astrophysics Data System (ADS)

    Krijgsman, D.; Luding, S.

    2016-11-01

    Discrete element methods are extremely helpful in understanding the complex behaviors of granular media, as they give valuable insight into all internal variables of the system. In this paper, a novel discrete element method for performing simulations of granular media is presented, based on the minimization of the potential energy in the system. Contrary to most discrete element methods (i.e., soft-particle method, event-driven method, and non-smooth contact dynamics), the system does not evolve by (approximately) integrating Newtons equations of motion in time, but rather by searching for mechanical equilibrium solutions for the positions of all particles in the system, which is mathematically equivalent to locally minimizing the potential energy. The new method allows for the rapid creation of jammed initial conditions (to be used for further studies) and for the simulation of quasi-static deformation problems. The major advantage of the new method is that it allows for truly static deformations. The system does not evolve with time, but rather with the externally applied strain or load, so that there is no kinetic energy in the system, in contrast to other quasi-static methods. The performance of the algorithm for both types of applications of the method is tested. Therefore we look at the required number of iterations, for the system to converge to a stable solution. For each single iteration, the required computational effort scales linearly with the number of particles. During the process of creating initial conditions, the required number of iterations for two-dimensional systems scales with the square root of the number of particles in the system. The required number of iterations increases for systems closer to the jamming packing fraction. For a quasi-static pure shear deformation simulation, the results of the new method are validated by regular soft-particle dynamics simulations. The energy minimization algorithm is able to capture the evolution of the

  2. Minimizing water consumption when producing hydropower

    NASA Astrophysics Data System (ADS)

    Leon, A. S.

    2015-12-01

    In 2007, hydropower accounted for only 16% of the world electricity production, with other renewable sources totaling 3%. Thus, it is not surprising that when alternatives are evaluated for new energy developments, there is strong impulse for fossil fuel or nuclear energy as opposed to renewable sources. However, as hydropower schemes are often part of a multipurpose water resources development project, they can often help to finance other components of the project. In addition, hydropower systems and their associated dams and reservoirs provide human well-being benefits, such as flood control and irrigation, and societal benefits such as increased recreational activities and improved navigation. Furthermore, hydropower due to its associated reservoir storage, can provide flexibility and reliability for energy production in integrated energy systems. The storage capability of hydropower systems act as a regulating mechanism by which other intermittent and variable renewable energy sources (wind, wave, solar) can play a larger role in providing electricity of commercial quality. Minimizing water consumption for producing hydropower is critical given that overuse of water for energy production may result in a shortage of water for other purposes such as irrigation, navigation or fish passage. This paper presents a dimensional analysis for finding optimal flow discharge and optimal penstock diameter when designing impulse and reaction water turbines for hydropower systems. The objective of this analysis is to provide general insights for minimizing water consumption when producing hydropower. This analysis is based on the geometric and hydraulic characteristics of the penstock, the total hydraulic head and the desired power production. As part of this analysis, various dimensionless relationships between power production, flow discharge and head losses were derived. These relationships were used to withdraw general insights on determining optimal flow discharge and

  3. Minimal Intervention Dentistry – A New Frontier in Clinical Dentistry

    PubMed Central

    NK., Bajwa; A, Pathak

    2014-01-01

    Minimally invasive procedures are the new paradigm in health care. Everything from heart bypasses to gall bladder, surgeries are being performed with these dynamic new techniques. Dentistry is joining this exciting revolution as well. Minimally invasive dentistry adopts a philosophy that integrates prevention, remineralisation and minimal intervention for the placement and replacement of restorations. Minimally invasive dentistry reaches the treatment objective using the least invasive surgical approach, with the removal of the minimal amount of healthy tissues. This paper reviews in brief the concept of minimal intervention in dentistry. PMID:25177659

  4. Minimal intervention dentistry - a new frontier in clinical dentistry.

    PubMed

    Mm, Jingarwar; Nk, Bajwa; A, Pathak

    2014-07-01

    Minimally invasive procedures are the new paradigm in health care. Everything from heart bypasses to gall bladder, surgeries are being performed with these dynamic new techniques. Dentistry is joining this exciting revolution as well. Minimally invasive dentistry adopts a philosophy that integrates prevention, remineralisation and minimal intervention for the placement and replacement of restorations. Minimally invasive dentistry reaches the treatment objective using the least invasive surgical approach, with the removal of the minimal amount of healthy tissues. This paper reviews in brief the concept of minimal intervention in dentistry.

  5. Feminist methodologies and engineering education research

    NASA Astrophysics Data System (ADS)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  6. 76 FR 71431 - Civil Penalty Calculation Methodology

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... TRANSPORTATION Federal Motor Carrier Safety Administration Civil Penalty Calculation Methodology AGENCY: Federal... its civil penalty methodology. Part of this evaluation includes a forthcoming explanation of the Uniform Fine Assessment (UFA) algorithm, which FMCSA currently uses for calculation of civil...

  7. Hamiltonian formalism of minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Mahdavian Yekta, Davood

    2015-09-01

    In this paper, we study the three-dimensional minimal massive gravity (MMG) in the Hamiltonian formalism. At first, we define the canonical gauge generators as building blocks in this formalism and then derive the canonical expressions for the asymptotic conserved charges. The construction of a consistent asymptotic structure of MMG requires introducing suitable boundary conditions. In the second step, we show that the Poisson bracket algebra of the improved canonical gauge generators produces an asymptotic gauge group, which includes two separable versions of the Virasoro algebras. For instance, we study the Banados-Teitelboim-Zanelli (BTZ) black hole as a solution of the MMG field equations, and the conserved charges give the energy and angular momentum of the BTZ black hole. Finally, we compute the black hole entropy from the Cardy formula in the dual conformal field theory and show our result is consistent with the value obtained by using the Smarr formula from the holographic principle.

  8. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-07

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  9. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  10. Quasistatic Cracks and Minimal Energy Surfaces

    SciTech Connect

    Raeisaenen, V.I.; Seppala, E.T.; Alava, M.J.; Raeisaenen, V.I.; Alava, M.J.; Alava, M.J.; Duxbury, P.M.

    1998-01-01

    We compare the roughness of minimal energy (ME) surfaces and scalar quasistatic fracture (SQF) surfaces. Two-dimensional ME and SQF surfaces have {ital the same roughness scaling,} w{approximately}L{sup {zeta}} (L is the system size) with {zeta}=(2)/(3). The 3{ital d} ME and SQF results at strong disorder are consistent with the random-bond Ising exponent {zeta}(d{ge}3){approx}0.21(5{minus}d) (d is the bulk dimension). However, 3{ital d} SQF surfaces are {ital rougher} than ME surfaces due to a larger prefactor. ME surfaces undergo a weakly rough to algebraically rough{close_quotes} transition in 3{ital d}, suggesting a similar behavior in fracture. {copyright} {ital 1998} {ital The American Physical Society}

  11. Intravital microscopy of the lung: minimizing invasiveness.

    PubMed

    Fiole, Daniel; Tournier, Jean-Nicolas

    2016-09-01

    In vivo microscopy has recently become a gold standard in lung immunology studies involving small animals, largely benefiting from the democratization of multiphoton microscopy allowing for deep tissue imaging. This technology represents currently our only way of exploring the lungs and inferring what happens in human respiratory medicine. The interest of lung in vivo microscopy essentially relies upon its relevance as a study model, fulfilling physiological requirements in comparison with in vitro and ex vivo experiments. However, strategies developed in order to overcome movements of the thorax caused by breathing and heartbeats remain the chief drawback of the technique and a major source of invasiveness. In this context, minimizing invasiveness is an unavoidable prerequisite for any improvement of lung in vivo microscopy. This review puts into perspective the main techniques enabling lung in vivo microscopy, providing pros and cons regarding invasiveness. PMID:26846880

  12. "Analytic continuation" of = 2 minimal model

    NASA Astrophysics Data System (ADS)

    Sugawara, Yuji

    2014-04-01

    In this paper we discuss what theory should be identified as the "analytic continuation" with N rArr -N of the {mathcal N}=2 minimal model with the central charge hat {c} = 1 - frac {2}{N}. We clarify how the elliptic genus of the expected model is written in terms of holomorphic linear combinations of the "modular completions" introduced in [T. Eguchi and Y. Sugawara, JHEP 1103, 107 (2011)] in the SL(2)_{N+2}/U(1) supercoset theory. We further discuss how this model could be interpreted as a kind of model of the SL(2)_{N+2}/U(1) supercoset in the (widetilde {{R}},widetilde {R}) sector, in which only the discrete spectrum appears in the torus partition function and the potential IR divergence due to the non-compactness of the target space is removed. We also briefly discuss possible definitions of the sectors with other spin structures.

  13. Mechatronic Feasibility of Minimally Invasive, Atraumatic Cochleostomy

    PubMed Central

    Caversaccio, Marco; Proops, David; Brett, Peter

    2014-01-01

    Robotic assistance in the context of lateral skull base surgery, particularly during cochlear implantation procedures, has been the subject of considerable research over the last decade. The use of robotics during these procedures has the potential to provide significant benefits to the patient by reducing invasiveness when gaining access to the cochlea, as well as reducing intracochlear trauma when performing a cochleostomy. Presented herein is preliminary work on the combination of two robotic systems for reducing invasiveness and trauma in cochlear implantation procedures. A robotic system for minimally invasive inner ear access was combined with a smart drilling tool for robust and safe cochleostomy; evaluation was completed on a single human cadaver specimen. Access to the middle ear was successfully achieved through the facial recess without damage to surrounding anatomical structures; cochleostomy was completed at the planned position with the endosteum remaining intact after drilling as confirmed by microscope evaluation. PMID:25110661

  14. A minimal fate-selection switch.

    PubMed

    Weinberger, Leor S

    2015-12-01

    To preserve fitness in unpredictable, fluctuating environments, a range of biological systems probabilistically generate variant phenotypes--a process often referred to as 'bet-hedging', after the financial practice of diversifying assets to minimize risk in volatile markets. The molecular mechanisms enabling bet-hedging have remained elusive. Here, we review how HIV makes a bet-hedging decision between active replication and proviral latency, a long-lived dormant state that is the chief barrier to an HIV cure. The discovery of a virus-encoded bet-hedging circuit in HIV revealed an ancient evolutionary role for latency and identified core regulatory principles, such as feedback and stochastic 'noise', that enable cell-fate decisions. These core principles were later extended to fate selection in stem cells and cancer, exposed new therapeutic targets for HIV, and led to a potentially broad strategy of using 'noise modulation' to redirect cell fate. PMID:26611210

  15. LHC prospects for minimal decaying dark matter

    SciTech Connect

    Arcadi, Giorgio; Covi, Laura; Dradi, Federico E-mail: laura.covi@theorie.physik.uni-goettingen.de

    2014-10-01

    We study the possible signals at LHC of the minimal models of decaying dark matter. Those models are characterized by the fact that DM interacts with SM particles through renormalizable coupling with an additional heavier charged state. Such interaction allows to produce a substantial abundance of DM in the early Universe via the decay of the charged heavy state, either in- or out-of-equilibrium. Moreover additional couplings of the charged particle open up decay channels for the DM, which can nevertheless be sufficiently long-lived to be a good DM candidate and within reach of future Indirect Detection observations. We compare the cosmologically favored parameter regions to the LHC discovery reach and discuss the possibility of simultaneous detection of DM decay in Indirect Detection.

  16. The minimal nanowire: Mechanical properties of carbyne

    NASA Astrophysics Data System (ADS)

    Nair, A. K.; Cranford, S. W.; Buehler, M. J.

    2011-07-01

    Advances in molecular assembly are converging to an ultimate in atomistic precision —nanostructures built by single atoms. Recent experimental studies confirm that single chains of carbon atoms —carbyne— exist in stable polyyne structures and can be synthesized, representing the minimal possible nanowire. Here we report the mechanical properties of carbyne obtained by first-principles-based ReaxFF molecular simulation. A peak Young's modulus of 288 GPa is found with linear stiffnesses ranging from 64.6-5 N/m for lengths of 5-64 Å. We identify a size-dependent strength that ranges from 11 GPa (1.3 nN) for the shortest to a constant 8 GPa (0.9 nN) for longer carbyne chains. We demonstrate that carbyne chains exhibit extremely high vibrational frequencies close to 6 THz for the shortest chains, which are found to be highly length-dependent.

  17. Constraints on grip-selection: minimizing awkwardness.

    PubMed

    Fischman, M G

    1998-02-01

    In picking up and manipulating an object, the selection of an initial grip (overhand versus underhand) often depends on how comfortable the hand and arm will be at the end of the movement. This effect has been called "end-state comfort" and seems to be an important constraint in grip-selection. The present experiment further explored this effect by selecting a task that would ensure a comfortable ending position regardless of the initial choice of grip. 206 undergraduates picked up a cardboard paper-towel roll from a horizontal position and placed one end down on a table. Analysis showed a clear preference for the overhand grip, as 78% of the participants chose this grip. In addition, more women preferred the overhand grip than men. The findings indicate that people may be sensitive to minimizing awkwardness in both terminal and initial positions. PMID:9530757

  18. MINIMIZATION OF CARBON LOSS IN COAL REBURNING

    SciTech Connect

    Vladimir M. Zamansky; Vitali V. Lissianski

    2001-02-10

    This project develops Fuel-Flexible Reburning (FFR), which combines conventional reburning and Advanced Reburning (AR) technologies with an innovative method of delivering coal as the reburning fuel. The overall objective of this project is to develop engineering and scientific information and know-how needed to improve the cost of reburning via increased efficiency and minimized carbon-in-ash and move the FFR technology to the demonstration and commercialization stage. The first reporting period (August 11, 2000-February 10, 2001) included experimental activities with the primary objective to characterize the impacts of reburning process parameters on NO{sub x} reduction at conditions typical of the full-scale boilers. Tests were conducted in GE EER's Boiler Simulator Facility (BSF). Tests showed that NO{sub x} reduction in basic coal reburning depends on process conditions, initial NO{sub x} and coal type. Up to 60% NO{sub x} reduction was achieved at optimized conditions.

  19. Error minimizing algorithms for nearest eighbor classifiers

    SciTech Connect

    Porter, Reid B; Hush, Don; Zimmer, G. Beate

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  20. Minimally invasive surgery for esophageal cancer.

    PubMed

    Santillan, Alfredo A; Farma, Jeffrey M; Meredith, Kenneth L; Shah, Nilay R; Kelley, Scott T

    2008-10-01

    Esophageal cancer represents a major public health problem worldwide. Several minimally invasive esophagectomy (MIE) techniques have been described and represent a safe alternative for the surgical management of esophageal cancer in selected centers with high volume and expertise in them. This article reviews the most recent and largest series evaluating MIE techniques. Recent larger series have shown MIE to be equivalent in postoperative morbidity and mortality rates to conventional surgery. MIE has been associated with less blood loss, less postoperative pain, and decreased intensive care unit and hospital length of stay compared with conventional surgery. Despite limited data, conventional surgery and MIE have shown no significant difference in survival, stage for stage. The myriad of MIE techniques complicates the debate of defining the optimal surgical approach for treating esophageal cancer. Randomized controlled trials comparing MIE with conventional open esophagectomy are needed to clarify the ideal procedure with the lowest postoperative morbidity, best quality of life after surgery, and long-term survival.

  1. Minimizing Reheat Energy Use in Laboratories

    SciTech Connect

    Frenze, David; Mathew, Paul; Morehead, Michael; Sartor, Dale; Starr Jr., William

    2005-11-29

    HVAC systems that are designed without properly accounting for equipment load variation across laboratory spaces in a facility can significantly increase simultaneous heating and cooling, particularly for systems that use zone reheat for temperature control. This best practice guide describes the problem of simultaneous heating and cooling resulting from load variations, and presents several technological and design process strategies to minimize it. This guide is one in a series created by the Laboratories for the 21st century ('Labs21') program, a joint program of the U.S. Environmental Protection Agency and U.S. Department of Energy. Geared towards architects, engineers, and facilities managers, these guides provide information about technologies and practices to use in designing, constructing, and operating safe, sustainable, high-performance laboratories.

  2. A review of minimally invasive cosmetic procedures.

    PubMed

    Ogden, S; Griffiths, T W

    2008-11-01

    In today's society the desire to maintain a youthful appearance has driven the development of minimally invasive dermatological procedures that are designed to rejuvenate the ageing face. The aim of this review is to present evidence for the use of techniques which can easily be incorporated into outpatient dermatology practice with low overhead expenditure. For this reason, laser and light-based treatments have been omitted. This review will instead focus on chemical peels, intradermal fillers and botulinum toxin. These techniques address the main aspects of facial ageing, namely photodamage, volume loss and dynamic lines, which correlate anatomically to skin, subcutaneous fat and muscle. A combination of such techniques will provide the practitioner with a reasonable portfolio of treatments for a balanced, holistic result.

  3. Reflections concerning triply-periodic minimal surfaces.

    PubMed

    Schoen, Alan H

    2012-10-01

    In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau-Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346-362).

  4. Reflections concerning triply-periodic minimal surfaces

    PubMed Central

    Schoen, Alan H.

    2012-01-01

    In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau–Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346–362). PMID:24098851

  5. Minimal residual disease in acute promyelocytic leukemia.

    PubMed

    Weil, S C

    2000-03-01

    In the last decade our understanding of acute promyelocytic leukemia (APL) has advanced tremendously. The recognition of all-trans retinoic acid (ATRA) as a powerful therapeutic agent paralleled the cloning of the t(15;17) breakpoint. RtPCR for the PML-RARA hybrid mRNA has become the hallmark of molecular diagnosis and molecular monitoring in APL. Current techniques are useful in predicting complete remission and a possible cure in many patients who repeatedly test negative by PCR. Standardizing techniques and improving the sensitivity of the assay are important. Doing this in a way so that clinically relevant minimal residual disease can be distinguished from "indolent disease" remains among the future challenges in APL. PMID:10702899

  6. Minimally invasive training in urologic oncology.

    PubMed

    Liu, Jen-Jane; Gonzalgo, Mark L

    2011-11-01

    Use of minimally invasive surgical (MIS) techniques continues to expand in the field of urologic oncology; however, proficiency in these techniques is subject to a learning curve. Current training paradigms have incorporated MIS, but in a non-standardized fashion. Residency work-hour restrictions and ethical concerns may influence efforts to deliver adequate training during a defined residency period. Post-residency fellowships or mini-courses may help urologists gain proficiency in these skills, but are time-consuming and may not provide adequate exposure. Surgical simulation with dry labs and augmentation with virtual reality are important adjuncts to operative training for MIS. The urologic oncologist must be familiar with open and MIS techniques to effectively treat cancer in the least morbid way possible and adapt to the ever-changing field of MIS with dynamic training paradigms. PMID:22155873

  7. Linear functional minimization for inverse modeling

    SciTech Connect

    Barajas-Solano, David A.; Wohlberg, Brendt Egon; Vesselinov, Velimir Valentinov; Tartakovsky, Daniel M.

    2015-06-01

    In this paper, we present a novel inverse modeling strategy to estimate spatially distributed parameters of nonlinear models. The maximum a posteriori (MAP) estimators of these parameters are based on a likelihood functional, which contains spatially discrete measurements of the system parameters and spatiotemporally discrete measurements of the transient system states. The piecewise continuity prior for the parameters is expressed via Total Variation (TV) regularization. The MAP estimator is computed by minimizing a nonquadratic objective equipped with the TV operator. We apply this inversion algorithm to estimate hydraulic conductivity of a synthetic confined aquifer from measurements of conductivity and hydraulic head. The synthetic conductivity field is composed of a low-conductivity heterogeneous intrusion into a high-conductivity heterogeneous medium. Our algorithm accurately reconstructs the location, orientation, and extent of the intrusion from the steady-state data only. Finally, addition of transient measurements of hydraulic head improves the parameter estimation, accurately reconstructing the conductivity field in the vicinity of observation locations.

  8. Minimally invasive procedures for neuropathic pain.

    PubMed

    Sdrulla, Andrei; Chen, Grace

    2016-04-01

    Neuropathic pain is "pain arising as a direct consequence of a lesion or disease affecting the somatosensory system". The prevalence of neuropathic pain ranges from 7 to 11% of the population and minimally invasive procedures have been used to both diagnose and treat neuropathic pain. Diagnostic procedures consist of nerve blocks aimed to isolate the peripheral nerve implicated, whereas therapeutic interventions either modify or destroy nerve function. Procedures that modify how nerves function include epidural steroid injections, peripheral nerve blocks and sympathetic nerve blocks. Neuroablative procedures include radiofrequency ablation, cryoanalgesia and neurectomies. Currently, neuromodulation with peripheral nerve stimulators and spinal cord stimulators are the most evidence-based treatments of neuropathic pain. PMID:26988024

  9. Minimal Increase Network Coding for Dynamic Networks.

    PubMed

    Zhang, Guoyin; Fan, Xu; Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211

  10. Endoscopic navigation for minimally invasive suturing.

    PubMed

    Wengert, Christian; Bossard, Lukas; Häberling, Armin; Baur, Charles; Székely, Gábor; Cattin, Philippe C

    2007-01-01

    Manipulating small objects such as needles, screws or plates inside the human body during minimally invasive surgery can be very difficult for less experienced surgeons, due to the loss of 3D depth perception. This paper presents an approach for tracking a suturing needle using a standard endoscope. The resulting pose information of the needle is then used to generate artificial 3D cues on the 2D screen to optimally support surgeons during tissue suturing. Additionally, if an external tracking device is provided to report the endoscope's position, the suturing needle can be tracked in a hybrid fashion with sub-millimeter accuracy. Finally, a visual navigation aid can be incorporated, if a 3D surface is intraoperatively reconstructed from video or registered from preoperative imaging. PMID:18044620

  11. Minimally disruptive schedule repair for MCM missions

    NASA Astrophysics Data System (ADS)

    Molineaux, Matthew; Auslander, Bryan; Moore, Philip G.; Gupta, Kalyan M.

    2015-05-01

    Mine countermeasures (MCM) missions entail planning and operations in very dynamic and uncertain operating environments, which pose considerable risk to personnel and equipment. Frequent schedule repairs are needed that consider the latest operating conditions to keep mission on target. Presently no decision support tools are available for the challenging task of MCM mission rescheduling. To address this capability gap, we have developed the CARPE system to assist operation planners. CARPE constantly monitors the operational environment for changes and recommends alternative repaired schedules in response. It includes a novel schedule repair algorithm called Case-Based Local Schedule Repair (CLOSR) that automatically repairs broken schedules while satisfying the requirement of minimal operational disruption. It uses a case-based approach to represent repair strategies and apply them to new situations. Evaluation of CLOSR on simulated MCM operations demonstrates the effectiveness of case-based strategy. Schedule repairs are generated rapidly, ensure the elimination of all mines, and achieve required levels of clearance.

  12. Minimal residual method stronger than polynomial preconditioning

    SciTech Connect

    Faber, V.; Joubert, W.; Knill, E.

    1994-12-31

    Two popular methods for solving symmetric and nonsymmetric systems of equations are the minimal residual method, implemented by algorithms such as GMRES, and polynomial preconditioning methods. In this study results are given on the convergence rates of these methods for various classes of matrices. It is shown that for some matrices, such as normal matrices, the convergence rates for GMRES and for the optimal polynomial preconditioning are the same, and for other matrices such as the upper triangular Toeplitz matrices, it is at least assured that if one method converges then the other must converge. On the other hand, it is shown that matrices exist for which restarted GMRES always converges but any polynomial preconditioning of corresponding degree makes no progress toward the solution for some initial error. The implications of these results for these and other iterative methods are discussed.

  13. Minimal Increase Network Coding for Dynamic Networks.

    PubMed

    Zhang, Guoyin; Fan, Xu; Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery.

  14. JSC Metal Finishing Waste Minimization Methods

    NASA Technical Reports Server (NTRS)

    Sullivan, Erica

    2003-01-01

    THe paper discusses the following: Johnson Space Center (JSC) has achieved VPP Star status and is ISO 9001 compliant. The Structural Engineering Division in the Engineering Directorate is responsible for operating the metal finishing facility at JSC. The Engineering Directorate is responsible for $71.4 million of space flight hardware design, fabrication and testing. The JSC Metal Finishing Facility processes flight hardware to support the programs in particular schedule and mission critical flight hardware. The JSC Metal Finishing Facility is operated by Rothe Joint Venture. The Facility provides following processes: anodizing, alodining, passivation, and pickling. JSC Metal Finishing Facility completely rebuilt in 1998. Total cost of $366,000. All new tanks, electrical, plumbing, and ventilation installed. Designed to meet modern safety, environmental, and quality requirements. Designed to minimize contamination and provide the highest quality finishes.

  15. Design and Demonstration of Minimal Lunar Base

    NASA Astrophysics Data System (ADS)

    Boche-Sauvan, L.; Foing, B. H.; Exohab Team

    2009-04-01

    Introduction: We propose a conceptual analysis of a first minimal lunar base, in focussing on the system aspects and coordinating every different part as part an evolving architecture [1-3]. We justify the case for a scientific outpost allowing experiments, sample analysis in laboratory (relevant to the origin and evolution of the Earth, geophysical and geochemical studies of the Moon, life sciences, observation from the Moon). Research: Research activities will be conducted with this first settlement in: - science (of, from and on the Moon) - exploration (robotic mobility, rover, drilling), - technology (communication, command, organisation, automatism). Life sciences. The life sciences aspects are considered through a life support for a crew of 4 (habitat) and a laboratory activity with biological experiments performed on Earth or LEO, but then without any magnetosphere protection and therefore with direct cosmic rays and solar particle effects. Moreover, the ability of studying the lunar environment in the field will be a big asset before settling a permanent base [3-5]. Lunar environment. The lunar environment adds constraints to instruments specifications (vacuum, extreme temperature, regolith, seism, micrometeorites). SMART-1 and other missions data will bring geometrical, chemical and physical details about the environment (soil material characteristics, on surface conditions …). Test bench. To assess planetary technologies and operations preparing for Mars human exploration. Lunar outpost predesign modular concept: To allow a human presence on the moon and to carry out these experiments, we will give a pre-design of a human minimal lunar base. Through a modular concept, this base will be possibly evolved into a long duration or permanent base. We will analyse the possibilities of settling such a minimal base by means of the current and near term propulsion technology, as a full Ariane 5 ME carrying 1.7 T of gross payload to the surface of the Moon

  16. Minimal dilaton model and the diphoton excess

    NASA Astrophysics Data System (ADS)

    Agarwal, Bakul; Isaacson, Joshua; Mohan, Kirtimaan A.

    2016-08-01

    In light of the recent 750 GeV diphoton excesses reported by the ATLAS and CMS collaborations, we investigate the possibility of explaining this excess using the minimal dilaton model. We find that this model is able to explain the observed excess with the presence of additional top partner(s), with the same charge as the top quark, but with mass in the TeV region. First, we constrain model parameters using in addition to the 750 GeV diphoton signal strength, precision electroweak tests, single top production measurements, as well as Higgs signal strength data collected in the earlier runs of the LHC. In addition we discuss interesting phenomenology that could arise in this model, relevant for future runs of the LHC.

  17. Convex Lower Bounds for Free Energy Minimization

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan

    We construct lower bounds on free energy with convex relaxations from the nonlinear minimization over probabilities to linear programs over expectation values. Finite-temperature expectation values are further resolved into distributions over energy. A superset of valid expectation values is delineated by an incomplete set of linear constraints. Free energy bounds can be improved systematically by adding constraints, which also increases their computational cost. We compute several free energy bounds of increasing accuracy for the triangular-lattice Ising model to assess the utility of this method. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  18. Waste Minimization and Pollution Prevention Awareness Plan

    SciTech Connect

    Not Available

    1994-04-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, to estimate budget, and to review the plan. In addition to the above, this plan records LLNL`s goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities.

  19. Endoscopic navigation for minimally invasive suturing.

    PubMed

    Wengert, Christian; Bossard, Lukas; Häberling, Armin; Baur, Charles; Székely, Gábor; Cattin, Philippe C

    2007-01-01

    Manipulating small objects such as needles, screws or plates inside the human body during minimally invasive surgery can be very difficult for less experienced surgeons, due to the loss of 3D depth perception. This paper presents an approach for tracking a suturing needle using a standard endoscope. The resulting pose information of the needle is then used to generate artificial 3D cues on the 2D screen to optimally support surgeons during tissue suturing. Additionally, if an external tracking device is provided to report the endoscope's position, the suturing needle can be tracked in a hybrid fashion with sub-millimeter accuracy. Finally, a visual navigation aid can be incorporated, if a 3D surface is intraoperatively reconstructed from video or registered from preoperative imaging.

  20. Radiation oncology: physics advances that minimize morbidity.

    PubMed

    Allison, Ron R; Patel, Rajen M; McLawhorn, Robert A

    2014-12-01

    Radiation therapy has become an ever more successful treatment for many cancer patients. This is due in large part from advances in physics including the expanded use of imaging protocols combined with ever more precise therapy devices such as linear and particle beam accelerators, all contributing to treatments with far fewer side effects. This paper will review current state-of-the-art physics maneuvers that minimize morbidity, such as intensity-modulated radiation therapy, volummetric arc therapy, image-guided radiation, radiosurgery and particle beam treatment. We will also highlight future physics enhancements on the horizon such as MRI during treatment and intensity-modulated hadron therapy, all with the continued goal of improved clinical outcomes.

  1. Minimal Increase Network Coding for Dynamic Networks

    PubMed Central

    Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211

  2. New identities between unitary minimal Virasoro characters

    NASA Astrophysics Data System (ADS)

    Taormina, Anne

    1994-10-01

    Two sets of identities between unitary minimal Virasoro characters at levels m=3, 4, 5 are presented and proven. The first identity suggests a connection between the Ising and the tricritical Ising models since the m=3 Virasoro characters are obtained as bilinears of m=4 Virasoro characters. The second identity given the tricritical Ising model characters as bilinears in the Ising model characters and the six combinations of m=5 Virasoro characters which do not appear in the spectrum of the three state Potts model. The implication of these identities on the study of the branching rules of N=4 superconformal characters intoSwidehat{U(2)} × Swidehat{U(2)} characters is discussed.

  3. Minimally invasive training in urologic oncology.

    PubMed

    Liu, Jen-Jane; Gonzalgo, Mark L

    2011-11-01

    Use of minimally invasive surgical (MIS) techniques continues to expand in the field of urologic oncology; however, proficiency in these techniques is subject to a learning curve. Current training paradigms have incorporated MIS, but in a non-standardized fashion. Residency work-hour restrictions and ethical concerns may influence efforts to deliver adequate training during a defined residency period. Post-residency fellowships or mini-courses may help urologists gain proficiency in these skills, but are time-consuming and may not provide adequate exposure. Surgical simulation with dry labs and augmentation with virtual reality are important adjuncts to operative training for MIS. The urologic oncologist must be familiar with open and MIS techniques to effectively treat cancer in the least morbid way possible and adapt to the ever-changing field of MIS with dynamic training paradigms.

  4. Minimally invasive surgery for thyroid eye disease.

    PubMed

    Naik, Milind Neilkant; Nair, Akshay Gopinathan; Gupta, Adit; Kamal, Saurabh

    2015-11-01

    Thyroid eye disease (TED) can affect the eye in myriad ways: proptosis, strabismus, eyelid retraction, optic neuropathy, soft tissue changes around the eye and an unstable ocular surface. TED consists of two phases: active, and inactive. The active phase of TED is limited to a period of 12-18 months and is mainly managed medically with immunosuppression. The residual structural changes due to the resultant fibrosis are usually addressed with surgery, the mainstay of which is orbital decompression. These surgeries are performed during the inactive phase. The surgical rehabilitation of TED has evolved over the years: not only the surgical techniques, but also the concepts, and the surgical tools available. The indications for decompression surgery have also expanded in the recent past. This article discusses the technological and conceptual advances of minimally invasive surgery for TED that decrease complications and speed up recovery. Current surgical techniques offer predictable, consistent results with better esthetics.

  5. Minimal five dimensional supergravities and complex geometries

    SciTech Connect

    Herdeiro, Carlos A. R.

    2010-07-28

    We discuss the relation between solutions admitting Killing spinors of minimal super-gravities in five dimensions, both timelike and null, and complex geometries. For the timelike solutions the results may be summarised as follows. In the ungauged case (vanishing cosmological constant {Lambda} 0) the solutions are determined in terms of a hyper-Kaehler base space; in the gauged case ({Lambda}<0) the complex geometry is Kaehler; in the de Sitter case ({Lambda}>0) the complex geometry is hyper-Kaehler with torsion (HKT). For the null solutions we shall focus on the de Sitter case, for which the solutions are determined by a constrained Einstein-Weyl 3-geometry called Gauduchon-Tod space. The method for constructing explicit solutions is discussed in each case.

  6. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  7. Minimizing forced outage risk in generator bidding

    NASA Astrophysics Data System (ADS)

    Das, Dibyendu

    Competition in power markets has exposed the participating companies to physical and financial uncertainties. Generator companies bid to supply power in a day-ahead market. Once their bids are accepted by the ISO they are bound to supply power. A random outage after acceptance of bids forces a generator to buy power from the expensive real-time hourly spot market and sell to the ISO at the set day-ahead market clearing price, incurring losses. A risk management technique is developed to assess this financial risk associated with forced outages of generators and then minimize it. This work presents a risk assessment module which measures the financial risk of generators bidding in an open market for different bidding scenarios. The day-ahead power market auction is modeled using a Unit Commitment algorithm and a combination of Normal and Cauchy distributions generate the real time hourly spot market. Risk profiles are derived and VaRs are calculated at 98 percent confidence level as a measure of financial risk. Risk Profiles and VaRs help the generators to analyze the forced outage risk and different factors affecting it. The VaRs and the estimated total earning for different bidding scenarios are used to develop a risk minimization module. This module will develop a bidding strategy of the generator company such that its estimated total earning is maximized keeping the VaR below a tolerable limit. This general framework of a risk management technique for the generating companies bidding in competitive day-ahead market can also help them in decisions related to building new generators.

  8. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  9. Managing sediment to minimize environmental impacts

    SciTech Connect

    Sherman, K.

    1995-12-31

    When considering licensing of a hydroelectric project, FERC must give equal consideration to power and nonpower values such as environmental resources. A case study is the existing Rock-Creek Cresta Project, located on the North Fork of the Feather River in northern California, which is in the process of relicensing by the Commission. This project includes two reservoirs - Rock Creek and Cresta Reservoirs, each formed by a dam that diverts water from the river into a tunnel and to a powerhouse. The watershed includes large natural and man-made sediment sources. Rock Creek Reservoir has accumulated 3.9 million cubic yards of sediments since the dam was built in 1950; Cresta Reservoir has accumulated 2.9 million cy of sediments since 1949. Operational problems began in the 1980s. As part of the relicensing process, Pacific Gas & Electric Company (PG&E) initially proposed a combination of dredging 500,000 cy of sediment from each reservoir, land disposal of dredged sediments, followed by sediment pass-through to achieve a long term net balance of sediment inflow and outflow. This proposal had substantial economic costs and environmental impact. Potential environmental effects included impacts to water quality and aquatic organisms and to terrestrial habitat from disposal of a million cy of dredged sediments. PG&E used physical and mathematical models to develop an innovative approach that minimized the amount of sediment needed to be dredged by limiting dredging to the area immediately adjacent to the intake structures. This would also tend to minimize impacts to water quality and aquatic habitat by reducing the area of disturbance within the reservoirs. PG&E proposes to keep the intake areas open and provide for long-term sediment pass-through by providing additional low-level outlet capacity. This would permit reservoir drawdown, which would increase velocities and sediment movement out of the reservoirs.

  10. Minimally invasive total hip arthroplasty: in opposition.

    PubMed

    Hungerford, David S

    2004-06-01

    At the Knee Society Winter Meeting in 2003, Seth Greenwald and I debated about whether there should be new standards (ie, regulations) applied to the release of information to the public on "new developments." I argued for the public's "right to know" prior to the publication of peer-reviewed literature. He argued for regulatory constraint or "proving by peer-reviewed publication" before alerting the public. It is not a contradiction for me to currently argue against the public advertising of minimally invasive (MIS) total hip arthroplasty as not yet being in the best interest of the public. It is hard to remember a concept that has so captured both the public's and the surgical community's fancy as MIS. Patients are "demanding" MIS without knowing why. Surgeons are offering it as the next best, greatest thing without having developed the skill and experience to avoid the surgery's risks. If you put "minimally invasive hip replacement" into the Google search engine (http://www.google.com), you get 5,170 matches. If you put the same words in PubMed (http://www.ncbi.nlm.nih.gov/entrez/query.fcgi), referencing the National Library of Medicine database, you get SEVENTEEN; none is really a peer-reviewed article. Most are 1 page papers in orthopedics from medical education meetings. On the other hand, there are over 6,000 peer-reviewed articles on total hip arthroplasty. Dr. Thomas Sculco, my couterpart in this debate, wrote an insightful editorial in the American Journal of Orthopedic Surgery in which he stated: "Although these procedures have generated incredible interest and enthusiasm, I am concerned that they may be performed to the detriment of our patients." I couldn't agree with him more. Smaller is not necessarily better and, when it is worse, it will be the "smaller" that is held accountable.

  11. Minimal distortion pathways in polyhedral rearrangements.

    PubMed

    Casanova, David; Cirera, Jordi; Llunell, Miquel; Alemany, Pere; Avnir, David; Alvarez, Santiago

    2004-02-18

    A definition of minimum distortion paths between two polyhedra in terms of continuous shape measures (CShM) is presented. A general analytical expression deduced for such pathways makes use of one parameter, the minimum distortion constant, that can be easily obtained through the CShM methodology and is herein tabulated for pairs of polyhedra having four to eight vertexes. The work presented here also allows us to obtain representative model molecular structures along the interconversion pathways. Several commonly used polytopal rearrangement pathways are shown to be in fact minimum distortion pathways: the spread path leading from the tetrahedron to the square, the Berry pseudorotation that interconverts a square pyramid and a trigonal bipyramid, and the Bailar twist for the interconversion of the octahedron and the trigonal prism. Examples of applications to the analysis of the stereochemistries of several families of metal complexes are presented. PMID:14871107

  12. Heuristic Methodology and New Science Studies.

    ERIC Educational Resources Information Center

    Erwin, Susan L.; Erwin, John R.

    This paper considers the use of heuristic methodology as a research vehicle for new science investigations in education. The paper describes heuristic methodology and its use as a means of new science-based research in schools. It also describes how heuristic methodology was used in a 2002 study to explain educational practices through the…

  13. Methodology and the Research-Practice Gap.

    ERIC Educational Resources Information Center

    Robinson, Viviane M. J.

    1998-01-01

    Addresses the mismatch between educational research methodologies and its application to generic features of practice and proposes a problem-based methodology that better links research with problem solving. Implications of this methodology are discussed from recent research on school tracking. (GR)

  14. Design for minimizing fracture risk of all-ceramic cantilever dental bridge.

    PubMed

    Zhang, Zhongpu; Zhou, Shiwei; Li, Eric; Li, Wei; Swain, Michael V; Li, Qing

    2015-01-01

    Minimization of the peak stresses and fracture incidence induced by mastication function is considered critical in design of all-ceramic dental restorations, especially for cantilever fixed partial dentures (FPDs). The focus of this study is on developing a mechanically-sound optimal design for all-ceramic cantilever dental bridge in a posterior region. The topology optimization procedure in association with Extended Finite Element Method (XFEM) is implemented here to search for the best possible distribution of porcelain and zirconia materials in the bridge structure. The designs with different volume fractions of zirconia are considered. The results show that this new methodology is capable of improving FPD design by minimizing incidence of crack in comparison with the initial design. Potentially, it provides dental technicians with a new design tool to develop mechanically sound cantilever fixed partial dentures for more complicated clinical situation. PMID:26405963

  15. Design for minimizing fracture risk of all-ceramic cantilever dental bridge.

    PubMed

    Zhang, Zhongpu; Zhou, Shiwei; Li, Eric; Li, Wei; Swain, Michael V; Li, Qing

    2015-01-01

    Minimization of the peak stresses and fracture incidence induced by mastication function is considered critical in design of all-ceramic dental restorations, especially for cantilever fixed partial dentures (FPDs). The focus of this study is on developing a mechanically-sound optimal design for all-ceramic cantilever dental bridge in a posterior region. The topology optimization procedure in association with Extended Finite Element Method (XFEM) is implemented here to search for the best possible distribution of porcelain and zirconia materials in the bridge structure. The designs with different volume fractions of zirconia are considered. The results show that this new methodology is capable of improving FPD design by minimizing incidence of crack in comparison with the initial design. Potentially, it provides dental technicians with a new design tool to develop mechanically sound cantilever fixed partial dentures for more complicated clinical situation.

  16. Minimal Residual Disease Assessment in the Context of Multiple Myeloma Treatment.

    PubMed

    Nishihori, Taiga; Song, Jinming; Shain, Kenneth H

    2016-04-01

    With contemporary therapeutic strategies in multiple myeloma, heretofore unseen depth and rate of responses are being achieved. These strategies have paralleled improvements in outcome of multiple myeloma patients. The integration of the next generation of proteasome inhibitors and antibody therapeutics promise continued improvements in therapy with the expectation of consistent depth of response not quantifiable by current clinical methods. As such, there is a growing need to develop adequate tools to evaluate deeper disease response after therapy and to refine the response criteria including the minimal residual disease. Several emerging techniques are being evaluated for these purposes including multi-parameter flow cytometry, allele-specific oligonucleotide polymerase chain reaction, next-generation sequencing, and imaging modalities. In this review, we highlight the recent developments and evaluate advantages and limitations of the current technologies to assess minimal residual disease. We also discuss future applications of these methodologies in potentially guiding multiple myeloma treatment decisions.

  17. Methodological quality of behavioural weight loss studies: a systematic review.

    PubMed

    Lemon, S C; Wang, M L; Haughton, C F; Estabrook, D P; Frisard, C F; Pagoto, S L

    2016-07-01

    This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate > 75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting.

  18. Note: A method for minimizing oxide formation during elevated temperature nanoindentation

    SciTech Connect

    Cheng, I. C.; Hodge, A. M.; Garcia-Sanchez, E.

    2014-09-15

    A standardized method to protect metallic samples and minimize oxide formation during elevated-temperature nanoindentation was adapted to a commercial instrument. Nanoindentation was performed on Al (100), Cu (100), and W (100) single crystals submerged in vacuum oil at 200 °C, while the surface morphology and oxidation was carefully monitored using atomic force microscopy (AFM) and X-ray photoelectron spectroscopy (XPS). The results were compared to room temperature and 200 °C nanoindentation tests performed without oil, in order to evaluate the feasibility of using the oil as a protective medium. Extensive surface characterization demonstrated that this methodology is effective for nanoscale testing.

  19. Minimal trellises for linear block codes and their duals

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Dolinar, S.; Ekroot, L.; Mceliece, R. J.; Lin, W.

    1995-01-01

    We consider the problem of finding a trellis for a linear block code that minimizes one or more measures of trellis complexity for a fixed permutation of the code. We examine constraints on trellises, including relationships between the minimal trellis of a code and that of the dual code. We identify the primitive structures that can appear in a minimal trellis and relate this to those for the minimal trellis of the dual code.

  20. Simulation enabled safeguards assessment methodology

    SciTech Connect

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-07-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  1. Simulation Enabled Safeguards Assessment Methodology

    SciTech Connect

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.

  2. Indirect Lightning Safety Assessment Methodology

    SciTech Connect

    Ong, M M; Perkins, M P; Brown, C G; Crull, E W; Streit, R D

    2009-04-24

    Lightning is a safety hazard for high-explosives (HE) and their detonators. In the However, the current flowing from the strike point through the rebar of the building The methodology for estimating the risk from indirect lighting effects will be presented. It has two parts: a method to determine the likelihood of a detonation given a lightning strike, and an approach for estimating the likelihood of a strike. The results of these two parts produce an overall probability of a detonation. The probability calculations are complex for five reasons: (1) lightning strikes are stochastic and relatively rare, (2) the quality of the Faraday cage varies from one facility to the next, (3) RF coupling is inherently a complex subject, (4) performance data for abnormally stressed detonators is scarce, and (5) the arc plasma physics is not well understood. Therefore, a rigorous mathematical analysis would be too complex. Instead, our methodology takes a more practical approach combining rigorous mathematical calculations where possible with empirical data when necessary. Where there is uncertainty, we compensate with conservative approximations. The goal is to determine a conservative estimate of the odds of a detonation. In Section 2, the methodology will be explained. This report will discuss topics at a high-level. The reasons for selecting an approach will be justified. For those interested in technical details, references will be provided. In Section 3, a simple hypothetical example will be given to reinforce the concepts. While the methodology will touch on all the items shown in Figure 1, the focus of this report is the indirect effect, i.e., determining the odds of a detonation from given EM fields. Professor Martin Uman from the University of Florida has been characterizing and defining extreme lightning strikes. Using Professor Uman's research, Dr. Kimball Merewether at Sandia National Laboratory in Albuquerque calculated the EM fields inside a Faraday-cage type

  3. Lean methodology in health care.

    PubMed

    Kimsey, Diane B

    2010-07-01

    Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area. PMID:20619772

  4. Methodology for flammable gas evaluations

    SciTech Connect

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  5. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  6. Lean methodology in health care.

    PubMed

    Kimsey, Diane B

    2010-07-01

    Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area.

  7. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  8. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  9. Support minimized inversion of acoustic and elastic wave scattering

    SciTech Connect

    Safaeinili, A.

    1994-04-24

    This report discusses the following topics on support minimized inversion of acoustic and elastic wave scattering: Minimum support inversion; forward modelling of elastodynamic wave scattering; minimum support linearized acoustic inversion; support minimized nonlinear acoustic inversion without absolute phase; and support minimized nonlinear elastic inversion.

  10. Minimally invasive spine stabilisation with long implants

    PubMed Central

    Logroscino, Carlo Ambrogio; Proietti, Luca

    2009-01-01

    Originally aimed at treating degenerative syndromes of the lumbar spine, percutaneous minimally invasive posterior fixation is nowadays even more frequently used to treat some thoracolumbar fractures. According to the modern principles of saving segment of motion, a short implant (one level above and one level below the injured vertebra) is generally used to stabilise the injured spine. Although the authors generally use a short percutaneous fixation in treating thoracolumbar fractures with good results, they observed some cases in which the high fragmentation of the vertebral body and the presence of other associated diseases (co-morbidities) did not recommend the use of a short construct. The authors identified nine cases, in which a long implant (two levels above and two levels below the injured vertebra) was performed by a percutaneous minimally invasive approach. Seven patients (five males/two females) were affected by thoracolumbar fractures. T12 vertebra was involved in three cases, L1 in two cases, T10 and L2 in one case, respectively. Two fractures were classified as type A 3.1, two as A 3.2, two as A 3.3 and one as B 2.3, according to Magerl. In the present series, there were also two patients affected by a severe osteolysis of the spine (T9 and T12) due to tumoral localisation. All patients operated on with long instrumentation had a good outcome with prompt and uneventful clinical recovery. At the 1-year follow-up, all patients except one, who died 11 months after the operation, did not show any radiologic signs of mobilisation or failure of the implant. Based on the results of the present series, the long percutaneous fixation seems to represent an effective and safe system to treat particular cases of vertebral lesions. In conclusion, the authors believe that a long implant might be an alternative surgical method compared to more aggressive or demanding procedures, which in a few patients could represent an overtreatment. PMID:19399530

  11. Waste Minimization via Radiological Hazard Reduction

    SciTech Connect

    Stone, K.A.; Coffield, T.; Hooker, K.L.

    1998-03-01

    The Savannah River Site (SRS), a 803 km{sup 2} U.S. Department of Energy (DOE) facility in south-western South Carolina, incorporates pollution prevention as a fundamental component of its Environmental Management System. A comprehensive pollution prevention program was implemented as part of an overall business strategy to reduce waste generation and pollution releases, minimize environmental impacts, and to reduce future waste management and pollution control costs. In fiscal years 1995 through 1997, the Site focused on implementing specific waste reduction initiatives identified while benchmarking industry best practices. These efforts resulted in greater than $25 million in documented cost avoidance. While these results have been dramatic to date, the Site is further challenged to maximize resource utilization and deploy new technologies and practices to achieve further waste reductions. The Site has elected to target a site-wide reduction of contaminated work spaces in fiscal year 1998 as the primary source reduction initiative. Over 120,900 m{sup 2} of radiologically contaminated work areas (approximately 600 separate inside areas) exist at SRS. Reduction of these areas reduces future waste generation, minimizes worker exposure, and reduces surveillance and maintenance costs. This is a major focus of the Site`s As Low As Reasonably Achievable (ALARA) program by reducing sources of worker exposure. The basis for this approach was demonstrated during 1997 as part of a successful Enhanced Work Planning pilot conducted at several specific contamination areas at SRS. An economic-based prioritization process was utilized to develop a model for prioritizing areas to reclaim. In the H-Canyon Separation facility, over 3,900 m{sup 2} of potentially contaminated area was rolled back to a Radiation Buffer Area. The facility estimated nearly 420 m{sup 3} of low level radioactive waste will be avoided each year, and overall cost savings and productivity gains will reach

  12. Minimizing Glovebox Glove Breaches: PART II.

    SciTech Connect

    Cournoyer, M. E.; Andrade, R.M.; Taylor, D. J.; Stimmel, J. J.; Zaelke, R. L.; Balkey, J. J.

    2005-01-01

    As a matter of good business practices, a team of glovebox experts from Los Alamos National Laboratory (LANL) has been assembled to proactively investigate processes and procedures that minimize unplanned breaches in the glovebox, e.g., glove failures. A major part of this effort involves the review of glovebox glove failures that have occurred at the Plutonium Facility and at the Chemical and Metallurgy Research Facility. Information dating back to 1993 has been compiled from formal records. This data has been combined with information obtained from a baseline inventory of about 9,000 glovebox gloves. The key attributes tracked include those related to location, the glovebox glove, type and location of breaches, the worker, and the consequences resulting from breaches. This glovebox glove failure analysis yielded results in the areas of the ease of collecting this type of data, the causes of most glove failures that have occurred, the effectiveness of current controls, and recommendations to improve hazard control systems. As expected, a significant number of breaches involve high-risk operations such as grinding, hammering, using sharps (especially screwdrivers), and assembling equipment. Surprisingly, tasks such as the movement of equipment and material between gloveboxes and the opening of cans are also major contributions of breaches. Almost half the gloves fail within a year of their install date. The greatest consequence for over 90% of glovebox glove failures is alpha contamination of protective clothing. Personnel self-monitoring at the gloveboxes continues to be the most effective way of detecting glovebox glove failures. Glove failures from these tasks can be reduced through changes in procedures and the design of remote-handling apparatus. The Nuclear Materials Technology Division management uses this information to improve hazard control systems to reduce the number of unplanned breaches in the glovebox further. As a result, excursions of contaminants

  13. The Minimal Cost of Life in Space

    NASA Astrophysics Data System (ADS)

    Drysdale, A.; Rutkze, C.; Albright, L.; Ladue, R.

    Life in space requires protection from the external environment, provision of a suitable internal environment, provision of consumables to maintain life, and removal of wastes. Protection from the external environment will mainly require shielding from radiation and meteoroids. Provision of a suitable environment inside the spacecraft will require provision of suitable air pressure and composition, temperature, and protection from environmental toxins (trace contaminants) and pathogenic micro-organisms. Gravity may be needed for longer missions to avoid excessive changes such as decalcification and muscle degeneration. Similarly, the volume required per crewmember will increase as the mission duration increases. Consumables required include oxygen, food, and water. Nitrogen might be required, depending on the total pressure and non-metabolic losses. We normally provide these consumables from the Earth, with a greater or lesser degree of regeneration. In principle, all consumables can be regenerated. Water and air are easiest to regenerate. At the present time, food can only be regenerated by using plants, and higher plants at that. Waste must be removed, including carbon dioxide and other metabolic waste as well as trash such as food packaging, filters, and expended spare parts. This can be done by dumping or regeneration. The minimal cost of life in space would be to use a synthesis process or system to regenerate all consumables from wastes. As the efficiency of the various processes rises, the minimal cost of life support will fall. However, real world regeneration requires significant equipment, power, and crew time. Make-up will be required for those items that cannot be economically regenerated. For very inefficient processes, it might be cheaper to ship all or part of the consumables. We are currently far down the development curve, and for short missions it is cheaper to ship consumables. For longer duration missions, greater closure is cost effective

  14. Minimally Invasive Versus Conventional Aortic Valve Replacement

    PubMed Central

    Attia, Rizwan Q.; Hickey, Graeme L.; Grant, Stuart W.; Bridgewater, Ben; Roxburgh, James C.; Kumar, Pankaj; Ridley, Paul; Bhabra, Moninder; Millner, Russell W. J.; Athanasiou, Thanos; Casula, Roberto; Chukwuemka, Andrew; Pillay, Thasee; Young, Christopher P.

    2016-01-01

    Objective Minimally invasive aortic valve replacement (MIAVR) has been demonstrated as a safe and effective option but remains underused. We aimed to evaluate outcomes of isolated MIAVR compared with conventional aortic valve replacement (CAVR). Methods Data from The National Institute for Cardiovascular Outcomes Research (NICOR) were analyzed at seven volunteer centers (2006–2012). Primary outcomes were in-hospital mortality and midterm survival. Secondary outcomes were postoperative length of stay as well as cumulative bypass and cross-clamp times. Propensity modeling with matched cohort analysis was used. Results Of 307 consecutive MIAVR patients, 151 (49%) were performed during the last 2 years of study with a continued increase in numbers. The 307 MIAVR patients were matched on a 1:1 ratio. In the matched CAVR group, there was no statistically significant difference in in-hospital mortality [MIAVR, 4/307,(1.3%); 95% confidence interval (CI), 0.4%–3.4% vs CAVR, 6/307 (2.0%); 95% CI, 0.8%–4.3%; P = 0.752]. One-year survival rates in the MIAVR and CAVR groups were 94.4% and 94.6%, respectively. There was no statistically significant difference in midterm survival (P = 0.677; hazard ratio, 0.90; 95% CI, 0.56–1.46). Median postoperative length of stay was lower in the MIAVR patients by 1 day (P = 0.009). The mean cumulative bypass time (94.8 vs 91.3 minutes; P = 0.333) and cross-clamp time (74.6 vs 68.4 minutes; P = 0.006) were longer in the MIAVR group; however, this was significant only in the cross-clamp time comparison. Conclusions Minimally invasive aortic valve replacement is a safe alternative to CAVR with respect to operative and 1-year mortality and is associated with a shorter postoperative stay. Further studies are required in high-risk (logistic EuroSCORE > 10) patients to define the role of MIAVR. PMID:26926521

  15. [Minimally Invasive Thoracoscopic Surgery for Mediastinal Lesions].

    PubMed

    Maeda, Sumiko

    2016-07-01

    This review article describes minimally invasive thoracoscopic surgery for anterior mediastinal lesions. The operative procedures for anterior mediastinal lesions have been changed in a couple of decades from open surgery under median sternotomy to complete thoracoscopic mediastinal surgery with sternal lifting or carbon dioxide insufflation. Carbon dioxide insufflation of the thoracic cavity or the mediastinum is now prevailing to improve the surgical field and facilitate the operative procedures. Surgical indications for complete thoracoscopic mediastinal surgery include benign cystic lesions generally regardless of their size and non-invasive anterior mediastinal tumors usually less than 50~60 mm in the greatest dimension. There are currently three surgical approaches in the complete thoracoscopic surgery for the anterior mediastinal lesions. One is the unilateral or bilateral transthoracic approach. The second is the combination of the subxiphoid and the transthoracic approach. The last is the subxiphoid approach. The selection of the surgical approach depends on the surgeon's preference and experiences. When carbon dioxide insufflation is applied during the operation, following complications may occur;hypercapnia, gas embolism, subcutaneous emphysema, endotracheal tube dislocation due to the mediastinal sift, and hypotention. Special safety considerations are necessary during the complete thoracoscopic mediastinal surgery with carbon dioxide insufflation. PMID:27440034

  16. Minimal model for dark matter and unification

    SciTech Connect

    Mahbubani, Rakhi; Senatore, Leonardo

    2006-02-15

    Gauge coupling unification and the success of TeV-scale weakly-interacting dark matter are usually taken as evidence of low-energy supersymmetry (SUSY). However, if we assume that the tuning of the Higgs can be explained in some unnatural way, from environmental considerations for example, SUSY is no longer a necessary component of any beyond the standard model theory. In this paper we study the minimal model with a dark matter candidate and gauge coupling unification. This consists of the standard model plus fermions with the quantum numbers of SUSY Higgsinos, and a singlet. It predicts thermal dark matter with a mass that can range from 100 GeV to around 2 TeV and generically gives rise to an electric dipole moment (EDM) that is just beyond current experimental limits, with a large portion of its allowed parameter space accessible to next-generation EDM and direct detection experiments. We study precision unification in this model by embedding it in a 5D orbifold GUT where certain large threshold corrections are calculable, achieving gauge coupling and b-{tau} unification, and predicting a rate of proton decay just beyond current limits.

  17. Minimizing metastatic risk in radiotherapy fractionation schedules

    NASA Astrophysics Data System (ADS)

    Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin

    2015-11-01

    Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.

  18. Digital breast tomosynthesis with minimal breast compression

    NASA Astrophysics Data System (ADS)

    Scaduto, David A.; Yang, Min; Ripton-Snyder, Jennifer; Fisher, Paul R.; Zhao, Wei

    2015-03-01

    Breast compression is utilized in mammography to improve image quality and reduce radiation dose. Lesion conspicuity is improved by reducing scatter effects on contrast and by reducing the superposition of tissue structures. However, patient discomfort due to breast compression has been cited as a potential cause of noncompliance with recommended screening practices. Further, compression may also occlude blood flow in the breast, complicating imaging with intravenous contrast agents and preventing accurate quantification of contrast enhancement and kinetics. Previous studies have investigated reducing breast compression in planar mammography and digital breast tomosynthesis (DBT), though this typically comes at the expense of degradation in image quality or increase in mean glandular dose (MGD). We propose to optimize the image acquisition technique for reduced compression in DBT without compromising image quality or increasing MGD. A zero-frequency signal-difference-to-noise ratio model is employed to investigate the relationship between tube potential, SDNR and MGD. Phantom and patient images are acquired on a prototype DBT system using the optimized imaging parameters and are assessed for image quality and lesion conspicuity. A preliminary assessment of patient motion during DBT with minimal compression is presented.

  19. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    1994-08-01

    The primary mission of DOE/NV is to manage and operate the Nevada Test Site (NTS) and other designated test locations, within and outside the United States; provide facilities and services to DOE and non-DOE NTS users; and plan. coordinate, and execute nuclear weapons tests and related test activities. DOE/NV also: (a) Supports operations under interagency agreements pertaining to tests, emergencies, and related functions/activities, (b) Plans, coordinates, and executes environmental restoration, (c) Provides support to the Yucca Mountain Site Characterization Project Office in conjunction with DOE/HQ oversight, (d) Manages the Radioactive Waste Management Sites (RWMS) for disposal of low-level and mixed wastes received from the NTS and off-site generators, and (e) Implements waste minimization programs to reduce the amount of hazardous, mixed, radioactive, and nonhazardous solid waste that is generated and disposed The NTS, which is the primary facility controlled by DOE/NV, occupies 1,350 square miles of restricted-access, federally-owned land located in Nye County in Southern Nevada. The NTS is located in a sparsely populated area, approximately 65 miles northwest of Las Vegas, Nevada.

  20. Management options for minimal hepatic encephalopathy.

    PubMed

    Bajaj, Jasmohan S

    2008-12-01

    Minimal hepatic encephalopathy (MHE) is a neurocognitive dysfunction that is present in the majority of patients with cirrhosis. MHE has a characteristic cognitive profile that cannot be diagnosed clinically. This cognitive dysfunction is independent of sleep dysfunction or problems with overall intelligence. MHE has a significant impact on quality of life, the ability to function in daily life and progression to overt hepatic encephalopathy. Driving ability can be impaired in MHE and this may be a significant factor behind motor vehicle accidents. A crucial aspect of the clinical care of MHE patients is their driving history, which is often ignored during routine care and can add a vital dimension to the overall disease assessment. Driving history should be an integral part of the care of patients with MHE. The preserved communication skills and lack of specific signs and insight make MHE difficult to diagnose. The predominant strategies for MHE diagnosis are psychometric or neurophysiological testing. These are usually limited by financial, normative or time constraints. Studies into inhibitory control, cognitive drug research and critical flicker frequency tests are encouraging. These tests do not require a psychologist for administration and interpretation. Lactulose and probiotics have been studied for their potential use as therapies for MHE, but these are not standard-of-care practices at this time. Therapy can improve the quality of life in MHE patients but the natural history, specific diagnostic strategies and treatment options are still being investigated. PMID:19090738

  1. Process optimized minimally invasive total hip replacement

    PubMed Central

    Gebel, Philipp; Oszwald, Markus; Ishaque, Bernd; Ahmed, Gaffar; Blessing, Recha; Thorey, Fritz; Ottersbach, Andreas

    2012-01-01

    The purpose of this study was to analyse a new concept of using the the minimally invasive direct anterior approach (DAA) in total hip replacement (THR) in combination with the leg positioner (Rotex- Table) and a modified retractor system (Condor). We evaluated retrospectively the first 100 primary THR operated with the new concept between 2009 and 2010, regarding operation data, radiological and clinical outcome (HOOS). All surgeries were perfomed in a standardized operation technique including navigation. The average age of the patients was 68 years (37 to 92 years), with a mean BMI of 26.5 (17 to 43). The mean time of surgery was 80 min. (55 to 130 min). The blood loss showed an average of 511.5 mL (200 to 1000 mL). No intra-operative complications occurred. The postoperative complication rate was 6%. The HOOS increased from 43 points pre-operatively to 90 (max 100 points) 3 months after surgery. The radiological analysis showed an average cup inclination of 43° and a leg length discrepancy in a range of +/− 5 mm in 99%. The presented technique led to excellent clinic results, showed low complication rates and allowed correct implant positions although manpower was saved. PMID:22577504

  2. Minimally invasive knee arthroplasty: An overview

    PubMed Central

    Tria, Alfred J; Scuderi, Giles R

    2015-01-01

    Minimally invasive surgery (MIS) for arthroplasty of the knee began with surgery for unicondylar knee arthroplasty (UKA). Partial knee replacements were designed in the 1970s and were amenable to a more limited exposure. In the 1990s Repicci popularized the MIS for UKA. Surgeons began to apply his concepts to total knee arthroplasty. Four MIS surgical techniques were developed: quadriceps sparing, mini-mid vastus, mini-subvastus, and mini-medial parapatellar. The quadriceps sparing technique is the most limited one and is also the most difficult. However, it is the least invasive and allows rapid recovery. The mini-midvastus is the most common technique because it affords slightly better exposure and can be extended. The mini-subvastus technique entirely avoids incising the quadriceps extensor mechanism but is time consuming and difficult in the obese and in the muscular male patient. The mini-parapatellar technique is most familiar to surgeons and represents a good starting point for surgeons who are learning the techniques. The surgeries are easier with smaller instruments but can be performed with standard ones. The techniques are accurate and do lead to a more rapid recovery, with less pain, less blood loss, and greater motion if they are appropriately performed. PMID:26601062

  3. Minimally Invasive Procedures for Nasal Aesthetics

    PubMed Central

    Redaelli, Alessio; Limardo, Pietro

    2012-01-01

    Nose has an important role in the aesthetics of face. It is easy to understand the reason of the major interest that has revolved around the correction of its imperfections for several centuries, or even from the ancient times. In the last decade, all the surgical or medical minimal-invasive techniques evolved exponentially. The techniques of rejuvenation and corrections of nasal imperfections did not escape this development that is much widespread in the medicine of the third millennium. In many cases, the techniques of surgical correction involve invasive procedure that necessitates, for the majority of cases, hospitalisation. The author, using a different approach, has developed mini-invasive techniques using botulinum toxin A (BTxA) and absorbable fillers for the correction of nasal imperfections. BTxA allows to reduce the imperfections due to hypertension of muscles, while the absorbable fillers allow to correct all the imperfections of the nasal profile from the root to the tip in total safety. The correction is based on the precise rules that allow avoiding the majority of side effects. Results are long lasting and well appreciated by patients. PMID:23060706

  4. Minimally Invasive Approach to Achilles Tendon Pathology.

    PubMed

    Hegewald, Kenneth W; Doyle, Matthew D; Todd, Nicholas W; Rush, Shannon M

    2016-01-01

    Many surgical procedures have been described for Achilles tendon pathology; however, no overwhelming consensus has been reached for surgical treatment. Open repair using a central or paramedian incision allows excellent visualization for end-to-end anastomosis in the case of a complete rupture and detachment and reattachment for insertional pathologies. Postoperative wound dehiscence and infection in the Achilles tendon have considerable deleterious effects on overall functional recovery and outcome and sometimes require plastic surgery techniques to achieve coverage. With the aim of avoiding such complications, foot and ankle surgeons have studied less invasive techniques for repair. We describe a percutaneous approach to Achilles tendinopathy using a modification of the Bunnell suture weave technique combined with the use of interference screws. No direct end-to-end repair of the tendon is performed, rather, the proximal stump is brought in direct proximity of the distal stump, preventing overlengthening and proximal stump retraction. This technique also reduces the suture creep often seen with end-to-end tendon repair by providing a direct, rigid suture to bone interface. We have used the new technique to minimize dissection and exposure while restoring function and accelerating recovery postoperatively. PMID:26385574

  5. Minimally invasive dentistry and the dental enterprise.

    PubMed

    Rossomando, Edward F

    2007-03-01

    Improvements in understanding the process of remineralization have resulted in a reappraisal of repair of damaged tooth structure and call into question the principles of cavity preparation of GV Black and his principle of "extension for prevention." From this reappraisal has emerged the idea of minimally invasive dentistry (MID). The goal of MID is to remove as little of the sound tooth structure during the restoration phase as possible. This goal is in our reach in part because of availability of products that promote mineralization and of dental excavation instruments, like the dental laser, that can be managed to remove only damaged tooth structure. It is critical that the leaders of the dental enterprise endorse MID. Delay could allow new products to move from the dental profession to other health care providers. For example, a caries vaccine will soon enter the market place. Will dentists expand the scope of their practices to include the application of this vaccine, or will they ignore this new product and allow the new technology to enter the scope of practice of other health providers?

  6. Design and Demonstration of Minimal Lunar Base

    NASA Astrophysics Data System (ADS)

    Boche-Sauvan, L.; Foing, B. H.; Exohab Team

    2009-04-01

    Introduction: We propose a conceptual analysis of a first minimal lunar base, in focussing on the system aspects and coordinating every different part as part an evolving architecture [1-3]. We justify the case for a scientific outpost allowing experiments, sample analysis in laboratory (relevant to the origin and evolution of the Earth, geophysical and geochemical studies of the Moon, life sciences, observation from the Moon). Research: Research activities will be conducted with this first settlement in: - science (of, from and on the Moon) - exploration (robotic mobility, rover, drilling), - technology (communication, command, organisation, automatism). Life sciences. The life sciences aspects are considered through a life support for a crew of 4 (habitat) and a laboratory activity with biological experiments performed on Earth or LEO, but then without any magnetosphere protection and therefore with direct cosmic rays and solar particle effects. Moreover, the ability of studying the lunar environment in the field will be a big asset before settling a permanent base [3-5]. Lunar environment. The lunar environment adds constraints to instruments specifications (vacuum, extreme temperature, regolith, seism, micrometeorites). SMART-1 and other missions data will bring geometrical, chemical and physical details about the environment (soil material characteristics, on surface conditions …). Test bench. To assess planetary technologies and operations preparing for Mars human exploration. Lunar outpost predesign modular concept: To allow a human presence on the moon and to carry out these experiments, we will give a pre-design of a human minimal lunar base. Through a modular concept, this base will be possibly evolved into a long duration or permanent base. We will analyse the possibilities of settling such a minimal base by means of the current and near term propulsion technology, as a full Ariane 5 ME carrying 1.7 T of gross payload to the surface of the Moon

  7. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care.

  8. Minimal flow units for magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Orlandi, P.

    2016-08-01

    We present direct numerical simulations of two minimal flow units (MFUs) to investigate the differences between inviscid and viscous simulations, and the different behavior of the evolution for conducting fluids. In these circumstances the introduction of the Lorentz force in the momentum equation produces different scenarios. The Taylor–Green vortex, in the past, was an MFU widely considered for both conducting and non-conducting fluids. The simulations were performed by pseudo-spectral numerical methods; these are repeated here by using a finite difference second-order accurate, energy-conserving scheme for ν =0. Having observed that this initial condition could be inefficient for capturing the eventual occurrence of a finite time singularity a potentially more efficient MFU consisting of two interacting Lamb dipoles was considered. It was found that the two flows have a different time evolution in the vortical dominated stage. In this stage, turbulent structures of different size are generated leading to spectra, in the inviscid conditions, with a {k}-3 range. In real conditions the viscosity produces smaller scales characteristic of fully developed turbulence with energy spectra with well defined exponential and inertial ranges. In the presence of non-conducting conditions the passive vector behaves as the vorticity. The evolution is different in the presence of conducting conditions. Although the time evolution is different, both flows lead to spectra in Kolmogorov units with the same shape at high and intermediate wave numbers.

  9. The minimal power spectrum: Higher order contributions

    NASA Technical Reports Server (NTRS)

    Fry, J. N.

    1994-01-01

    It has been an accepted belief for some time that gravity induces a minimal tail P(k) approximately k(exp 4) in the power spectrum as k approaches 0 for distributions with no initial power on large scales. In a recent numerical experiment with initial power confined to a restricted range in k, Shandarin and Melott (1990) found a k approaches 0 tail that at early stages of evolution behaves as k(exp 4) and grows with time as a(exp 4)(t), where a(t) is the cosmological expansion factor, and at late times depends on scale as k(exp 3) and grows with time as a(exp 2)(t). I compute analytically several contributions to the power spectrum of higher order than those included in earlier work, and I apply the results to the particular case of initial power restricted to a finite range of k. As expected, in the perturbative regime P(k) approximately a(exp 4)k(exp 4) from the first correction to linear perturbation theory is the dominant term as k approaches 0. Numerical investigations show that the higher order contributions go as k(exp 4) also. However, perturbation theory alone cannot tell whether the P approximately a(exp 2)k(exp 3) result is 'nonperturbative' or a numerical artifact.

  10. Minimal detectable outliers as measures of reliability

    NASA Astrophysics Data System (ADS)

    Koch, Karl-Rudolf

    2015-05-01

    The concept of reliability was introduced into geodesy by Baarda (A testing procedure for use in geodetic networks. Publications on Geodesy, vol. 2. Netherlands Geodetic Commission, Delft, 1968). It gives a measure for the ability of a parameter estimation to detect outliers and leads in case of one outlier to the MDB, the minimal detectable bias or outlier. The MDB depends on the non-centrality parameter of the -distribution, as the variance factor of the linear model is assumed to be known, on the size of the outlier test of an individual observation which is set to 0.001 and on the power of the test which is generally chosen to be 0.80. Starting from an estimated variance factor, the -distribution is applied here. Furthermore, the size of the test of the individual observation is a function of the number of outliers to keep the size of the test of all observations constant, say 0.05. The power of the test is set to 0.80. The MDBs for multiple outliers are derived here under these assumptions. The method is applied to the reconstruction of a bell-shaped surface measured by a laser scanner. The MDBs are introduced as outliers for the alternative hypotheses of the outlier tests. A Monte Carlo method reveals that due to the way of introducing the outliers, the false null hypotheses cannot be rejected on the average with a power of 0.80 if the MDBs are not enlarged by a factor.

  11. Wormholes minimally violating the null energy condition

    SciTech Connect

    Bouhmadi-López, Mariam; Lobo, Francisco S N; Martín-Moruno, Prado E-mail: fslobo@fc.ul.pt

    2014-11-01

    We consider novel wormhole solutions supported by a matter content that minimally violates the null energy condition. More specifically, we consider an equation of state in which the sum of the energy density and radial pressure is proportional to a constant with a value smaller than that of the inverse area characterising the system, i.e., the area of the wormhole mouth. This approach is motivated by a recently proposed cosmological event, denoted {sup t}he little sibling of the big rip{sup ,} where the Hubble rate and the scale factor blow up but the cosmic derivative of the Hubble rate does not [1]. By using the cut-and-paste approach, we match interior spherically symmetric wormhole solutions to an exterior Schwarzschild geometry, and analyse the stability of the thin-shell to linearized spherically symmetric perturbations around static solutions, by choosing suitable properties for the exotic material residing on the junction interface radius. Furthermore, we also consider an inhomogeneous generalization of the equation of state considered above and analyse the respective stability regions. In particular, we obtain a specific wormhole solution with an asymptotic behaviour corresponding to a global monopole.

  12. Cultural change and support of waste minimization

    SciTech Connect

    Boylan, M.S.

    1991-12-31

    The process of bringing a subject like pollution prevention to top of mind awareness, where designed to prevent waste becomes part of business as usual, is called cultural change. With Department of Energy orders and management waste minimization commitment statements on file, the REAL work is just beginning at the Idaho National Engineering Laboratory (INEL); shaping the attitudes of 11,000+ employees. The difficulties of such a task are daunting. The 890 square mile INEL site and in-town support offices mean a huge diversity of employee jobs and waste streams; from cafeteria and auto maintenance wastes to high-level nuclear waste casks. INEL is pursuing a three component cultural change strategy: training, publicity, and public outreach. To meet the intent of DOE orders, all INEL employees are slated to receive pollution prevention orientation training. More technical training is given to targeted groups like purchasing and design engineering. To keep newly learned pollution prevention concepts top-of-mind, extensive site-wide publicity is being developed and conducted, culminating in the April Pollution Prevention Awareness Week coinciding with Earth Day 1992. Finally, news of INEL pollution prevention successes is shared with the public to increase their overall environmental awareness and their knowledge of INEL activities. An important added benefit is the sense of pride the program instills in INEL employees to have their successes displayed so publicly.

  13. Flavor mixing democracy and minimal CP violation

    NASA Astrophysics Data System (ADS)

    Gerard, Jean-Marc; Xing, Zhi-zhong

    2012-06-01

    We point out that there is a unique parametrization of quark flavor mixing in which every angle is close to the Cabibbo angle θC≃13° with the CP-violating phase ϕq around 1°, implying that they might all be related to the strong hierarchy among quark masses. Applying the same parametrization to lepton flavor mixing, we find that all three mixing angles are comparably large (around π/4) and the Dirac CP-violating phase ϕl is also minimal as compared with its values in the other eight possible parametrizations. In this spirit, we propose a simple neutrino mixing ansatz which is equivalent to the tri-bimaximal flavor mixing pattern in the ϕl→0 limit and predicts sin θ13=1/√{2}sin(ϕl/2) for reactor antineutrino oscillations. Hence the Jarlskog invariant of leptonic CP violation Jl=(sin ϕl)/12 can reach a few percent if θ13 lies in the range 7°⩽θ13⩽10°.

  14. Subjective loudness of "minimized" sonic boom waveforms.

    PubMed

    Niedzwiecki, A; Ribner, H S

    1978-12-01

    For very long supersonic aircraft the "midfield" sonic boom signature may not have evolved fully into an N wave at ground level. Thus in current boom minimization techniques the shape of the aircraft may be tailored to optimize this midfield wave form for reduced subjective loudness. The present investigation tests a family of "flat-top" waveforms cited by Darden: all but one have a front shock height (deltapSH) less than the peak amplitude (deltapMAX). For equal subjective loudness, "flat top" vs N wave (peak overpressure deltapN), the peak amplitude of the "flat top" signature was found to be substantially higher than that of the N wave; thus for equal peak amplitude the "flat-top" signature was quieter. The results for equal loudness were well fitted by an emperical law deltapSH + 0.11deltapMAX = deltapN; the equivalence shows how the front shock amplitude (deltapSH) dominates the loudness. All this was found compatible with predictions by the method of Johnson and Robinson. PMID:739097

  15. Minimal flow units for magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Orlandi, P.

    2016-08-01

    We present direct numerical simulations of two minimal flow units (MFUs) to investigate the differences between inviscid and viscous simulations, and the different behavior of the evolution for conducting fluids. In these circumstances the introduction of the Lorentz force in the momentum equation produces different scenarios. The Taylor-Green vortex, in the past, was an MFU widely considered for both conducting and non-conducting fluids. The simulations were performed by pseudo-spectral numerical methods; these are repeated here by using a finite difference second-order accurate, energy-conserving scheme for ν =0. Having observed that this initial condition could be inefficient for capturing the eventual occurrence of a finite time singularity a potentially more efficient MFU consisting of two interacting Lamb dipoles was considered. It was found that the two flows have a different time evolution in the vortical dominated stage. In this stage, turbulent structures of different size are generated leading to spectra, in the inviscid conditions, with a {k}-3 range. In real conditions the viscosity produces smaller scales characteristic of fully developed turbulence with energy spectra with well defined exponential and inertial ranges. In the presence of non-conducting conditions the passive vector behaves as the vorticity. The evolution is different in the presence of conducting conditions. Although the time evolution is different, both flows lead to spectra in Kolmogorov units with the same shape at high and intermediate wave numbers.

  16. Hibernation and daily torpor minimize mammalian extinctions

    NASA Astrophysics Data System (ADS)

    Geiser, Fritz; Turbill, Christopher

    2009-10-01

    Small mammals appear to be less vulnerable to extinction than large species, but the underlying reasons are poorly understood. Here, we provide evidence that almost all (93.5%) of 61 recently extinct mammal species were homeothermic, maintaining a constant high body temperature and thus energy expenditure, which demands a high intake of food, long foraging times, and thus exposure to predators. In contrast, only 6.5% of extinct mammals were likely heterothermic and employed multi-day torpor (hibernation) or daily torpor, even though torpor is widespread within more than half of all mammalian orders. Torpor is characterized by substantial reductions of body temperature and energy expenditure and enhances survival during adverse conditions by minimizing food and water requirements, and consequently reduces foraging requirements and exposure to predators. Moreover, because life span is generally longer in heterothermic mammals than in related homeotherms, heterotherms can employ a ‘sit-and-wait’ strategy to withstand adverse periods and then repopulate when circumstances improve. Thus, torpor is a crucial but hitherto unappreciated attribute of small mammals for avoiding extinction. Many opportunistic heterothermic species, because of their plastic energetic requirements, may also stand a better chance of future survival than homeothermic species in the face of greater climatic extremes and changes in environmental conditions caused by global warming.

  17. Minimal model for tag-based cooperation

    NASA Astrophysics Data System (ADS)

    Traulsen, Arne; Schuster, Heinz Georg

    2003-10-01

    Recently, Riolo et al. [Nature (London) 414, 441 (2001)] showed by computer simulations that cooperation can arise without reciprocity when agents donate only to partners who are sufficiently similar to themselves. One striking outcome of their simulations was the observation that the number of tolerant agents that support a wide range of players was not constant in time, but showed characteristic fluctuations. The cause and robustness of these tides of tolerance remained to be explored. Here we clarify the situation by solving a minimal version of the model of Riolo et al. It allows us to identify a net surplus of random changes from intolerant to tolerant agents as a necessary mechanism that produces these oscillations of tolerance, which segregate different agents in time. This provides a new mechanism for maintaining different agents, i.e., for creating biodiversity. In our model the transition to the oscillating state is caused by a saddle node bifurcation. The frequency of the oscillations increases linearly with the transition rate from tolerant to intolerant agents.

  18. Disk Acceleration Experiment Utilizing Minimal Material (DAXUMM)

    NASA Astrophysics Data System (ADS)

    Biss, Matthew; Lorenz, Thomas; Sutherland, Gerrit

    2015-06-01

    A venture between the US Army Research Laboratory (ARL) and Lawrence Livermore National Laboratory (LLNL) is currently underway in an effort to characterize novel energetic material performance properties using a single, high-precision, gram-range charge. A nearly all-inclusive characterization experiment is proposed by combing LLNL's disk acceleration experiment (DAX) with the ARL explosive evaluation utilizing minimal material (AXEUMM) experiment. Spherical-cap charges fitted with a flat circular metal disk are centrally initiated using an exploding bridgewire detonator while photonic doppler velocimetry is used to probe the metal disk surface velocity and measure its temporal history. The metal disk's jump-off-velocity measurement is combined with conservation equations, material Hugoniots, and select empirical relationships to determine performance properties of the detonation wave (i.e., velocity, pressure, particle velocity, and density). Using the temporal velocity history with the numerical hydrocode CTH, a determination of the energetic material's equation of state and material expansion energy is possible. Initial experimental and computational results for the plastic-bonded energetic formulation PBXN-5 are presented.

  19. Minimal realistic SU(5) Grand Unified Theory

    NASA Astrophysics Data System (ADS)

    Assad, Nima

    2016-03-01

    Despite making predictions in unprecedented agreement with experiment, such as the magnetic dipole moment of the electron to one part in a billion, the experimental confirmation of neutrino flavor oscillations, and thus of massive neutrinos, implies that the Standard Model (SM) of particle physics is incomplete. An extension of the SM, which retains its low energy predictions while accounting for massive neutrinos, is achieved through the introduction of the dimension 5 Weinberg operator and its associated energy scale above the electroweak (102 GeV), but below the Planck scale (1019 GeV). The Beyond Standard Model (BSM) class of Grand Unified Theories (GUTs) implicates such a scale (1016 GeV) in the unification of the three SM gauge couplings, thus making the origin of neutrino mass a theoretically appealing probe into particle behavior at energies currently inaccessible experimentally. Here, we compare the 24F and 15H extensions of the Georgi-Glashow SU(5) GUT to accommodate massive neutrinos and to unify SM gauge couplings while minimizing the theory's additional field content. Using the Monte Carlo event generator MadGraph, each extension is found to produce distinct signatures at the run II of the LHC.

  20. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care. PMID:26354615

  1. Minimally invasive mitral surgery: dangerous to dabble.

    PubMed

    Edwards, James; Mazzone, Annette; Crouch, Gareth

    2012-03-01

    The introduction of any new surgical technique is fraught with dangers and difficulties, and in cardiac surgery, these potential negative outcomes are magnified by inherent small margins for error. Buxton's law states that it is always too early for rigorous evaluation (of a new technique) until, unfortunately, it is suddenly too late (1). This insightful statement was used to describe the phenomenon to often seen in the introduction of new technologies or procedures in medicine. There is a natural reluctance to subject new techniques to standardized assessment too early in the introductory phase in an attempt to avoid negatively biased results while operator learning is still occurring (2). Over the last two or three decades, this phenomenon has been described as the learning curve and has most often been applied to minimally invasive surgery of all specialties, including general surgery, gynecology, and cardiothoracic surgery. Buxton's concern was justified, because by the time the procedure has become well practiced, there is a reluctance to subject it to rigorous trials on the argument that this will deny the latest, and perhaps greatest, treatment to patients. Whereas each argument, pre-emptive assessment, or delaying access is valid in isolation, the combination is a dangerous system to follow because it prevents rigorous evaluation and denies best practice.

  2. Bacterial Stressors in Minimally Processed Food

    PubMed Central

    Capozzi, Vittorio; Fiocco, Daniela; Amodio, Maria Luisa; Gallone, Anna; Spano, Giuseppe

    2009-01-01

    Stress responses are of particular importance to microorganisms, because their habitats are subjected to continual changes in temperature, osmotic pressure, and nutrients availability. Stressors (and stress factors), may be of chemical, physical, or biological nature. While stress to microorganisms is frequently caused by the surrounding environment, the growth of microbial cells on its own may also result in induction of some kinds of stress such as starvation and acidity. During production of fresh-cut produce, cumulative mild processing steps are employed, to control the growth of microorganisms. Pathogens on plant surfaces are already stressed and stress may be increased during the multiple mild processing steps, potentially leading to very hardy bacteria geared towards enhanced survival. Cross-protection can occur because the overlapping stress responses enable bacteria exposed to one stress to become resistant to another stress. A number of stresses have been shown to induce cross protection, including heat, cold, acid and osmotic stress. Among other factors, adaptation to heat stress appears to provide bacterial cells with more pronounced cross protection against several other stresses. Understanding how pathogens sense and respond to mild stresses is essential in order to design safe and effective minimal processing regimes. PMID:19742126

  3. Minimal genetic device with multiple tunable functions.

    PubMed

    Bagh, Sangram; Mandal, Mahuya; McMillen, David R

    2010-08-01

    The ability to design artificial genetic devices with predictable functions is critical to the development of synthetic biology. Given the highly variable requirements of biological designs, the ability to tune the behavior of a genetic device is also of key importance; such tuning will allow devices to be matched with other components into larger systems, and to be shifted into the correct parameter regimes to elicit desired behaviors. Here, we have developed a minimal synthetic genetic system that acts as a multifunction, tunable biodevice in the bacterium Escherichia coli. First, it acts as a biochemical AND gate, sensing the extracellular small molecules isopropyl β-D -1-thiogalactopyranoside and anhydrotetracycline as two input signals and expressing enhanced green fluorescent protein as an output signal. Next, the output signal of the AND gate can be amplified by the application of another extracellular chemical, arabinose. Further, the system can generate a wide range of chemically tunable single input-output response curves, without any genetic alteration of the circuit, by varying the concentrations of a set of extracellular small molecules. We have developed and parameterized a simple transfer function model for the system, and shown that the model successfully explains and predicts the quantitative relationships between input and output signals in the system.

  4. Linearized Functional Minimization for Inverse Modeling

    SciTech Connect

    Wohlberg, Brendt; Tartakovsky, Daniel M.; Dentz, Marco

    2012-06-21

    Heterogeneous aquifers typically consist of multiple lithofacies, whose spatial arrangement significantly affects flow and transport. The estimation of these lithofacies is complicated by the scarcity of data and by the lack of a clear correlation between identifiable geologic indicators and attributes. We introduce a new inverse-modeling approach to estimate both the spatial extent of hydrofacies and their properties from sparse measurements of hydraulic conductivity and hydraulic head. Our approach is to minimize a functional defined on the vectors of values of hydraulic conductivity and hydraulic head fields defined on regular grids at a user-determined resolution. This functional is constructed to (i) enforce the relationship between conductivity and heads provided by the groundwater flow equation, (ii) penalize deviations of the reconstructed fields from measurements where they are available, and (iii) penalize reconstructed fields that are not piece-wise smooth. We develop an iterative solver for this functional that exploits a local linearization of the mapping from conductivity to head. This approach provides a computationally efficient algorithm that rapidly converges to a solution. A series of numerical experiments demonstrates the robustness of our approach.

  5. Linear functional minimization for inverse modeling

    DOE PAGES

    Barajas-Solano, David A.; Wohlberg, Brendt Egon; Vesselinov, Velimir Valentinov; Tartakovsky, Daniel M.

    2015-06-01

    In this paper, we present a novel inverse modeling strategy to estimate spatially distributed parameters of nonlinear models. The maximum a posteriori (MAP) estimators of these parameters are based on a likelihood functional, which contains spatially discrete measurements of the system parameters and spatiotemporally discrete measurements of the transient system states. The piecewise continuity prior for the parameters is expressed via Total Variation (TV) regularization. The MAP estimator is computed by minimizing a nonquadratic objective equipped with the TV operator. We apply this inversion algorithm to estimate hydraulic conductivity of a synthetic confined aquifer from measurements of conductivity and hydraulicmore » head. The synthetic conductivity field is composed of a low-conductivity heterogeneous intrusion into a high-conductivity heterogeneous medium. Our algorithm accurately reconstructs the location, orientation, and extent of the intrusion from the steady-state data only. Finally, addition of transient measurements of hydraulic head improves the parameter estimation, accurately reconstructing the conductivity field in the vicinity of observation locations.« less

  6. Minimal change glomerulopathy in a cat.

    PubMed

    Backlund, Brianna; Cianciolo, Rachel E; Cook, Audrey K; Clubb, Fred J; Lees, George E

    2011-04-01

    A 6-year-old domestic shorthair male castrated cat was evaluated for sudden onset of vomiting and anorexia. A diagnosis of hypereosinophilic syndrome (HES) was made, and the cat was treated with imatinib mesylate. The cat had an initial clinical improvement with the normalization of the peripheral eosinophil count. After approximately 8 weeks of treatment, lethargy and anorexia recurred despite the normal eosinophil count and a significant proteinuric nephropathy was identified. Treatment with imatinib was discontinued. Ultrasound guided renal biopsies exhibited histologic, ultrastructural, and immunostaining changes indicative of a minimal change glomerulopathy (MCG) which has not previously been reported in the literature in a cat. The proteinuria and HES initially improved while the cat was treated with more traditional medications; however, both the problems persisted for 30 months that the cat was followed subsequently. Previous studies demonstrating the safety and efficacy of imatinib in cats do not report any glomerular injury or significant adverse drug reactions, and the exact cause of this cat's proteinuric nephropathy is uncertain. Nonetheless, the possibility of an adverse drug reaction causing proteinuria should be considered when initiating treatment with imatinib in a cat. PMID:21414552

  7. [Minimally invasive adrenalectomy: transperitoneal vs. retroperitoneal approach].

    PubMed

    Ramacciato, Giovanni; Nigri, Giuseppe; Di Santo, Vincenzo; Piccoli, Michaela; Pansadoro, Vito; Buniva, Paolo; Bellagamba, Riccardo; Cescon, Matteo; Ercolani, Giorgio; Cucchetti, Alessandro; Lauro, Augusto; Del Gaudio, Massimo; Ravaioli, Matteo; Valabrega, Stefano; D'Angelo, Francesco; Aurello, Paolo; Stigliano, Antonio; Toscano, Vincenzo; Melotti, Gianluigi

    2008-01-01

    Laparoscopic adrenalectomy is now regarded as the procedure of choice for most adrenal glands presenting surgical pathology. The primary adrenal-specific contraindication to laparoscopic adrenalectomy today is the presence of a large adrenal mass with evidence of local infiltration or venous invasion. We used our multicentre experience to compare the transperitoneal (TLA) and retroperitoneal (RLA) minimally invasive approaches. In our study we found statistically significant differences between RLA and TLA in terms of duration of surgery (148 minuti vs. 112; p < 0.005), intra-operative blood loss (439 cc vs 333 p < 0.005; p < 0.005) and time of first oral intake (1.2 +/- 0.5 days vs 1.8 +/- 1.08 days; p < 0.005). The RLA approach is preferable in cases of previous abdominal surgery, but its learning curve is extremely steep. TLA access needs a less demanding learning curve and tends to be faster than RLA, where the working area is penalised by limited manoeuvring space. There is no clear preference between TLA and RLA in the literature. However, the experience of the surgeon still remains the most important variable when choosing between the two approaches.

  8. Process optimized minimally invasive total hip replacement.

    PubMed

    Gebel, Philipp; Oszwald, Markus; Ishaque, Bernd; Ahmed, Gaffar; Blessing, Recha; Thorey, Fritz; Ottersbach, Andreas

    2012-01-01

    The purpose of this study was to analyse a new concept of using the the minimally invasive direct anterior approach (DAA) in total hip replacement (THR) in combination with the leg positioner (Rotex- Table) and a modified retractor system (Condor). We evaluated retrospectively the first 100 primary THR operated with the new concept between 2009 and 2010, regarding operation data, radiological and clinical outcome (HOOS). All surgeries were perfomed in a standardized operation technique including navigation. The average age of the patients was 68 years (37 to 92 years), with a mean BMI of 26.5 (17 to 43). The mean time of surgery was 80 min. (55 to 130 min). The blood loss showed an average of 511.5 mL (200 to 1000 mL). No intra-operative complications occurred. The postoperative complication rate was 6%. The HOOS increased from 43 points pre-operatively to 90 (max 100 points) 3 months after surgery. The radiological analysis showed an average cup inclination of 43° and a leg length discrepancy in a range of +/- 5 mm in 99%. The presented technique led to excellent clinic results, showed low complication rates and allowed correct implant positions although manpower was saved. PMID:22577504

  9. Minimizing or eliminating refueling of nuclear reactor

    DOEpatents

    Doncals, Richard A.; Paik, Nam-Chin; Andre, Sandra V.; Porter, Charles A.; Rathbun, Roy W.; Schwallie, Ambrose L.; Petras, Diane S.

    1989-01-01

    Demand for refueling of a liquid metal fast nuclear reactor having a life of 30 years is eliminated or reduced to intervals of at least 10 years by operating the reactor at a low linear-power density, typically 2.5 kw/ft of fuel rod, rather than 7.5 or 15 kw/ft, which is the prior art practice. So that power of the same magnitude as for prior art reactors is produced, the volume of the core is increased. In addition, the height of the core and it diameter are dimensioned so that the ratio of the height to the diameter approximates 1 to the extent practicable considering the requirement of control and that the pressure drop in the coolant shall not be excessive. The surface area of a cylinder of given volume is a minimum if the ratio of the height to the diameter is 1. By minimizing the surface area, the leakage of neutrons is reduced. By reducing the linear-power density, increasing core volume, reducing fissile enrichment and optimizing core geometry, internal-core breeding of fissionable fuel is substantially enhanced. As a result, core operational life, limited by control worth requirements and fuel burnup capability, is extended up to 30 years of continuous power operation.

  10. A minimal model of neutrino flavor

    NASA Astrophysics Data System (ADS)

    Luhn, Christoph; Parattu, Krishna Mohan; Wingerter, Akın

    2012-12-01

    Models of neutrino mass which attempt to describe the observed lepton mixing pattern are typically based on discrete family symmetries with a non-Abelian and one or more Abelian factors. The latter so-called shaping symmetries are imposed in order to yield a realistic phenomenology by forbidding unwanted operators. Here we propose a supersymmetric model of neutrino flavor which is based on the group T 7 and does not require extra {Z} N or U(1) factors in the Yukawa sector, which makes it the smallest realistic family symmetry that has been considered so far. At leading order, the model predicts tribimaximal mixing which arises completely accidentally from a combination of the T 7 Clebsch-Gordan coefficients and suitable flavon alignments. Next-to-leading order (NLO) operators break the simple tribimaximal structure and render the model compatible with the recent results of the Daya Bay and Reno collaborations which have measured a reactor angle of around 9°. Problematic NLO deviations of the other two mixing angles can be controlled in an ultraviolet completion of the model. The vacuum alignment mechanism that we use necessitates the introduction of a hidden flavon sector that transforms under a {Z} 6 symmetry, thereby spoiling the minimality of our model whose flavor symmetry is then T 7 × {Z} 6.

  11. Generalized scaling ansatz and minimal seesaw mechanism

    NASA Astrophysics Data System (ADS)

    Yasuè, Masaki

    2012-12-01

    Generalized scaling in flavor neutrino masses Mij (i, j=e, μ, τ) expressed in terms of θSC and the atmospheric neutrino mixing angle θ23 is defined by Miτ/Miμ=-κit23 (i=e, μ, τ) with κe=1, κμ=B/A and κτ=1/B, where t23=tan⁡θ23, A=cos⁡2θSC+sin⁡2θSCt234 and B=cos⁡2θSC-sin⁡2θSCt232. The generalized scaling ansatz predicts the vanishing reactor neutrino mixing angle θ13=0. It is shown that the minimal seesaw mechanism naturally implements our scaling ansatz. There are textures satisfying the generalized scaling ansatz that yield vanishing baryon asymmetry of the Universe (BAU). Focusing on these textures, we discuss effects of θ13≠0 to evaluate a CP-violating Dirac phase δ and BAU and find that BAU is approximately controlled by the factor sin⁡2θ13sin⁡(2δ-ϕ), where ϕ stands for the CP-violating Majorana phase whose magnitude turns out to be at most 0.1.

  12. Minimal genetic device with multiple tunable functions

    NASA Astrophysics Data System (ADS)

    Bagh, Sangram; Mandal, Mahuya; McMillen, David R.

    2010-08-01

    The ability to design artificial genetic devices with predictable functions is critical to the development of synthetic biology. Given the highly variable requirements of biological designs, the ability to tune the behavior of a genetic device is also of key importance; such tuning will allow devices to be matched with other components into larger systems, and to be shifted into the correct parameter regimes to elicit desired behaviors. Here, we have developed a minimal synthetic genetic system that acts as a multifunction, tunable biodevice in the bacterium Escherichia coli. First, it acts as a biochemical AND gate, sensing the extracellular small molecules isopropyl β-D -1-thiogalactopyranoside and anhydrotetracycline as two input signals and expressing enhanced green fluorescent protein as an output signal. Next, the output signal of the AND gate can be amplified by the application of another extracellular chemical, arabinose. Further, the system can generate a wide range of chemically tunable single input-output response curves, without any genetic alteration of the circuit, by varying the concentrations of a set of extracellular small molecules. We have developed and parameterized a simple transfer function model for the system, and shown that the model successfully explains and predicts the quantitative relationships between input and output signals in the system.

  13. Emerging robotic platforms for minimally invasive surgery.

    PubMed

    Vitiello, Valentina; Lee, Su-Lin; Cundy, Thomas P; Yang, Guang-Zhong

    2013-01-01

    Recent technological advances in surgery have resulted in the development of a range of new techniques that have reduced patient trauma, shortened hospitalization, and improved diagnostic accuracy and therapeutic outcome. Despite the many appreciated benefits of minimally invasive surgery (MIS) compared to traditional approaches, there are still significant drawbacks associated with conventional MIS including poor instrument control and ergonomics caused by rigid instrumentation and its associated fulcrum effect. The use of robot assistance has helped to realize the full potential of MIS with improved consistency, safety and accuracy. The development of articulated, precision tools to enhance the surgeon's dexterity has evolved in parallel with advances in imaging and human-robot interaction. This has improved hand-eye coordination and manual precision down to micron scales, with the capability of navigating through complex anatomical pathways. In this review paper, clinical requirements and technical challenges related to the design of robotic platforms for flexible access surgery are discussed. Allied technical approaches and engineering challenges related to instrument design, intraoperative guidance, and intelligent human-robot interaction are reviewed. We also highlight emerging designs and research opportunities in the field by assessing the current limitations and open technical challenges for the wider clinical uptake of robotic platforms in MIS.

  14. New methodology in biomedical science: methodological errors in classical science.

    PubMed

    Skurvydas, Albertas

    2005-01-01

    The following methodological errors are observed in biomedical sciences: paradigmatic ones; those of exaggerated search for certainty; science dehumanisation; deterministic and linearity; those of making conclusions; errors of reductionism or quality decomposition as well as exaggerated enlargement; errors connected with discarding odd; unexpected or awkward facts; those of exaggerated mathematization; isolation of science; the error of "common sense"; Ceteris Paribus law's ("other things being equal" laws) error; "youth" and common sense; inflexibility of criteria of the truth; errors of restricting the sources of truth and ways of searching for truth; the error connected with wisdom gained post factum; the errors of wrong interpretation of research mission; "laziness" to repeat the experiment as well as the errors of coordination of errors. One of the basic aims for the present-day scholars of biomedicine is, therefore, mastering the new non-linear, holistic, complex way of thinking that will, undoubtedly, enable one to make less errors doing research. The aim of "scientific travelling" will be achieved with greater probability if the "travelling" itself is performed with great probability. PMID:15687745

  15. On Ramsey (P3, P6)-minimal graphs

    NASA Astrophysics Data System (ADS)

    Rahmadani, Desi; Baskoro, Edy Tri; Assiyatun, Hilda

    2016-02-01

    Finding all Ramsey (G, H)-minimal graphs for a certain pair of graphs G and H is an interesting and difficult problem. Even though, it is just for small graphs G and H. In this paper, we determine some Ramsey (P3, P6)-minimal graphs of small order. We also characterize all such Ramsey minimal graphs of order 6 by using their degree sequences. We prove that Ramsey (P3, P6)-minimal graphs have diameter at least two. We construct an infinite class of trees [6] which provides Ramsey (P3, P6)-minimal graphs.

  16. Minimal flavor violation in the minimal U(1)B-L model and resonant leptogenesis

    NASA Astrophysics Data System (ADS)

    Okada, Nobuchika; Orikasa, Yuta; Yamada, Toshifumi

    2012-10-01

    We investigate the resonant leptogenesis scenario in the minimally U(1)B-L extended standard model with minimal flavor violation. In our model, the U(1)B-L gauge symmetry is broken at the TeV scale and standard model singlet neutrinos gain Majorana masses of order TeV. In addition, we introduce a flavor symmetry on the singlet neutrinos at a scale higher than TeV. The flavor symmetry is explicitly broken by the neutrino Dirac Yukawa coupling, which induces splittings in the singlet neutrino Majorana masses at lower scales through renormalization group evolutions. We call this setup minimal flavor violation. The mass splittings are proportional to the tiny Dirac Yukawa coupling, and hence they automatically enhance the CP asymmetry parameter necessary for the resonant leptogenesis mechanism. In this paper, we calculate the baryon number yield by solving the Boltzmann equations, including the effects of U(1)B-L gauge boson that also has TeV scale mass and causes washing-out of the singlet neutrinos in the course of thermal leptogenesis. The Dirac Yukawa coupling for neutrinos is fixed in terms of neutrino oscillation data and an arbitrary 3×3 complex-valued orthogonal matrix. We show that the right amount of baryon number asymmetry can be achieved through thermal leptogenesis in the context of the minimal flavor violation with singlet neutrinos and U(1)B-L gauge boson at the TeV scale. These particles can be discovered at the LHC in the near future.

  17. [Significance of Minimal Residual Disease in Chronic Lymphocytic Leukemia].

    PubMed

    Doubek, M

    2015-01-01

    Newly introduced highly effective treatment options increase the importance of minimal residual disease measurement in chronic lymphocytic leukemia. Minimal residual disease is gaining interest mainly as a predictive marker; however, clinical significance of minimal residual dis-ease in chronic lymphocytic leukemia in many different situations remains unresolved. Factors with a possible impact on the clinical significance of minimal residual disease are as follows: technique for minimal residual disease quantification, treatment regimen, peripheral blood vs. bone marrow analysis or time -point for sampling. Highly sensitive methods now available to evaluate minimal residual disease can detect a single chronic lymphocytic leukemia cell in 10(-4)- 10(-5) leukocytes using either allele -specific oligonucleotide polymerase chain reaction or multicolor flow cytometry. Minimal residual disease quantification as a surrogate marker to assess treatment efficacy in routine hematological practice has to be further evaluated.

  18. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    PubMed Central

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132

  19. Null-polygonal minimal surfaces in AdS4 from perturbed W minimal models

    NASA Astrophysics Data System (ADS)

    Hatsuda, Yasuyuki; Ito, Katsushi; Satoh, Yuji

    2013-02-01

    We study the null-polygonal minimal surfaces in AdS4, which correspond to the gluon scattering amplitudes/Wilson loops in {N} = 4 super Yang-Mills theory at strong coupling. The area of the minimal surfaces with n cusps is characterized by the thermodynamic Bethe ansatz (TBA) integral equations or the Y-system of the homogeneous sine-Gordon model, which is regarded as the SU( n - 4)4 /U(1) n-5 generalized parafermion theory perturbed by the weight-zero adjoint operators. Based on the relation to the TBA systems of the perturbed W minimal models, we solve the TBA equations by using the conformal perturbation theory, and obtain the analytic expansion of the remainder function around the UV/regular-polygonal limit for n = 6 and 7. We compare the rescaled remainder function for n = 6 with the two-loop one, to observe that they are close to each other similarly to the AdS3 case.

  20. Software Replica of Minimal Living Processes

    NASA Astrophysics Data System (ADS)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  1. Minimal SUSY SO(10) and Yukawa unification

    SciTech Connect

    Okada, Nobuchika

    2013-05-23

    The minimal supersymmetric (SUSY) SO(10) model, where only two Higgs multiplets {l_brace}10 Circled-Plus 126-bar{r_brace} are utilized for Yukawa couplings with matter fields, can nicely fit the neutrino oscillation parameters as well as charged fermion masses and mixing angles. In the fitting of the fermion mass matrix data, the largest element in the Yukawa coupling with the 126-bar -plet Higgs (Y{sup 126}) is found to be of order one, so that the right see-saw scale should be provided by Higgs vacuum expectation values (VEVs) of {beta}(10{sup 14}GeV). This fact causes a serious problem, namely, the gauge coupling unification is spoiled because of the presence of many exotic Higgs multiples emerging at the see-saw scale. In order to solve this problem, we consider a unification between bottom-quark and tau Yukawa couplings (b - {tau} Yukawa coupling unification) at the grand unified theory (GUT) scale, due to threshold corrections of superpartners to the Yukawa couplings at the 1 TeV scale. When the b - {tau} Yukawa coupling unification is very accurate, the largest element in Y{sub 126} can become {beta}(0.01), so that the right see-saw scale is realized by the GUT scale VEV and the usual gauge coupling unification is maintained. Since the b - {tau} Yukawa unification alters the Yukawa coupling data at the GUT scale, we re-analyze the fitting of the fermion mass matrix data by taking all the relevant free parameters into account. Unfortunately, we find that no parameter region shows up to give a nice fit for the current neutrino oscillation data and therefore, the usual picture of the gauge coupling unification cannot accommodate the fermion mass matrix data fitting in our procedure.

  2. An H-infinity norm minimization approach

    NASA Astrophysics Data System (ADS)

    Muse, Jonathan A.

    This dissertation seeks to merge the ideas from robust control theory such as Hinfinity control design and the Small Gain Theorem, L stability theory and Lyapunov stability from nonlinear control, and recent theoretical achievements in adaptive control. The fusion of frequency domain and linear time domain ideas allows the derivation of an H infinity Norm Minimization Approach (H infinity-NMA) for adaptive control architecture that permits a control designer to simplify the adaptive tuning process and tune the uncertainty compensation characteristics via linear control design techniques, band limit the adaptive control signal, efficiently handle redundant actuators, and handle unmatched uncertainty and matched uncertainty in a single design framework. The two stage design framework is similar to that used in robust control, but without sacrificing performance. The first stage of the design considers an ideal system with the system uncertainty completely known. For this system, a control law is designed using linear Hinfinity theory. Then in the second stage, an adaptive process is implemented that emulates the behavior of the ideal system. If the linear Hinfinity design is applied to control the emulated system, it then guarantees closed loop system stability of the actual system. All of this is accomplished while providing notions of transient performance bounds between the ideal system and the true system. Extensions to the theory include architectures for a class of output feedback systems, limiting the authority of an adaptive control system, and a method for improving the performance of an adaptive system with slow dynamics without any modification terms. Applications focus on using aerodynamic flow control for aircraft flight control and the Crew Launch Vehicle.

  3. Software replica of minimal living processes.

    PubMed

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela's autopoietic cellular automata, Ganti's chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  4. Logarithmic minimal models with Robin boundary conditions

    NASA Astrophysics Data System (ADS)

    Bourgine, Jean-Emile; Pearce, Paul A.; Tartaglia, Elena

    2016-06-01

    We consider general logarithmic minimal models LM≤ft( p,{{p}\\prime}\\right) , with p,{{p}\\prime} coprime, on a strip of N columns with the (r, s) Robin boundary conditions introduced by Pearce, Rasmussen and Tipunin. On the lattice, these models are Yang-Baxter integrable loop models that are described algebraically by the one-boundary Temperley-Lieb algebra. The (r, s) Robin boundary conditions are a class of integrable boundary conditions satisfying the boundary Yang-Baxter equations which allow loop segments to either reflect or terminate on the boundary. The associated conformal boundary conditions are organized into infinitely extended Kac tables labelled by the Kac labels r\\in {Z} and s\\in {N} . The Robin vacuum boundary condition, labelled by ≤ft(r,s-\\frac{1}{2}\\right)=≤ft(0,\\frac{1}{2}\\right) , is given as a linear combination of Neumann and Dirichlet boundary conditions. The general (r, s) Robin boundary conditions are constructed, using fusion, by acting on the Robin vacuum boundary with an (r, s)-type seam consisting of an r-type seam of width w columns and an s-type seam of width d  =  s  -  1 columns. The r-type seam admits an arbitrary boundary field which we fix to the special value ξ =-\\fracλ{2} where λ =\\frac≤ft( {{p}\\prime}-p\\right)π{{{p}\\prime}} is the crossing parameter. The s-type boundary introduces d defects into the bulk. We consider the commuting double-row transfer matrices and their associated quantum Hamiltonians and calculate analytically the boundary free energies of the (r, s) Robin boundary conditions. Using finite-size corrections and sequence extrapolation out to system sizes N+w+d≤slant 26 , the conformal spectrum of boundary operators is accessible by numerical diagonalization of the Hamiltonians. Fixing the parity of N for r\

  5. Software replica of minimal living processes.

    PubMed

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela's autopoietic cellular automata, Ganti's chemoton model, whose running delivers interesting take home messages to open-minded biologists. PMID:20204519

  6. Phenomenology in minimal theory of massive gravity

    NASA Astrophysics Data System (ADS)

    De Felice, Antonio; Mukohyama, Shinji

    2016-04-01

    We investigate the minimal theory of massive gravity (MTMG) recently introduced. After reviewing the original construction based on its Hamiltonian in the vielbein formalism, we reformulate it in terms of its Lagrangian in both the vielbein and the metric formalisms. It then becomes obvious that, unlike previous attempts in the literature of Lorentz-violating massive gravity, not only the potential but also the kinetic structure of the action is modified from the de Rham-Gabadadze-Tolley (dRGT) massive gravity theory. We confirm that the number of physical degrees of freedom in MTMG is two at fully nonlinear level. This proves the absence of various possible pathologies such as superluminality, acausality and strong coupling. Afterwards, we discuss the phenomenology of MTMG in the presence of a dust fluid. We find that on a flat homogeneous and isotropic background we have two branches. One of them (self-accelerating branch) naturally leads to acceleration without the genuine cosmological constant or dark energy. For this branch both the scalar and the vector modes behave exactly as in general relativity (GR). The phenomenology of this branch differs from GR in the tensor modes sector, as the tensor modes acquire a non-zero mass. Hence, MTMG serves as a stable nonlinear completion of the self-accelerating cosmological solution found originally in dRGT theory. The other branch (normal branch) has a dynamics which depends on the time-dependent fiducial metric. For the normal branch, the scalar mode sector, even though as in GR only one scalar mode is present (due to the dust fluid), differs from the one in GR, and, in general, structure formation will follow a different phenomenology. The tensor modes will be massive, whereas the vector modes, for both branches, will have the same phenomenology as in GR.

  7. [Minimally invasive approach in the pleural fluids].

    PubMed

    Sen, Serdar; Sentürk, Ekrem; Pabuşcu, Engin; Cokpinar, Salih; Yaman, Ertan

    2010-01-01

    The excess production or depleted absorbtion of pleural fluid is the major mechanism of pleural effusion formation. Primary lung pathologies or pathologies that originated from the other organs can be cause of pleural effusion. The search for suitable, practical and ideal treatment is continued at the present day. We have reviewed 94 patients with pleural effusion that have been treated by 10F catheter with local anesthesia in 2007-2008. The patient with dispenea, massive effusion or reoccurrent pleural effusion have been administrated pleural catheter through 7th or 8th intercostal interspace with local anesthesia. The mean age of patients (58 male, 36 female) was 57.2 (26-94). The most common etiologic causes were primary broncho carcinoma (34 cases 36.1%), cardiac failure (11 cases 11.1%) and empyema (eight cases 9.5%). Fifty three (56.3%) have been administrated pleurodesis because of treatment failure or reoccurrence. In 19 of these cases (20.2%), pleurodesis was successful. Pleurodesis agent was talc or tetracycline according to patients pain threshold. The treatment methods of pleural effusion include thoracentesis, thoracoscopy, tube thoracostomy and catheters with permanent tunnel. The simple and small-diameter catheters are administrated easily with minimal morbidity and no mortality. It's not only used in malign effusion but also used in benign effusion. Finally, simple catheter can be first treatment choice in short-term therapy and alternative choice in long-term therapy because of it's administrating facility, effectiveness in pleurodesis and cost-effectiveness. PMID:20517732

  8. VISION 21 SYSTEMS ANALYSIS METHODOLOGIES

    SciTech Connect

    G.S. Samuelsen; A. Rao; F. Robson; B. Washom

    2003-08-11

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into power plant systems that meet performance and emission goals of the Vision 21 program. The study efforts have narrowed down the myriad of fuel processing, power generation, and emission control technologies to selected scenarios that identify those combinations having the potential to achieve the Vision 21 program goals of high efficiency and minimized environmental impact while using fossil fuels. The technology levels considered are based on projected technical and manufacturing advances being made in industry and on advances identified in current and future government supported research. Included in these advanced systems are solid oxide fuel cells and advanced cycle gas turbines. The results of this investigation will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  9. Methodological challenges in meditation research.

    PubMed

    Caspi, Opher; Burleson, Katharine O

    2005-01-01

    Like other complex, multifaceted interventions in medicine, meditation represents a mixture of specific and not-so-specific elements of therapy. However, meditation is somewhat unique in that it is difficult to standardize, quantify, and authenticate for a given sample of research subjects. Thus, it is often challenging to discern its specific effects in order to satisfy the scientific method of causal inferences that underlies evidence-based medicine. Therefore, it is important to consider the key methodological challenges that affect both the design and analysis of meditation research. The goal of this paper is to review those challenges and to offer some practical solutions. Among the challenges discussed are the mismatches between questions and designs, the variability in meditation types, problems associated with meditation implementation, individual differences across meditators, and the impossibility of double-blind, placebo-controlled meditation studies. Among the design solutions offered are aptitude x treatment interaction (ATI) research, mixed quantitative-qualitative methods, and practical (pragmatic) clinical trials. Similar issues and solutions can be applied more generally to the entire domain of mind-body therapies.

  10. Methodological challenges in meditation research.

    PubMed

    Caspi, Opher; Burleson, Katherine O

    2007-01-01

    Like other complex, multifaceted interventions in medicine, meditation represents a mixture of specific and not-so-specific elements of therapy. However, meditation is somewhat unique in that it is difficult to standardize, quantify, and authenticate for a given sample of research subjects. Thus, it is often challenging to discern its specific effects in order to satisfy the scientific method of causal inferences that underlies evidence-based medicine. Therefore, it is important to consider the key methodological challenges that affect both the design and analysis of meditation research. The goal of this paper is to review those challenges and to offer some practical solutions. Among the challenges discussed are the mismatches between questions and designs, the variability in meditation types, problems associated with meditation implementation, individual differences across meditators, and the impossibility of double-blind, placebo-controlled meditation studies. Among the design solutions offered are aptitude x treatment interaction (ATI) research, mixed quantitative-qualitative methods, and practical (pragmatic) clinical trials. Similar issues and solutions can be applied more generally to the entire domain of mind-body therapies.

  11. Waste Package Design Methodology Report

    SciTech Connect

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  12. Unrestricted disposal of minimal activity levels of radioactive wastes: exposure and risk calculations

    SciTech Connect

    Fields, D.E.; Emerson, C.J.

    1984-08-01

    The US Nuclear Regulatory Commission is currently considering revision of rule 10 CFR Part 20, which covers disposal of solid wastes containing minimal radioactivity. In support of these revised rules, we have evaluated the consequences of disposing of four waste streams at four types of disposal areas located in three different geographic regions. Consequences are expressed in terms of human exposures and associated health effects. Each geographic region has its own climate and geology. Example waste streams, waste disposal methods, and geographic regions chosen for this study are clearly specified. Monetary consequences of minimal activity waste disposal are briefly discussed. The PRESTO methodology was used to evaluate radionuclide transport and health effects. This methodology was developed to assess radiological impacts to a static local population for a 1000-year period following disposal. Pathways and processes of transit from the trench to exposed populations included the following considerations: groundwater transport, overland flow, erosion, surface water dilution, resuspension, atmospheric transport, deposition, inhalation, and ingestion of contaminated beef, milk, crops, and water. 12 references, 2 figures, 8 tables.

  13. Constructivism: a naturalistic methodology for nursing inquiry.

    PubMed

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  14. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  15. Grounded theory methodology--narrativity revisited.

    PubMed

    Ruppel, Paul Sebastian; Mey, Günter

    2015-06-01

    This article aims to illuminate the role of narrativity in Grounded Theory Methodology and to explore an approach within Grounded Theory Methodology that is sensitized towards aspects of narrativity. The suggested approach takes into account narrativity as an aspect of the underlying data. It reflects how narrativity could be conceptually integrated and systematically used for shaping the way in which coding, category development and the presentation of results in a Grounded Theory Methodology study proceed.

  16. Minimizing instrumental polarization in the Multiangle SpectroPolarmetric Imager (MSPI) using diattenuation balancing between the three mirror coatings

    NASA Astrophysics Data System (ADS)

    Mahler, Anna-Britt; Raouf, Nasrat A.; Smith, Paula K.; McClain, Stephen C.; Chipman, Russell A.

    2008-07-01

    Special enhanced silver mirror coatings were designed and fabricated to minimize the polarization introduced by a three-mirror off-axis high-accuracy telescope. A system diattenuation of approximately 1% in the VIS-NIR was achieved by both reducing the diattenuation from each mirror individually and by balancing the diattenuations introduced by the three mirrors over the spectral range. This process of low-polarization engineering involves minimizing system polarization introduced by surface geometry, thin film coatings and birefringent elements, and measuring the system. In this report we will outline a methodology to minimize instrumental polarization aberrations, with an emphasis on achieving low diattenuation in the MSPI camera, given its off-axis geometry and coating design constraints imposed by the space-based application. This polarization balancing technique for mirror coatings can be applied to astrophysics applications.

  17. Research Methodologies in Science Education: Qualitative Data.

    ERIC Educational Resources Information Center

    Libarkin, Julie C.; Kurdziel, Josepha P.

    2002-01-01

    Introduces the concepts and terminology of qualitative research methodologies in the context of science education. Discusses interviewing, observing, validity, reliability, and confirmability. (Author/MM)

  18. Minimally Invasive Colorectal Cancer Surgery in Europe

    PubMed Central

    Babaei, Masoud; Balavarca, Yesilda; Jansen, Lina; Gondos, Adam; Lemmens, Valery; Sjövall, Annika; B⊘rge Johannesen, Tom; Moreau, Michel; Gabriel, Liberale; Gonçalves, Ana Filipa; Bento, Maria José; van de Velde, Tony; Kempfer, Lana Raffaela; Becker, Nikolaus; Ulrich, Alexis; Ulrich, Cornelia M.; Schrotz-King, Petra; Brenner, Hermann

    2016-01-01

    Abstract Minimally invasive surgery (MIS) of colorectal cancer (CRC) was first introduced over 20 years ago and recently has gained increasing acceptance and usage beyond clinical trials. However, data on dissemination of the method across countries and on long-term outcomes are still sparse. In the context of a European collaborative study, a total of 112,023 CRC cases from 3 population-based (N = 109,695) and 4 institute-based clinical cancer registries (N = 2328) were studied and compared on the utilization of MIS versus open surgery. Cox regression models were applied to study associations between surgery type and survival of patients from the population-based registries. The study considered adjustment for potential confounders. The percentage of CRC patients undergoing MIS differed substantially between centers and generally increased over time. MIS was significantly less often used in stage II to IV colon cancer compared with stage I in most centers. MIS tended to be less often used in older (70+) than in younger colon cancer patients. MIS tended to be more often used in women than in men with rectal cancer. MIS was associated with significantly reduced mortality among colon cancer patients in the Netherlands (hazard ratio [HR] 0.66, 95% confidence interval [CI] (0.63–0.69), Sweden (HR 0.68, 95% CI 0.60–0.76), and Norway (HR 0.73, 95% CI 0.67–0.79). Likewise, MIS was associated with reduced mortality of rectal cancer patients in the Netherlands (HR 0.74, 95% CI 0.68–0.80) and Sweden (HR 0.77, 95% CI 0.66–0.90). Utilization of MIS in CRC resection is increasing, but large variation between European countries and clinical centers prevails. Our results support association of MIS with substantially enhanced survival among colon cancer patients. Further studies controlling for selection bias and residual confounding are needed to establish role of MIS in survival of patients. PMID:27258522

  19. Pesticides and other chemicals: minimizing worker exposures.

    PubMed

    Keifer, Matthew; Gasperini, Frank; Robson, Mark

    2010-07-01

    Pesticides, ammonia, and sanitizers, all used in agricultural production present ongoing risks for exposed workers. Pesticides continue to poison workers despite elimination of some of the most toxic older products. Obligatory reporting of pesticide poisonings exists in 30 states and surveillance of poisoning occurs in only 12. Estimates of poisoning numbers have been based on sampling but funding for this is scant and in constant jeopardy. There appears to be a downward trend in poisonings nationally based on SENSOR data. Newer more pest-specific pesticides are generally less toxic and present less health risks but may have unpredicted health effects in humans that may not emerge until used widely. Internationally, older cheaper chemicals continue to be used with serious consequences in many developing countries. Monitoring workers for overexposure to pesticides broadly is impractical with the exception of the cholinesterase inhibitors. We can learn much from monitoring systems. Unfortunately, monitoring tools are economically inaccessible for most other chemical groups. New technologies for toxicity testing will necessitate new biomonitoring tools that should be supplied by the producers of these chemicals and made available for protecting worker and the public. Protection of workers from pesticides is primarily based on personal protective equipment use, which presents significant hardship for workers in hot environments and is generally considered the least effective approach on the hierarchy of controls in worker protection. Isolation through the use of closed systems has been employed, though rarely studied as to effectiveness in field use. Substitution or replacing harmful substances with safer ones is underway as more pest specific chemicals enter the pesticide portfolio and older ones drop out. This paper summarizes the panel presentation, "Minimizing Exposures to Pesticides and Other Chemicals," at the Agricultural Safety and Health Council of America

  20. MINIMIZATION OF CARBON LOSS IN COAL REBURNING

    SciTech Connect

    Vladimir Zamansky; Vitali Lissianski; Pete Maly; Richard Koppang

    2002-09-10

    This project develops Fuel-Flexible Reburning (FFR) technology that is an improved version of conventional reburning. In FFR solid fuel is partially gasified before injection into the reburning zone of a boiler. Partial gasification of the solid fuel improves efficiency of NO{sub x} reduction and decreases LOI by increasing fuel reactivity. Objectives of this project were to develop engineering and scientific information and know-how needed to improve the cost of reburning via increased efficiency and minimized LOI and move the FFR technology to the demonstration and commercialization stage. All project objectives and technical performance goals have been met, and competitive advantages of FFR have been demonstrated. The work included a combination of experimental and modeling studies designed to identify optimum process conditions, confirm the process mechanism and to estimate cost effectiveness of the FFR technology. Experimental results demonstrated that partial gasification of a solid fuel prior to injection into the reburning zone improved the efficiency of NO{sub x} reduction and decreased LOI. Several coals with different volatiles content were tested. Testing suggested that incremental increase in the efficiency of NO{sub x} reduction due to coal gasification was more significant for coals with low volatiles content. Up to 14% increase in the efficiency of NO{sub x} reduction in comparison with basic reburning was achieved with coal gasification. Tests also demonstrated that FFR improved efficiency of NO{sub x} reduction for renewable fuels with high fuel-N content. Modeling efforts focused on the development of the model describing reburning with gaseous gasification products. Modeling predicted that the composition of coal gasification products depended on temperature. Comparison of experimental results and modeling predictions suggested that the heterogeneous NO{sub x} reduction on the surface of char played important role. Economic analysis confirmed