Science.gov

Sample records for minimal cut-set methodology

  1. Minimal cut-set methodology for artificial intelligence applications

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements.

  2. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same

  3. Direct calculation of minimal cut sets involving a specific reaction knock-out.

    PubMed

    Tobalina, Luis; Pey, Jon; Planes, Francisco J

    2016-07-01

    The concept of Minimal Cut Sets (MCSs) is used in metabolic network modeling to describe minimal groups of reactions or genes whose simultaneous deletion eliminates the capability of the network to perform a specific task. Previous work showed that MCSs where closely related to Elementary Flux Modes (EFMs) in a particular dual problem, opening up the possibility to use the tools developed for computing EFMs to compute MCSs. Until recently, however, there existed no method to compute an EFM with some specific characteristic, meaning that, in the case of MCSs, the only strategy to obtain them was to enumerate them using, for example, the standard K-shortest EFMs algorithm. In this work, we adapt the recently developed theory to compute EFMs satisfying several constraints to the calculation of MCSs involving a specific reaction knock-out. Importantly, we emphasize that not all the EFMs in the dual problem correspond to real MCSs, and propose a new formulation capable of correctly identifying the MCS wanted. Furthermore, this formulation brings interesting insights about the relationship between the primal and the dual problem of the MCS computation. A Matlab-Cplex implementation of the proposed algorithm is available as a supplementary material fplanes@ceit.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  4. Sequential computation of elementary modes and minimal cut sets in genome-scale metabolic networks using alternate integer linear programming.

    PubMed

    Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh; Ramkrishna, Doraiswami

    2017-08-01

    Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. To alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs. The software is implemented in Matlab, and is provided as supplementary information . hyunseob.song@pnnl.gov. Supplementary data are available at Bioinformatics online.

  5. SET OF CUT SETS AND OPTIMUM FLOW,

    DTIC Science & Technology

    maintain the same terminal flow. The method presented stems from the work of Ford and Fulkerson which relates maximum terminal flow to the cut set...separating the terminals. A new set of cut sets called a ’set of M- cut sets’ is introduced from which it is possible to improve edge flows while maintaining maximum terminal flow.

  6. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  7. Energy minimization in medical image analysis: Methodologies and applications.

    PubMed

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well.

  8. Pollution balance. A new methodology for minimizing waste production in manufacturing processes

    SciTech Connect

    Hilaly, A.K.; Sikdar, S.K.

    1994-11-01

    A new methodology based on a generic pollution balance equation has been developed for minimizing waste production in manufacturing processes. A `pollution index,` defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative measure of waste generation in a process. A waste reduction algorithm also has been developed from the pollution balance equation. This paper explains this methodology and demonstrates the applicability of the method by a case study. 8 refs., 7 figs.

  9. Anthropogenic microfibres pollution in marine biota. A new and simple methodology to minimize airborne contamination.

    PubMed

    Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi

    2016-12-15

    Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work.

  10. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  11. Proposed SPAR Modeling Method for Quantifying Time Dependent Station Blackout Cut Sets

    SciTech Connect

    John A. Schroeder

    2010-06-01

    Abstract: The U.S. Nuclear Regulatory Commission’s (USNRC’s) Standardized Plant Analysis Risk (SPAR) models and industry risk models take similar approaches to analyzing the risk associated with loss of offsite power and station blackout (LOOP/SBO) events at nuclear reactor plants. In both SPAR models and industry models, core damage risk resulting from a LOOP/SBO event is analyzed using a combination of event trees and fault trees that produce cut sets that are, in turn, quantified to obtain a numerical estimate of the resulting core damage risk. A proposed SPAR method for quantifying the time-dependent cut sets is sometimes referred to as a convolution method. The SPAR method reflects assumptions about the timing of emergency diesel failures, the timing of subsequent attempts at emergency diesel repair, and the timing of core damage that may be different than those often used in industry models. This paper describes the proposed SPAR method.

  12. A Modulator Design Methodology Minimizing Power Dissipation in a Quantum Well Modulator-Based Optical Interconnect

    NASA Astrophysics Data System (ADS)

    Cho, Hoyeol; Kapur, Pawan; Saraswat, Krishna C.

    2007-06-01

    There is a strong need for a methodology that minimizes total power, which inherently includes device design, for short-distance optical link applications (chip-to-chip or board-to-board communications). We present such a power optimization methodology for a modulator-based optical link, where we do a full 3-D modulator parameter optimization, keeping the power of the entire link in mind. We find for low bit rates (10 Gb/s) that the optimum operational voltage for the modulator was within the supply voltage at the 65-nm technology node. At higher bit rates, the optimum voltage is found to increase and go beyond the stipulated supply voltage. In such a case, a suboptimum operation at the supply voltage incurs a 46% power penalty at 25 Gb/s. Having obtained the optimum modulator design and operation parameters and the corresponding total link power dissipation, we examine the impact of device and system parameters on the optimization. We find that a smaller device capacitance is an efficient solution to push the optimum swing voltage to be within the supply voltage. This is feasible using monolithically integrated Ge-based complementary-metal oxide semiconductor-compatible modulator and metal semiconductor metal photodetectors.

  13. Using benchmarking to minimize common DOE waste streams. Volume 1, Methodology and liquid photographic waste

    SciTech Connect

    Levin, V.

    1994-04-01

    Finding innovative ways to reduce waste streams generated at Department of Energy (DOE) sites by 50% by the year 2000 is a challenge for DOE`s waste minimization efforts. This report examines the usefulness of benchmarking as a waste minimization tool, specifically regarding common waste streams at DOE sites. A team of process experts from a variety of sites, a project leader, and benchmarking consultants completed the project with management support provided by the Waste Minimization Division EM-352. Using a 12-step benchmarking process, the team examined current waste minimization processes for liquid photographic waste used at their sites and used telephone and written questionnaires to find ``best-in-class`` industrv partners willing to share information about their best waste minimization techniques and technologies through a site visit. Eastman Kodak Co., and Johnson Space Center/National Aeronautics and Space Administration (NASA) agreed to be partners. The site visits yielded strategies for source reduction, recycle/recovery of components, regeneration/reuse of solutions, and treatment of residuals, as well as best management practices. An additional benefit of the work was the opportunity for DOE process experts to network and exchange ideas with their peers at similar sites.

  14. POLLUTION BALANCE: A NEW METHODOLOGY FOR MINIMIZING WASTE PRODUCTION IN MANUFACTURING PROCESSES.

    EPA Science Inventory

    A new methodolgy based on a generic pollution balance equation, has been developed for minimizing waste production in manufacturing processes. A "pollution index," defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative meas...

  15. Methodology for Minimizing Losses for the Harman Technique at High Temperatures

    NASA Astrophysics Data System (ADS)

    McCarty, R.; Thompson, J.; Sharp, J.; Thompson, A.; Bierschenk, J.

    2012-06-01

    A high-temperature Harman technique for measuring material ZT, or thermo- electric figure of merit, with increased measurement accuracy is presented. Traditional Harman tests are sensitive to radiation heat losses at elevated temperature, and measurement errors are minimized by applying current in positive and negative polarities while thermally sinking the sample base to a constant temperature for both polarities (referred to here as bottom temperature match, BTM). Since the sample top temperature differs between polarities in BTM, the heat losses are not equivalent and still add error to the ZT measurement. A modification is presented in which the sample base temperature is adjusted until the sample top temperature is the same in both polarities (referred to as top temperature match, TTM). This ensures that heat losses from the top of the sample are nearly identical and cancel out of the ZT calculation. A temperature-controlled radiation guard maintained at the sample top temperature is employed to minimize radiation loss and increase ZT calculation accuracy. Finite-element analysis (FEA) models suggest that ZT errors less than 5% for Bi2Te3 alloys tested at 250°C are achievable, a 30% improvement over the conventional BTM approach. Experimental results support these trends.

  16. Periodic Application of Stochastic Cost Optimization Methodology to Achieve Remediation Objectives with Minimized Life Cycle Cost

    NASA Astrophysics Data System (ADS)

    Kim, U.; Parker, J.

    2016-12-01

    Many dense non-aqueous phase liquid (DNAPL) contaminated sites in the U.S. are reported as "remediation in progress" (RIP). However, the cost to complete (CTC) remediation at these sites is highly uncertain and in many cases, the current remediation plan may need to be modified or replaced to achieve remediation objectives. This study evaluates the effectiveness of iterative stochastic cost optimization that incorporates new field data for periodic parameter recalibration to incrementally reduce prediction uncertainty and implement remediation design modifications as needed to minimize the life cycle cost (i.e., CTC). This systematic approach, using the Stochastic Cost Optimization Toolkit (SCOToolkit), enables early identification and correction of problems to stay on track for completion while minimizing the expected (i.e., probability-weighted average) CTC. This study considers a hypothetical site involving multiple DNAPL sources in an unconfined aquifer using thermal treatment for source reduction and electron donor injection for dissolved plume control. The initial design is based on stochastic optimization using model parameters and their joint uncertainty based on calibration to site characterization data. The model is periodically recalibrated using new monitoring data and performance data for the operating remediation systems. Projected future performance using the current remediation plan is assessed and reoptimization of operational variables for the current system or consideration of alternative designs are considered depending on the assessment results. We compare remediation duration and cost for the stepwise re-optimization approach with single stage optimization as well as with a non-optimized design based on typical engineering practice.

  17. Minimally invasive metabolic testing for malignant hyperthermia susceptibility: a systematic review of the methodology and results.

    PubMed

    Metterlein, Thomas; Schuster, Frank; Kranke, Peter; Roewer, Norbert; Anetseder, Martin

    2010-03-01

    Malignant hyperthermia (MH) is a potentially lethal hypermetabolic syndrome that develops in susceptible individuals exposed to volatile anesthetics or depolarizing neuromuscular blocking agents. Because genetic screening is successful only in 30 - 50% of all suspected cases, contracture testing following an open muscle biopsy is performed to diagnose MH susceptibility. Two different protocols exist, the in vitro contracture test (IVCT) for Europe and the caffine halothane contracture test for the US. As replacement for the IVCT, an in vivo metabolic test might allow an equal discrimination of MH susceptible individuals. In this systematic review, all available metabolic testing methods are analyzed. The reader will gain insight in methods and results of alternative approaches to diagnose MH. Relevant studies involving in vivo metabolic testing were systematically searched (Medline) and reviewed. Their ability to discriminate MH susceptible individuals was analyzed and compared. Any systemic or local side effects were documented and evaluated in order to allow more robust conclusions based on larger sample sizes than the single trials. All discussed study protocols allowed an adequate discrimination of MH susceptible individuals. The latest study protocol reaches a specificity of 79% with a sensitivity of 100%. No severe systemic or local adverse effects could be seen in the pooled analysis. Minimally invasive metabolic testing is a promising novel approach to diagnose MH. Further multi-center studies have to be conducted to optimize the results in order to replace the IVCT.

  18. A minimally invasive methodology based on morphometric parameters for day 2 embryo quality assessment.

    PubMed

    Molina, Inmaculada; Lázaro-Ibáñez, Elisa; Pertusa, Jose; Debón, Ana; Martínez-Sanchís, Juan Vicente; Pellicer, Antonio

    2014-10-01

    The risk of multiple pregnancy to maternal-fetal health can be minimized by reducing the number of embryos transferred. New tools for selecting embryos with the highest implantation potential should be developed. The aim of this study was to evaluate the ability of morphological and morphometric variables to predict implantation by analysing images of embryos. This was a retrospective study of 135 embryo photographs from 112 IVF-ICSI cycles carried out between January and March 2011. The embryos were photographed immediately before transfer using Cronus 3 software. Their images were analysed using the public program ImageJ. Significant effects (P < 0.05), and higher discriminant power to predict implantation were observed for the morphometric embryo variables compared with morphological ones. The features for successfully implanted embryos were as follows: four cells on day 2 of development; all blastomeres with circular shape (roundness factor greater than 0.9), an average zona pellucida thickness of 13 µm and an average of 17695.1 µm² for the embryo area. Embryo size, which is described by its area and the average roundness factor for each cell, provides two objective variables to consider when predicting implantation. This approach should be further investigated for its potential ability to improve embryo scoring.

  19. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations.

    PubMed

    Andrews, Nick A; Latrémolière, Alban; Basbaum, Allan I; Mogil, Jeffrey S; Porreca, Frank; Rice, Andrew S C; Woolf, Clifford J; Currie, Gillian L; Dworkin, Robert H; Eisenach, James C; Evans, Scott; Gewandter, Jennifer S; Gover, Tony D; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P; Kennedy, Jeffrey D; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P; McMahon, Stephen B; Price, Theodore J; Ross, Sarah E; Scherrer, Grégory; Seal, Rebecca P; Sena, Emily S; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I; Turk, Dennis C; Whiteside, Garth

    2016-04-01

    There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public-private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues.

  20. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations

    PubMed Central

    Andrews, Nick A.; Latrémolière, Alban; Basbaum, Allan I.; Mogil, Jeffrey S.; Porreca, Frank; Rice, Andrew S.C.; Woolf, Clifford J.; Currie, Gillian L.; Dworkin, Robert H.; Eisenach, James C.; Evans, Scott; Gewandter, Jennifer S.; Gover, Tony D.; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P.; Kennedy, Jeffrey D.; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P.; McMahon, Stephen B.; Price, Theodore J.; Ross, Sarah E.; Scherrer, Grégory; Seal, Rebecca P.; Sena, Emily S.; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I.; Turk, Dennis C.; Whiteside, Garth

    2015-01-01

    Abstract There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  1. Towards uniform accelerometry analysis: a standardization methodology to minimize measurement bias due to systematic accelerometer wear-time variation.

    PubMed

    Katapally, Tarun R; Muhajarine, Nazeem

    2014-05-01

    Accelerometers are predominantly used to objectively measure the entire range of activity intensities - sedentary behaviour (SED), light physical activity (LPA) and moderate to vigorous physical activity (MVPA). However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants), jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within 'valid' data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com) were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA). Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time's influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and MVPA as outcome

  2. Minimizing effects of methodological decisions on interpretation and prediction in species distribution studies: An example with background selection

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Talbert, Marian; Morisette, Jeffrey T.; Aldridge, Cameron; Brown, Cynthia; Kumar, Sunil; Manier, Daniel; Talbert, Colin; Holcombe, Tracy R.

    2017-01-01

    Evaluating the conditions where a species can persist is an important question in ecology both to understand tolerances of organisms and to predict distributions across landscapes. Presence data combined with background or pseudo-absence locations are commonly used with species distribution modeling to develop these relationships. However, there is not a standard method to generate background or pseudo-absence locations, and method choice affects model outcomes. We evaluated combinations of both model algorithms (simple and complex generalized linear models, multivariate adaptive regression splines, Maxent, boosted regression trees, and random forest) and background methods (random, minimum convex polygon, and continuous and binary kernel density estimator (KDE)) to assess the sensitivity of model outcomes to choices made. We evaluated six questions related to model results, including five beyond the common comparison of model accuracy assessment metrics (biological interpretability of response curves, cross-validation robustness, independent data accuracy and robustness, and prediction consistency). For our case study with cheatgrass in the western US, random forest was least sensitive to background choice and the binary KDE method was least sensitive to model algorithm choice. While this outcome may not hold for other locations or species, the methods we used can be implemented to help determine appropriate methodologies for particular research questions.

  3. Minimal Pairs: Minimal Importance?

    ERIC Educational Resources Information Center

    Brown, Adam

    1995-01-01

    This article argues that minimal pairs do not merit as much attention as they receive in pronunciation instruction. There are other aspects of pronunciation that are of greater importance, and there are other ways of teaching vowel and consonant pronunciation. (13 references) (VWL)

  4. Endovascular treatment for Small Core and Anterior circulation Proximal occlusion with Emphasis on minimizing CT to recanalization times (ESCAPE) trial: methodology.

    PubMed

    Demchuk, Andrew M; Goyal, Mayank; Menon, Bijoy K; Eesa, Muneer; Ryckborst, Karla J; Kamal, Noreen; Patil, Shivanand; Mishra, Sachin; Almekhlafi, Mohammed; Randhawa, Privia A; Roy, Daniel; Willinsky, Robert; Montanera, Walter; Silver, Frank L; Shuaib, Ashfaq; Rempel, Jeremy; Jovin, Tudor; Frei, Donald; Sapkota, Biggya; Thornton, J Michael; Poppe, Alexandre; Tampieri, Donatella; Lum, Cheemun; Weill, Alain; Sajobi, Tolulope T; Hill, Michael D

    2015-04-01

    ESCAPE is a prospective, multicenter, randomized clinical trial that will enroll subjects with the following main inclusion criteria: less than 12 h from symptom onset, age > 18, baseline NIHSS >5, ASPECTS score of >5 and CTA evidence of carotid T/L or M1 segment MCA occlusion, and at least moderate collaterals by CTA. The trial will determine if endovascular treatment will result in higher rates of favorable outcome compared with standard medical therapy alone. Patient populations that are eligible include those receiving IV tPA, tPA ineligible and unwitnessed onset or wake up strokes with 12 h of last seen normal. The primary end-point, based on intention-to-treat criteria is the distribution of modified Rankin Scale scores at 90 days assessed using a proportional odds model. The projected maximum sample size is 500 subjects. Randomization is stratified under a minimization process using age, gender, baseline NIHSS, baseline ASPECTS (8-10 vs. 6-7), IV tPA treatment and occlusion location (ICA vs. MCA) as covariates. The study will have one formal interim analysis after 300 subjects have been accrued. Secondary end-points at 90 days include the following: mRS 0-1; mRS 0-2; Barthel 95-100, EuroQOL and a cognitive battery. Safety outcomes are symptomatic ICH, major bleeding, contrast nephropathy, total radiation dose, malignant MCA infarction, hemicraniectomy and mortality at 90 days. © 2014 World Stroke Organization.

  5. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows.

  6. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  7. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  8. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  9. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  10. Methodological Gravitism

    ERIC Educational Resources Information Center

    Zaman, Muhammad

    2011-01-01

    In this paper the author presents the case of the exchange marriage system to delineate a model of methodological gravitism. Such a model is not a deviation from or alteration to the existing qualitative research approaches. I have adopted culturally specific methodology to investigate spouse selection in line with the Grounded Theory Method. This…

  11. Carotene-rich plant foods ingested with minimal dietary fat enhance the total-body vitamin A pool size in Filipino schoolchildren as assessed by stable-isotope-dilution methodology.

    PubMed

    Ribaya-Mercado, Judy D; Maramag, Cherry C; Tengco, Lorena W; Dolnikowski, Gregory G; Blumberg, Jeffrey B; Solon, Florentino S

    2007-04-01

    Strategies for improving the vitamin A status of vulnerable populations are needed. We studied the influence of the amounts of dietary fat on the effectiveness of carotene-rich plant foods in improving vitamin A status. Schoolchildren aged 9-12 y were fed standardized meals 3 times/d, 5 d/wk, for 9 wk. The meals provided 4.2 mg provitamin A carotenoids/d (mainly beta-carotene) from yellow and green leafy vegetables [carrots, pechay (bok choy), squash, and kangkong (swamp cabbage)] and 7, 15, or 29 g fat/d (2.4, 5, or 10 g fat/meal) in groups A, B, and C (n = 39, 39, and 38, respectively). Other self-selected foods eaten were recorded daily. Before and after the intervention, total-body vitamin A pool sizes and liver vitamin A concentrations were measured with the deuterated-retinol-dilution method; serum retinol and carotenoid concentrations were measured by HPLC. Similar increases in mean serum beta-carotene (5-fold), alpha-carotene (19-fold), and beta-cryptoxanthin (2-fold) concentrations; total-body vitamin A pool size (2-fold); and liver vitamin A (2-fold) concentrations were observed after 9 wk in the 3 study groups; mean serum retinol concentrations did not change significantly. The total daily beta-carotene intake from study meals plus self-selected foods was similar between the 3 groups and was 14 times the usual intake; total fat intake was 0.9, 1.4, or 2.0 times the usual intake in groups A, B, and C, respectively. The overall prevalence of low liver vitamin A (<0.07 mumol/g) decreased from 35% to 7%. Carotene-rich yellow and green leafy vegetables, when ingested with minimal fat, enhance serum carotenoids and the total-body vitamin A pool size and can restore low liver vitamin A concentrations to normal concentrations.

  12. Minimal covering problem and PLA minimization

    SciTech Connect

    Young, M.H.; Muroga, S.

    1985-12-01

    Solving the minimal covering problem by an implicit enumeration method is discussed. The implicit enumeration method in this paper is a modification of the Quine-McCluskey method tailored to computer processing and also its extension, utilizing some new properties of the minimal covering problem for speedup. A heuristic algorithm is also presented to solve large-scale problems. Its application to the minimization of programmable logic arrays (i.e., PLAs) is shown as an example. Computational experiences are presented to confirm the improvements by the implicit enumeration method discussed.

  13. Better Hyper-minimization

    NASA Astrophysics Data System (ADS)

    Maletti, Andreas

    Hyper-minimization aims to compute a minimal deterministic finite automaton (dfa) that recognizes the same language as a given dfa up to a finite number of errors. Algorithms for hyper-minimization that run in time O(n logn), where n is the number of states of the given dfa, have been reported recently in [Gawrychowski and Jeż: Hyper-minimisation made efficient. Proc. Mfcs, Lncs 5734, 2009] and [Holzer and Maletti: An n logn algorithm for hyper-minimizing a (minimized) deterministic automaton. Theor. Comput. Sci. 411, 2010]. These algorithms are improved to return a hyper-minimal dfa that commits the least number of errors. This closes another open problem of [Badr, Geffert, and Shipman: Hyper-minimizing minimized deterministic finite state automata. Rairo Theor. Inf. Appl. 43, 2009]. Unfortunately, the time complexity for the obtained algorithm increases to O(n 2).

  14. Increasingly minimal bias routing

    DOEpatents

    Bataineh, Abdulla; Court, Thomas; Roweth, Duncan

    2017-02-21

    A system and algorithm configured to generate diversity at the traffic source so that packets are uniformly distributed over all of the available paths, but to increase the likelihood of taking a minimal path with each hop the packet takes. This is achieved by configuring routing biases so as to prefer non-minimal paths at the injection point, but increasingly prefer minimal paths as the packet proceeds, referred to herein as Increasing Minimal Bias (IMB).

  15. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  16. Probabilistic inspection strategies for minimizing service failures

    NASA Astrophysics Data System (ADS)

    Brot, Abraham

    1994-09-01

    The INSIM computer program is described which simulates the 'limited fatigue life' environment in which aircraft structures generally operate. The use of INSIM to develop inspection strategies which aim to minimize service failures is demonstrated. Damage-tolerance methodology, inspection thresholds and customized inspections are simulated using the probability of failure as the driving parameter.

  17. Minimizing Classroom Interruptions.

    ERIC Educational Resources Information Center

    Partin, Ronald L.

    1987-01-01

    Offers suggestions for minimizing classroom interruptions, such as suggesting to the principal that announcements not be read over the intercom during class time and arranging desks and chairs so as to minimize visual distractions. Contains a school interruption survey form. (JC)

  18. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  19. Minimally invasive stomas.

    PubMed

    Hellinger, Michael D; Al Haddad, Abdullah

    2008-02-01

    Traditionally, stoma creation and end stoma reversal have been performed via a laparotomy incision. However, in many situations, stoma construction may be safely performed in a minimally invasive nature. This may include a trephine, laparoscopic, or combined approach. Furthermore, Hartmann's colostomy reversal, a procedure traditionally associated with substantial morbidity, may also be performed laparoscopically. The authors briefly review patient selection, preparation, and indications, and focus primarily on surgical techniques and results of minimally invasive stoma creation and Hartmann's reversal.

  20. Minimally invasive lumbar foraminotomy.

    PubMed

    Deutsch, Harel

    2013-07-01

    Lumbar radiculopathy is a common problem. Nerve root compression can occur at different places along a nerve root's course including in the foramina. Minimal invasive approaches allow easier exposure of the lateral foramina and decompression of the nerve root in the foramina. This video demonstrates a minimally invasive approach to decompress the lumbar nerve root in the foramina with a lateral to medial decompression. The video can be found here: http://youtu.be/jqa61HSpzIA.

  1. Methodology for assessing systems materials requirements

    SciTech Connect

    Culver, D.H.; Teeter, R.R.; Jamieson, W.M.

    1980-01-01

    A potential stumbling block to new system planning and design is imprecise, confusing, or contradictory data regarding materials - their availability and costs. A methodology is now available that removes this barrier by minimizing uncertainties regarding materials availability. Using this methodology, a planner can assess materials requirements more quickly, at lower cost, and with much greater confidence in the results. Developed specifically for energy systems, its potential application is much broader. This methodology and examples of its use are discussed.

  2. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  3. Minimal gaugino mediation

    SciTech Connect

    Schmaltz, Martin; Skiba, Witold

    2000-11-01

    We propose minimal gaugino mediation as the simplest known solution to the supersymmetric flavor and CP problems. The framework predicts a very minimal structure for the soft parameters at ultrahigh energies: gaugino masses are unified and non-vanishing whereas all other soft supersymmetry breaking parameters vanish. We show that this boundary condition naturally arises from a small extra dimension and present a complete model which includes a new extra-dimensional solution to the {mu} problem. We briefly discuss the predicted superpartner spectrum as a function of the two parameters of the model. The commonly ignored renormalization group evolution above the GUT scale is crucial to the viability of minimal gaugino mediation but does not introduce new model dependence.

  4. Minimal Gaugino Mediation

    SciTech Connect

    Schmaltz, M.

    2000-01-19

    The authors propose Minimal Gaugino Mediation as the simplest known solution to the supersymmetric flavor and CP problems. The framework predicts a very minimal structure for the soft parameters at ultra-high energies: gaugino masses are unified and non-vanishing whereas all other soft supersymmetry breaking parameters vanish. The authors show that this boundary condition naturally arises from a small extra dimension and present a complete model which includes a new extra-dimensional solution to the mu problem. The authors briefly discuss the predicted superpartner spectrum as a function of the two parameters of the model. The commonly ignored renormalization group evolution above the GUT scale is crucial to the viability of Minimal Gaugino Mediation but does not introduce new model dependence.

  5. Approximate fault-tree analysis without cut sets

    NASA Astrophysics Data System (ADS)

    Schneeweiss, Winfrid G.

    It is shown that a rather efficient approximate fault tree analysis is possible on the basis of the Shannon decomposition. The main advantages are: (1) no preprocessing is necessary to determine all the mincuts; (2) the maximum error can be prespecified; and (3) noncoherent systems and systems with dependent component states can be treated. The main disadvantage is the fact that the cutting off of certain subtrees of the decomposition tree (for upper bound results) may need some trial and error test calculations.

  6. Testing methodologies

    SciTech Connect

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  7. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  8. CONMIN- CONSTRAINED FUNCTION MINIMIZATION

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1994-01-01

    In many mathematical problems, it is necessary to determine the minimum and maximum of a function of several variables, limited by various linear and nonlinear inequality constraints. It is seldom possible, in practical applications, to solve these problems directly. In most cases, an iterative method must be used to numerically obtain a solution. The CONMIN program was developed to numerically perform the minimization of a multi-variable function subject to a set of inequality constraints. The function need not be a simple analytical equation; it may be any function which can be numerically evaluated. The basic analytic technique used by CONMIN is to minimize the function until one or more of the constraints become active. The minimization process then continues by following the constraint boundaries in a direction such that the value of the function continues to decrease. When a point is reached where no further decrease in the function can be obtained, the process is terminated. Function maximization may be achieved by minimizing the negative of the function. This program is written in FORTRAN IV for batch execution and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 43K (octal) of 60 bit words. The CONMIN program was originally developed in 1973 and last updated in 1978.

  9. Minimally invasive valve surgery.

    PubMed

    Woo, Y Joseph

    2009-08-01

    Traditional cardiac valve replacement surgery is being rapidly supplanted by innovative, minimally invasive approaches toward the repair of these valves. Patients are experiencing benefits ranging from less bleeding and pain to faster recovery and greater satisfaction. These operations are proving to be safe, highly effective, and durable, and their use will likely continue to increase and become even more widely applicable.

  10. Minimal hepatic encephalopathy.

    PubMed

    Zamora Nava, Luis Eduardo; Torre Delgadillo, Aldo

    2011-06-01

    The term minimal hepatic encephalopathy (MHE) refers to the subtle changes in cognitive function, electrophysiological parameters, cerebral neurochemical/neurotransmitter homeostasis, cerebral blood flow, metabolism, and fluid homeostasis that can be observed in patients with cirrhosis who have no clinical evidence of hepatic encephalopathy; the prevalence is as high as 84% in patients with hepatic cirrhosis. Physician does generally not perceive cirrhosis complications, and neuropsychological tests and another especial measurement like evoked potentials and image studies like positron emission tomography can only make diagnosis. Diagnosis of minimal hepatic encephalopathy may have prognostic and therapeutic implications in cirrhotic patients. The present review pretends to explore the clinic, therapeutic, diagnosis and prognostic aspects of this complication.

  11. Discrete Minimal Surface Algebras

    NASA Astrophysics Data System (ADS)

    Arnlind, Joakim; Hoppe, Jens

    2010-05-01

    We consider discrete minimal surface algebras (DMSA) as generalized noncommutative analogues of minimal surfaces in higher dimensional spheres. These algebras appear naturally in membrane theory, where sequences of their representations are used as a regularization. After showing that the defining relations of the algebra are consistent, and that one can compute a basis of the enveloping algebra, we give several explicit examples of DMSAs in terms of subsets of sln (any semi-simple Lie algebra providing a trivial example by itself). A special class of DMSAs are Yang-Mills algebras. The representation graph is introduced to study representations of DMSAs of dimension d ≤ 4, and properties of representations are related to properties of graphs. The representation graph of a tensor product is (generically) the Cartesian product of the corresponding graphs. We provide explicit examples of irreducible representations and, for coinciding eigenvalues, classify all the unitary representations of the corresponding algebras.

  12. Minimally symmetric Higgs boson

    SciTech Connect

    Low, Ian

    2015-06-17

    Models addressing the naturalness of a light Higgs boson typically employ symmetries, either bosonic or fermionic, to stabilize the Higgs mass. We consider a setup with the minimal amount of symmetries: four shift symmetries acting on the four components of the Higgs doublet, subject to the constraints of linearly realized SU(2)(L) x U(1)(Y) electroweak symmetry. Up to terms that explicitly violate the shift symmetries, the effective Lagrangian can be derived, irrespective of the spontaneously broken group G in the ultraviolet, and is universal among all models where the Higgs arises as a pseudo-Nambu-Goldstone boson. Very high energy scatterings of vector bosons could provide smoking gun signals of a minimally symmetric Higgs boson.

  13. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary's direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  14. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  15. Minimal surfaces over stars

    NASA Astrophysics Data System (ADS)

    McDougall, Jane; Schaubroeck, Lisbeth

    2008-04-01

    A JS surface is a minimal graph over a polygonal domain that becomes infinite in magnitude at the domain boundary. Jenkins and Serrin characterized the existence of these minimal graphs in terms of the signs of the boundary values and the side-lengths of the polygon. For a convex polygon, there can be essentially only one JS surface, but a non-convex domain may admit several distinct JS surfaces. We consider two families of JS surfaces corresponding to different boundary values, namely JS0 and JS1, over domains in the form of regular stars. We give parameterizations for these surfaces as lifts of harmonic maps, and observe that all previously constructed JS surfaces have been of type JS0. We give an example of a JS1 surface that is a new complete embedded minimal surface generalizing Scherk's doubly periodic surface, and show also that the JS0 surface over a regular convex 2n-gon is the limit of JS1 surfaces over non-convex stars. Finally we consider the construction of other JS surfaces over stars that belong neither to JS0 nor to JS1.

  16. [Minimally invasive thymus surgery].

    PubMed

    Rückert, J C; Ismail, M; Swierzy, M; Braumann, C; Badakhshi, H; Rogalla, P; Meisel, A; Rückert, R I; Müller, J M

    2008-01-01

    There are absolute and relative indications for complete removal of the thymus gland. In the complex therapy of autoimmune-related myasthenia gravis, thymectomy plays a central role and is performed with relative indication. In case of thymoma with or without myasthenia, thymectomy is absolutely indicated. Thymus resection is further necessary for cases of hyperparathyroidism with ectopic intrathymic parathyroids or with certain forms of multiple endocrine neoplasia. The transcervical operation technique traditionally reflected the well-founded desire for minimal invasiveness for thymectomy. Due to the requirement of radicality however, most of these operations were performed using sternotomy. With the evolution of therapeutic thoracoscopy in thoracic surgery, several pure or extended minimally invasive operation techniques for thymectomy have been developed. At present uni- or bilateral, subxiphoid, and modified transcervical single or combination thoracoscopic techniques are in use. Recently a very precise new level of thoracoscopic operation technique was developed using robotic-assisted surgery. There are special advantages of this technique for thymectomy. An overview of the development and experiences with minimally invasive thymectomy is presented, including data from the largest series published so far.

  17. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  18. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  19. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  20. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  1. Logarithmic superconformal minimal models

    NASA Astrophysics Data System (ADS)

    Pearce, Paul A.; Rasmussen, Jørgen; Tartaglia, Elena

    2014-05-01

    The higher fusion level logarithmic minimal models {\\cal LM}(P,P';n) have recently been constructed as the diagonal GKO cosets {(A_1^{(1)})_k\\oplus (A_1^ {(1)})_n}/ {(A_1^{(1)})_{k+n}} where n ≥ 1 is an integer fusion level and k = nP/(P‧- P) - 2 is a fractional level. For n = 1, these are the well-studied logarithmic minimal models {\\cal LM}(P,P')\\equiv {\\cal LM}(P,P';1). For n ≥ 2, we argue that these critical theories are realized on the lattice by n × n fusion of the n = 1 models. We study the critical fused lattice models {\\cal LM}(p,p')_{n\\times n} within a lattice approach and focus our study on the n = 2 models. We call these logarithmic superconformal minimal models {\\cal LSM}(p,p')\\equiv {\\cal LM}(P,P';2) where P = |2p - p‧|, P‧ = p‧ and p, p‧ are coprime. These models share the central charges c=c^{P,P';2}=\\frac {3}{2}\\big (1-{2(P'-P)^2}/{P P'}\\big ) of the rational superconformal minimal models {\\cal SM}(P,P'). Lattice realizations of these theories are constructed by fusing 2 × 2 blocks of the elementary face operators of the n = 1 logarithmic minimal models {\\cal LM}(p,p'). Algebraically, this entails the fused planar Temperley-Lieb algebra which is a spin-1 Birman-Murakami-Wenzl tangle algebra with loop fugacity β2 = [x]3 = x2 + 1 + x-2 and twist ω = x4 where x = eiλ and λ = (p‧- p)π/p‧. The first two members of this n = 2 series are superconformal dense polymers {\\cal LSM}(2,3) with c=-\\frac {5}{2}, β2 = 0 and superconformal percolation {\\cal LSM}(3,4) with c = 0, β2 = 1. We calculate the bulk and boundary free energies analytically. By numerically studying finite-size conformal spectra on the strip with appropriate boundary conditions, we argue that, in the continuum scaling limit, these lattice models are associated with the logarithmic superconformal models {\\cal LM}(P,P';2). For system size N, we propose finitized Kac character formulae of the form q^{-{c^{P,P';2}}/{24}+\\Delta ^{P,P';2} _{r

  2. Transanal Minimally Invasive Surgery

    PubMed Central

    deBeche-Adams, Teresa; Nassif, George

    2015-01-01

    Transanal minimally invasive surgery (TAMIS) was first described in 2010 as a crossover between single-incision laparoscopic surgery and transanal endoscopic microsurgery (TEM) to allow access to the proximal and mid-rectum for resection of benign and early-stage malignant rectal lesions. The TAMIS technique can also be used for noncurative intent surgery of more advanced lesions in patients who are not candidates for radical surgery. Proper workup and staging should be done before surgical decision-making. In addition to the TAMIS port, instrumentation and set up include readily available equipment found in most operating suites. TAMIS has proven its usefulness in a wide range of applications outside of local excision, including repair of rectourethral fistula, removal of rectal foreign body, control of rectal hemorrhage, and as an adjunct in total mesorectal excision for rectal cancer. TAMIS is an easily accessible, technically feasible, and cost-effective alternative to TEM. PMID:26491410

  3. Membranes minimize liquid discharge

    SciTech Connect

    Cappos, S.

    1995-07-01

    Zero discharge is a matter of concentration. Liquid and solid waste are repeatedly reduced to minimize or eliminate their discharge. But the process is intense, requiring an array of filtering and purifying technologies to achieve discharge goals. One of the most productive and effective technologies for this purpose is reverse osmosis (RO). Developed in the 1960s, RO produces a high-quality permeate for reuse, and a small concentrated stream for further treatment. The addition of RO to a wastewater treatment system can reduce overall operating costs and the capital costs of other components, as well as reduce a waste treatment system`s reliance on chemical treatment. The paper discusses how RO works, when RO is the best solution, where the waste goes, alternative technologies (clarifiers, vapor compression evaporators, and ion exchange demineralizers), and recent advances in membrane technology.

  4. Minimally invasive esophagectomy

    PubMed Central

    Herbella, Fernando A; Patti, Marco G

    2010-01-01

    Esophageal resection is associated with a high morbidity and mortality rate. Minimally invasive esophagectomy (MIE) might theoretically decrease this rate. We reviewed the current literature on MIE, with a focus on the available techniques, outcomes and comparison with open surgery. This review shows that the available literature on MIE is still crowded with heterogeneous studies with different techniques. There are no controlled and randomized trials, and the few retrospective comparative cohort studies are limited by small numbers of patients and biased by historical controls of open surgery. Based on the available literature, there is no evidence that MIE brings clear benefits compared to conventional esophagectomy. Increasing experience and the report of larger series might change this scenario. PMID:20698044

  5. Minimally invasive valve surgery.

    PubMed

    Woo, Y Joseph; Seeburger, Joerg; Mohr, Friedrich W

    2007-01-01

    As alternatives to standard sternotomy, surgeons have developed innovative, minimally invasive approaches to conducting valve surgery. Through very small skin incisions and partial upper sternal division for aortic valve surgery and right minithoracotomy for mitral surgery, surgeons have become adept at performing complex valve procedures. Beyond cosmetic appeal, apparent benefits range from decreased pain and bleeding to improved respiratory function and recovery time. The large retrospective studies and few small prospective randomized studies are herein briefly summarized. The focus is then directed toward describing specific intraoperative technical details in current clinical use, covering anesthetic preparation, incision, mediastinal access, cardiovascular cannulation, valve exposure, and valve reconstruction. Finally, unique situations such as pulmonic valve surgery, reoperations, beating heart surgery, and robotics are discussed.

  6. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  7. A perturbation technique for shield weight minimization

    SciTech Connect

    Watkins, E.F.; Greenspan, E. )

    1993-01-01

    The radiation shield optimization code SWAN (Ref. 1) was originally developed for minimizing the thickness of a shield that will meet a given dose (or another) constraint or for extremizing a performance parameter of interest (e.g., maximizing energy multiplication or minimizing dose) while maintaining the shield volume constraint. The SWAN optimization process proved to be highly effective (e.g., see Refs. 2, 3, and 4). The purpose of this work is to investigate the applicability of the SWAN methodology to problems in which the weight rather than the volume is the relevant shield characteristic. Such problems are encountered in shield design for space nuclear power systems. The investigation is carried out using SWAN with the coupled neutron-photon cross-section library FLUNG (Ref. 5).

  8. Minimizing Variation In Outdoor CPV Power Ratings

    NASA Astrophysics Data System (ADS)

    Muller, Matthew; Marion, Bill; Rodriguez, Jose; Kurtz, Sarah

    2011-12-01

    The CPV community has agreed to have both indoor and outdoor module power ratings. The indoor rating provides a repeatable measurement off the factory line while the outdoor rating provides a measure of true on-sun performance. The challenge with an outdoor rating is that conditions that impact the measurement such as the spectrum, temperature, wind speed, etc are constantly in flux. This work examines methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  9. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  10. Minimal complexity control law synthesis

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Haddad, Wassim M.; Nett, Carl N.

    1989-01-01

    A paradigm for control law design for modern engineering systems is proposed: Minimize control law complexity subject to the achievement of a specified accuracy in the face of a specified level of uncertainty. Correspondingly, the overall goal is to make progress towards the development of a control law design methodology which supports this paradigm. Researchers achieve this goal by developing a general theory of optimal constrained-structure dynamic output feedback compensation, where here constrained-structure means that the dynamic-structure (e.g., dynamic order, pole locations, zero locations, etc.) of the output feedback compensation is constrained in some way. By applying this theory in an innovative fashion, where here the indicated iteration occurs over the choice of the compensator dynamic-structure, the paradigm stated above can, in principle, be realized. The optimal constrained-structure dynamic output feedback problem is formulated in general terms. An elegant method for reducing optimal constrained-structure dynamic output feedback problems to optimal static output feedback problems is then developed. This reduction procedure makes use of star products, linear fractional transformations, and linear fractional decompositions, and yields as a byproduct a complete characterization of the class of optimal constrained-structure dynamic output feedback problems which can be reduced to optimal static output feedback problems. Issues such as operational/physical constraints, operating-point variations, and processor throughput/memory limitations are considered, and it is shown how anti-windup/bumpless transfer, gain-scheduling, and digital processor implementation can be facilitated by constraining the controller dynamic-structure in an appropriate fashion.

  11. Minimal Higgs inflation

    NASA Astrophysics Data System (ADS)

    Maity, Debaprasad

    2017-06-01

    In this paper we propose minimal Higgs inflation scenarios by non-polynomial modification of the Higgs potential. The modification is done in such a way that it creates a flat plateau for a huge range of field values at the inflationary energy scale μ ≃(λ) 1 / 4 α. Assuming the perturbative Higgs quartic coupling, λ ≃ O (1), our model prediction for all the cosmologically relevant quantities, (ns , r , dnsk), fit extremely well with observations made by PLANCK. For both the models the inflation energy scale turned out to be μ ≃ (1014 ,1015) GeV. Considering observed central value of the scalar spectral index, ns = 0.968, models predict efolding number, N = (52 , 47). Within a wide range of viable parameter space, we found that the prediction of tensor to scalar ratio r (≤10-5) is far below the current experimental limit. The prediction for the running of scalar spectral index, dnsk, remains very small. We also computed the background field dependent unitarity scale Λ (h), which turned out to be much larger than the aforementioned inflationary energy scale.

  12. USGS Methodology for Assessing Continuous Petroleum Resources

    USGS Publications Warehouse

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  13. New Methodology of ENSO Forecast

    NASA Astrophysics Data System (ADS)

    Feigin, A. M.; Gavrilov, A.; Mukhin, D.; Loskutov, E.; Seleznev, A.

    2016-12-01

    We describe methodology of ENSO forecast based on data-driven construction of evolution operator of underlying climate sub-system. The methodology is composed of two key algorithms: (i) space-distributed data preparation aiming to reduce data dimensionality with minimal loss of information about system's dynamics, and (ii) construction of operator that reproduces evolution of the system in reduced data space. The first algorithm combines several known data preprocessing techniques: decomposition via empirical orthogonal function basis, its spatiotemporal generalization as well as singular value decomposition techniques. The second algorithm supposes construction of evolution operator in the form of random dynamical system realized as nonlinear random mapping; the last is parameterized by artificial neural networks. General Bayesian approach is applied for mutual searching optimal parameters of both algorithms: optimal dimensionality of reduced data space and optimal complexity of the evolution operator. Abilities of suggested methodology will be demonstrated via reproduction and forecast of different ENSO related indexes including comparison of prediction skill of new methodology with power of other existing techniques. This research was supported by the Government of the Russian Federation (Agreement No.14.Z50.31.0033 with the Institute of Applied Physics RAS).

  14. Discrete minimal flavor violation

    SciTech Connect

    Zwicky, Roman; Fischbacher, Thomas

    2009-10-01

    We investigate the consequences of replacing the global flavor symmetry of minimal flavor violation (MFV) SU(3){sub Q}xSU(3){sub U}xSU(3){sub D}x{center_dot}{center_dot}{center_dot} by a discrete D{sub Q}xD{sub U}xD{sub D}x{center_dot}{center_dot}{center_dot} symmetry. Goldstone bosons resulting from the breaking of the flavor symmetry generically lead to bounds on new flavor structure many orders of magnitude above the TeV scale. The absence of Goldstone bosons for discrete symmetries constitute the primary motivation of our work. Less symmetry implies further invariants and renders the mass-flavor basis transformation observable in principle and calls for a hierarchy in the Yukawa matrix expansion. We show, through the dimension of the representations, that the (discrete) symmetry in principle does allow for additional {delta}F=2 operators. If though the {delta}F=2 transitions are generated by two subsequent {delta}F=1 processes, as, for example, in the standard model, then the four crystal-like groups {sigma}(168){approx_equal}PSL(2,F{sub 7}), {sigma}(72{phi}), {sigma}(216{phi}) and especially {sigma}(360{phi}) do provide enough protection for a TeV-scale discrete MFV scenario. Models where this is not the case have to be investigated case by case. Interestingly {sigma}(216{phi}) has a (nonfaithful) representation corresponding to an A{sub 4} symmetry. Moreover we argue that the, apparently often omitted, (D) groups are subgroups of an appropriate {delta}(6g{sup 2}). We would like to stress that we do not provide an actual model that realizes the MFV scenario nor any other theory of flavor.

  15. Minimal Change Disease.

    PubMed

    Vivarelli, Marina; Massella, Laura; Ruggiero, Barbara; Emma, Francesco

    2017-02-07

    Minimal change disease (MCD) is a major cause of idiopathic nephrotic syndrome (NS), characterized by intense proteinuria leading to edema and intravascular volume depletion. In adults, it accounts for approximately 15% of patients with idiopathic NS, reaching a much higher percentage at younger ages, up to 70%-90% in children >1 year of age. In the pediatric setting, a renal biopsy is usually not performed if presentation is typical and the patient responds to therapy with oral prednisone at conventional doses. Therefore, in this setting steroid-sensitive NS can be considered synonymous with MCD. The pathologic hallmark of disease is absence of visible alterations by light microscopy and effacement of foot processes by electron microscopy. Although the cause is unknown and it is likely that different subgroups of disease recognize a different pathogenesis, immunologic dysregulation and modifications of the podocyte are thought to synergize in altering the integrity of the glomerular basement membrane and therefore determining proteinuria. The mainstay of therapy is prednisone, but steroid-sensitive forms frequently relapse and this leads to a percentage of patients requiring second-line steroid-sparing immunosuppression. The outcome is variable, but forms of MCD that respond to steroids usually do not lead to chronic renal damage, whereas forms that are unresponsive to steroids may subsequently reveal themselves as FSGS. However, in a substantial number of patients the disease is recurrent and requires long-term immunosuppression, with significant morbidity because of side effects. Recent therapeutic advances, such as the use of anti-CD20 antibodies, have provided long-term remission off-therapy and suggest new hypotheses for disease pathogenesis. Copyright © 2017 by the American Society of Nephrology.

  16. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  17. Minimizing Launch Mass for ISRU Processes

    NASA Technical Reports Server (NTRS)

    England, C.; Hallinan, K. P.

    2004-01-01

    The University of Dayton and the Jet Propulsion Laboratory are developing a methodology for estimating the Earth launch mass (ELM) of processes for In-Situ Resource Utilization (ISRU) with a focus on lunar resource recovery. ISRU may be enabling for both an extended presence on the Moon, and for large sample return missions and for a human presence on Mars. To accomplish these exploration goals, the resources recovered by ISRU must offset the ELM for the recovery process. An appropriate figure of merit is the cost of the exploration mission, which is closely related to ELM. For a given production rate and resource concentration, the lowest ELM - and the best ISRU process - is achieved by minimizing capital equipment for both the ISRU process and energy production. ISRU processes incur Carnot limitations and second law losses (irreversibilities) that ultimately determine production rate, material utilization and energy efficiencies. Heat transfer, chemical reaction, and mechanical operations affect the ELM in ways that are best understood by examining the process's detailed energetics. Schemes for chemical and thermal processing that do not incorporate an understanding of second law losses will be incompletely understood. Our team is developing a methodology that will aid design and selection of ISRU processes by identifying the impact of thermodynamic losses on ELM. The methodology includes mechanical, thermal and chemical operations, and, when completed, will provide a procedure and rationale for optimizing their design and minimizing their cost. The technique for optimizing ISRU with respect to ELM draws from work of England and Funk that relates the cost of endothermic processes to their second law efficiencies. Our team joins their approach for recovering resources by chemical processing with analysis of thermal and mechanical operations in space. Commercial firms provide cost inputs for ELM and planetary landing. Additional information is included in the

  18. Minimizing Launch Mass for ISRU Processes

    NASA Technical Reports Server (NTRS)

    England, C.; Hallinan, K. P.

    2004-01-01

    The University of Dayton and the Jet Propulsion Laboratory are developing a methodology for estimating the Earth launch mass (ELM) of processes for In-Situ Resource Utilization (ISRU) with a focus on lunar resource recovery. ISRU may be enabling for both an extended presence on the Moon, and for large sample return missions and for a human presence on Mars. To accomplish these exploration goals, the resources recovered by ISRU must offset the ELM for the recovery process. An appropriate figure of merit is the cost of the exploration mission, which is closely related to ELM. For a given production rate and resource concentration, the lowest ELM - and the best ISRU process - is achieved by minimizing capital equipment for both the ISRU process and energy production. ISRU processes incur Carnot limitations and second law losses (irreversibilities) that ultimately determine production rate, material utilization and energy efficiencies. Heat transfer, chemical reaction, and mechanical operations affect the ELM in ways that are best understood by examining the process's detailed energetics. Schemes for chemical and thermal processing that do not incorporate an understanding of second law losses will be incompletely understood. Our team is developing a methodology that will aid design and selection of ISRU processes by identifying the impact of thermodynamic losses on ELM. The methodology includes mechanical, thermal and chemical operations, and, when completed, will provide a procedure and rationale for optimizing their design and minimizing their cost. The technique for optimizing ISRU with respect to ELM draws from work of England and Funk that relates the cost of endothermic processes to their second law efficiencies. Our team joins their approach for recovering resources by chemical processing with analysis of thermal and mechanical operations in space. Commercial firms provide cost inputs for ELM and planetary landing. Additional information is included in the

  19. What kind of consciousness is minimal?

    PubMed

    Kotchoubey, Boris; Vogel, Dominik; Lang, Simone; Müller, Friedemann

    2014-01-01

    A comparison between unitary and non-unitary views on minimal consciousness. First, unitary (all-or-none) and non-unitary (gradual or continuous) models of consciousness are defined as they have been developed in both philosophy and neurophysiology. Then, the implications of these ideas to the notion the minimally conscious state (MCS) are discussed. Review and analysis of theoretical conceptions and empirical data. Both kinds of models are compatible with the actual definitions of MCS. Although unitary views may seem to contradict the description of the MCS in 'Neurology' 2002, the apparent contradiction can easily be solved. Most recent data, particularly those obtained using fMRI and concerning learning, emotional responsiveness and pain and suffering, speak for non-unitary models. Most evidence speaks for non-unitary models of minimal consciousness. If these models are correct, patients with MCS may have, in addition to temporal fluctuations, a lower level of consciousness compared with fully conscious individuals. A still lower level could characterize patients diagnosed as unresponsive wakefulness syndrome (UWS). From this point of view, therefore, the difference between UWS and MCS is gradual rather than qualitative. However, due to methodological limitations of the available studies, the evidence for non-unitary models cannot be regarded as definite.

  20. Microbiological Methodology in Astrobiology

    NASA Technical Reports Server (NTRS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  1. Microbiological Methodology in Astrobiology

    NASA Technical Reports Server (NTRS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  2. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  3. Influenza SIRS with Minimal Pneumonitis.

    PubMed

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  4. Response Surface Methodology

    DTIC Science & Technology

    2004-10-01

    methods . All three of these topics are usually combined into Response Surface Methodology (RSM). Also the experimenter may encounter situations where...TITLE AND SUBTITLE Response Surface Methodology 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...18 Keywords: Response Surface Methodology (RSM), regression analysis, linear

  5. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  6. The development and characterization of synthetic minimal yeast promoters

    PubMed Central

    Redden, Heidi; Alper, Hal S.

    2015-01-01

    Synthetic promoters, especially minimally sized, are critical for advancing fungal synthetic biology. Fungal promoters often span hundreds of base pairs, nearly ten times the amount of bacterial counterparts. This size limits large-scale synthetic biology efforts in yeasts. Here we address this shortcoming by establishing a methodical workflow necessary to identify robust minimal core elements that can be linked with minimal upstream activating sequences to develop short, yet strong yeast promoters. Through a series of library-based synthesis, analysis and robustness tests, we create a set of non-homologous, purely synthetic, minimal promoters for yeast. These promoters are comprised of short core elements that are generic and interoperable and 10 bp UAS elements that impart strong, constitutive function. Through this methodology, we are able to generate the shortest fungal promoters to date, which can achieve high levels of both inducible and constitutive expression with up to an 80% reduction in size. PMID:26183606

  7. Minimizing waste in environmental restoration

    SciTech Connect

    Moos, L.; Thuot, J.R.

    1996-07-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs.

  8. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  9. Aortic valve surgery - minimally invasive

    MedlinePlus

    ... of the heart is reduced. This is called aortic stenosis. The aortic valve can be replaced using: Minimally ... RN, Wang A. Percutaneous heart valve replacement for aortic stenosis: state of the evidence. Ann Intern Med . 2010; ...

  10. Constrained minimization for monotonic reconstruction

    SciTech Connect

    Rider, W.J.; Kothe, D.B.

    1996-08-20

    The authors present several innovations in a method for monotonic reconstructions. It is based on the application of constrained minimization techniques for the imposition of monotonicity on a reconstruction. In addition, they present extensions of several classical TVD limiters to a genuinely multidimensional setting. In this case the linear least squares reconstruction method is expanded upon. They also clarify data dependent weighting techniques used with the minimization process.

  11. Shapes of embedded minimal surfaces.

    PubMed

    Colding, Tobias H; Minicozzi, William P

    2006-07-25

    Surfaces that locally minimize area have been extensively used to model physical phenomena, including soap films, black holes, compound polymers, protein folding, etc. The mathematical field dates to the 1740s but has recently become an area of intense mathematical and scientific study, specifically in the areas of molecular engineering, materials science, and nanotechnology because of their many anticipated applications. In this work, we show that all minimal surfaces are built out of pieces of the surfaces in Figs. 1 and 2.

  12. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  13. Specialized minimal PDFs for optimized LHC calculations.

    PubMed

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.

  14. Minimally invasive surgery for atrial fibrillation

    PubMed Central

    Suwalski, Piotr

    2013-01-01

    Atrial fibrillation (AF) remains the most common cardiac arrhythmia, affecting nearly 2% of the general population worldwide. Minimally invasive surgical ablation remains one of the most dynamically evolving fields of modern cardiac surgery. While there are more than a dozen issues driving this development, two seem to play the most important role: first, there is lack of evidence supporting percutaneous catheter based approach to treat patients with persistent and long-standing persistent AF. Paucity of this data offers surgical community unparalleled opportunity to challenge guidelines and change indications for surgical intervention. Large, multicenter prospective clinical studies are therefore of utmost importance, as well as honest, clear data reporting. Second, a collaborative methodology started a long-awaited debate on a Heart Team approach to AF, similar to the debate on coronary artery disease and transcatheter valves. Appropriate patient selection and tailored treatment options will most certainly result in better outcomes and patient satisfaction, coupled with appropriate use of always-limited institutional resources. The aim of this review, unlike other reviews of minimally invasive surgical ablation, is to present medical professionals with two distinctly different, approaches. The first one is purely surgical, Standalone surgical isolation of the pulmonary veins using bipolar energy source with concomitant amputation of the left atrial appendage—a method of choice in one of the most important clinical trials on AF—The Atrial Fibrillation Catheter Ablation Versus Surgical Ablation Treatment (FAST) Trial. The second one represents the most complex approach to this problem: a multidisciplinary, combined effort of a cardiac surgeon and electrophysiologist. The Convergent Procedure, which includes both endocardial and epicardial unipolar ablation bonds together minimally invasive endoscopic surgery with electroanatomical mapping, to deliver best of

  15. Minimal but non-minimal inflation and electroweak symmetry breaking

    SciTech Connect

    Marzola, Luca; Racioppi, Antonio

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  16. Propeller aeroacoustic methodologies

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Gregorek, G. M.

    1980-01-01

    The aspects related to propeller performance by means of a review of propeller methodologies are addressed. Preliminary wind tunnel propeller performance data are presented and the predominent limitations of existing propeller performance methodologies are discussed. Airfoil developments appropriate for propeller applications are also reviewed.

  17. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  18. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  19. Menopause and Methodological Doubt

    ERIC Educational Resources Information Center

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  20. Menopause and Methodological Doubt

    ERIC Educational Resources Information Center

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  1. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  2. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  3. Minimal Erythema Dose (MED) Testing

    PubMed Central

    Heckman, Carolyn J.; Chandler, Rachel; Kloss, Jacqueline D.; Benson, Amy; Rooney, Deborah; Munshi, Teja; Darlow, Susan D.; Perlis, Clifford; Manne, Sharon L.; Oslin, David W.

    2013-01-01

    Ultraviolet radiation (UV) therapy is sometimes used as a treatment for various common skin conditions, including psoriasis, acne, and eczema. The dosage of UV light is prescribed according to an individual's skin sensitivity. Thus, to establish the proper dosage of UV light to administer to a patient, the patient is sometimes screened to determine a minimal erythema dose (MED), which is the amount of UV radiation that will produce minimal erythema (sunburn or redness caused by engorgement of capillaries) of an individual's skin within a few hours following exposure. This article describes how to conduct minimal erythema dose (MED) testing. There is currently no easy way to determine an appropriate UV dose for clinical or research purposes without conducting formal MED testing, requiring observation hours after testing, or informal trial and error testing with the risks of under- or over-dosing. However, some alternative methods are discussed. PMID:23748556

  4. [Vegetative or minimally conscious state?].

    PubMed

    Overbeek, Berno U H; Lavrijsen, Jan C M; Eilander, Henk J

    2010-01-01

    We describe the clinical course of a 51-year-old woman in a vegetative state and of a 63-year-old woman in a minimally conscious state. The difference between these two states is an important one, as clinical course, prognosis and medical-ethical considerations of both are different. In practice it is difficult to distinguish between a vegetative state and a minimally conscious state, but the use of a Post-Acute Level of Consciousness scale helps to illustrate the differences. Expertise, research, and application of functional neuro-imaging techniques (PET, fMRI) might also be useful. The differences between these two states regarding rehabilitation, pain management and medical-ethical decisions are important. The effects of neuro-rehabilitation and the implications of a minimally conscious state for patients and their proxies need further investigation.

  5. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  6. Does Minimally Invasive Spine Surgery Minimize Surgical Site Infections?

    PubMed Central

    Patel, Ravish Shammi; Dutta, Shumayou

    2016-01-01

    Study Design Retrospective review of prospectively collected data. Purpose To evaluate the incidence of surgical site infections (SSIs) in minimally invasive spine surgery (MISS) in a cohort of patients and compare with available historical data on SSI in open spinal surgery cohorts, and to evaluate additional direct costs incurred due to SSI. Overview of Literature SSI can lead to prolonged antibiotic therapy, extended hospitalization, repeated operations, and implant removal. Small incisions and minimal dissection intrinsic to MISS may minimize the risk of postoperative infections. However, there is a dearth of literature on infections after MISS and their additional direct financial implications. Methods All patients from January 2007 to January 2015 undergoing posterior spinal surgery with tubular retractor system and microscope in our institution were included. The procedures performed included tubular discectomies, tubular decompressions for spinal stenosis and minimal invasive transforaminal lumbar interbody fusion (TLIF). The incidence of postoperative SSI was calculated and compared to the range of cited SSI rates from published studies. Direct costs were calculated from medical billing for index cases and for patients with SSI. Results A total of 1,043 patients underwent 763 noninstrumented surgeries (discectomies, decompressions) and 280 instrumented (TLIF) procedures. The mean age was 52.2 years with male:female ratio of 1.08:1. Three infections were encountered with fusion surgeries (mean detection time, 7 days). All three required wound wash and debridement with one patient requiring unilateral implant removal. Additional direct cost due to infection was $2,678 per 100 MISS-TLIF. SSI increased hospital expenditure per patient 1.5-fold after instrumented MISS. Conclusions Overall infection rate after MISS was 0.29%, with SSI rate of 0% in non-instrumented MISS and 1.07% with instrumented MISS. MISS can markedly reduce the SSI rate and can be an

  7. Rovers minimize human disturbance in research on wild animals.

    PubMed

    Le Maho, Yvon; Whittington, Jason D; Hanuise, Nicolas; Pereira, Louise; Boureau, Matthieu; Brucker, Mathieu; Chatelain, Nicolas; Courtecuisse, Julien; Crenner, Francis; Friess, Benjamin; Grosbellet, Edith; Kernaléguen, Laëtitia; Olivier, Frédérique; Saraux, Claire; Vetter, Nathanaël; Viblanc, Vincent A; Thierry, Bernard; Tremblay, Pascale; Groscolas, René; Le Bohec, Céline

    2014-12-01

    Investigating wild animals while minimizing human disturbance remains an important methodological challenge. When approached by a remote-operated vehicle (rover) which can be equipped to make radio-frequency identifications, wild penguins had significantly lower and shorter stress responses (determined by heart rate and behavior) than when approached by humans. Upon immobilization, the rover-unlike humans-did not disorganize colony structure, and stress rapidly ceased. Thus, rovers can reduce human disturbance of wild animals and the resulting scientific bias.

  8. Shapes of embedded minimal surfaces

    PubMed Central

    Colding, Tobias H.; Minicozzi, William P.

    2006-01-01

    Surfaces that locally minimize area have been extensively used to model physical phenomena, including soap films, black holes, compound polymers, protein folding, etc. The mathematical field dates to the 1740s but has recently become an area of intense mathematical and scientific study, specifically in the areas of molecular engineering, materials science, and nanotechnology because of their many anticipated applications. In this work, we show that all minimal surfaces are built out of pieces of the surfaces in Figs. 1 and 2. PMID:16847265

  9. Minimal Incision Congenital Cardiac Surgery

    PubMed Central

    del Nido, Pedro J.

    2008-01-01

    Minimally invasive techniques have had limited application in congenital cardiac surgery, primarily due to the complexity of the defects, small working area, and the fact that most defects require exposure to intracardiac structures. Advances in cannula design and instrumentation have allowed application of minimal incision techniques but in most cases, cardiopulmonary bypass is still required. Image guided surgery, which uses non-invasive imaging to guide intracardiac procedures, holds the promise of permitting performance of reconstructive surgery in the beating heart in children. PMID:18395631

  10. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  11. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  12. Wilson loops in minimal surfaces

    SciTech Connect

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-04-27

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS{sub 5} x S{sup 5}. The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS{sub 5} x S{sup 5} gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface.

  13. Dubin's Minimal Linkage Construct Revisited.

    ERIC Educational Resources Information Center

    Rogers, Donald P.

    This paper contains a theoretical analysis and empirical study that support the major premise of Robert Dubin's minimal-linkage construct-that restricting communication links increases organizational stability. The theoretical analysis shows that fewer communication links are associated with less uncertainty, more redundancy, and greater…

  14. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  15. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  16. What is minimally invasive dentistry?

    PubMed

    Ericson, Dan

    2004-01-01

    Minimally Invasive Dentistry is the application of "a systematic respect for the original tissue." This implies that the dental profession recognizes that an artifact is of less biological value than the original healthy tissue. Minimally invasive dentistry is a concept that can embrace all aspects of the profession. The common delineator is tissue preservation, preferably by preventing disease from occurring and intercepting its progress, but also removing and replacing with as little tissue loss as possible. It does not suggest that we make small fillings to restore incipient lesions or surgically remove impacted third molars without symptoms as routine procedures. The introduction of predictable adhesive technologies has led to a giant leap in interest in minimally invasive dentistry. The concept bridges the traditional gap between prevention and surgical procedures, which is just what dentistry needs today. The evidence-base for survival of restorations clearly indicates that restoring teeth is a temporary palliative measure that is doomed to fail if the disease that caused the condition is not addressed properly. Today, the means, motives and opportunities for minimally invasive dentistry are at hand, but incentives are definitely lacking. Patients and third parties seem to be convinced that the only things that count are replacements. Namely, they are prepared to pay for a filling but not for a procedure that can help avoid having one.

  17. Methodologies for clinical ethics.

    PubMed

    Drane, J F

    1990-01-01

    Truly professional medical ethics requires a methodology that generates both moral discernment and consistently right judgments. In this article the author briefly reviews difficulties involved in ethical decision-making, the historical development of casuistry, and four ethical methodologies employed in clinical medicine today. These latter, which are outlined and compared, are as follows: the methodology developed by David Thomasma in the 1960s and 1970s; one created by Jonsen, Siegler, and Winslade; another developed by the author; and the Bochum Protocol authored by Hans-Martin Sass et al. of the Bochum Center for Medical Ethics in the Federal Republic of Germany.

  18. Technology transfer methodology

    NASA Technical Reports Server (NTRS)

    Labotz, Rich

    1991-01-01

    Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.

  19. Methodology for Stochastic Modeling.

    DTIC Science & Technology

    1985-01-01

    AD-AISS 851 METHODOLOGY FOR STOCHASTIC MODELING(U) ARMY MATERIEL 11 SYSTEMS ANALYSIS ACTIYITY ABERDEEN PROVING GROUND MD H E COHEN JAN 95 RNSAA-TR-41...FORM T REPORT NUMBER 2. GOVT ACCESSION NO. 3. RECIPIENT’$ CATALOG NUMBER 4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED Methodology for...autoregression models, moving average models, ARMA, adaptive modeling, covariance methods , singular value decom- position, order determination rational

  20. Light modular rig for minimal environment impact

    SciTech Connect

    Mehra, S.; Abedrabbo, A.

    1996-12-31

    The fast plenary meeting of United Nations on human Environment in 1972 considered the need for a common outlook and for common principles to inspire and guide the people and industries of the world in the preservation and enhancement of human environment. Since then many countries have, or am now enacting, environmental legislation`s covering the wide spectrum of environmental protection issues. Petroleum industry has not been immune to inch scrutiny, however, little has changed in land based drilling operations, especially in remote areas. A major aspect of the ongoing program in the design of a light modular land rig has been minimization of the environmental impact. Today, concerns for protection of the environment have spread in many drilling areas: the use of some traditional drilling techniques such as waste pits is now banned. When rethinking about rig hardware and design today, environment protection needs to be considered at an early stage. There are many incentives for implementation of environmental protection programs, in design and in operation, aside from the regulatory/compliance issue. Waste disposal costs have risen dramatically over the last few years and the trend is expected to continue. Improvements in environment conditions improves morale and image. Growing public awareness and realization of the man made harm in my regions of the earth : dangerous levels of pollution in water, air, earth and living beings; major and undesirable disturbances to the ecological balance of the biosphere; destruction and depletion of irreplaceable resources; and gross deficiencies harmful to the physical, mental and social health of man in the living and working environment. This paper discusses the steps taken, early on in the design stage and operations methodology, to minimize the environmental impact.

  1. Minimally invasive surgical approach to pancreatic malignancies

    PubMed Central

    Bencini, Lapo; Annecchiarico, Mario; Farsi, Marco; Bartolini, Ilenia; Mirasolo, Vita; Guerra, Francesco; Coratti, Andrea

    2015-01-01

    Pancreatic surgery for malignancy is recognized as challenging for the surgeons and risky for the patients due to consistent perioperative morbidity and mortality. Furthermore, the oncological long-term results are largely disappointing, even for those patients who experience an uneventfully hospital stay. Nevertheless, surgery still remains the cornerstone of a multidisciplinary treatment for pancreatic cancer. In order to maximize the benefits of surgery, the advent of both laparoscopy and robotics has led many surgeons to treat pancreatic cancers with these new methodologies. The reduction of postoperative complications, length of hospital stay and pain, together with a shorter interval between surgery and the beginning of adjuvant chemotherapy, represent the potential advantages over conventional surgery. Lastly, a better cosmetic result, although not crucial in any cancerous patient, could also play a role by improving overall well-being and patient self-perception. The laparoscopic approach to pancreatic surgery is, however, difficult in inexperienced hands and requires a dedicated training in both advanced laparoscopy and pancreatic surgery. The recent large diffusion of the da Vinci® robotic platform seems to facilitate many of the technical maneuvers, such as anastomotic biliary and pancreatic reconstructions, accurate lymphadenectomy, and vascular sutures. The two main pancreatic operations, distal pancreatectomy and pancreaticoduodenectomy, are approachable by a minimally invasive path, but more limited interventions such as enucleation are also feasible. Nevertheless, a word of caution should be taken into account when considering the increasing costs of these newest technologies because the main concerns regarding these are the maintenance of all oncological standards and the lack of long-term follow-up. The purpose of this review is to examine the evidence for the use of minimally invasive surgery in pancreatic cancer (and less aggressive tumors

  2. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  3. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  4. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  5. Optimizing Processes to Minimize Risk

    NASA Technical Reports Server (NTRS)

    Loyd, David

    2017-01-01

    NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.

  6. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  7. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  8. Outcomes After Minimally Invasive Esophagectomy

    PubMed Central

    Luketich, James D.; Pennathur, Arjun; Awais, Omar; Levy, Ryan M.; Keeley, Samuel; Shende, Manisha; Christie, Neil A.; Weksler, Benny; Landreneau, Rodney J.; Abbas, Ghulam; Schuchert, Matthew J.; Nason, Katie S.

    2014-01-01

    Background Esophagectomy is a complex operation and is associated with significant morbidity and mortality. In an attempt to lower morbidity, we have adopted a minimally invasive approach to esophagectomy. Objectives Our primary objective was to evaluate the outcomes of minimally invasive esophagectomy (MIE) in a large group of patients. Our secondary objective was to compare the modified McKeown minimally invasive approach (videothoracoscopic surgery, laparoscopy, neck anastomosis [MIE-neck]) with our current approach, a modified Ivor Lewis approach (laparoscopy, videothoracoscopic surgery, chest anastomosis [MIE-chest]). Methods We reviewed 1033 consecutive patients undergoing MIE. Elective operation was performed on 1011 patients; 22 patients with nonelective operations were excluded. Patients were stratified by surgical approach and perioperative outcomes analyzed. The primary endpoint studied was 30-day mortality. Results The MIE-neck was performed in 481 (48%) and MIE-Ivor Lewis in 530 (52%). Patients undergoing MIE-Ivor Lewis were operated in the current era. The median number of lymph nodes resected was 21. The operative mortality was 1.68%. Median length of stay (8 days) and ICU stay (2 days) were similar between the 2 approaches. Mortality rate was 0.9%, and recurrent nerve injury was less frequent in the Ivor Lewis MIE group (P < 0.001). Conclusions MIE in our center resulted in acceptable lymph node resection, postoperative outcomes, and low mortality using either an MIE-neck or an MIE-chest approach. The MIE Ivor Lewis approach was associated with reduced recurrent laryngeal nerve injury and mortality of 0.9% and is now our preferred approach. Minimally invasive esophagectomy can be performed safely, with good results in an experienced center. PMID:22668811

  9. Minimal massive 3D gravity

    NASA Astrophysics Data System (ADS)

    Bergshoeff, Eric; Hohm, Olaf; Merbis, Wout; Routh, Alasdair J.; Townsend, Paul K.

    2014-07-01

    We present an alternative to topologically massive gravity (TMG) with the same ‘minimal’ bulk properties; i.e. a single local degree of freedom that is realized as a massive graviton in linearization about an anti-de Sitter (AdS) vacuum. However, in contrast to TMG, the new ‘minimal massive gravity’ has both a positive energy graviton and positive central charges for the asymptotic AdS-boundary conformal algebra.

  10. Construction schedules slack time minimizing

    NASA Astrophysics Data System (ADS)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  11. Minimally invasive surgery. Future developments.

    PubMed Central

    Wickham, J. E.

    1994-01-01

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures. Images Fig 1 Fig 2 Fig 3 Fig 4 Fig 5 PMID:8312776

  12. Minimally invasive paediatric cardiac surgery.

    PubMed

    Bacha, Emile; Kalfa, David

    2014-01-01

    The concept of minimally invasive surgery for congenital heart disease in paediatric patients is broad, and has the aim of reducing the trauma of the operation at each stage of management. Firstly, in the operating room using minimally invasive incisions, video-assisted thoracoscopic and robotically assisted surgery, hybrid procedures, image-guided intracardiac surgery, and minimally invasive cardiopulmonary bypass strategies. Secondly, in the intensive-care unit with neuroprotection and 'fast-tracking' strategies that involve early extubation, early hospital discharge, and less exposure to transfused blood products. Thirdly, during postoperative mid-term and long-term follow-up by providing the children and their families with adequate support after hospital discharge. Improvement of these strategies relies on the development of new devices, real-time multimodality imaging, aids to instrument navigation, miniaturized and specialized instrumentation, robotic technology, and computer-assisted modelling of flow dynamics and tissue mechanics. In addition, dedicated multidisciplinary co-ordinated teams involving congenital cardiac surgeons, perfusionists, intensivists, anaesthesiologists, cardiologists, nurses, psychologists, and counsellors are needed before, during, and after surgery to go beyond apparent technological and medical limitations with the goal to 'treat more while hurting less'.

  13. Minimal Absent Words in Four Human Genome Assemblies

    PubMed Central

    Garcia, Sara P.; Pinho, Armando J.

    2011-01-01

    Minimal absent words have been computed in genomes of organisms from all domains of life. Here, we aim to contribute to the catalogue of human genomic variation by investigating the variation in number and content of minimal absent words within a species, using four human genome assemblies. We compare the reference human genome GRCh37 assembly, the HuRef assembly of the genome of Craig Venter, the NA12878 assembly from cell line GM12878, and the YH assembly of the genome of a Han Chinese individual. We find the variation in number and content of minimal absent words between assemblies more significant for large and very large minimal absent words, where the biases of sequencing and assembly methodologies become more pronounced. Moreover, we find generally greater similarity between the human genome assemblies sequenced with capillary-based technologies (GRCh37 and HuRef) than between the human genome assemblies sequenced with massively parallel technologies (NA12878 and YH). Finally, as expected, we find the overall variation in number and content of minimal absent words within a species to be generally smaller than the variation between species. PMID:22220210

  14. Multiple myeloma, immunotherapy and minimal residual disease.

    PubMed

    Kusenda, J; Kovarikova, A

    2016-01-01

    Multiple myeloma (MM) is an incurable heterogeneous hematological malignancy in which relapse is characterized by re-growth of residual tumor and immune suppression with a complex biology that affects many aspects of the disease and its response to treatment. The bone marrow microenvironment, including immune cells, plays a central role in MM pathogenesis, survival, and drug resistance. The advances in basic and translational research, introduction of novel agents, particularly combination therapies, improved indicators of quality of life and survival. Minimal residual disease (MRD) detection by multiparameter flow cytometry (MFC) has revolutionized monitoring of treatment response in MM. The importance of MFC methodology will be further strengthened by the ongoing international standardization efforts. Results of MRD testing provide unique and clinically important information and demonstrated the prognostic significance of MRD in patients, leading to regulate treatment intensity in many contemporary protocols. In this review, we will summarize the principal approaches in MM immunotherapy, focusing how new agents have potential in the treatment of MM and application of MRD detection by MFC as a surrogate endpoint would allow quicker evaluation of treatment outcomes and rapid identification of effective new therapies.

  15. Temporal structure of consciousness and minimal self in schizophrenia

    PubMed Central

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one’s first-person experiential perspective as, for example, an “altered presence” during which the sense of the experienced self (“mineness”) is subtly affected, or “altered sense of demarcation,” i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an “I.” In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  16. Temporal structure of consciousness and minimal self in schizophrenia.

    PubMed

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one's first-person experiential perspective as, for example, an "altered presence" during which the sense of the experienced self ("mineness") is subtly affected, or "altered sense of demarcation," i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an "I." In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed.

  17. Methodology for research I

    PubMed Central

    Garg, Rakesh

    2016-01-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles. PMID:27729690

  18. Minimally invasive therapy in Denmark.

    PubMed

    Schou, I

    1993-01-01

    Minimally invasive therapy (MIT) is beginning to have impacts on health care in Denmark, although diffusion has been delayed compared to diffusion in other European countries. Now policy makers are beginning to appreciate the potential advantages in terms of closing hospitals and shifting treatment to the out-patient setting, and diffusion will probably go faster in the future. Denmark does not have a system for technology assessment, neither central nor regional, and there is no early warning mechanism to survey international developments. This implies lack of possibilities for the planning of diffusion, training, and criteria for treatment.

  19. Risk minimization through portfolio replication

    NASA Astrophysics Data System (ADS)

    Ciliberti, S.; Mã©Zard, M.

    2007-05-01

    We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.

  20. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  1. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  2. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  3. Evidence-Based Integrated Environmental Solutions For Secondary Lead Smelters: Pollution Prevention And Waste Minimization Technologies And Practices

    EPA Science Inventory

    An evidence-based methodology was adopted in this research to establish strategies to increase lead recovery and recycling via a systematic review and critical appraisal of the published literature. In particular, the research examines pollution prevention and waste minimization...

  4. Evidence-Based Integrated Environmental Solutions For Secondary Lead Smelters: Pollution Prevention And Waste Minimization Technologies And Practices

    EPA Science Inventory

    An evidence-based methodology was adopted in this research to establish strategies to increase lead recovery and recycling via a systematic review and critical appraisal of the published literature. In particular, the research examines pollution prevention and waste minimization...

  5. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  6. The minimal time detection algorithm

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan

    1995-01-01

    An aerospace vehicle may operate throughout a wide range of flight environmental conditions that affect its dynamic characteristics. Even when the control design incorporates a degree of robustness, system parameters may drift enough to cause its performance to degrade below an acceptable level. The object of this paper is to develop a change detection algorithm so that we can build a highly adaptive control system applicable to aircraft systems. The idea is to detect system changes with minimal time delay. The algorithm developed is called Minimal Time-Change Detection Algorithm (MT-CDA) which detects the instant of change as quickly as possible with false-alarm probability below a certain specified level. Simulation results for the aircraft lateral motion with a known or unknown change in control gain matrices, in the presence of doublet input, indicate that the algorithm works fairly well as theory indicates though there is a difficulty in deciding the exact amount of change in some situations. One of MT-CDA distinguishing properties is that detection delay of MT-CDA is superior to that of Whiteness Test.

  7. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  8. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  9. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  10. Empowering Research Methodologies.

    ERIC Educational Resources Information Center

    Lather, Patti

    Neo-marxist theory provides a better tool for educational researchers than other research methodologies because of its focus on empowering the dispossessed and its interest in the relationships between human activity and material circumstances. Traditional educational research is rooted in the positivist tradition and claims to be value neutral…

  11. Video: Modalities and Methodologies

    ERIC Educational Resources Information Center

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data…

  12. Courseware Engineering Methodology.

    ERIC Educational Resources Information Center

    Uden, Lorna

    2002-01-01

    Describes development of the Courseware Engineering Methodology (CEM), created to guide novices in designing effective courseware. Discusses CEM's four models: pedagogical (concerned with the courseware's pedagogical aspects), conceptual (dealing with software engineering), interface (relating to human-computer interaction), and hypermedia…

  13. Complicating Methodological Transparency

    ERIC Educational Resources Information Center

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  14. A methodology for distributed fault diagnosis

    NASA Astrophysics Data System (ADS)

    Gupta, V.; Puig, V.; Blesa, J.

    2017-01-01

    In this paper, a methodology for distributed fault diagnosis is proposed. The algorithm places the sensors in a system in such a manner that the partition of a system into various subsystems becomes easier facilitating the implementation of a distributed fault diagnosis system. This algorithm also reduces or minimized the number of sensors to be used or install thus reducing overall cost. Binary integer linear programming is used for optimization in this algorithm. Real case study of Barcelona water network has been used to demonstrate and validate the proposed algorithm.

  15. Improved methodology for generating controlled test atmospheres.

    PubMed

    Miller, R R; Letts, R L; Potts, W J; McKenna, M J

    1980-11-01

    Improved methodology has been developed for generating controlled test atmospheres. Vaporization of volatile liquids is accomplished in a 28 mm (O.D.) glass J-tube in conjunction with a compressed air flameless heat torch, a pressure-sensitive switch, and a positive displacement piston pump. The vaporization system has been very reliable with a variety of test materials in studies ranging from a few days to several months. The J-tube vaporization assembly minimizes the possibility of thermal decomposition of the test material and affords a better margin of safety when vaporizing potentially explosive materials.

  16. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  17. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  18. Perturbation resilience and superiorization methodology of averaged mappings

    NASA Astrophysics Data System (ADS)

    He, Hongjin; Xu, Hong-Kun

    2017-04-01

    We first prove the bounded perturbation resilience for the successive fixed point algorithm of averaged mappings, which extends the string-averaging projection and block-iterative projection methods. We then apply the superiorization methodology to a constrained convex minimization problem where the constraint set is the intersection of fixed point sets of a finite family of averaged mappings.

  19. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  20. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  1. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  2. Minimally Invasive Spigelian Hernia Repair

    PubMed Central

    Baucom, Catherine; Nguyen, Quan D.; Hidalgo, Marco

    2009-01-01

    Introduction: Spigelian hernia is an uncommon ventral hernia characterized by a defect in the linea semilunaris. Repair of spigelian hernia has traditionally been accomplished via an open transverse incision and primary repair. The purpose of this article is to present 2 case reports of incarcerated spigelian hernia that were successfully repaired laparoscopically using Gortex mesh and to present a review of the literature regarding laparoscopic repair of spigelian hernias. Methods: Retrospective chart review and Medline literature search. Results: Two patients underwent laparoscopic mesh repair of incarcerated spigelian hernias. Both were started on a regular diet on postoperative day 1 and discharged on postoperative days 2 and 3. One patient developed a seroma that resolved without intervention. There was complete resolution of preoperative symptoms at the 12-month follow-up. Conclusion: Minimally invasive repair of spigelian hernias is an alternative to the traditional open surgical technique. Further studies are needed to directly compare the open and the laparoscopic repair. PMID:19660230

  3. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR.

  4. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  5. Strategies to minimize antibiotic resistance.

    PubMed

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-09-12

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics.

  6. Air Pollutants Minimalization of Pollutant Absorber with Condensation System

    NASA Astrophysics Data System (ADS)

    Ruhiat, Yayat; Catur Wibowo, Firmanul; Oktarisa, Yuvita

    2017-05-01

    Industrial development has implications for pollution, one of it is air pollution. The amount of air pollutants emitted from industrial depend on several factors which are capacity of its fuel, high chimneys and atmospheric stability. To minimize pollutants emitted from industries is created a tool called Pollutant Absorber (PA) with a condensing system. Research & Development with the approach of Design for Production was used as methodology in making PA. To test the function of PA, the simulation had been done by using the data on industrial emissions Cilegon industrial area. The simulation results in 15 years period showed that the PA was able to minimize the pollutant emissions of SO2 by 38% NOx by 37% and dust by 64%. Differences in the absorption of pollutants shows the weakness of particle separation process in the separator. This condition happen because the condensation process is less optimal during the absorption and separation in the separator.

  7. MINIMIZATION OF CARBON LOSS IN COAL REBURNING

    SciTech Connect

    Vladimir M. Zamansky; Vitali V. Lissianski

    2001-09-07

    This project develops Fuel-Flexible Reburning (FFR), which combines conventional reburning and Advanced Reburning (AR) technologies with an innovative method of delivering coal as the reburning fuel. The overall objective of this project is to develop engineering and scientific information and know-how needed to improve the cost of reburning via increased efficiency and minimized carbon in ash and move the FFR technology to the demonstration and commercialization stage. Specifically, the project entails: (1) optimizing FFR with injection of gasified and partially gasified fuels with respect to NO{sub x} and carbon in ash reduction; (2) characterizing flue gas emissions; (3) developing a process model to predict FFR performance; (4) completing an engineering and economic analysis of FFR as compared to conventional reburning and other commercial NO{sub x} control technologies, and (5) developing a full-scale FFR design methodology. The project started in August 2000 and will be conducted over a two-year period. The work includes a combination of analytical and experimental studies to identify optimum process configurations and develop a design methodology for full-scale applications. The first year of the program included pilot-scale tests to evaluate performances of two bituminous coals in basic reburning and modeling studies designed to identify parameters that affect the FFR performance and to evaluate efficiency of coal pyrolysis products as a reburning fuel. Tests were performed in a 300 kW Boiler Simulator Facility to characterize bituminous coals as reburning fuels. Tests showed that NO{sub x} reduction in basic coal reburning depends on process conditions, initial NO{sub x} and coal type. Up to 60% NO{sub x} reduction was achieved at optimized conditions. Modeling activities during first year concentrated on the development of coal reburning model and on the prediction of NO{sub x} reduction in reburning by coal gasification products. Modeling predicted that

  8. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  9. Tobacco documents research methodology

    PubMed Central

    McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-01-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings. PMID:21504933

  10. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  11. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  12. Expert Systems Development Methodology

    DTIC Science & Technology

    1989-07-28

    two volumes. Volume 1 is the Development Metodology and Volume 2 is an Evaluation Methodology containing methods for evaluation, validation and...system are written in an English -like language which almost anyone can understand. Thus programming in rule based systems can become "programming for...computers and others have little understanding about how computers work. The knowledge engineer must therefore be willing and able to teach the expert

  13. NAVOSH Priority Methodology.

    DTIC Science & Technology

    1982-03-01

    studies were available. However, the extent to which the results of previous prioritization investigations might benefit this research was not known. By...In 1978, SRI developed a method for the U.S. Enviromental Protection Agency (EPA) to use in rapid ranking of environmental pollutants. The method is...representative of the state of development of relevant prioritization methodology techniques: IN a. Cost- Benefit Fault Tree Analysis b. Cost- Benefit Type Methods c

  14. Methodology for research II

    PubMed Central

    Bhaskar, S Bala; Manjuladevi, M

    2016-01-01

    Research is a systematic process, which uses scientific methods to generate new knowledge that can be used to solve a query or improve on the existing system. Any research on human subjects is associated with varying degree of risk to the participating individual and it is important to safeguard the welfare and rights of the participants. This review focuses on various steps involved in methodology (in continuation with the previous section) before the data are submitted for publication. PMID:27729691

  15. Autonomous spacecraft design methodology

    SciTech Connect

    Divita, E.L.; Turner, P.R.

    1984-08-01

    A methodology for autonomous spacecraft design blends autonomy requirements with traditional mission requirements and assesses the impact of autonomy upon the total system resources available to support faulttolerance and automation. A baseline functional design can be examined for autonomy implementation impacts, and the costs, risk, and benefits of various options can be assessed. The result of the process is a baseline design that includes autonomous control functions.

  16. Darwin's Methodological Evolution.

    PubMed

    Lennox, James G

    2005-01-01

    A necessary condition for having a revolution named after you is that you are an innovator in your field. I argue that if Charles Darwin meets this condition, it is as a philosopher and methodologist. In 1991, I made the case for Darwin's innovative use of "thought experiment" in the Origin. Here I place this innovative practice in the context of Darwin's methodological commitments, trace its origins back into Darwin's notebooks, and pursue Darwin's suggestion that it owes its inspiration to Charles Lyell.

  17. Minimizing Variation in Outdoor CPV Power Ratings: Preprint

    SciTech Connect

    Muller, M.; Marion, B.; Rodriguez, J.; Kurtz, S.

    2011-07-01

    The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  18. Methodologically rigorous clinical research.

    PubMed

    Yang, Lynda J-S; Chang, Kate W-C; Chung, Kevin C

    2012-06-01

    Rigorous methodology increases the quality of clinical research by encouraging freedom from the biases inherent in clinical studies. As randomized controlled studies (clinical trial design) are rarely applicable to surgical research, the authors address the commonly used observational study designs and methodologies by presenting guidelines for rigor. The authors performed a review of study designs, including cohort, case-control, and cross-sectional studies and case series/reports, and biases and confounders of study design. Details about biases and confounders at each study stage, study characteristics, rigor checklists, and published literature examples for each study design are summarized and presented in this report. For those surgeons interested in pursuing clinical research, mastery of the principles of methodologic rigor is imperative in the context of evidence-based medicine and widespread publication of clinical studies. Knowledge of the study designs and their appropriate application, and strict adherence to study design methods can provide high-quality evidence to serve as the basis for rational clinical decision-making.

  19. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  20. Minimally Invasive Mitral Valve Surgery II

    PubMed Central

    Wolfe, J. Alan; Malaisrie, S. Chris; Farivar, R. Saeid; Khan, Junaid H.; Hargrove, W. Clark; Moront, Michael G.; Ryan, William H.; Ailawadi, Gorav; Agnihotri, Arvind K.; Hummel, Brian W.; Fayers, Trevor M.; Grossi, Eugene A.; Guy, T. Sloane; Lehr, Eric J.; Mehall, John R.; Murphy, Douglas A.; Rodriguez, Evelio; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Lewis, Clifton T. P.; Barnhart, Glenn R.; Goldman, Scott M.

    2016-01-01

    Abstract Techniques for minimally invasive mitral valve repair and replacement continue to evolve. This expert opinion, the second of a 3-part series, outlines current best practices for nonrobotic, minimally invasive mitral valve procedures, and for postoperative care after minimally invasive mitral valve surgery. PMID:27654406

  1. Cyclone Simulation via Action Minimization

    NASA Astrophysics Data System (ADS)

    Plotkin, D. A.; Weare, J.; Abbot, D. S.

    2016-12-01

    A postulated impact of climate change is an increase in intensity of tropical cyclones (TCs). This hypothesized effect results from the fact that TCs are powered subsaturated boundary layer air picking up water vapor from the surface ocean as it flows inwards towards the eye. This water vapor serves as the energy input for TCs, which can be idealized as heat engines. The inflowing air has a nearly identical temperature as the surface ocean; therefore, warming of the surface leads to a warmer atmospheric boundary layer. By the Clausius-Clapeyron relationship, warmer boundary layer air can hold more water vapor and thus results in more energetic storms. Changes in TC intensity are difficult to predict due to the presence of fine structures (e.g. convective structures and rainbands) with length scales of less than 1 km, while general circulation models (GCMs) generally have horizontal resolutions of tens of kilometers. The models are therefore unable to capture these features, which are critical to accurately simulating cyclone structure and intensity. Further, strong TCs are rare events, meaning that long multi-decadal simulations are necessary to generate meaningful statistics about intense TC activity. This adds to the computational expense, making it yet more difficult to generate accurate statistics about long-term changes in TC intensity due to global warming via direct simulation. We take an alternative approach, applying action minimization techniques developed in molecular dynamics to the WRF weather/climate model. We construct artificial model trajectories that lead from quiescent (TC-free) states to TC states, then minimize the deviation of these trajectories from true model dynamics. We can thus create Monte Carlo model ensembles that are biased towards cyclogenesis, which reduces computational expense by limiting time spent in non-TC states. This allows for: 1) selective interrogation of model states with TCs; 2) finding the likeliest paths for

  2. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  3. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  4. Differentially Private Empirical Risk Minimization.

    PubMed

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D

    2011-03-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance.

  5. Minimal hepatic encephalopathy: A review.

    PubMed

    Nardone, Raffaele; Taylor, Alexandra C; Höller, Yvonne; Brigo, Francesco; Lochner, Piergiorgio; Trinka, Eugen

    2016-10-01

    Minimal hepatic encephalopathy (MHE) is the earliest form of hepatic encephalopathy and can affect up to 80% of patients with liver cirrhosis. By definition, MHE is characterized by cognitive function impairment in the domains of attention, vigilance and integrative function, but obvious clinical manifestation are lacking. MHE has been shown to affect daily functioning, quality of life, driving and overall mortality. The diagnosis can be achieved through neuropsychological testing, recently developed computerized psychometric tests, such as the critical flicker frequency and the inhibitory control tests, as well as neurophysiological procedures. Event related potentials can reveal subtle changes in patients with normal neuropsychological performances. Spectral analysis of electroencephalography (EEG) and quantitative analysis of sleep EEG provide early markers of cerebral dysfunction in cirrhotic patients with MHE. Neuroimaging, in particular MRI, also increasingly reveals diffuse abnormalities in intrinsic brain activity and altered organization of functional connectivity networks. Medical treatment for MHE to date has been focused on reducing serum ammonia levels and includes non-absorbable disaccharides, probiotics or rifaximin. Liver transplantation may not reverse the cognitive deficits associated with MHE. We performed here an updated review on epidemiology, burden and quality of life, neuropsychological testing, neuroimaging, neurophysiology and therapy in subjects with MHE. Copyright © 2016 Elsevier Ireland Ltd and Japan Neuroscience Society. All rights reserved.

  6. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  7. Solubility curves and nucleation rates from molecular dynamics for polymorph prediction - moving beyond lattice energy minimization.

    PubMed

    Parks, Conor; Koswara, Andy; DeVilbiss, Frank; Tung, Hsien-Hsin; Nere, Nandkishor K; Bordawekar, Shailendra; Nagy, Zoltan K; Ramkrishna, Doraiswami

    2017-02-15

    Current polymorph prediction methods, known as lattice energy minimization, seek to determine the crystal lattice with the lowest potential energy, rendering it unable to predict solvent dependent metastable form crystallization. Facilitated by embarrassingly parallel, multiple replica, large-scale molecular dynamics simulations, we report on a new method concerned with predicting crystal structures using the kinetics and solubility of the low energy polymorphs predicted by lattice energy minimization. The proposed molecular dynamics simulation methodology provides several new predictions to the field of crystallization. (1) The methodology is shown to correctly predict the kinetic preference for β-glycine nucleation in water relative to α- and γ-glycine. (2) Analysis of nanocrystal melting temperatures show γ- nanocrystals have melting temperatures up to 20 K lower than either α- or β-glycine. This provides a striking explanation of how an energetically unstable classical nucleation theory (CNT) transition state complex leads to kinetic inaccessibility of γ-glycine in water, despite being the thermodynamically preferred polymorph predicted by lattice energy minimization. (3) The methodology also predicts polymorph-specific solubility curves, where the α-glycine solubility curve is reproduced to within 19% error, over a 45 K temperature range, using nothing but atomistic-level information provided from nucleation simulations. (4) Finally, the methodology produces the correct solubility ranking of β- > α-glycine. In this work, we demonstrate how the methodology supplements lattice energy minimization with molecular dynamics nucleation simulations to give the correct polymorph prediction, at different length scales, when lattice energy minimization alone would incorrectly predict the formation of γ-glycine in water from the ranking of lattice energies. Thus, lattice energy minimization optimization algorithms are supplemented with the necessary solvent

  8. WARRP Decon-13: Subject Matter Expert (SME) Meeting Waste Screening and Waste Minimization Methodologies Project

    DTIC Science & Technology

    2012-08-01

    radioactive material compares to Chernobyl . In looking at atmospheric releases of Cs- 137, there seems to be agreement that Fukushima releases were about 10...to 20% of those produced by the Chernobyl event. However, the Fukushima event has resulted in significant releases of contaminated water to the...ocean. Also, the Chernobyl releases occurred over about 10 days, while releases from Fukushima continued over a longer period of time. Mr. Tupin

  9. A simple efficient methodology for Dirac equation in minimal length quantum mechanics

    NASA Astrophysics Data System (ADS)

    Hassanabadi, H.; Zarrinkamar, S.; Rajabi, A. A.

    2013-01-01

    We solve the modified Dirac equation by adding a harmonic oscillator potential and implementing the Nikiforov-Uvarov technique. The closed forms of solutions are reported in a quite simple and systematic manner.

  10. Regional Expansion of Minimally Invasive Surgery for Hysterectomy: Implementation and Methodology in a Large Multispecialty Group

    PubMed Central

    Andryjowicz, Esteban; Wray, Teresa

    2011-01-01

    Introduction: Approximately 600,000 hysterectomies are performed in the US each year, making hysterectomy the second most common major operation performed in women. Several methods can be used to perform this procedure. In 2009, a Cochrane Review concluded “that vaginal hysterectomy should be performed in preference to abdominal hysterectomy, where possible. Where vaginal hysterectomy is not possible, a laparoscopic approach may avoid the need for an abdominal hysterectomy. Risks and benefits of different approaches may however be influenced by the surgeon's experience. More research is needed, particularly to examine the long-term effects of the different types of surgery.” This article reviews the steps that a large multispecialty group used to teach non-open hysterectomy methods to improve the quality of care for their patients and to decrease the number of inpatient procedures and therefore costs. The percentages of each type of hysterectomy performed yearly between 2005 and 2010 were calculated, as well as the length of stay (LOS) for each method. Methods: A structured educational intervention with both didactic and hands-on exercises was created and rolled out to 12 medical centers. All patients undergoing hysterectomy for benign conditions through the Southern California Permanente Medical Group (a large multispecialty group that provides medical care to Kaiser Permanente patients in Southern California) between 2005 and 2010 were included. This amounted to 26,055 hysterectomies for benign conditions being performed by more than 350 obstetrician/gynecologists (Ob/Gyns). Results: More than 300 Ob/Gyns took the course across 12 medical centers. On the basis of hospital discharge data, the total number of hysterectomies, types of hysterectomies, and LOS for each type were identified for each year. Between 2005 and 2010, the rate of non-open hysterectomies has increased 120% (from 38% to 78%) and the average LOS has decreased 31%. PMID:22319415

  11. Missile Misdistance Reduction: An Instructive Methodology for Developing Terminal Guidance Control Systems to Minimize Missile Misdistance.

    DTIC Science & Technology

    1982-10-01

    constant bearing 4 course given by: V a T 1- cosm T ,for n #1 [qZm= n-i[t~-1 n(iV.C-10) V a T 1- cos B in 1- for n = 1 I 67 Equation (IV.C-9) is plotted in...VI.B-2) ’P +STU or SP = -(i - SqTa - 4) (VI.B-2a) Tcy Equation (VI.B-2a) can be expressed in state form as follows: Let Y = X5 , = k5 = 6 ’= X8 =i and

  12. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  13. Differing antidepressant maintenance methodologies.

    PubMed

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  14. Minimal breast cancer: a clinical appraisal.

    PubMed Central

    Peters, T G; Donegan, W L; Burg, E A

    1977-01-01

    Eighty-five patients with a diagnosis of minimal breast cancer were evaluated. The predominant lesion was intraductal carcinoma, and axillary metastases occurred in association with minimal breast cancer in seven of 96 cases. One death occurred due to minimal breast cancer. Bilateral mammary carcinoma was evident in 24% and bilateral minimal breast cancer in 13% of the patients. The component lesions of minimal breast cancer have varied biologic activity, but prognosis is good with a variety of operations. The multifocal nature of minimal breast cancer and the potential for metastases should be recognized. Therapy should include removal of the entire mammary parenchyma and low axillary nodes. The high incidence of bilateral malignancy supports elective contralateral biopsy at the time of therapy for minimal breast cancer. Images Fig. 1. Fig. 2. Fig. 3. Fig. 5. PMID:203233

  15. Supply chain assessment methodology.

    PubMed

    Topor, E

    2000-08-01

    This article describes an assessment methodology based on the supply chain proficiency model that can be used to set realistic supply chain objectives. The assessment centers on a business model that identifies the logical stages of supply chain proficiency as measured against a comprehensive set of business characteristics. For each characteristic, an enterprise evolves from one stage to the next. The magnitude of change inherent in moving forward usually prohibits skipping stages. Although it is possible to be at different stages for each characteristic, it is usually desirable to maintain balance.

  16. Neuropathography: origins and methodology.

    PubMed

    Bradford, David T

    2006-10-01

    Neuropathography is a genre of case study which balances the clinical neuroscientific perspective with the descriptive acuity and existential interests of phenomenological psychopathology. Its subjects are persons of exceptional talent whose contributions are widely recognized, and also those whose seemingly ordinary lives include personally profound experiences of discernible cultural significance. In all instances, the chief focus is on the shaping influence of brain dysfunction in the subject's life and work. Six methodological guidelines are outlined, their topics ranging from the subjects, source material, aesthetic standards, and multidisciplinary character of neuropathography to normative standards and concepts of neuropsychological causation.

  17. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  18. Emergency exercise methodology

    SciTech Connect

    Klimczak, C.A.

    1993-01-01

    Competence for proper response to hazardous materials emergencies is enhanced and effectively measured by exercises which test plans and procedures and validate training. Emergency exercises are most effective when realistic criteria is used and a sequence of events is followed. The scenario is developed from pre-determined exercise objectives based on hazard analyses, actual plans and procedures. The scenario should address findings from previous exercises and actual emergencies. Exercise rules establish the extent of play and address contingencies during the exercise. All exercise personnel are assigned roles as players, controllers or evaluators. These participants should receive specialized training in advance. A methodology for writing an emergency exercise plan will be detailed.

  19. Emergency exercise methodology

    SciTech Connect

    Klimczak, C.A.

    1993-03-01

    Competence for proper response to hazardous materials emergencies is enhanced and effectively measured by exercises which test plans and procedures and validate training. Emergency exercises are most effective when realistic criteria is used and a sequence of events is followed. The scenario is developed from pre-determined exercise objectives based on hazard analyses, actual plans and procedures. The scenario should address findings from previous exercises and actual emergencies. Exercise rules establish the extent of play and address contingencies during the exercise. All exercise personnel are assigned roles as players, controllers or evaluators. These participants should receive specialized training in advance. A methodology for writing an emergency exercise plan will be detailed.

  20. Minimal Models of Multidimensional Computations

    PubMed Central

    Fitzgerald, Jeffrey D.; Sincich, Lawrence C.; Sharpee, Tatyana O.

    2011-01-01

    The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs. PMID:21455284

  1. Robotic assisted minimally invasive surgery

    PubMed Central

    Palep, Jaydeep H

    2009-01-01

    The term “robot” was coined by the Czech playright Karel Capek in 1921 in his play Rossom's Universal Robots. The word “robot” is from the check word robota which means forced labor. The era of robots in surgery commenced in 1994 when the first AESOP (voice controlled camera holder) prototype robot was used clinically in 1993 and then marketed as the first surgical robot ever in 1994 by the US FDA. Since then many robot prototypes like the Endoassist (Armstrong Healthcare Ltd., High Wycombe, Buck, UK), FIPS endoarm (Karlsruhe Research Center, Karlsruhe, Germany) have been developed to add to the functions of the robot and try and increase its utility. Integrated Surgical Systems (now Intuitive Surgery, Inc.) redesigned the SRI Green Telepresence Surgery system and created the daVinci Surgical System® classified as a master-slave surgical system. It uses true 3-D visualization and EndoWrist®. It was approved by FDA in July 2000 for general laparoscopic surgery, in November 2002 for mitral valve repair surgery. The da Vinci robot is currently being used in various fields such as urology, general surgery, gynecology, cardio-thoracic, pediatric and ENT surgery. It provides several advantages to conventional laparoscopy such as 3D vision, motion scaling, intuitive movements, visual immersion and tremor filtration. The advent of robotics has increased the use of minimally invasive surgery among laparoscopically naïve surgeons and expanded the repertoire of experienced surgeons to include more advanced and complex reconstructions. PMID:19547687

  2. [History of minimally invasive surgery].

    PubMed

    Radojcić, Branka; Jokić, Radoica; Grebeldinger, Slobodan; Meljnikov, Igor; Radojić, Nikola

    2009-01-01

    This paper presents a historical review and development of minimally invasive surgery. The interest of physicians to "look into the internal organs" has existed since the ancient time. The first described endoscopy was by Hippocrates. He made reference to a rectal speculum. The credit for modern endoscopy belongs to Bozzini. He developed a light conductor which he called "Lichleiter" to avoid the problems of inadequate illumination. In 1853, Desormeaux first introduced the "Lichtleiter" of Bozzini to a patient. Many developments, which occurred independently but almost simultaneously, produced breakthroughs for endoscopy and laparoscopy that were bases for modern instruments. In 1901, Kelling coined the term "coelioskope" to describe the technique that used a cystoscope to examine the abdominal cavity of dogs. In 1910, Jacobaeus used the term "laparothorakoskopie" for the fist time. In 1938, Veress developed the spring-loaded needle for draining ascites and evacuating fluid and air from the chest. Its current modifications make the "Veress" needle a perfect tool to achieve pneumnoperitoneum during laparoscopic surgery. In 1970, Hasson developed a technique performing laparoscopy through a miniature leparotomy incision. The first solid state camera was introduced in 1982 that was the start of "video-laparoscopy". In 1981 Kurt Semm performed first laparoscopic appendectomy. Within a year, all standard surgical procedures were performed laparoscopically. The authors also analyzed the new surgical techniques, such as telesurgery, robotics and virtual reality in current surgical practice. They specially enmphasized the use of laparoscopic access in pediatric surgery which has become a new gold standard in surgical treatment of pediatric patients.

  3. A minimal axisymmetric hurricane model

    NASA Astrophysics Data System (ADS)

    Mai, Nguyen Chi; Smith, Roger K.; Zhu, Hongyan; Ulrich, Wolfgang

    2002-10-01

    Solutions of an axisymmetric version of the minimal three-dimensional numerical model of a tropical cyclone developed by Zhu et al. (2001) are described and compared with those of the three-dimensional model. Vortex evolution is similar in the two models during the early stages of intensification, but the period of rapid intensification occurs earlier in the axisymmetric model due to the higher effective resolution obtained using a staggered grid. There are marked differences at later times, when, in the three-dimensional model, asymmetric structures develop. The findings are compared with those of an earlier study by Anthes (1972). The axisymmetric model is used to investigate certain fundamental aspects of tropical-cyclone dynamics, including the emergence of a region of supergradient winds in the boundary layer and the evolution of regions satisfying necessary conditions for inertial and barotropic instability.Supergradient winds develop in the boundary layer within a radius of about 100 km of the vortex axis at an early stage of evolution and appear to be a natural feature of the vortex boundary layer. The development of flow regions satisfying necessary conditions for inertial and barotropic instability occur later, and may be attributed inter alia to the upward transfer of air with relatively high angular momentum, from the boundary layer to the middle and upper layers, by the secondary circulation of the vortex, and the downward transfer of air with relatively low angular momentum to the middle layer. A linear analysis of a two-layer slab-symmetric flow suggests why inertial instability does not occur in the axisymmetric model. Barotropic instability does not appear to be the mechanism responsible for the growth of asymmetries in the calculations using the three-dimensional version of the model.

  4. Intelligent systems engineering methodology

    NASA Technical Reports Server (NTRS)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  5. The methodology of neuroproteomics.

    PubMed

    Ottens, Andrew K

    2009-01-01

    The human central nervous system (CNS) is the most complex organ in nature, composed of ten trillion cells forming complex neural networks using a quadrillion synaptic connections. Proteins, their modifications, and their interactions are integral to CNS function. The emerging field of neuroproteomics provides us with a wide-scope view of posttranslation protein dynamics within the CNS to better our understanding of its function, and more often, its dysfunction consequent to neurodegenerative disorders. This chapter reviews methodology employed in the neurosciences to study the neuroproteome in health and disease. The chapter layout parallels this volume's four parts. Part I focuses on modeling human neuropathology in animals as surrogate, accessible, and controllable platforms in our research. Part II discusses methodology used to focus analysis onto a subneuroproteome. Part III reviews analytical and bioinformatic technologies applied in neuroproteomics. Part IV discusses clinical neuroproteomics, from processing of human biofluids to translation in biomarkers research. Neuroproteomics continues to mature as a discipline, confronting the extreme complexity of the CNS proteome and its dynamics, and providing insight into the molecular mechanisms underlying how our nervous system works and how it is compromised by injury and disease.

  6. Relative Hazard Calculation Methodology

    SciTech Connect

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-09-07

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation).

  7. Methodology for Teachers. Volunteer's Manual.

    ERIC Educational Resources Information Center

    Holt, Daniel D.; And Others

    The Volunteer's Manual of "Methodology for Teachers" was written to (1) provide Peace Corps/Korea TESOL volunteers with a simple, complete guide to methodology for teaching English in Korea; and (2) provide these volunteers with a simple, complete guide for teaching this methodology to Korean English teachers in inservice training programs. For…

  8. Minimal Cells-Real and Imagined.

    PubMed

    Glass, John I; Merryman, Chuck; Wise, Kim S; Hutchison, Clyde A; Smith, Hamilton O

    2017-03-27

    A minimal cell is one whose genome only encodes the minimal set of genes necessary for the cell to survive. Scientific reductionism postulates the best way to learn the first principles of cellular biology would be to use a minimal cell in which the functions of all genes and components are understood. The genes in a minimal cell are, by definition, essential. In 2016, synthesis of a genome comprised of only the set of essential and quasi-essential genes encoded by the bacterium Mycoplasma mycoides created a near-minimal bacterial cell. This organism performs the cellular functions common to all organisms. It replicates DNA, transcribes RNA, translates proteins, undergoes cell division, and little else. In this review, we examine this organism and contrast it with other bacteria that have been used as surrogates for a minimal cell.

  9. Situating methodology within qualitative research.

    PubMed

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  10. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  11. Why minimally invasive skin sampling techniques? A bright scientific future.

    PubMed

    Wang, Christina Y; Maibach, Howard I

    2011-03-01

    There is increasing interest in minimally invasive skin sampling techniques to assay markers of molecular biology and biochemical processes. This overview examines methodology strengths and limitations, and exciting developments pending in the scientific community. Publications were searched via PubMed, the U.S. Patent and Trademark Office Website, the DermTech Website and the CuDerm Website. The keywords used were noninvasive skin sampling, skin stripping, skin taping, detergent method, ring method, mechanical scrub, reverse iontophoresis, glucose monitoring, buccal smear, hair root sampling, mRNA, DNA, RNA, and amino acid. There is strong interest in finding methods to access internal biochemical, molecular, and genetic processes through noninvasive and minimally invasive external means. Minimally invasive techniques include the widely used skin tape stripping, the abrasion method that includes scraping and detergent, and reverse iontophoresis. The first 2 methods harvest largely the stratum corneum. Hair root sampling (material deeper than the epidermis), buccal smear, shave biopsy, punch biopsy, and suction blistering are also methods used to obtain cellular material for analysis, but involve some degree of increased invasiveness and thus are only briefly mentioned. Existing and new sampling methods are being refined and validated, offering exciting, different noninvasive means of quickly and efficiently obtaining molecular material with which to monitor bodily functions and responses, assess drug levels, and follow disease processes without subjecting patients to unnecessary discomfort and risk.

  12. Minimal normal measurement models of quantum instruments

    NASA Astrophysics Data System (ADS)

    Pellonpää, Juha-Pekka; Tukiainen, Mikko

    2017-06-01

    In this work we study the minimal normal measurement models of quantum instruments. We show that usually the apparatus' Hilbert space in such a model is unitarily isomorphic to the minimal Stinespring dilation space of the instrument. However, if the Hilbert space of the system is infinite-dimensional and the multiplicities of the outcomes of the associated observable (POVM) are all infinite then this may not be the case. In these pathological cases the minimal apparatus' Hilbert space is shown to be unitarily isomorphic to the instrument's minimal dilation space augmented by one extra dimension.

  13. MACT: A Manageable Minimization Allocation System

    PubMed Central

    Cui, Yan; Bu, Huaien; Liao, Shizhong

    2014-01-01

    Background. Minimization is a case allocation method for randomized controlled trials (RCT). Evidence suggests that the minimization method achieves balanced groups with respect to numbers and participant characteristics, and can incorporate more prognostic factors compared to other randomization methods. Although several automatic allocation systems exist (e.g., randoWeb, and MagMin), the minimization method is still difficult to implement, and RCTs seldom employ minimization. Therefore, we developed the minimization allocation controlled trials (MACT) system, a generic manageable minimization allocation system. System Outline. The MACT system implements minimization allocation by Web and email. It has a unified interface that manages trials, participants, and allocation. It simultaneously supports multitrials, multicenters, multigrouping, multiple prognostic factors, and multilevels. Methods. Unlike previous systems, MACT utilizes an optimized database that greatly improves manageability. Simulations and Results. MACT was assessed in a series of experiments and evaluations. Relative to simple randomization, minimization produces better balance among groups and similar unpredictability. Applications. MACT has been employed in two RCTs that lasted three years. During this period, MACT steadily and simultaneously satisfied the requirements of the trial. Conclusions. MACT is a manageable, easy-to-use case allocation system. Its outstanding features are attracting more RCTs to use the minimization allocation method. PMID:24701251

  14. Waste Minimization Study on Pyrochemical Reprocessing Processes

    SciTech Connect

    Boussier, H.; Conocar, O.; Lacquement, J.

    2006-07-01

    Ideally a new pyro-process should not generate more waste, and should be at least as safe and cost effective as the hydrometallurgical processes currently implemented at industrial scale. This paper describes the thought process, the methodology and some results obtained by process integration studies to devise potential pyro-processes and to assess their capability of achieving this challenging objective. As example the assessment of a process based on salt/metal reductive extraction, designed for the reprocessing of Generation IV carbide spent fuels, is developed. Salt/metal reductive extraction uses the capability of some metals, aluminum in this case, to selectively reduce actinide fluorides previously dissolved in a fluoride salt bath. The reduced actinides enter the metal phase from which they are subsequently recovered; the fission products remain in the salt phase. In fact, the process is not so simple, as it requires upstream and downstream subsidiary steps. All these process steps generate secondary waste flows representing sources of actinide leakage and/or FP discharge. In aqueous processes the main solvent (nitric acid solution) has a low boiling point and evaporate easily or can be removed by distillation, thereby leaving limited flow containing the dissolved substance behind to be incorporated in a confinement matrix. From the point of view of waste generation, one main handicap of molten salt processes, is that the saline phase (fluoride in our case) used as solvent is of same nature than the solutes (radionuclides fluorides) and has a quite high boiling point. So it is not so easy, than it is with aqueous solutions, to separate solvent and solutes in order to confine only radioactive material and limit the final waste flows. Starting from the initial block diagram devised two years ago, the paper shows how process integration studies were able to propose process fittings which lead to a reduction of the waste variety and flows leading at an 'ideal

  15. Contemporary review of minimally invasive pancreaticoduodenectomy

    PubMed Central

    Dai, Rui; Turley, Ryan S; Blazer, Dan G

    2016-01-01

    AIM To assess the current literature describing various minimally invasive techniques for and to review short-term outcomes after minimally invasive pancreaticoduodenectomy (PD). METHODS PD remains the only potentially curative treatment for periampullary malignancies, including, most commonly, pancreatic adenocarcinoma. Minimally invasive approaches to this complex operation have begun to be increasingly reported in the literature and are purported by some to reduce the historically high morbidity of PD associated with the open technique. In this systematic review, we have searched the literature for high-quality publications describing minimally invasive techniques for PD-including laparoscopic, robotic, and laparoscopic-assisted robotic approaches (hybrid approach). We have identified publications with the largest operative experiences from well-known centers of excellence for this complex procedure. We report primarily short term operative and perioperative results and some short term oncologic endpoints. RESULTS Minimally invasive techniques include laparoscopic, robotic and hybrid approaches and each of these techniques has strong advocates. Consistently, across all minimally invasive modalities, these techniques are associated less intraoperative blood loss than traditional open PD (OPD), but in exchange for longer operating times. These techniques are relatively equivalent in terms of perioperative morbidity and short term oncologic outcomes. Importantly, pancreatic fistula rate appears to be comparable in most minimally invasive series compared to open technique. Impact of minimally invasive technique on length of stay is mixed compared to some traditional open series. A few series have suggested that initiation of and time to adjuvant therapy may be improved with minimally invasive techniques, however this assertion remains controversial. In terms of short-terms costs, minimally invasive PD is significantly higher than that of OPD. CONCLUSION Minimally

  16. Minimizing electrode contamination in an electrochemical cell

    DOEpatents

    Kim, Yu Seung; Zelenay, Piotr; Johnston, Christina

    2014-12-09

    An electrochemical cell assembly that is expected to prevent or at least minimize electrode contamination includes one or more getters that trap a component or components leached from a first electrode and prevents or at least minimizes them from contaminating a second electrode.

  17. Minimally Invasive Mitral Valve Surgery I

    PubMed Central

    Ailawadi, Gorav; Agnihotri, Arvind K.; Mehall, John R.; Wolfe, J. Alan; Hummel, Brian W.; Fayers, Trevor M.; Farivar, R. Saeid; Grossi, Eugene A.; Guy, T. Sloane; Hargrove, W. Clark; Khan, Junaid H.; Lehr, Eric J.; Malaisrie, S. Chris; Murphy, Douglas A.; Rodriguez, Evelio; Ryan, William H.; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Goldman, Scott M.; Lewis, Clifton T. P.; Barnhart, Glenn R.

    2016-01-01

    Abstract Widespread adoption of minimally invasive mitral valve repair and replacement may be fostered by practice consensus and standardization. This expert opinion, first of a 3-part series, outlines current best practices in patient evaluation and selection for minimally invasive mitral valve procedures, and discusses preoperative planning for cannulation and myocardial protection. PMID:27654407

  18. Locus minimization in breed prediction using artificial neural network approach.

    PubMed

    Iquebal, M A; Ansari, M S; Sarika; Dixit, S P; Verma, N K; Aggarwal, R A K; Jayakumar, S; Rai, A; Kumar, D

    2014-12-01

    Molecular markers, viz. microsatellites and single nucleotide polymorphisms, have revolutionized breed identification through the use of small samples of biological tissue or germplasm, such as blood, carcass samples, embryos, ova and semen, that show no evident phenotype. Classical tools of molecular data analysis for breed identification have limitations, such as the unavailability of referral breed data, causing increased cost of collection each time, compromised computational accuracy and complexity of the methodology used. We report here the successful use of an artificial neural network (ANN) in background to decrease the cost of genotyping by locus minimization. The webserver is freely accessible (http://nabg.iasri.res.in/bisgoat) to the research community. We demonstrate that the machine learning (ANN) approach for breed identification is capable of multifold advantages such as locus minimization, leading to a drastic reduction in cost, and web availability of reference breed data, alleviating the need for repeated genotyping each time one investigates the identity of an unknown breed. To develop this model web implementation based on ANN, we used 51,850 samples of allelic data of microsatellite-marker-based DNA fingerprinting on 25 loci covering 22 registered goat breeds of India for training. Minimizing loci to up to nine loci through the use of a multilayer perceptron model, we achieved 96.63% training accuracy. This server can be an indispensable tool for identification of existing breeds and new synthetic commercial breeds, leading to protection of intellectual property in case of sovereignty and bio-piracy disputes. This server can be widely used as a model for cost reduction by locus minimization for various other flora and fauna in terms of variety, breed and/or line identification, especially in conservation and improvement programs.

  19. Cancer cytogenetics: methodology revisited.

    PubMed

    Wan, Thomas S K

    2014-11-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed.

  20. Methodological Problems of Nanotechnoscience

    NASA Astrophysics Data System (ADS)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  1. Scientific methodology applied.

    PubMed

    Lussier, A

    1975-04-01

    The subject of this symposium is naproxen, a new drug that resulted from an investigation to find a superior anti-inflammatory agent. It was synthesized by Harrison et al. in 1970 at the Syntex Institute of Organic Chemistry and Biological Sciences. How can we chart the evolution of this or any other drug? Three steps are necessary: first, chemical studies (synthesis, analysis); second, animal pharmacology; third, human pharmacology. The last step can additionally be divided into four phases: metabolism and toxicology of the drug in normal volunteers; dose titration and initial clinical trials with sick subjects (pharmacometry); confirmatory clinical trials when the drug is accepted on the market and revaluation (familiarization trials). To discover the truth about naproxen, we must all participate actively with a critical mind, following the principles of scientific methodology. We shall find that the papers to be presented today all deal with the third step in the evaluation process--clinical pharmacology. It is quite evident that the final and most decisive test must be aimed at the most valuable target: the human being. The end product of this day's work for each of us should be the formation of an opinion based on solid scientific proofs. And let us hope that we will all enjoy fulfilling the symposium in its entire etymological meaning this evening. In vino veritas.

  2. Engineering radioecology: Methodological considerations

    SciTech Connect

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-12-31

    The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.

  3. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  4. Nursing research methodology: transcending Cartesianism.

    PubMed

    Walters, A J

    1996-06-01

    Nurses involved in research are concerned with methodological issues. This paper explores the Cartesian debate that has polarized the discourse on nursing research methodology. It is argued that methodologies exclusively based on objectivism, one pole of the Cartesian debate, or subjectivism, the other, do not provide nurses with adequate research foundations to understand the complexity of the lifeworld of nursing practice. This paper provides nurse researchers with an alternative methodological perspective, Gadamerian hermeneutics, which is in harmony with the clinical world of nursing practice.

  5. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  6. Dosimetric methodology of the ICRP

    SciTech Connect

    Eckerman, K.F.

    1994-12-31

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples.

  7. Status of sonic boom methodology and understanding

    NASA Technical Reports Server (NTRS)

    Darden, Christine M.; Powell, Clemans A.; Hayes, Wallace D.; George, Albert R.; Pierce, Allan D.

    1989-01-01

    In January 1988, approximately 60 representatives of industry, academia, government, and the military gathered at NASA-Langley for a 2 day workshop on the state-of-the-art of sonic boom physics, methodology, and understanding. The purpose of the workshop was to assess the sonic boom area, to determine areas where additional sonic boom research is needed, and to establish some strategies and priorities in this sonic boom research. Attendees included many internationally recognized sonic boom experts who had been very active in the Supersonic Transport (SST) and Supersonic Cruise Aircraft Research Programs of the 60's and 70's. Summaries of the assessed state-of-the-art and the research needs in theory, minimization, atmospheric effects during propagation, and human response are given.

  8. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    ERIC Educational Resources Information Center

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  9. Minimizing Expected Maximum Risk from Cyber-Attacks with Probabilistic Attack Success

    SciTech Connect

    Bhuiyan, Tanveer H.; Nandi, Apurba; Medal, Hugh; Halappanavar, Mahantesh

    2016-07-16

    The goal of our work is to enhance network security by generating partial cut-sets, which are a subset of edges that remove paths from initially vulnerable nodes (initial security conditions) to goal nodes (critical assets), on an attack graph given costs for cutting an edge and a limited overall budget.

  10. Minimal representations, geometric quantization, and unitarity.

    PubMed Central

    Brylinski, R; Kostant, B

    1994-01-01

    In the framework of geometric quantization we explicitly construct, in a uniform fashion, a unitary minimal representation pio of every simply-connected real Lie group Go such that the maximal compact subgroup of Go has finite center and Go admits some minimal representation. We obtain algebraic and analytic results about pio. We give several results on the algebraic and symplectic geometry of the minimal nilpotent orbits and then "quantize" these results to obtain the corresponding representations. We assume (Lie Go)C is simple. PMID:11607478

  11. Minimal covariant observables identifying all pure states

    NASA Astrophysics Data System (ADS)

    Carmeli, Claudio; Heinosaari, Teiko; Toigo, Alessandro

    2013-09-01

    It has been recently shown by Heinosaari, Mazzarella and Wolf (2013) [1] that an observable that identifies all pure states of a d-dimensional quantum system has minimally 4d-4 outcomes or slightly less (the exact number depending on d). However, no simple construction of this type of minimal observable is known. We investigate covariant observables that identify all pure states and have minimal number of outcomes. It is shown that the existence of this kind of observables depends on the dimension of the Hilbert space.

  12. Minimal representations, geometric quantization, and unitarity.

    PubMed

    Brylinski, R; Kostant, B

    1994-06-21

    In the framework of geometric quantization we explicitly construct, in a uniform fashion, a unitary minimal representation pio of every simply-connected real Lie group Go such that the maximal compact subgroup of Go has finite center and Go admits some minimal representation. We obtain algebraic and analytic results about pio. We give several results on the algebraic and symplectic geometry of the minimal nilpotent orbits and then "quantize" these results to obtain the corresponding representations. We assume (Lie Go)C is simple.

  13. An algorithm for constructing minimal order inverses

    NASA Technical Reports Server (NTRS)

    Patel, R. V.

    1976-01-01

    In this paper an algorithm is presented for constructing minimal order inverses of linear, time invariant, controllable and observable, multivariable systems. By means of simple matrix operations, a 'state-overdescribed' system is first constructed which is an inverse of the given multivariable system. A simple Gauss-Jordan type reduction procedure is then used to remove the redundancy in the state vector of the inverse system to obtain a minimal order inverse. When the given multivariable system is not invertible, the algorithm enables a minimal order inverse of an invertible subsystem to be constructed. Numerical examples are given to illustrate the use of the algorithm.

  14. Technology applications for radioactive waste minimization

    SciTech Connect

    Devgun, J.S.

    1994-07-01

    The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry.

  15. Workshops as a Research Methodology

    ERIC Educational Resources Information Center

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  16. Methodological Pluralism and Narrative Inquiry

    ERIC Educational Resources Information Center

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  17. Methodological Pluralism and Narrative Inquiry

    ERIC Educational Resources Information Center

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  18. Choosing a Methodology: Philosophical Underpinning

    ERIC Educational Resources Information Center

    Jackson, Elizabeth

    2013-01-01

    As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of…

  19. Minimizing Variation in Outdoor CPV Power Ratings (Presentation)

    SciTech Connect

    Muller, M.

    2011-04-01

    Presented at the 7th International Conference on Concentrating Photovoltaic Systems (CPV-7), 4-6 April 2011, Las Vegas, Nevada. The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  20. Minimization of power consumption during charging of superconducting accelerating cavities

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Anirban Krishna; Ziemann, Volker; Ruber, Roger; Goryashko, Vitaliy

    2015-11-01

    The radio frequency cavities, used to accelerate charged particle beams, need to be charged to their nominal voltage after which the beam can be injected into them. The standard procedure for such cavity filling is to use a step charging profile. However, during initial stages of such a filling process a substantial amount of the total energy is wasted in reflection for superconducting cavities because of their extremely narrow bandwidth. The paper presents a novel strategy to charge cavities, which reduces total energy reflection. We use variational calculus to obtain analytical expression for the optimal charging profile. Energies, reflected and required, and generator peak power are also compared between the charging schemes and practical aspects (saturation, efficiency and gain characteristics) of power sources (tetrodes, IOTs and solid state power amplifiers) are also considered and analysed. The paper presents a methodology to successfully identify the optimal charging scheme for different power sources to minimize total energy requirement.

  1. Conceptual design methodology for vibration isolation

    NASA Astrophysics Data System (ADS)

    Hyde, T. Tupper

    1997-06-01

    High performance dynamic structures have strict requirements on structural motion that are emphasized by the flexibility inherent in lightweight space systems. Vibration isolation is used to prevent disturbances from affecting critical payload components where motion is to be minimized. Isolation, however, is often an engineering solution that is not properly considered in the early conceptual design of the spacecraft. It is at this key stage of a program that mission driving performance targets and resource allocations are made yet little analysis has been performed. A conceptual design methodology for isolation is developed and applied to the conceptual design of a proposed space shuttle based telescope system. In the developed methodology, frequency domain computation of the closed loop performance without isolation pinpoints frequency regimes and disturbance to performance channels targeted for improvement. A coarse fidelity structural model, with well defined disturbance and performance characterization, is more useful than a costly high fidelity analysis when evaluating the many isolation options available early in a project. Isolation design choices are made by trading their performance improvement against their complexity/cost. Simple, idealized mechanical descriptions of the passive or active isolation system provide the needed frequency domain effect on performance without the costly analysis that a detailed isolator design entails. Similarly, the effects of other integrating subsystems, such as structural or optical control are approximated by frequency domain descriptions.

  2. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  3. Structural design methodology for large space structures

    NASA Astrophysics Data System (ADS)

    Dornsife, Ralph J.

    1992-02-01

    The Department of Defense requires research and development in designing, fabricating, deploying, and maintaining large space structures (LSS) in support of Army and Strategic Defense Initiative military objectives. Because of their large size, extreme flexibility, and the unique loading conditions in the space environment, LSS will present engineers with problems unlike those encountered in designing conventional civil engineering or aerospace structures. LSS will require sophisticated passive damping and active control systems in order to meet stringent mission requirements. These structures must also be optimally designed to minimize high launch costs. This report outlines a methodology for the structural design of LSS. It includes a definition of mission requirements, structural modeling and analysis, passive damping and active control system design, ground-based testing, payload integration, on-orbit system verification, and on-orbit assessment of structural damage. In support of this methodology, analyses of candidate LSS truss configurations are presented, and an algorithm correlating ground-based test behavior to expected microgravity behavior is developed.

  4. Minimally invasive surgery for Achilles tendon pathologies

    PubMed Central

    Maffulli, Nicola; Longo, Umile Giuseppe; Spiezia, Filippo; Denaro, Vincenzo

    2010-01-01

    Minimally invasive trauma and orthopedic surgery is increasingly common, though technically demanding. Its use for pathologies of the Achilles tendon (AT) hold the promise to allow faster recovery times, shorter hospital stays, and improved functional outcomes when compared to traditional open procedures, which can lead to difficulty with wound healing because of the tenuous blood supply and increased chance of wound breakdown and infection. We present the recent advances in the field of minimally invasive AT surgery for tendinopathy, acute ruptures, chronic tears, and chronic avulsions of the AT. In our hands, minimally invasive surgery has provided similar results to those obtained with open surgery, with decreased perioperative morbidity, decreased duration of hospital stay, and reduced costs. So far, the studies on minimally invasive orthopedic techniques are of moderate scientific quality with short follow-up periods. Multicenter studies with longer follow-up are needed to justify the long-term advantages of these techniques over traditional ones. PMID:24198547

  5. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    Not Available

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  6. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  7. Heart bypass surgery - minimally invasive - discharge

    MedlinePlus

    ... thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest . 2012;141(2 ... bypass surgery - minimally invasive Heart failure - overview High blood cholesterol ...

  8. Genetic algorithms for minimal source reconstructions

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.

    1993-12-01

    Under-determined linear inverse problems arise in applications in which signals must be estimated from insufficient data. In these problems the number of potentially active sources is greater than the number of observations. In many situations, it is desirable to find a minimal source solution. This can be accomplished by minimizing a cost function that accounts from both the compatibility of the solution with the observations and for its ``sparseness``. Minimizing functions of this form can be a difficult optimization problem. Genetic algorithms are a relatively new and robust approach to the solution of difficult optimization problems, providing a global framework that is not dependent on local continuity or on explicit starting values. In this paper, the authors describe the use of genetic algorithms to find minimal source solutions, using as an example a simulation inspired by the reconstruction of neural currents in the human brain from magnetoencephalographic (MEG) measurements.

  9. Minimally Invasive Transcatheter Aortic Valve Replacement (TAVR)

    MedlinePlus Videos and Cool Tools

    Watch a Broward Health surgeon perform a minimally invasive Transcatheter Aortic Valve Replacement (TAVR) Click Here to view the BroadcastMed, Inc. Privacy Policy and Legal Notice © 2017 BroadcastMed, Inc. ...

  10. Academic Achievement and Minimal Brain Dysfunction

    ERIC Educational Resources Information Center

    Edwards, R. Philip; And Others

    1971-01-01

    The investigation provided no evidence that a diagnosis of minimal brain dysfunction based on a pediatric neurological evaluation and/or visual-motor impairment as measured by the Bender-Gestalt, is a useful predictor of academic achievement. (Author)

  11. Mixed waste minimization in a research environment

    SciTech Connect

    Kirner, N.

    1994-12-31

    This presentation describes minimization efforts and processes for mixed waste generated by research facilities. Waste stream assessment and treatment, and database management for various research-related waste streams is detailed.

  12. Degreasing of titanium to minimize stress corrosion

    NASA Technical Reports Server (NTRS)

    Carpenter, S. R.

    1967-01-01

    Stress corrosion of titanium and its alloys at elevated temperatures is minimized by replacing trichloroethylene with methanol or methyl ethyl ketone as a degreasing agent. Wearing cotton gloves reduces stress corrosion from perspiration before the metal components are processed.

  13. Bi-quartic parametric polynomial minimal surfaces

    NASA Astrophysics Data System (ADS)

    Kassabov, O.; Vlachkova, K.

    2015-10-01

    Minimal surfaces with isothermal parameters admitting Bézier representation were studied by Cosín and Monterde. They showed that, up to an affine transformation, the Enneper surface is the only bi-cubic isothermal minimal surface. Here we study bi-quartic isothermal minimal surfaces and establish the general form of their generating functions in the Weierstrass representation formula. We apply an approach proposed by Ganchev to compute the normal curvature and show that, in contrast to the bi-cubic case, there is a variety of bi-quartic isothermal minimal surfaces. Based on the Bézier representation we establish some geometric properties of the bi-quartic harmonic surfaces. Numerical experiments are visualized and presented to illustrate and support our results.

  14. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  15. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    PubMed Central

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-01-01

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171

  16. A modified secant method for unconstrained minimization

    NASA Technical Reports Server (NTRS)

    Polak, E.

    1972-01-01

    A gradient-secant algorithm for unconstrained optimization problems is presented. The algorithm uses Armijo gradient method iterations until it reaches a region where the Newton method is more efficient, and then switches over to a secant form of operation. It is concluded that an efficient method for unconstrained minimization has been developed, and that any convergent minimization method can be substituted for the Armijo gradient method.

  17. Future of Minimally Invasive Colorectal Surgery

    PubMed Central

    Whealon, Matthew; Vinci, Alessio; Pigazzi, Alessio

    2016-01-01

    Minimally invasive surgery is slowly taking over as the preferred operative approach for colorectal diseases. However, many of the procedures remain technically difficult. This article will give an overview of the state of minimally invasive surgery and the many advances that have been made over the last two decades. Specifically, we discuss the introduction of the robotic platform and some of its benefits and limitations. We also describe some newer techniques related to robotics. PMID:27582647

  18. Gravitino problem in minimal supergravity inflation

    NASA Astrophysics Data System (ADS)

    Hasegawa, Fuminori; Mukaida, Kyohei; Nakayama, Kazunori; Terada, Takahiro; Yamada, Yusuke

    2017-04-01

    We study non-thermal gravitino production in the minimal supergravity inflation. In this minimal model utilizing orthogonal nilpotent superfields, the particle spectrum includes only graviton, gravitino, inflaton, and goldstino. We find that a substantial fraction of the cosmic energy density can be transferred to the longitudinal gravitino due to non-trivial change of its sound speed. This implies either a breakdown of the effective theory after inflation or a serious gravitino problem.

  19. Minimally Invasive Forefoot Surgery in France.

    PubMed

    Meusnier, Tristan; Mukish, Prikesht

    2016-06-01

    Study groups have been formed in France to advance the use of minimally invasive surgery. These techniques are becoming more frequently used and the technique nuances are continuing to evolve. The objective of this article was to advance the awareness of the current trends in minimally invasive surgery for common diseases of the forefoot. The percutaneous surgery at the forefoot is less developed at this time, but also will be discussed.

  20. Current research in sonic-boom minimization

    NASA Technical Reports Server (NTRS)

    Darden, C. M.; Mack, R. J.

    1976-01-01

    A review is given of several questions as yet unanswered in the area of sonic-boom research. Efforts, both here at Langley and elsewhere, in the area of minimization, human response, design techniques and in developing higher order propagation methods are discussed. In addition, a wind-tunnel test program being conducted to assess the validity of minimization methods based on a forward spike in the F-function is described.

  1. Minimally invasive treatment of infected pancreatic necrosis

    PubMed Central

    Cebulski, Włodzimierz; Słodkowski, Maciej; Krasnodębski, Ireneusz W.

    2014-01-01

    Infected pancreatic necrosis is a challenging complication that worsens prognosis in acute pancreatitis. For years, open necrosectomy has been the mainstay treatment option in infected pancreatic necrosis, although surgical debridement still results in high morbidity and mortality rates. Recently, many reports on minimally invasive treatment in infected pancreatic necrosis have been published. This paper presents a review of minimally invasive techniques and attempts to define their role in the management of infected pancreatic necrosis. PMID:25653725

  2. Mesonic spectroscopy of minimal walking technicolor

    SciTech Connect

    Del Debbio, Luigi; Lucini, Biagio; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2010-07-01

    We investigate the structure and the novel emerging features of the mesonic nonsinglet spectrum of the minimal walking technicolor theory. Precision measurements in the nonsinglet pseudoscalar and vector channels are compared to the expectations for an IR-conformal field theory and a QCD-like theory. Our results favor a scenario in which minimal walking technicolor is (almost) conformal in the infrared, while spontaneous chiral symmetry breaking seems less plausible.

  3. Minimally invasive osteosynthesis technique for articular fractures.

    PubMed

    Beale, Brian S; Cole, Grayson

    2012-09-01

    Articular fractures require accurate reduction and rigid stabilization to decrease the chance of osteoarthritis and joint dysfunction. Articular fractures have been traditionally repaired by arthrotomy and internal fixation. Recently, minimally invasive techniques have been introduced to treat articular fractures, reducing patient morbidity and improving the accuracy of reduction. A variety of techniques, including distraction, radiographic imaging, and arthroscopy, are used with the minimally invasive osteosynthesis technique of articular fractures to achieve a successful repair and outcome.

  4. Alternating minimization and Boltzmann machine learning.

    PubMed

    Byrne, W

    1992-01-01

    Training a Boltzmann machine with hidden units is appropriately treated in information geometry using the information divergence and the technique of alternating minimization. The resulting algorithm is shown to be closely related to gradient descent Boltzmann machine learning rules, and the close relationship of both to the EM algorithm is described. An iterative proportional fitting procedure for training machines without hidden units is described and incorporated into the alternating minimization algorithm.

  5. Minimally Invasive Osteotomies of the Calcaneus.

    PubMed

    Guyton, Gregory P

    2016-09-01

    Osteotomies of the calcaneus are powerful surgical tools, representing a critical component of the surgical reconstruction of pes planus and pes cavus deformity. Modern minimally invasive calcaneal osteotomies can be performed safely with a burr through a lateral incision. Although greater kerf is generated with the burr, the effect is modest, can be minimized, and is compatible with many fixation techniques. A hinged jig renders the procedure more reproducible and accessible.

  6. Minimalism in Art, Medical Science and Neurosurgery.

    PubMed

    Ökten, Ali İhsan

    2016-12-21

    The word ''minimalism'' is a word derived from French the word ''minimum''. Whereas the lexical meaning of minimum is ''the least or the smallest quantity necessary for something'', its expression in mathematics can be described as ''the lowest step a variable number can descend, least, minimal''. Minimalism, which advocates an extreme simplicity of the artistic form, is a current in modern art and music whose origins go to 1960s and which features simplicity and objectivity. Although art, science and philosophy are different disciplines, they support each other from time to time, sometimes they intertwine and sometimes they copy each other. A periodic schools or teaching in one of them can take the others into itself, so, they proceed on their ways empowering each other. It is also true for the minimalism in art and the minimal invasive surgical approaches in science. Concepts like doing with less, avoiding unnecessary materials and reducing the number of the elements in order to increase the effect in the expression which are the main elements of the minimalism in art found their equivalents in medicine and neurosurgery. Their equivalents in medicine or neurosurgery have been to protect the physical integrity of the patient with less iatrogenic injury, minimum damage and the same therapeutic effect in the most effective way and to enable the patient to regain his health in the shortest span of time.

  7. Economic impact of minimally invasive lumbar surgery

    PubMed Central

    Hofstetter, Christoph P; Hofer, Anna S; Wang, Michael Y

    2015-01-01

    Cost effectiveness has been demonstrated for traditional lumbar discectomy, lumbar laminectomy as well as for instrumented and noninstrumented arthrodesis. While emerging evidence suggests that minimally invasive spine surgery reduces morbidity, duration of hospitalization, and accelerates return to activites of daily living, data regarding cost effectiveness of these novel techniques is limited. The current study analyzes all available data on minimally invasive techniques for lumbar discectomy, decompression, short-segment fusion and deformity surgery. In general, minimally invasive spine procedures appear to hold promise in quicker patient recovery times and earlier return to work. Thus, minimally invasive lumbar spine surgery appears to have the potential to be a cost-effective intervention. Moreover, novel less invasive procedures are less destabilizing and may therefore be utilized in certain indications that traditionally required arthrodesis procedures. However, there is a lack of studies analyzing the economic impact of minimally invasive spine surgery. Future studies are necessary to confirm the durability and further define indications for minimally invasive lumbar spine procedures. PMID:25793159

  8. Minimally Invasive Surgery in Gynecologic Oncology

    PubMed Central

    Mori, Kristina M.; Neubauer, Nikki L.

    2013-01-01

    Minimally invasive surgery has been utilized in the field of obstetrics and gynecology as far back as the 1940s when culdoscopy was first introduced as a visualization tool. Gynecologists then began to employ minimally invasive surgery for adhesiolysis and obtaining biopsies but then expanded its use to include procedures such as tubal sterilization (Clyman (1963), L. E. Smale and M. L. Smale (1973), Thompson and Wheeless (1971), Peterson and Behrman (1971)). With advances in instrumentation, the first laparoscopic hysterectomy was successfully performed in 1989 by Reich et al. At the same time, minimally invasive surgery in gynecologic oncology was being developed alongside its benign counterpart. In the 1975s, Rosenoff et al. reported using peritoneoscopy for pretreatment evaluation in ovarian cancer, and Spinelli et al. reported on using laparoscopy for the staging of ovarian cancer. In 1993, Nichols used operative laparoscopy to perform pelvic lymphadenectomy in cervical cancer patients. The initial goals of minimally invasive surgery, not dissimilar to those of modern medicine, were to decrease the morbidity and mortality associated with surgery and therefore improve patient outcomes and patient satisfaction. This review will summarize the history and use of minimally invasive surgery in gynecologic oncology and also highlight new minimally invasive surgical approaches currently in development. PMID:23997959

  9. Cluster Stability Estimation Based on a Minimal Spanning Trees Approach

    NASA Astrophysics Data System (ADS)

    Volkovich, Zeev (Vladimir); Barzily, Zeev; Weber, Gerhard-Wilhelm; Toledano-Kitai, Dvora

    2009-08-01

    Among the areas of data and text mining which are employed today in science, economy and technology, clustering theory serves as a preprocessing step in the data analyzing. However, there are many open questions still waiting for a theoretical and practical treatment, e.g., the problem of determining the true number of clusters has not been satisfactorily solved. In the current paper, this problem is addressed by the cluster stability approach. For several possible numbers of clusters we estimate the stability of partitions obtained from clustering of samples. Partitions are considered consistent if their clusters are stable. Clusters validity is measured as the total number of edges, in the clusters' minimal spanning trees, connecting points from different samples. Actually, we use the Friedman and Rafsky two sample test statistic. The homogeneity hypothesis, of well mingled samples within the clusters, leads to asymptotic normal distribution of the considered statistic. Resting upon this fact, the standard score of the mentioned edges quantity is set, and the partition quality is represented by the worst cluster corresponding to the minimal standard score value. It is natural to expect that the true number of clusters can be characterized by the empirical distribution having the shortest left tail. The proposed methodology sequentially creates the described value distribution and estimates its left-asymmetry. Numerical experiments, presented in the paper, demonstrate the ability of the approach to detect the true number of clusters.

  10. Minimizing liability by properly planning UST system upgrades

    SciTech Connect

    Kroon, D.H.; Baach, M.K.

    1995-12-31

    Existing underground storage tank (UST) systems containing regulated substances, including petroleum products, are defined by the Environmental Protection Agency (EPA) as those installed prior to December 22, 1988. Under the federal regulations (40 CFR Parts 280 and 281), these systems must be upgraded to new standards by December 22, 1998 in the areas of spill and overfill prevention, corrosion protection, and leak detection. Properly planned UST system upgrades provide safety and environmental protection plus: compliance with federal regulations; minimum public liability; and reduced insurance premiums. Some modification to this program will be required where state and local regulations are more strict than the federal requirements. Minimizing liability at reduced costs is the key element of the upgrade program. Although the regulatory requirements must be satisfied, the paramount issue is to minimize exposure to public liability. The methodology presented has been demonstrated to economically achieve that very important goal. In a recent case history, a major operator of UST systems adopted this program and was rewarded by his insurance company with over a 50% reduction in premiums for pollution liability insurance. The upgrade program for existing UST systems consists of: general planning; site investigation; specific plan development; implementation; and monitoring and records.

  11. Sequential unconstrained minimization algorithms for constrained optimization

    NASA Astrophysics Data System (ADS)

    Byrne, Charles

    2008-02-01

    The problem of minimizing a function f(x):RJ → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G_k(x)=f(x)+g_k(x), to obtain xk. The auxiliary functions gk(x):D ⊆ RJ → R+ are nonnegative on the set D, each xk is assumed to lie within D, and the objective is to minimize the continuous function f:RJ → R over x in the set C=\\overline D , the closure of D. We assume that such minimizers exist, and denote one such by \\hat x . We assume that the functions gk(x) satisfy the inequalities 0\\leq g_k(x)\\leq G_{k-1}(x)-G_{k-1}(x^{k-1}), for k = 2, 3, .... Using this assumption, we show that the sequence {f(xk)} is decreasing and converges to f({\\hat x}) . If the restriction of f(x) to D has bounded level sets, which happens if \\hat x is unique and f(x) is closed, proper and convex, then the sequence {xk} is bounded, and f(x^*)=f({\\hat x}) , for any cluster point x*. Therefore, if \\hat x is unique, x^*={\\hat x} and \\{x^k\\}\\rightarrow {\\hat x} . When \\hat x is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton-Raphson method. The proof techniques used for SUMMA can be extended to obtain related results for the induced proximal

  12. Teaching Stimulus-Stimulus Relations to Minimally Verbal Individuals: Reflections on Technology and Future Directions

    PubMed Central

    McIlvane, W. J.; Gerard, C. J.; Kledaras, J. B.; Mackay, H. A.; Lionello-DeNolf, K. M.

    2016-01-01

    This paper discusses recent methodological approaches and investigations that are aimed at developing reliable behavioral technology for teaching stimulus-stimulus relations to individuals who are minimally verbal and show protracted difficulty in acquiring such relations. The paper has both empirical and theoretical content. The empirical component presents recent data concerning the possibility of generating rapid relational learning in individuals who do not initially show it. The theoretical component (1) considers decades of methodological investigations with this population and (2) suggests a testable hypothesis concerning some individuals exhibit unusual difficulties in learning. Given this background, we suggest a way forward to better understand and perhaps resolve these learning challenges. PMID:28490976

  13. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  14. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  15. The GO-FLOW methodology

    SciTech Connect

    Matsuoka, T.; Kobayashi, M.; Takemura, K.

    1989-03-01

    A reliability analysis using the GO-FLOW methodology is given for the emergency core cooling system (ECCS) of a marine reactor experiencing either a collision or a grounding accident. The analysis is an example of a phased mission problem, and the system is a relatively large system with 90 components. An overview of the GO-FLOW methodology, a description of the ECCS, and the analysis procedure are given. Time-dependent mission unreliabilities under three accident conditions are obtained by one GO-FLOW chart with one computer run. The GO-FLOW methodology has proved to be a useful tool for probabilistic safety assessments of actual systems.

  16. Intrasulcal electrocorticography in macaque monkeys with minimally invasive neurosurgical protocols.

    PubMed

    Matsuo, Takeshi; Kawasaki, Keisuke; Osada, Takahiro; Sawahata, Hirohito; Suzuki, Takafumi; Shibata, Masahiro; Miyakawa, Naohisa; Nakahara, Kiyoshi; Iijima, Atsuhiko; Sato, Noboru; Kawai, Kensuke; Saito, Nobuhito; Hasegawa, Isao

    2011-01-01

    Electrocorticography (ECoG), multichannel brain-surface recording and stimulation with probe electrode arrays, has become a potent methodology not only for clinical neurosurgery but also for basic neuroscience using animal models. The highly evolved primate's brain has deep cerebral sulci, and both gyral and intrasulcal cortical regions have been implicated in important functional processes. However, direct experimental access is typically limited to gyral regions, since placing probes into sulci is difficult without damaging the surrounding tissues. Here we describe a novel methodology for intrasulcal ECoG in macaque monkeys. We designed and fabricated ultra-thin flexible probes for macaques with micro-electro-mechanical systems technology. We developed minimally invasive operative protocols to implant the probes by introducing cutting-edge devices for human neurosurgery. To evaluate the feasibility of intrasulcal ECoG, we conducted electrophysiological recording and stimulation experiments. First, we inserted parts of the Parylene-C-based probe into the superior temporal sulcus to compare visually evoked ECoG responses from the ventral bank of the sulcus with those from the surface of the inferior temporal cortex. Analyses of power spectral density and signal-to-noise ratio revealed that the quality of the ECoG signal was comparable inside and outside of the sulcus. Histological examination revealed no obvious physical damage in the implanted areas. Second, we placed a modified silicone ECoG probe into the central sulcus and also on the surface of the precentral gyrus for stimulation. Thresholds for muscle twitching were significantly lower during intrasulcal stimulation compared to gyral stimulation. These results demonstrate the feasibility of intrasulcal ECoG in macaques. The novel methodology proposed here opens up a new frontier in neuroscience research, enabling the direct measurement and manipulation of electrical activity in the whole brain.

  17. [Ancient DNA: principles and methodologies].

    PubMed

    De Angelis, Flavio; Scorrano, Gabriele; Rickards, Olga

    2013-01-01

    Paleogenetics is providing increasing evidence about the biological characteristics of ancient populations. This paper examines the guiding principles and methodologies to the study of ancient DNA with constant references to the state of the art in this fascinating disciplin.

  18. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  19. Methodological Problems of Soviet Pedagogy

    ERIC Educational Resources Information Center

    Noah, Harold J., Ed.; Beach, Beatrice S., Ed.

    1974-01-01

    Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)

  20. Mach, methodology, hysteresis and economics

    NASA Astrophysics Data System (ADS)

    Cross, R.

    2008-11-01

    This methodological note examines the epistemological foundations of hysteresis with particular reference to applications to economic systems. The economy principles of Ernst Mach are advocated and used in this assessment.

  1. DSN data systems software methodology

    NASA Technical Reports Server (NTRS)

    Hung, C. K.

    1982-01-01

    A software methodology for JPL deep space network (DSN) data systems software implementations through transfer and delivery is presented. The DSN Data Systems Software Methodology is compatible with and depends on DSN software methodology and also incorporates the characteristics of real-time program development in a DSN environment. The DSN Data Systems software implementation consists of a series of six distinct phases. An Independent Group is responsible for verification and validation of the DSN Data Systems software during developing phases. The DSN data systems software methodology is applied to all development software provided for or by the DSN data systems section in Mark IV where there is a desire for reliability, maintainability, and usability within budget and schedule constraints.

  2. Reflective Methodology: The Beginning Teacher

    ERIC Educational Resources Information Center

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  3. Minimally invasive procedures on the lumbar spine

    PubMed Central

    Skovrlj, Branko; Gilligan, Jeffrey; Cutler, Holt S; Qureshi, Sheeraz A

    2015-01-01

    Degenerative disease of the lumbar spine is a common and increasingly prevalent condition that is often implicated as the primary reason for chronic low back pain and the leading cause of disability in the western world. Surgical management of lumbar degenerative disease has historically been approached by way of open surgical procedures aimed at decompressing and/or stabilizing the lumbar spine. Advances in technology and surgical instrumentation have led to minimally invasive surgical techniques being developed and increasingly used in the treatment of lumbar degenerative disease. Compared to the traditional open spine surgery, minimally invasive techniques require smaller incisions and decrease approach-related morbidity by avoiding muscle crush injury by self-retaining retractors, preventing the disruption of tendon attachment sites of important muscles at the spinous processes, using known anatomic neurovascular and muscle planes, and minimizing collateral soft-tissue injury by limiting the width of the surgical corridor. The theoretical benefits of minimally invasive surgery over traditional open surgery include reduced blood loss, decreased postoperative pain and narcotics use, shorter hospital length of stay, faster recover and quicker return to work and normal activity. This paper describes the different minimally invasive techniques that are currently available for the treatment of degenerative disease of the lumbar spine. PMID:25610845

  4. Exploration, novelty, surprise, and free energy minimization.

    PubMed

    Schwartenbeck, Philipp; Fitzgerald, Thomas; Dolan, Raymond J; Friston, Karl

    2013-01-01

    This paper reviews recent developments under the free energy principle that introduce a normative perspective on classical economic (utilitarian) decision-making based on (active) Bayesian inference. It has been suggested that the free energy principle precludes novelty and complexity, because it assumes that biological systems-like ourselves-try to minimize the long-term average of surprise to maintain their homeostasis. However, recent formulations show that minimizing surprise leads naturally to concepts such as exploration and novelty bonuses. In this approach, agents infer a policy that minimizes surprise by minimizing the difference (or relative entropy) between likely and desired outcomes, which involves both pursuing the goal-state that has the highest expected utility (often termed "exploitation") and visiting a number of different goal-states ("exploration"). Crucially, the opportunity to visit new states increases the value of the current state. Casting decision-making problems within a variational framework, therefore, predicts that our behavior is governed by both the entropy and expected utility of future states. This dissolves any dialectic between minimizing surprise and exploration or novelty seeking.

  5. Blackfolds, plane waves and minimal surfaces

    NASA Astrophysics Data System (ADS)

    Armas, Jay; Blau, Matthias

    2015-07-01

    Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.

  6. Methodological Innovations for Studying Organizations.

    DTIC Science & Technology

    1981-01-01

    A-AL13 284 AMERICAN PSYCHOLOGICAL ASSOCIATION INC WASHINGTON DC F/0 5/1 1C1NOOOLOICAL INNOVATIONS FOR STUDYING ORGANIZATIONS. (U) 1981 NOOI-79-0-0074...METHODOLOGY METHODOLOGICAL INNOVATIONS FOR STUDYING ORGANIZATIONS 0 TDT1C, (, APR 8 A project sponsored by Division 14 of the American Psychological Association , funded...Industrial and Organiza- tional Psychology) of the American Psychological Association . In 1977, the president of the division (John P. Campbell) appointed

  7. [Guidelines for nursing methodology implantation].

    PubMed

    Alberdi Castell, Rosamaría; Artigas Lelong, Berta; Cuxart Ainaud, Núria; Agüera Ponce, Ana

    2003-09-01

    The authors introduce three guidelines as part of the process to implant the nursing methodology based on the Virginia Henderson Conceptual Model; they propose to help nurses adopt the aforementioned method in their daily practice. These three guidelines shall be published in successive articles: Guidelines to identify attitudes and aptitudes related to the nursing profession; Guidelines to implant the nursing methodology based on the Virginia Henderson Conceptual Model; and Guidelines to plan areas for improvement.

  8. Minimally invasive optical biopsy for oximetry

    NASA Astrophysics Data System (ADS)

    van der Putten, Marieke A.; Brewer, James M.; Harvey, Andrew R.

    2017-02-01

    The study of localised oxygen saturation in blood vessels can shed light on the etiology and progression of many diseases with which hypoxia is associated. For example, hypoxia in the tendon has been linked to early stages of rheumatoid arthritis, an auto-immune inflammatory disease. Vascular oximetry of deep tissue presents significant challenges as vessels are not optically accessible. In this paper, we present a novel multispectral imaging technique for vascular oximetry, and recent developments made towards its adaptation for minimally invasive imaging. We present proof-of-concept of the system and illumination scheme as well as the analysis technique. We present results of a validation study performed in vivo on mice with acutely inflamed tendons. Adaptation of the technique for minimally invasive microendoscopy is also presented, along with preliminary results of minimally invasive ex vivo vascular oximetry.

  9. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  10. Minimal Length Scale Scenarios for Quantum Gravity.

    PubMed

    Hossenfelder, Sabine

    2013-01-01

    We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.

  11. Minimally invasive neurosurgery for cerebrospinal fluid disorders.

    PubMed

    Guillaume, Daniel J

    2010-10-01

    This article focuses on minimally invasive approaches used to address disorders of cerebrospinal fluid (CSF) circulation. The author covers the primary CSF disorders that are amenable to minimally invasive treatment, including aqueductal stenosis, fourth ventricular outlet obstruction (including Chiari malformation), isolated lateral ventricle, isolated fourth ventricle, multiloculated hydrocephalus, arachnoid cysts, and tumors that block CSF flow. General approaches to evaluating disorders of CSF circulation, including detailed imaging studies, are discussed. Approaches to minimally invasive management of such disorders are described in general, and for each specific entity. For each procedure, indications, surgical technique, and known outcomes are detailed. Specific complications as well as strategies for their avoidance and management are addressed. Lastly, future directions and the need for structured outcome studies are discussed.

  12. Genetic Research on Biospecimens Poses Minimal Risk

    PubMed Central

    Wendler, David S.; Rid, Annette

    2014-01-01

    Genetic research on human biospecimens is increasingly common. Yet, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to address this concern using the widely-endorsed ‘risks of daily life’ standard. The three extant versions of this standard all suggest that, with proper measures in place to protect donor confidentiality, most genetic research on human biospecimens poses minimal risk to donors. PMID:25530152

  13. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  14. PRIME: Phase Retrieval via Majorization-Minimization

    NASA Astrophysics Data System (ADS)

    Qiu, Tianyu; Babu, Prabhu; Palomar, Daniel P.

    2016-10-01

    This paper considers the phase retrieval problem in which measurements consist of only the magnitude of several linear measurements of the unknown, e.g., spectral components of a time sequence. We develop low-complexity algorithms with superior performance based on the majorization-minimization (MM) framework. The proposed algorithms are referred to as PRIME: Phase Retrieval vIa the Majorization-minimization techniquE. They are preferred to existing benchmark methods since at each iteration a simple surrogate problem is solved with a closed-form solution that monotonically decreases the original objective function. In total, four algorithms are proposed using different majorization-minimization techniques. Experimental results validate that our algorithms outperform existing methods in terms of successful recovery and mean square error under various settings.

  15. Responsible gambling: general principles and minimal requirements.

    PubMed

    Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc

    2011-12-01

    Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program.

  16. On Equilibria for ADM Minimization Games

    NASA Astrophysics Data System (ADS)

    Epstein, Leah; Levin, Asaf

    In the ADM minimization problem, the input is a set of arcs along a directed ring. The input arcs need to be partitioned into non-overlapping chains and cycles so as to minimize the total number of endpoints, where a k-arc cycle contributes k endpoints and a k-arc chain contains k + 1 endpoints. We study ADM minimization problem both as a non-cooperative and a cooperative games. In these games, each arc corresponds to a player, and the players share the cost of the ADM switches. We consider two cost allocation models, a model which was considered by Flammini et al., and a new cost allocation model, which is inspired by congestion games. We compare the price of anarchy and price of stability in the two cost allocation models, as well as the strong price of anarchy and the strong price of stability.

  17. Waste Minimization Measurement and Progress Reporting

    SciTech Connect

    Stone, K.A.

    1995-02-13

    Westinghouse Savannah River Company is implementing productivity improvement concepts into the Waste Minimization Program by focusing on the positive initiatives taken to reduce waste generation at the Savannah River Site. Previous performance measures, based only on waste generation rates, proved to be an ineffective metric for measuring performance and promoting continuous improvements within the Program. Impacts of mission changes and non-routine operations impeded development of baseline waste generation rates and often negated waste generation trending reports. A system was developed to quantify, document and track innovative activities that impact waste volume and radioactivity/toxicity reductions. This system coupled with Management-driven waste disposal avoidance goals is proving to be a powerful tool to promote waste minimization awareness and the implementation of waste reduction initiatives. Measurement of waste not generated, in addition to waste generated, increases the credibility of the Waste Minimization Program, improves sharing of success stories, and supports development of regulatory and management reports

  18. One-dimensional Gromov minimal filling problem

    NASA Astrophysics Data System (ADS)

    Ivanov, Alexandr O.; Tuzhilin, Alexey A.

    2012-05-01

    The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.

  19. Advanced pyrochemical technologies for minimizing nuclear waste

    SciTech Connect

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-12-31

    The US Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 l of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper presents technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts.

  20. Minimally invasive surgical techniques in periodontal regeneration.

    PubMed

    Cortellini, Pierpaolo

    2012-09-01

    A review of the current scientific literature was undertaken to evaluate the efficacy of minimally invasive periodontal regenerative surgery in the treatment of periodontal defects. The impact on clinical outcomes, surgical chair-time, side effects and patient morbidity were evaluated. An electronic search of PUBMED database from January 1987 to December 2011 was undertaken on dental journals using the key-word "minimally invasive surgery". Cohort studies, retrospective studies and randomized controlled clinical trials referring to treatment of periodontal defects with at least 6 months of follow-up were selected. Quality assessment of the selected studies was done through the Strength of Recommendation Taxonomy Grading (SORT) System. Ten studies (1 retrospective, 5 cohorts and 4 RCTs) were included. All the studies consistently support the efficacy of minimally invasive surgery in the treatment of periodontal defects in terms of clinical attachment level gain, probing pocket depth reduction and minimal gingival recession. Six studies reporting on side effects and patient morbidity consistently indicate very low levels of pain and discomfort during and after surgery resulting in a reduced intake of pain-killers and very limited interference with daily activities in the post-operative period. Minimally invasive surgery might be considered a true reality in the field of periodontal regeneration. The observed clinical improvements are consistently associated with very limited morbidity to the patient during the surgical procedure as well as in the post-operative period. Minimally invasive surgery, however, cannot be applied at all cases. A stepwise decisional algorithm should support clinicians in choosing the treatment approach.

  1. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D.W.; Bennett, B.L.; Cockroft, N.J.

    1998-09-08

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal. 5 figs.

  2. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D. Wayne; Bennett, Bryan L.; Cockroft, Nigel J.

    1998-01-01

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal.

  3. Reversible Rings with Involutions and Some Minimalities

    PubMed Central

    Fakieh, W. M.; Nauman, S. K.

    2013-01-01

    In continuation of the recent developments on extended reversibilities on rings, we initiate here a study on reversible rings with involutions, or, in short, ∗-reversible rings. These rings are symmetric, reversible, reflexive, and semicommutative. In this note we will study some properties and examples of ∗-reversible rings. It is proved here that the polynomial rings of ∗-reversible rings may not be ∗-reversible. A criterion for rings which cannot adhere to any involution is developed and it is observed that a minimal noninvolutary ring is of order 4 and that a minimal noncommutative ∗-reversible ring is of order 16. PMID:24489510

  4. Linearized non-minimal higher curvature supergravity

    NASA Astrophysics Data System (ADS)

    Farakos, Fotis; Kehagias, Alex; Koutrolikos, Konstantinos

    2015-05-01

    In the framework of linearized non-minimal supergravity (20/20), we present the embedding of the R +R2 model and we analyze its field spectrum. As usual, the auxiliary fields of the Einstein theory now become propagating, giving rise to additional degrees of freedom, which organize themselves into on-shell irreducible supermultiplets. By performing the analysis both in component and superspace formulations we identify the new supermultiplets. On top of the two massive chiral superfields reminiscent of the old-minimal supergravity embedding, the spectrum contains also a consistent physical, massive, vector supermultiplet and a tachyonic ghost, massive, vector supermultiplet.

  5. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  6. Minimally invasive treatments for venous compression syndromes

    PubMed Central

    Hulsberg, Paul C.; McLoney, Eric; Partovi, Sasan; Davidson, Jon C.

    2016-01-01

    The management of venous compression syndromes has historically been reliant on surgical treatment when conservative measures fail. There are, however, several settings in which endovascular therapy can play a significant role as an adjunct or even a replacement to more invasive surgical methods. We explore the role of minimally invasive treatment options for three of the most well-studied venous compression syndromes. The clinical aspects and pathophysiology of Paget-Schroetter syndrome (PSS), nutcracker syndrome, and May-Thurner syndrome are discussed in detail, with particular emphasis on the role that interventionalists can play in minimally invasive treatment. PMID:28123978

  7. Minimal String Theory and the Douglas Equation

    NASA Astrophysics Data System (ADS)

    Belavin, A. A.; Belavin, V. A.

    We use the connection between the Frobenius manifold and the Douglas string equation to further investigate Minimal Liouville gravity. We search for a solution of the Douglas string equation and simultaneously a proper transformation from the KdV to the Liouville frame which ensures the fulfilment of the conformal and fusion selection rules. We find that the desired solution of the string equation has an explicit and simple form in the flat coordinates on the Frobenius manifold in the general case of (p,q) Minimal Liouville gravity.

  8. MINIMAL IMMERSIONS OF SPHERES INTO SPHERES

    PubMed Central

    Do Carmo, Manfredo P.; Wallach, Nolan R.

    1969-01-01

    In this paper we announce a qualitative description of an important class of closed n-dimensional submanifolds of the m-dimensional sphere, namely, those which locally minimize the n-area in the same way that geodesics minimize the arc length and are themselves locally n-spheres of constant radius r; those r that may appear are called admissible. It is known that for n = 2 each admissible r determines a unique element of the above class. The main result here is that for each n ≥ 3 and each admissible r ≥ [unk]8 there exists a continuum of distinct such submanifolds. PMID:16591771

  9. The Parisi Formula has a Unique Minimizer

    NASA Astrophysics Data System (ADS)

    Auffinger, Antonio; Chen, Wei-Kuo

    2015-05-01

    In 1979, Parisi (Phys Rev Lett 43:1754-1756, 1979) predicted a variational formula for the thermodynamic limit of the free energy in the Sherrington-Kirkpatrick model, and described the role played by its minimizer. This formula was verified in the seminal work of Talagrand (Ann Math 163(1):221-263, 2006) and later generalized to the mixed p-spin models by Panchenko (Ann Probab 42(3):946-958, 2014). In this paper, we prove that the minimizer in Parisi's formula is unique at any temperature and external field by establishing the strict convexity of the Parisi functional.

  10. Minimally Invasive Treatment of Spine Trauma.

    PubMed

    McGowan, Jason E; Ricks, Christian B; Kanter, Adam S

    2017-01-01

    The role for minimally invasive surgery (MIS) continues to expand in the management of spinal pathology. In the setting of trauma, operative techniques that can minimize morbidity without compromising clinical efficacy have significant value. MIS techniques are associated with decreased intraoperative blood loss, operative time, and morbidity, while providing patients with comparable outcomes when compared with conventional open procedures. MIS interventions further enable earlier mobilization, decreased hospital stay, decreased pain, and an earlier return to baseline function when compared with traditional techniques. This article reviews patient selection and select MIS techniques for those who have suffered traumatic spinal injury.

  11. Instabilities and Solitons in Minimal Strips

    NASA Astrophysics Data System (ADS)

    Machon, Thomas; Alexander, Gareth P.; Goldstein, Raymond E.; Pesci, Adriana I.

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ4 theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  12. Minimally invasive treatments for perforator vein insufficiency.

    PubMed

    Kuyumcu, Gokhan; Salazar, Gloria Maria; Prabhakar, Anand M; Ganguli, Suvranu

    2016-12-01

    Incompetent superficial veins are the most common cause of lower extremity superficial venous reflux and varicose veins; however, incompetent or insufficient perforator veins are the most common cause of recurrent varicose veins after treatment, often unrecognized. Perforator vein insufficiency can result in pain, skin changes, and skin ulcers, and often merit intervention. Minimally invasive treatments have replaced traditional surgical treatments for incompetent perforator veins. Current minimally invasive treatment options include ultrasound guided sclerotherapy (USGS) and endovascular thermal ablation (EVTA) with either laser or radiofrequency energy sources. Advantages and disadvantages of each modality and knowledge on these treatments are required to adequately address perforator venous disease.

  13. Design of batch minimal bromate oscillator

    NASA Astrophysics Data System (ADS)

    Li, Jun; Wang, Jichang

    2011-05-01

    A new type of minimal bromate oscillator that could exhibit spontaneous oscillations in a closed system was developed in this research. The newly developed oscillator contains a reagent benzoquinone, which does not react with metal catalyst ferroin/ferriin, but modulates the evolution of bromide ions. Since the role of the organic substrate is only the bromine removal, we define this system as a batch minimal bromate oscillator. The bromination of benzoquinone was confirmed with NMR and mass spectrometry measurements. Experiments showed that transient oscillations emerged in a closed reactor when benzoquinone concentration was above a threshold level.

  14. Instabilities and Solitons in Minimal Strips.

    PubMed

    Machon, Thomas; Alexander, Gareth P; Goldstein, Raymond E; Pesci, Adriana I

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ^{4} theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  15. Minimally invasive treatments for perforator vein insufficiency

    PubMed Central

    Salazar, Gloria Maria; Prabhakar, Anand M.; Ganguli, Suvranu

    2016-01-01

    Incompetent superficial veins are the most common cause of lower extremity superficial venous reflux and varicose veins; however, incompetent or insufficient perforator veins are the most common cause of recurrent varicose veins after treatment, often unrecognized. Perforator vein insufficiency can result in pain, skin changes, and skin ulcers, and often merit intervention. Minimally invasive treatments have replaced traditional surgical treatments for incompetent perforator veins. Current minimally invasive treatment options include ultrasound guided sclerotherapy (USGS) and endovascular thermal ablation (EVTA) with either laser or radiofrequency energy sources. Advantages and disadvantages of each modality and knowledge on these treatments are required to adequately address perforator venous disease. PMID:28123979

  16. Minimally invasive plate osteosynthesis: tibia and fibula.

    PubMed

    Beale, Brian S; McCally, Ryan

    2012-09-01

    Fractures of the tibia and fibula are common in dogs and cats and occur most commonly as a result of substantial trauma. Tibial fractures are often amenable to repair using the minimally invasive plate osteosynthesis (MIPO) technique because of the minimal soft tissue covering of the tibia and relative ease of indirect reduction and application of the implant system on the tibia. Treatment of tibial fractures by MIPO has been found to reduce surgical time, reduce the time for fracture healing, and decrease patient morbidity, while at the same time reducing complications compared with traditional open reduction and internal fixation.

  17. Efficiency of Analytical Methodologies in Uncertainty Analysis of Seismic Core Damage Frequency

    NASA Astrophysics Data System (ADS)

    Kawaguchi, Kenji; Uchiyama, Tomoaki; Muramatsu, Ken

    Fault Tree and Event Tree analysis is almost exclusively relied upon in the assessments of seismic Core Damage Frequency (CDF). In this approach, Direct Quantification of Fault tree using Monte Carlo simulation (DQFM) method, or simply called Monte Carlo (MC) method, and Binary Decision Diagram (BDD) method were introduced as alternatives for a traditional approximation method, namely Minimal Cut Set (MCS) method. However, there is still no agreement as to which method should be used in a risk assessment of seismic CDF, especially for uncertainty analysis. The purpose of this study is to examine the efficiencies of the three methods in uncertainty analysis as well as in point estimation so that the decision of selecting a proper method can be made effectively. The results show that the most efficient method would be BDD method in terms of accuracy and computational time. However, it will be discussed that BDD method is not always applicable to PSA models while MC method is so in theory. In turn, MC method was confirmed to agree with the exact solution obtained by BDD method, but it took a large amount of time, in particular for uncertainty analysis. On the other hand, it was shown that the approximation error of MCS method may not be as bad in uncertainty analysis as it is in point estimation. Based on these results and previous works, this paper will propose a scheme to select an appropriate analytical method for a seismic PSA study. Throughout this study, SECOM2-DQFM code was expanded to be able to utilize BDD method and to conduct uncertainty analysis with both MC and BDD method.

  18. A methodology for adaptive scheduling of radar intervals based on a cost-function methodology

    NASA Astrophysics Data System (ADS)

    Gray, John E.; Smith-Carroll, Amy S.; Zaffram, Christopher

    2004-07-01

    In this note we introduce the idea of adaptive scheduling based on a cost function methodology. As the warfare environment becomes more complex, individual sensor resources are stretched, and the usage of the sensors has grown. In a multi-ship multi-platform environment, one has the potential to share information across platforms. This would dramatically increase the strategic and tactical picture available to mission planners and commanders at all force levels. In order to accomplish this mission, the sensors must all be coordinated so adaptability and multi-force tasking can be accomplished with netted sensors. Adaptive sensor management expands group capabilities by freeing up resources such as dwells/energy management. Savings arise by effective usage of tracking resources by revisiting threats with radar resources only when needed. This can be done by introducing analytic cost functions of the revisit time that enable one to minimize revisit time while maintaining error within acceptable bounds.

  19. 76 FR 71431 - Civil Penalty Calculation Methodology

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... TRANSPORTATION Federal Motor Carrier Safety Administration Civil Penalty Calculation Methodology AGENCY: Federal... its civil penalty methodology. Part of this evaluation includes a forthcoming explanation of the... methodology for calculation of certain civil penalties. To induce compliance with federal regulations,...

  20. Environmental probabilistic quantitative assessment methodologies

    NASA Astrophysics Data System (ADS)

    Crovelli, Robert A.

    1995-10-01

    Probabilistic methodologies developed originally for one area of application may be applicable in another area. Therefore, it is extremely important to communicate across disciplines. Of course, a physical reinterpretation is necessary and perhaps some modification of the methodology. This seems to be the situation in applying resource assessment methodologies as environmental assessment methodologies. In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. It is ironic that oil as a precious resource in the ground can become a serious pollutant as a spill in the ocean. There are similarities in both situations where the quantity of undiscovered crude oil and natural gas resources, and the quantity of a pollutant or contaminant are to be estimated. Obviously, we are interested in making a quantitative assessment in order to answer the question, "How much material is there?" For situations in which there are a lack of statistical data, risk analysis is used rather than classical statistical analysis. That is, a relatively subjective evaluation is made rather than an evaluation based on random sampling which may be impossible. Hence, probabilistic quantitative assessment methodologies are needed for the risk analysis. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: (1) direct assessment, (2) accumulation size, (3) volumetric yield, and (4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz., TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. TRIAGG

  1. Minimizing risk in anonymous egg donation.

    PubMed

    Ahuja, K K; Simons, E G; Nair, S; Rimington, M R; Armar, N A

    2003-11-01

    Assisted conception carries with it known and putative medical and surgical risks. Exposing healthy women to these risks in order to harvest eggs for donation when a safer alternative exists is morally and ethically unacceptable. Egg sharing minimizes risk and provides a source of eggs for donation. Anonymity protects all parties involved and should not be removed.

  2. Platform Technologies for Minimally Invasive Physiological Monitoring

    DTIC Science & Technology

    2006-11-01

    Platform Technologies for Minimally Invasive Physiological Monitoring Mingui Sun1,2,3,4, Steven A. Hackworth1,2,4, Zhide Tang1, Jun Zhao1,4, Daliang...Light, W. G. Berger, “Medical devices of the head, neck, and spine,” Radiographics, Jan-Feb 2004 24(1):257-85. [2] W. Liu, M. Sivaprakasam, G. Wang

  3. Minimally Invasive Surgery for Inflammatory Bowel Disease

    PubMed Central

    Holder-Murray, Jennifer; Marsicovetere, Priscilla

    2015-01-01

    Abstract: Surgical management of inflammatory bowel disease is a challenging endeavor given infectious and inflammatory complications, such as fistula, and abscess, complex often postoperative anatomy, including adhesive disease from previous open operations. Patients with Crohn's disease and ulcerative colitis also bring to the table the burden of their chronic illness with anemia, malnutrition, and immunosuppression, all common and contributing independently as risk factors for increased surgical morbidity in this high-risk population. However, to reduce the physical trauma of surgery, technologic advances and worldwide experience with minimally invasive surgery have allowed laparoscopic management of patients to become standard of care, with significant short- and long-term patient benefits compared with the open approach. In this review, we will describe the current state-of the-art for minimally invasive surgery for inflammatory bowel disease and the caveats inherent with this practice in this complex patient population. Also, we will review the applicability of current and future trends in minimally invasive surgical technique, such as laparoscopic “incisionless,” single-incision laparoscopic surgery (SILS), robotic-assisted, and other techniques for the patient with inflammatory bowel disease. There can be no doubt that minimally invasive surgery has been proven to decrease the short- and long-term burden of surgery of these chronic illnesses and represents high-value care for both patient and society. PMID:25989341

  4. Fractality in selfsimilar minimal mass structures

    NASA Astrophysics Data System (ADS)

    De Tommasi, D.; Maddalena, F.; Puglisi, G.; Trentadue, F.

    2017-10-01

    In this paper we study the diffusely observed occurrence of Fractality and Self-organized Criticality in mechanical systems. We analytically show, based on a prototypical compressed tensegrity structure, that these phenomena can be viewed as the result of the contemporary attainment of mass minimization and global stability in elastic systems.

  5. New Diagnostic Terminology for Minimal Brain Dysfunction.

    ERIC Educational Resources Information Center

    Shaywitz, Bennett A.; And Others

    1979-01-01

    Minimal brain dysfunction has been redefined by the American Psychological Association as attention deficit disorder (ADD) and subdivided into categories with and without hyperactivity. The revised 'Diagnostic and Statistical Manual' (DSM III) is now undergoing field trials. Journal Availability: C. V. Mosby Company, 11830 Westline Industrial…

  6. Minimal Interventions in the Teaching of Mathematics

    ERIC Educational Resources Information Center

    Foster, Colin

    2014-01-01

    This paper addresses ways in which mathematics pedagogy can benefit from insights gleaned from counselling. Person-centred counselling stresses the value of genuineness, warm empathetic listening and minimal intervention to support people in solving their own problems and developing increased autonomy. Such an approach contrasts starkly with the…

  7. Using silviculture to minimize gypsy moth impacts

    Treesearch

    Kurt W. Gottschalk

    1989-01-01

    Silvicultural treatments can be used to minimize gypsy moth impacts on hardwood stands. There are two major strategies of these treatments: (1) to decrease susceptibility to defoliation by gypsy moth and (2) to strengthen the stand against mortality and encourage growth after defoliation.

  8. Using silviculture to minimize gypsy moth impacts

    Treesearch

    Kurt W. Gottschalk

    1991-01-01

    Several studies are underway to test and evaluate the use of silvicultural treatments to minimize gypsy moth impacts. Treatment objectives are to change stand susceptibility to gypsy moth defoliation or stand vulnerability to damage after defoliation. Decision charts have been developed to help forest and land managers to select the appropriate treatment for their...

  9. Minimally invasive excision of thoracic arachnoid web.

    PubMed

    Vergara, Pierluigi; Barone, Damiano Giuseppe

    2017-09-23

    Arachnoid webs are rare intradural lesions which can cause direct spinal cord compression and/or alteration of the CSF flow with syringomielia. Surgery has been historically performed via wide open laminectomies. The aim of this study is to prove the feasibility of minimally invasive techniques for the excision of arachnoid webs. A retrospective review of two cases of minimally invasive excision of thoracic arachnoid webs was performed. Surgery was carried out through expandable tubular retractors. Complete excision was achieved through the described approach, with minimal bony removal and soft tissue disruption. There were no intra- or peri- operative complications. Both patients were mobilised early and discharged home within 24hrs post-surgery. Postoperative imaging showed good re-expansion of the spinal cord, with no evidence of residual compression or tethering. For symptomatic arachnoid webs, surgery remains the only definitive treatment. In expert hands, the excision of arachnoid webs can be successfully achieved with tubular retractors and minimally invasive techniques. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Minimization of Salmonella Contamination on Raw Poultry

    USDA-ARS?s Scientific Manuscript database

    Many reviews have discussed Salmonella in poultry and suggested best practices to minimize this organism on raw poultry meat. Despite years of research and conscientious control efforts by industry and regulatory agencies, human salmonellosis rates have declined only modestly and Salmonella is stil...

  11. Data clustering and visualization via energy minimization

    NASA Astrophysics Data System (ADS)

    Andrecut, M.

    2011-09-01

    We discuss a stochastic method for configurational energy minimization, with applications to high-dimensional data clustering and visualization. Also, we demonstrate numerically the ability of the method to capture meaningful biological information from cancer-related microarray data, and to differentiate between different leukemia cancer subtypes.

  12. Minimizing emissions from existing ESP equipped MWCs

    SciTech Connect

    Rigo, H.G.; Chandler, A.J.

    1996-09-01

    A number of municipal waste combustors (MWCs) built before the mid-1980`s, when local permitting processes resulted in the application of acid gas control technology, were equipped with electrostatic precipitators to minimize particulate emissions and comply with the then applicable New Source Performance Standards for Municipal Waste Incinerators (40 CFR 60, Subpart C). Polychlorinated dibenzo-p-dioxins and dibenzo-furans (PCDD/F) emissions can be minimized from these facilities by improving combustion to minimize furnace carryover and flame formation. Unfortunately, this can do nothing about the gas and particle phase formation of PCDD/F when the products of combustion are held in the dioxin formation window--250 to 400 C or 482 to 752 F. A proof-of-concept demonstration test program to demonstrate that flue gas temperature entering existing ESPs can be practically reduced below 175 C (350 F) to minimize PCDD/F formation in the air pollution control system (APCS). At the same time, the performance of powdered activated carbon (PAC) and dry acid gas control reagent injection are demonstrated to establish the practicability of bringing existing ESP-equipped MWCs into compliance with the USEPA`s Emissions Guidelines without scrapping the sunk investment in functional, high efficiency ESPs.

  13. Minimally inconsistent reasoning in Semantic Web.

    PubMed

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.

  14. Minimal Guidelines for Authors of Web Pages.

    ERIC Educational Resources Information Center

    ADE Bulletin, 2002

    2002-01-01

    Presents guidelines that recommend the minimal reference information that should be provided on Web pages intended for use by students, teachers, and scholars in the modern languages. Suggests the inclusion of information about responsible parties, copyright declaration, privacy statements, and site information. Makes a note on Web page style. (SG)

  15. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. The intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative s...

  16. Minimal theory of quasidilaton massive gravity

    NASA Astrophysics Data System (ADS)

    De Felice, Antonio; Mukohyama, Shinji; Oliosi, Michele

    2017-07-01

    We introduce a quasidilaton scalar field to the minimal theory of massive gravity with the Minkowski fiducial metric, in such a way that the quasidilaton global symmetry is maintained and that the theory admits a stable self-accelerating de Sitter solution. We start with a precursor theory that contains three propagating gravitational degrees of freedom without a quasidilaton scalar and introduce Stückelberg fields to covariantize its action. This makes it possible for us to formulate the quasidilaton global symmetry that mixes the Stückelberg fields and the quasidilaton scalar field. By the Hamiltonian analysis we confirm that the precursor theory with the quasidilaton scalar contains 4 degrees of freedom, three from the precursor massive gravity and one from the quasidilaton scalar. We further remove one propagating degree of freedom to construct the minimal quasidilaton theory with three propagating degrees of freedom, corresponding to two polarizations of gravitational waves from the minimal theory of massive gravity and one scalar from the quasidilaton field, by carefully introducing two additional constraints to the system in the Hamiltonian language. Switching to the Lagrangian language, we find self-accelerating de Sitter solutions in the minimal quasidilaton theory and analyze their stability. It is found that the self-accelerating de Sitter solution is stable in a wide range of parameters.

  17. Minimal Mimicry: Mere Effector Matching Induces Preference

    ERIC Educational Resources Information Center

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-01-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually…

  18. Kundt spacetimes minimally coupled to scalar field

    NASA Astrophysics Data System (ADS)

    Tahamtan, T.; Svítek, O.

    2017-06-01

    We derive an exact solution belonging to the Kundt class of spacetimes both with and without a cosmological constant that are minimally coupled to a free massless scalar field. We show the algebraic type of these solutions and give interpretation of the results. Subsequently, we look for solutions additionally containing an electromagnetic field satisfying nonlinear field equations.

  19. New Diagnostic Terminology for Minimal Brain Dysfunction.

    ERIC Educational Resources Information Center

    Shaywitz, Bennett A.; And Others

    1979-01-01

    Minimal brain dysfunction has been redefined by the American Psychological Association as attention deficit disorder (ADD) and subdivided into categories with and without hyperactivity. The revised 'Diagnostic and Statistical Manual' (DSM III) is now undergoing field trials. Journal Availability: C. V. Mosby Company, 11830 Westline Industrial…

  20. The Biochemical Basis of Minimal Brain Dysfunction

    ERIC Educational Resources Information Center

    Shaywitz, Sally E.; And Others

    1978-01-01

    Available from: C. V. Mosby Company 11830 Westline Industrial Drive St. Louis, Missouri 63141 The research review examines evidence suggesting a biochemical basis for minimal brain dysfunction (MBD), which includes both a relationship between MBD and metabolic abnormalities and a significant genetic influence on the disorder in children. (IM)

  1. Botulinum toxin to minimize facial scarring.

    PubMed

    Jablonka, Eric M; Sherris, David A; Gassner, Holger G

    2012-10-01

    Chemoimmobilization with botulinum toxin A is an ideal biochemical agent that allows near-total elimination of muscle pull on the healing facial wound. The goal of chemoimmobilization of facial cutaneous wounds is to eliminate dynamic tension on the healing tissues to improve wound healing and minimize scarring for optimal aesthetic results.

  2. Mixed waste minimization/mixed waste avoidance

    SciTech Connect

    Todisco, L.R.

    1994-12-31

    This presentation describes methods for the minimization and volume reduction of low-level radioactive and mixed wastes. Many methods are presented including: source reduction, better waste monitoring activities, waste segregation, recycling, administrative controls, and optimization of waste-generating processes.

  3. Minimally Invasive Mitral Valve Surgery III

    PubMed Central

    Lehr, Eric J.; Guy, T. Sloane; Smith, Robert L.; Grossi, Eugene A.; Shemin, Richard J.; Rodriguez, Evelio; Ailawadi, Gorav; Agnihotri, Arvind K.; Fayers, Trevor M.; Hargrove, W. Clark; Hummel, Brian W.; Khan, Junaid H.; Malaisrie, S. Chris; Mehall, John R.; Murphy, Douglas A.; Ryan, William H.; Salemi, Arash; Segurola, Romualdo J.; Smith, J. Michael; Wolfe, J. Alan; Weldner, Paul W.; Barnhart, Glenn R.; Goldman, Scott M.; Lewis, Clifton T. P.

    2016-01-01

    Abstract Minimally invasive mitral valve operations are increasingly common in the United States, but robotic-assisted approaches have not been widely adopted for a variety of reasons. This expert opinion reviews the state of the art and defines best practices, training, and techniques for developing a successful robotics program. PMID:27662478

  4. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. The intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative s...

  5. Minimally inconsistent reasoning in Semantic Web

    PubMed Central

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning. PMID:28750030

  6. MULTIOBJECTIVE PARALLEL GENETIC ALGORITHM FOR WASTE MINIMIZATION

    EPA Science Inventory

    In this research we have developed an efficient multiobjective parallel genetic algorithm (MOPGA) for waste minimization problems. This MOPGA integrates PGAPack (Levine, 1996) and NSGA-II (Deb, 2000) with novel modifications. PGAPack is a master-slave parallel implementation of a...

  7. Banach spaces that realize minimal fillings

    SciTech Connect

    Bednov, B. B.; Borodin, P. A. E-mail: pborodin@inbox.ru

    2014-04-30

    It is proved that a real Banach space realizes minimal fillings for all its finite subsets (a shortest network spanning a fixed finite subset always exists and has the minimum possible length) if and only if it is a predual of L{sub 1}. The spaces L{sub 1} are characterized in terms of Steiner points (medians). Bibliography: 25 titles. (paper)

  8. Banach spaces that realize minimal fillings

    NASA Astrophysics Data System (ADS)

    Bednov, B. B.; Borodin, P. A.

    2014-04-01

    It is proved that a real Banach space realizes minimal fillings for all its finite subsets (a shortest network spanning a fixed finite subset always exists and has the minimum possible length) if and only if it is a predual of L_1. The spaces L_1 are characterized in terms of Steiner points (medians). Bibliography: 25 titles.

  9. MULTIOBJECTIVE PARALLEL GENETIC ALGORITHM FOR WASTE MINIMIZATION

    EPA Science Inventory

    In this research we have developed an efficient multiobjective parallel genetic algorithm (MOPGA) for waste minimization problems. This MOPGA integrates PGAPack (Levine, 1996) and NSGA-II (Deb, 2000) with novel modifications. PGAPack is a master-slave parallel implementation of a...

  10. Minimal Mimicry: Mere Effector Matching Induces Preference

    ERIC Educational Resources Information Center

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-01-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually…

  11. Minimal Brain Dysfunction: Associations with Perinatal Complications.

    ERIC Educational Resources Information Center

    Nichols, Paul L.

    Examined with over 28,000 7-year-old children whose mothers registered for prenatal care was the relationship between perinatal complications and such characteristics as poor school achievement, hyperactivity, and neurological soft signs associated with the diagnosis of minimal brain dysfunction (MBD). Ten perinatal antecedents were studied:…

  12. Challenging the minimal supersymmetric SU(5) model

    SciTech Connect

    Bajc, Borut; Lavignac, Stéphane; Mede, Timon

    2014-06-24

    We review the main constraints on the parameter space of the minimal renormalizable supersymmetric SU(5) grand unified theory. They consist of the Higgs mass, proton decay, electroweak symmetry breaking and fermion masses. Superpartner masses are constrained both from below and from above, giving hope for confirming or definitely ruling out the theory in the future. This contribution is based on Ref. [1].

  13. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.

  14. Methodology of metal criticality determination.

    PubMed

    Graedel, T E; Barr, Rachel; Chandler, Chelsea; Chase, Thomas; Choi, Joanne; Christoffersen, Lee; Friedlander, Elizabeth; Henly, Claire; Jun, Christine; Nassar, Nedal T; Schechner, Daniel; Warren, Simon; Yang, Man-Yu; Zhu, Charles

    2012-01-17

    A comprehensive methodology has been created to quantify the degree of criticality of the metals of the periodic table. In this paper, we present and discuss the methodology, which is comprised of three dimensions: supply risk, environmental implications, and vulnerability to supply restriction. Supply risk differs with the time scale (medium or long), and at its more complex involves several components, themselves composed of a number of distinct indicators drawn from readily available peer-reviewed indexes and public information. Vulnerability to supply restriction differs with the organizational level (i.e., global, national, and corporate). The criticality methodology, an enhancement of a United States National Research Council template, is designed to help corporate, national, and global stakeholders conduct risk evaluation and to inform resource utilization and strategic decision-making. Although we believe our methodological choices lead to the most robust results, the framework has been constructed to permit flexibility by the user. Specific indicators can be deleted or added as desired and weighted as the user deems appropriate. The value of each indicator will evolve over time, and our future research will focus on this evolution. The methodology has proven to be sufficiently robust as to make it applicable across the entire spectrum of metals and organizational levels and provides a structural approach that reflects the multifaceted factors influencing the availability of metals in the 21st century.

  15. Spatially explicit methodology for coordinated manure management in shared watersheds.

    PubMed

    Sharara, Mahmoud; Sampat, Apoorva; Good, Laura W; Smith, Amanda S; Porter, Pamela; Zavala, Victor M; Larson, Rebecca; Runge, Troy

    2017-05-01

    Increased clustering and consolidation of livestock production systems has been linked to adverse impacts on water quality. This study presents a methodology to optimize manure management within a hydrologic region to minimize agricultural phosphorus (P) loss associated with winter manure application. Spatial and non-spatial data representing livestock, crop, soil, terrain and hydrography were compiled to determine manure P production rates, crop P uptake, existing manure storage capabilities, and transportation distances. Field slope, hydrologic soil group (HSG), and proximity to waterbodies were used to classify crop fields according to their runoff risk for winter-applied manure. We use these data to construct a comprehensive optimization model that identifies optimal location, size, and transportation strategy to achieve environmental and economic goals. The environmental goal was the minimization of daily hauling of manure to environmentally sensitive crop fields, i.e., those classified as high P-loss fields, whereas the economic goal was the minimization of the transportation costs across the entire study area. A case study encompassing two contiguous 10-digit hydrologic unit subwatersheds (HUC-10) in South Central Wisconsin, USA was developed to demonstrate the proposed methodology. Additionally, scenarios representing different management decisions (storage facility maximum volume, and project capital) and production conditions (increased milk production and 20-year future projection) were analyzed to determine their impact on optimal decisions. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Waste minimization applications at a remediation site

    SciTech Connect

    Allmon, L.A.

    1995-01-23

    The Fernald Environmental Management Project (FEMP) owned by the Department of Energy was used for the processing of uranium. In 1989 Fernald suspended production of uranium metals and was placed on the National Priorities List (NPL). The site`s mission has changed from one of production to environmental restoration. Many groups necessary for producing a product were deemed irrelevant for remediation work, including Waste Minimization. Waste Minimization does not readily appear to be applicable to remediation work. Environmental remediation is designed to correct adverse impacts to the environment from past operations and generates significant amounts of waste requiring management. The premise of pollution prevention is to avoid waste generation, thus remediation is in direct conflict with this premise. Although greater amounts of waste will be generated during environmental remediation, treatment capacities are not always available and disposal is becoming more difficult and costly. This creates the need for pollution prevention and waste minimization. Applying waste minimization principles at a remediation site is an enormous challenge. If the remediation site is also radiologically contaminated it is even a bigger challenge. Innovative techniques and ideas must be utilized to achieve reductions in the amount of waste that must be managed or dispositioned. At Fernald the waste minimization paradigm was shifted from focusing efforts on source reduction to focusing efforts on recycle/reuse by inverting the EPA waste management hierarchy. A fundamental difference at remediation sites is that source reduction has limited applicability to legacy wastes but can be applied successfully on secondary waste generation. The bulk of measurable waste reduction will be achieved by the recycle/reuse of primary wastes and by segregation and decontamination of secondary wastestreams. Each effort must be measured in terms of being economically and ecologically beneficial.

  17. Waste minimization in environmental sampling and analysis

    SciTech Connect

    Brice, D.A.; Nixon, J. . Fernald Environmental Management Project); Lewis, E.T. )

    1992-01-01

    Environmental investigations of the extent and effect of contamination, and projects to remediate such contamination, are designed to mitigate perceived threats to human health and the environment. During the course of these investigations, excavations, borings, and monitoring wells are constructed: monitoring wells are developed and purged prior to sampling; samples are collected; equipment is decontaminated; constituents extracted and analyzed; and personal protective equipment is used to keep workers safe. All of these activities generate waste. A large portion of this waste may be classified as hazardous based on characteristics or constituent components. Waste minimization is defined as reducing the volume and/or toxicity of waste generated by a process. Waste minimization has proven to be an effective means of cost reduction and improving worker health, safety, and environmental awareness in the industrial workplace through pollution prevention. Building waste minimization goals into a project during the planning phase is both cost effective and consistent with total quality management principles. Application of waste minimization principles should be an integral part of the planning and conduct of environmental investigations. Current regulatory guidance on planning environmental investigations focuses on data quality and risk assessment objectives. Waste minimization should also be a scoping priority, along with meeting worker protection requirements, protection of human health and the environment, and achieving data quality objectives. Waste volume or toxicity can be reduced through the use of smaller sample sizes, less toxic extraction solvents, less hazardous decontamination materials, smaller excavations and borings, smaller diameter monitoring wells, dedicated sampling equipment, well-fitting personal protective equipment, judicious use of screening technologies, and analyzing only for parameters of concern.

  18. Waste minimization in environmental sampling and analysis

    SciTech Connect

    Brice, D.A.; Nixon, J.; Lewis, E.T.

    1992-03-01

    Environmental investigations of the extent and effect of contamination, and projects to remediate such contamination, are designed to mitigate perceived threats to human health and the environment. During the course of these investigations, excavations, borings, and monitoring wells are constructed: monitoring wells are developed and purged prior to sampling; samples are collected; equipment is decontaminated; constituents extracted and analyzed; and personal protective equipment is used to keep workers safe. All of these activities generate waste. A large portion of this waste may be classified as hazardous based on characteristics or constituent components. Waste minimization is defined as reducing the volume and/or toxicity of waste generated by a process. Waste minimization has proven to be an effective means of cost reduction and improving worker health, safety, and environmental awareness in the industrial workplace through pollution prevention. Building waste minimization goals into a project during the planning phase is both cost effective and consistent with total quality management principles. Application of waste minimization principles should be an integral part of the planning and conduct of environmental investigations. Current regulatory guidance on planning environmental investigations focuses on data quality and risk assessment objectives. Waste minimization should also be a scoping priority, along with meeting worker protection requirements, protection of human health and the environment, and achieving data quality objectives. Waste volume or toxicity can be reduced through the use of smaller sample sizes, less toxic extraction solvents, less hazardous decontamination materials, smaller excavations and borings, smaller diameter monitoring wells, dedicated sampling equipment, well-fitting personal protective equipment, judicious use of screening technologies, and analyzing only for parameters of concern.

  19. The NLC Software Requirements Methodology

    SciTech Connect

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  20. Income Smoothing: Methodology and Models.

    DTIC Science & Technology

    1986-05-01

    that managers desire a pattern % of income that has low variability relative to a linear time trend. 2. Industry Trend. Target 2 assumes that firms...R167 55? INCOME SMOOTHING: METHODOLOGY ND NODELS(U) UMVL in1POSTGRADUATE SCHOOL MONTEREY CA 0 D HOSES "AY S6 UNCLASSIFIED NP5-604FO53 E * I* vu...California oCiD ELEC fl MAY 12 986 INCOME SMOOTHING - METHODOLOGY AND MODELS by 0. Douglas Moses May 1986 *Approved frpublic release; ditibto uniie

  1. New methodologies for patients rehabilitation.

    PubMed

    Fardoun, H M; Mashat, A S; Lange, B

    2015-01-01

    The present editorial is part of the focus theme of Methods of Information in Medicine titled "New Methodologies for Patients Rehabilitation", with a specific focus on technologies and human factors related to the use of Information and Communication Technologies (ICT) for improving patient rehabilitation. The focus theme explores different dimensions of empowerment methodologies for disabled people in terms of rehabilitation and health care, and to explores the extent to which ICT is a useful tool in this process. The focus theme lists a set of research papers that present different ways of using ICT to develop advanced systems that help disabled people in their rehabilitation process.

  2. Methodological pluralism and narrative inquiry

    NASA Astrophysics Data System (ADS)

    Michie, Michael

    2013-09-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on `what meaning is being made' rather than `what is happening here' (quadrant 2 rather than quadrant 1). It is suggested that in using the integral theory model, a qualitative research project focuses primarily on one quadrant and is enhanced by approaches suggested in the other quadrants.

  3. Minimizing proteome redundancy in the UniProt Knowledgebase

    PubMed Central

    Bursteinas, Borisas; Britto, Ramona; Bely, Benoit; Auchincloss, Andrea; Rivoire, Catherine; Redaschi, Nicole; O'Donovan, Claire; Martin, Maria Jesus

    2016-01-01

    Advances in high-throughput sequencing have led to an unprecedented growth in genome sequences being submitted to biological databases. In particular, the sequencing of large numbers of nearly identical bacterial genomes during infection outbreaks and for other large-scale studies has resulted in a high level of redundancy in nucleotide databases and consequently in the UniProt Knowledgebase (UniProtKB). Redundancy negatively impacts on database searches by causing slower searches, an increase in statistical bias and cumbersome result analysis. The redundancy combined with the large data volume increases the computational costs for most reuses of UniProtKB data. All of this poses challenges for effective discovery in this wealth of data. With the continuing development of sequencing technologies, it is clear that finding ways to minimize redundancy is crucial to maintaining UniProt's essential contribution to data interpretation by our users. We have developed a methodology to identify and remove highly redundant proteomes from UniProtKB. The procedure identifies redundant proteomes by performing pairwise alignments of sets of sequences for pairs of proteomes and subsequently, applies graph theory to find dominating sets that provide a set of non-redundant proteomes with a minimal loss of information. This method was implemented for bacteria in mid-2015, resulting in a removal of 50 million proteins in UniProtKB. With every new release, this procedure is used to filter new incoming proteomes, resulting in a more scalable and scientifically valuable growth of UniProtKB. Database URL: http://www.uniprot.org/proteomes/ PMID:28025334

  4. The problems of the minimal surface and minimal lineal measure in three dimensions

    SciTech Connect

    Christensen, R.M.

    1994-02-01

    A solution is given to the classical problem of the minimal surface in three dimensions formed from a repeating cell microstructure under isotropic conditions. The solution is found through a global/local minimization procedure and the resulting basic cell is composed of 14 faces. At the junctions where the intersections between faces meet at a point, half of the junctions involve 4 intersections and half involve 3 intersections. The same general solution also applies tot he related minimal lineal measure problem where the measure is that of the length of the intersections connecting the junctions. Some implications and applications for materials science are given.

  5. Advances in minimally invasive neonatal colorectal surgery

    PubMed Central

    Bandi, Ashwath S; Bradshaw, Catherine J; Giuliani, Stefano

    2016-01-01

    Over the last two decades, advances in laparoscopic surgery and minimally invasive techniques have transformed the operative management of neonatal colorectal surgery for conditions such as anorectal malformations (ARMs) and Hirschsprung’s disease. Evolution of surgical care has mainly occurred due to the use of laparoscopy, as opposed to a laparotomy, for intra-abdominal procedures and the development of trans-anal techniques. This review describes these advances and outlines the main minimally invasive techniques currently used for management of ARMs and Hirschsprung’s disease. There does still remain significant variation in the procedures used and this review aims to report the current literature comparing techniques with an emphasis on the short- and long-term clinical outcomes. PMID:27830038

  6. Advances in minimally invasive neonatal colorectal surgery.

    PubMed

    Bandi, Ashwath S; Bradshaw, Catherine J; Giuliani, Stefano

    2016-10-27

    Over the last two decades, advances in laparoscopic surgery and minimally invasive techniques have transformed the operative management of neonatal colorectal surgery for conditions such as anorectal malformations (ARMs) and Hirschsprung's disease. Evolution of surgical care has mainly occurred due to the use of laparoscopy, as opposed to a laparotomy, for intra-abdominal procedures and the development of trans-anal techniques. This review describes these advances and outlines the main minimally invasive techniques currently used for management of ARMs and Hirschsprung's disease. There does still remain significant variation in the procedures used and this review aims to report the current literature comparing techniques with an emphasis on the short- and long-term clinical outcomes.

  7. Mirror glasses for minimally invasive surgery.

    PubMed

    Ishikawa, Norihiko; Sun, You Su; Nifong, L Wiley; Oda, Makoto; Ohta, Yasuhiko; Watanabe, Go; Chitwood, W Randolph

    2007-07-01

    The operator performing minimally invasive surgery is prevented from seeing the whole field with both eyes by the restricted small thoracotomy incision. To overcome this problem, we developed mirror glasses. Use of these glasses was evaluated in terms of the time required for threading of sutures with endoscopic forceps. Three surgeon ligated thread a suture five times with and without use of the glasses in the box, and the mean time was calculated for each surgeon. The time required for ligation (mean +/- SD) was 24.2 +/- 2.9 s with mirror glasses and 27.0 +/- 2.5 s without the glasses (p = 0.01). The mirror glasses may be found useful for fine manipulation for minimally invasive surgery.

  8. [Minimally invasive operations in vascular surgery].

    PubMed

    Stádler, Petr; Sedivý, Petr; Dvorácek, Libor; Slais, Marek; Vitásek, Petr; El Samman, Khaled; Matous, Pavel

    2011-01-01

    Minimally invasive surgery provides an attractive alternative compared with conventional surgical approaches and is popular with patients, particularly because of its favourable cosmetic results. Vascular surgery has taken its inspiration from general surgery and, over the past few years, has also been reducing the invasiveness of its operating methods. In addition to traditional laparoscopic techniques, we most frequently encounter the endovascular treatment of aneurysms of the thoracic and abdominal aorta and, most recently, robot-assisted surgery in the area of the abdominal aorta and pelvic arteries. Minimally invasive surgical interventions also have other advantages, including less operative trauma, a reduction in post-operative pain, shorter periods spent in the intensive care unit and overall hospitalization times, an earlier return to normal life and, finally, a reduction in total treatment costs.

  9. Minimally Disruptive Medicine for Patients with Diabetes.

    PubMed

    Serrano, Valentina; Spencer-Bonilla, Gabriela; Boehmer, Kasey R; Montori, Victor M

    2017-09-23

    Patients with diabetes must deal with the burden of symptoms and complications (burden of illness). Simultaneously, diabetes care demands practical and emotional work from patients and their families, work to access and use healthcare and to enact self-care (burden of treatment). Patient work must compete with the demands of family, job, and community life. Overwhelmed patients may not have the capacity to access care or enact self-care and will thus experience suboptimal diabetes outcomes. Minimally disruptive medicine (MDM) is a patient-centered approach to healthcare that prioritizes patients' goals for life and health while minimizing the healthcare disruption on patients' lives. In patients with diabetes, particularly in those with complex lives and multimorbidity, MDM coordinates healthcare and community responses to improve outcomes, reduce treatment burden, and enable patients to pursue their life's hopes and dreams.

  10. Topological minimally entangled states via geometric measure

    NASA Astrophysics Data System (ADS)

    Buerschaper, Oliver; García-Saez, Artur; Orús, Román; Wei, Tzu-Chieh

    2014-11-01

    Here we show how the Minimally Entangled States (MES) of a 2d system with topological order can be identified using the geometric measure of entanglement. We show this by minimizing this measure for the doubled semion, doubled Fibonacci and toric code models on a torus with non-trivial topological partitions. Our calculations are done either quasi-exactly for small system sizes, or using the tensor network approach in Orús et al (arXiv:1406.0585) for large sizes. As a byproduct of our methods, we see that the minimisation of the geometric entanglement can also determine the number of Abelian quasiparticle excitations in a given model. The results in this paper provide a very efficient and accurate way of extracting the full topological information of a 2d quantum lattice model from the multipartite entanglement structure of its ground states.

  11. Minimal Left-Right Symmetric Dark Matter.

    PubMed

    Heeck, Julian; Patra, Sudhanwa

    2015-09-18

    We show that left-right symmetric models can easily accommodate stable TeV-scale dark matter particles without the need for an ad hoc stabilizing symmetry. The stability of a newly introduced multiplet either arises accidentally as in the minimal dark matter framework or comes courtesy of the remaining unbroken Z_{2} subgroup of B-L. Only one new parameter is introduced: the mass of the new multiplet. As minimal examples, we study left-right fermion triplets and quintuplets and show that they can form viable two-component dark matter. This approach is, in particular, valid for SU(2)×SU(2)×U(1) models that explain the recent diboson excess at ATLAS in terms of a new charged gauge boson of mass 2 TeV.

  12. Activity recognition from minimal distinguishing subsequence mining

    NASA Astrophysics Data System (ADS)

    Iqbal, Mohammad; Pao, Hsing-Kuo

    2017-08-01

    Human activity recognition is one of the most important research topics in the era of Internet of Things. To separate different activities given sensory data, we utilize a Minimal Distinguishing Subsequence (MDS) mining approach to efficiently find distinguishing patterns among different activities. We first transform the sensory data into a series of sensor triggering events and operate the MDS mining procedure afterwards. The gap constraints are also considered in the MDS mining. Given the multi-class nature of most activity recognition tasks, we modify the MDS mining approach from a binary case to a multi-class one to fit the need for multiple activity recognition. We also study how to select the best parameter set including the minimal and the maximal support thresholds in finding the MDSs for effective activity recognition. Overall, the prediction accuracy is 86.59% on the van Kasteren dataset which consists of four different activities for recognition.

  13. The Sense of Commitment: A Minimal Approach

    PubMed Central

    Michael, John; Sebanz, Natalie; Knoblich, Günther

    2016-01-01

    This paper provides a starting point for psychological research on the sense of commitment within the context of joint action. We begin by formulating three desiderata: to illuminate the motivational factors that lead agents to feel and act committed, to pick out the cognitive processes and situational factors that lead agents to sense that implicit commitments are in place, and to illuminate the development of an understanding of commitment in ontogeny. In order to satisfy these three desiderata, we propose a minimal framework, the core of which is an analysis of the minimal structure of situations which can elicit a sense of commitment. We then propose a way of conceptualizing and operationalizing the sense of commitment, and discuss cognitive and motivational processes which may underpin the sense of commitment. PMID:26779080

  14. The method of minimal normal forms

    SciTech Connect

    Mane, S.R.; Weng, W.T.

    1992-01-01

    Normal form methods for solving nonlinear differential equations are reviewed and the comparative merits of three methods are evaluated. The concept of the minimal normal form is explained and is shown to be superior to other choices. The method is then extended to apply to the evaluation of discrete maps of an accelerator or storage ring. Such an extension, as suggested in this paper, is more suited for accelerator-based applications than a formulation utilizing continuous differential equations. A computer code has been generated to systematically implement various normal form formulations for maps in two-dimensional phase space. Specific examples of quadratic and cubic nonlinear fields were used and solved by the method developed. The minimal normal form method shown here gives good results using relatively low order expansions.

  15. The method of minimal normal forms

    SciTech Connect

    Mane, S.R.; Weng, W.T.

    1992-12-31

    Normal form methods for solving nonlinear differential equations are reviewed and the comparative merits of three methods are evaluated. The concept of the minimal normal form is explained and is shown to be superior to other choices. The method is then extended to apply to the evaluation of discrete maps of an accelerator or storage ring. Such an extension, as suggested in this paper, is more suited for accelerator-based applications than a formulation utilizing continuous differential equations. A computer code has been generated to systematically implement various normal form formulations for maps in two-dimensional phase space. Specific examples of quadratic and cubic nonlinear fields were used and solved by the method developed. The minimal normal form method shown here gives good results using relatively low order expansions.

  16. The Minimal Supersymmetric Fat Higgs Model

    SciTech Connect

    Harnik, Roni; Kribs, Graham D.; Larson, Daniel T.; Murayama, Hitoshi

    2003-11-26

    We present a calculable supersymmetric theory of a composite"fat'" Higgs boson. Electroweak symmetry is broken dynamically through a new gauge interaction that becomes strong at an intermediate scale. The Higgs mass can easily be 200-450 GeV along with the superpartner masses, solving the supersymmetric little hierarchy problem. We explicitly verify that the model is consistent with precision electroweak data without fine-tuning. Gauge coupling unification can be maintained despite the inherently strong dynamics involved in electroweak symmetry breaking. Supersymmetrizing the Standard Model therefore does not imply a light Higgs mass, contrary to the lore in the literature. The Higgs sector of the minimal Fat Higgs model has a mass spectrum that is distinctly different from the Minimal Supersymmetric Standard Model.

  17. Minimizing broadband excitation under dissipative conditions

    NASA Astrophysics Data System (ADS)

    Gelman, David; Kosloff, Ronnie

    2005-12-01

    Optimal control theory is employed for the task of minimizing the excited-state population of a dye molecule in solution. The spectrum of the excitation pulse is contained completely in the absorption band of the molecule. Only phase control is studied which is equivalent to optimizing the transmission of the pulse through the medium. The molecular model explicitly includes two electronic states and a single vibrational mode. The other degrees of freedom are classified as bath modes. The surrogate Hamiltonian method is employed to incorporate these bath degrees of freedom. Their influence can be classified as electronic dephasing and vibrational relaxation. In accordance with experimental results, minimal excitation is associated with a negatively chirped pulses. Optimal pulses with more complex transient structure are found to be superior to linearly chirped pulses. The difference is enhanced when the fluence is increased. The improvement degrades when dissipative effects become more dominant.

  18. A Minimal Periods Algorithm with Applications

    NASA Astrophysics Data System (ADS)

    Xu, Zhi

    Kosaraju in "Computation of squares in a string" briefly described a linear-time algorithm for computing the minimal squares starting at each position in a word. Using the same construction of suffix trees, we generalize his result and describe in detail how to compute the minimal α power, with a period of length longer than s, starting at each position in a word w for arbitrary exponent α> 1 and integer s ≥ 0. The algorithm runs in O(α|w|)-time for s = 0 and in O(|w|2)-time otherwise. We provide a complete proof of the correctness and computational complexity of the algorithm. The algorithm can be used to detect certain types of pseudo-patterns in words, which was our original goal in studying this generalization.

  19. Facets of the balanced minimal evolution polytope.

    PubMed

    Forcey, Stefan; Keefe, Logan; Sands, William

    2016-08-01

    The balanced minimal evolution (BME) method of creating phylogenetic trees can be formulated as a linear programming problem, minimizing an inner product over the vertices of the BME polytope. In this paper we undertake the project of describing the facets of this polytope. We classify and identify the combinatorial structure and geometry (facet inequalities) of all the facets in dimensions up to five, and classify even more facets in all dimensions. A full set of facet inequalities would allow a full implementation of the simplex method for finding the BME tree-although there are reasons to think this an unreachable goal. However, our results provide the crucial first steps for a more likely-to-be-successful program: finding efficient relaxations of the BME polytope.

  20. [Invasive and minimally invasive hemodynamic monitoring].

    PubMed

    Hansen, Matthias

    2016-10-01

    Advanced hemodynamic monitoring is necessary for adequate management of high-risk patients or patients with derangement of circulation. Studies demonstrate a benefit of early goal directed therapy in unstable cardiopulmonary situations. In these days we have different possibilities of minimally invasive or invasive hemodynamic monitoring. Minimally invasive measurements like pulse conture analysis or pulse wave analysis being less accurate under some circumstances, however only an artery catheter is needed for cardiac output monitoring. Pulmonary artery, transpulmonary thermodilution and lithium dilution technology have acceptable accuracy in cardiac output measurement. For therapy of unstable circulation there are additionally parameters to obtain. The pulmonary artery catheter is the device with the largest rate of complications, used by a trained crew and with a correct indication, his use is unchained justified.

  1. Commercial radioactive waste minimization program development guidance

    SciTech Connect

    Fischer, D.K.

    1991-01-01

    This document is one of two prepared by the EG&G Idaho, Inc., Waste Management Technical Support Program Group, National Low-Level Waste Management Program Unit. One of several Department of Energy responsibilities stated in the Amendments Act of 1985 is to provide technical assistance to compact regions Host States, and nonmember States (to the extent provided in appropriations acts) in establishing waste minimization program plans. Technical assistance includes, among other things, the development of technical guidelines for volume reduction options. Pursuant to this defined responsibility, the Department of Energy (through EG&G Idaho, Inc.) has prepared this report, which includes guidance on defining a program, State/compact commission participation, and waste minimization program plans.

  2. Commercial radioactive waste minimization program development guidance

    SciTech Connect

    Fischer, D.K.

    1991-01-01

    This document is one of two prepared by the EG G Idaho, Inc., Waste Management Technical Support Program Group, National Low-Level Waste Management Program Unit. One of several Department of Energy responsibilities stated in the Amendments Act of 1985 is to provide technical assistance to compact regions Host States, and nonmember States (to the extent provided in appropriations acts) in establishing waste minimization program plans. Technical assistance includes, among other things, the development of technical guidelines for volume reduction options. Pursuant to this defined responsibility, the Department of Energy (through EG G Idaho, Inc.) has prepared this report, which includes guidance on defining a program, State/compact commission participation, and waste minimization program plans.

  3. Minimally Invasive Surgical Therapies for Atrial Fibrillation

    PubMed Central

    Nakamura, Yoshitsugu; Kiaii, Bob; Chu, Michael W. A.

    2012-01-01

    Atrial fibrillation is the most common sustained arrhythmia and is associated with significant risks of thromboembolism, stroke, congestive heart failure, and death. There have been major advances in the management of atrial fibrillation including pharmacologic therapies, antithrombotic therapies, and ablation techniques. Surgery for atrial fibrillation, including both concomitant and stand-alone interventions, is an effective therapy to restore sinus rhythm. Minimally invasive surgical ablation is an emerging field that aims for the superior results of the traditional Cox-Maze procedure through a less invasive operation with lower morbidity, quicker recovery, and improved patient satisfaction. These novel techniques utilize endoscopic or minithoracotomy approaches with various energy sources to achieve electrical isolation of the pulmonary veins in addition to other ablation lines. We review advancements in minimally invasive techniques for atrial fibrillation surgery, including management of the left atrial appendage. PMID:22666609

  4. Minimal walking technicolor: Setup for collider physics

    SciTech Connect

    Foadi, Roshan; Frandsen, Mads T.; Ryttov, Thomas A.; Sannino, Francesco

    2007-09-01

    Different theoretical and phenomenological aspects of the minimal and nonminimal walking technicolor theories have recently been studied. The goal here is to make the models ready for collider phenomenology. We do this by constructing the low energy effective theory containing scalars, pseudoscalars, vector mesons, and other fields predicted by the minimal walking theory. We construct their self-interactions and interactions with standard model fields. Using the Weinberg sum rules, opportunely modified to take into account the walking behavior of the underlying gauge theory, we find interesting relations for the spin-one spectrum. We derive the electroweak parameters using the newly constructed effective theory and compare the results with the underlying gauge theory. Our analysis is sufficiently general such that the resulting model can be used to represent a generic walking technicolor theory not at odds with precision data.

  5. Towards synthesis of a minimal cell.

    PubMed

    Forster, Anthony C; Church, George M

    2006-01-01

    Construction of a chemical system capable of replication and evolution, fed only by small molecule nutrients, is now conceivable. This could be achieved by stepwise integration of decades of work on the reconstitution of DNA, RNA and protein syntheses from pure components. Such a minimal cell project would initially define the components sufficient for each subsystem, allow detailed kinetic analyses and lead to improved in vitro methods for synthesis of biopolymers, therapeutics and biosensors. Completion would yield a functionally and structurally understood self-replicating biosystem. Safety concerns for synthetic life will be alleviated by extreme dependence on elaborate laboratory reagents and conditions for viability. Our proposed minimal genome is 113 kbp long and contains 151 genes. We detail building blocks already in place and major hurdles to overcome for completion.

  6. Essential genes of a minimal bacterium.

    PubMed

    Glass, John I; Assad-Garcia, Nacyra; Alperovich, Nina; Yooseph, Shibu; Lewis, Matthew R; Maruf, Mahir; Hutchison, Clyde A; Smith, Hamilton O; Venter, J Craig

    2006-01-10

    Mycoplasma genitalium has the smallest genome of any organism that can be grown in pure culture. It has a minimal metabolism and little genomic redundancy. Consequently, its genome is expected to be a close approximation to the minimal set of genes needed to sustain bacterial life. Using global transposon mutagenesis, we isolated and characterized gene disruption mutants for 100 different nonessential protein-coding genes. None of the 43 RNA-coding genes were disrupted. Herein, we identify 382 of the 482 M. genitalium protein-coding genes as essential, plus five sets of disrupted genes that encode proteins with potentially redundant essential functions, such as phosphate transport. Genes encoding proteins of unknown function constitute 28% of the essential protein-coding genes set. Disruption of some genes accelerated M. genitalium growth.

  7. Essential genes of a minimal bacterium

    PubMed Central

    Glass, John I.; Assad-Garcia, Nacyra; Alperovich, Nina; Yooseph, Shibu; Lewis, Matthew R.; Maruf, Mahir; Hutchison, Clyde A.; Smith, Hamilton O.; Venter, J. Craig

    2006-01-01

    Mycoplasma genitalium has the smallest genome of any organism that can be grown in pure culture. It has a minimal metabolism and little genomic redundancy. Consequently, its genome is expected to be a close approximation to the minimal set of genes needed to sustain bacterial life. Using global transposon mutagenesis, we isolated and characterized gene disruption mutants for 100 different nonessential protein-coding genes. None of the 43 RNA-coding genes were disrupted. Herein, we identify 382 of the 482 M. genitalium protein-coding genes as essential, plus five sets of disrupted genes that encode proteins with potentially redundant essential functions, such as phosphate transport. Genes encoding proteins of unknown function constitute 28% of the essential protein-coding genes set. Disruption of some genes accelerated M. genitalium growth. PMID:16407165

  8. Minimal conformal extensions of the Higgs sector

    NASA Astrophysics Data System (ADS)

    Helmboldt, Alexander J.; Humbert, Pascal; Lindner, Manfred; Smirnov, Juri

    2017-07-01

    In this work we find the minimal extension of the Standard Model's Higgs sector which can lead to a light Higgs boson via radiative symmetry breaking and is consistent with the phenomenological requirements for a low-energy realization of a conformal theory. The model which turns out to be stable under renormalization group translations is an extension of the Standard Model by two scalar fields, one of which acquires a finite vacuum expectation value and therefore mixes into the physical Higgs. We find that the minimal model predicts a sizable amount of mixing which makes it testable at a collider. In addition to the physical Higgs, the theory's scalar spectrum contains one light and one heavy boson. The heavy scalar's properties render it a potential dark matter candidate.

  9. Tall sections from non-minimal transformations

    NASA Astrophysics Data System (ADS)

    Morrison, David R.; Park, Daniel S.

    2016-10-01

    In previous work, we have shown that elliptic fibrations with two sections, or Mordell-Weil rank one, can always be mapped birationally to a Weierstrass model of a certain form, namely, the Jacobian of a P^{112} model. Most constructions of elliptically fibered Calabi-Yau manifolds with two sections have been carried out assuming that the image of this birational map was a "minimal" Weierstrass model. In this paper, we show that for some elliptically fibered Calabi-Yau manifolds with Mordell-Weil rank-one, the Jacobian of the P^{112} model is not minimal. Said another way, starting from a Calabi-Yau Weierstrass model, the total space must be blown up (thereby destroying the "Calabi-Yau" property) in order to embed the model into P^{112} . In particular, we show that the elliptic fibrations studied recently by Klevers and Taylor fall into this class of models.

  10. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  11. Philosophy, Methodology and Action Research

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  12. Analytical Utility of Campylobacter Methodologies

    USDA-ARS?s Scientific Manuscript database

    The National Advisory Committee on Microbiological Criteria for Foods (NACMCF, or the Committee) was asked to address the analytical utility of Campylobacter methodologies in preparation for an upcoming United States Food Safety and Inspection Service (FSIS) baseline study to enumerate Campylobacter...

  13. Philosophy, Methodology and Action Research

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  14. TESOL Methodology: Five Annotated Bibliographies.

    ERIC Educational Resources Information Center

    Antoun, Elizabeth; Gebhard, Jerry G.; Gutwein, Geraldine; Kim, Won-Hyeong; Staben, Jennifer; York, Aimee

    The five bibliographies included here were selected from those of a graduate-level class in methodology for teaching English to speakers of other languages (TESOL). They were selected based on the quality of research and writing, interest the topic might have for other English-as-a-second-language teachers, and student permission. They include:…

  15. ESP Methodology for Science Lecturers.

    ERIC Educational Resources Information Center

    Rogers, Angela; Mulyana, Cukup

    A program designed to teach university science lecturers in Indonesia how to design and teach one-semester courses in English for special purposes (ESP) is described. The program provided lecturers with training in language teaching methodology and course design. The piloting of the teacher training course, focusing on physics instruction, is…

  16. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  17. A methodology for string resolution

    SciTech Connect

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine`s status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records.

  18. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  19. Unattended Monitoring System Design Methodology

    SciTech Connect

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-07-08

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations.

  20. A Methodological Investigation of Cultivation.

    ERIC Educational Resources Information Center

    Rubin, Alan M.; And Others

    Cultivation theory states that television engenders negative emotions in heavy viewers. Noting that cultivation methodology contains an apparent response bias, a study examined relationships between television exposure and positive restatements of cultivation concepts and tested a more instrumental media uses and effects model. Cultivation was…

  1. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  2. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  3. Cigarette price minimization strategies used by adults.

    PubMed

    Pesko, Michael F; Kruger, Judy; Hyland, Andrew

    2012-09-01

    We used multivariate logistic regressions to analyze data from the 2006 to 2007 Tobacco Use Supplement of the Current Population Survey, a nationally representative sample of adults. We explored use of cigarette price minimization strategies, such as purchasing cartons of cigarettes, purchasing in states with lower after-tax cigarette prices, and purchasing on the Internet. Racial/ethnic minorities and persons with low socioeconomic status used these strategies less frequently at last purchase than did White and high-socioeconomic-status respondents.

  4. Minimal Basis for Gauge Theory Amplitudes

    SciTech Connect

    Bjerrum-Bohr, N. E. J.; Damgaard, Poul H.; Vanhove, Pierre

    2009-10-16

    Identities based on monodromy for integrations in string theory are used to derive relations between different color-ordered tree-level amplitudes in both bosonic and supersymmetric string theory. These relations imply that the color-ordered tree-level n-point gauge theory amplitudes can be expanded in a minimal basis of (n-3)exclamation amplitudes. This result holds for any choice of polarizations of the external states and in any number of dimensions.

  5. Minimally Invasive Diagnosis of Secondary Intracranial Lymphoma

    PubMed Central

    Healy, G. M.; Redmond, C. E.; Stocker, E.; Connaghan, G.; Skehan, S. J.; Killeen, R. P.

    2016-01-01

    Diffuse large B cell lymphomas (DLBCL) are an aggressive group of non-Hodgkin lymphoid malignancies which have diverse presentation and can have high mortality. Central nervous system relapse is rare but has poor survival. We present the diagnosis of primary mandibular DLBCL and a unique minimally invasive diagnosis of secondary intracranial recurrence. This case highlights the manifold radiological contributions to the diagnosis and management of lymphoma. PMID:28018686

  6. Minimally processed vegetable salads: microbial quality evaluation.

    PubMed

    Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa

    2007-05-01

    The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had <10(2) CFU/g concentrations of fecal coliforms. L. monocytogenes was detected in only 1 (0.6%) of the 181 samples examined. This positive sample was simultaneously detected by both methods. The other Listeria species identified by plating were L. welshimeri (one sample of curly lettuce) and L. innocua (2 samples of watercress). The results indicate that minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.

  7. Minimally invasive aesthetic procedures in young adults

    PubMed Central

    Wollina, Uwe; Goldman, Alberto

    2011-01-01

    Age is a significant factor in modifying specific needs when it comes to medical aesthetic procedures. In this review we will focus on young adults in their third decade of life and review minimally invasive aesthetic procedures other than cosmetics and cosmeceuticals. Correction of asymmetries, correction after body modifying procedures, and facial sculpturing are important issues for young adults. The implication of aesthetic medicine as part of preventive medicine is a major ethical challenge that differentiates aesthetic medicine from fashion. PMID:21673871

  8. Heroin-associated anthrax with minimal morbidity.

    PubMed

    Black, Heather; Chapman, Ann; Inverarity, Donald; Sinha, Satyajit

    2017-03-08

    In 2010, during an outbreak of anthrax affecting people who inject drugs, a heroin user aged 37 years presented with soft tissue infection. He subsequently was found to have anthrax. We describe his management and the difficulty in distinguishing anthrax from non-anthrax lesions. His full recovery, despite an overall mortality of 30% for injectional anthrax, demonstrates that some heroin-related anthrax cases can be managed predominately with oral antibiotics and minimal surgical intervention. 2017 BMJ Publishing Group Ltd.

  9. The theory of Minimal Massive Gravity

    NASA Astrophysics Data System (ADS)

    De Felice, Antonio

    2017-08-01

    Here we give a short summary of the the theory of minimal massive gravity. This theory, by construction, possesses only two tensor modes, exactly as in General Relativity. This theory is able to reconcile the original background of dRGT - which was proven to possess a ghost - with a stable homogeneous and isotropic cosmological evolution. The theory is constructed by imposing two constraints in a non-linear, background-independent way. The phenomenology of the theory is also briefly discussed.

  10. Isolated subtalar arthrodesis through minimal incision surgery.

    PubMed

    Carranza-Bencano, A; Tejero-García, S; Del Castillo-Blanco, G; Fernández-Torres, J J; Alegrete-Parra, A

    2013-08-01

    In recent years there has been an increase in the use of minimally invasive techniques, such as arthroscopy, percutaneous, and minimally invasive incisions, for foot and ankle surgery. The purpose of this study was to analyze the fusion rate and clinical results of isolated subtalar arthrodesis (ISA) using the novel and original technique of minimal incision surgery (MIS). There were a total of 77 feet in 76 patients who underwent ISA and were followed for 50 months on average (range, 15-108). The first 30 cases were evaluated retrospectively, and 47 cases were evaluated prospectively. MIS without tourniquet was used in all cases and fusion was assessed radiographically and clinically. Clinical outcome measures used were the Angus and Cowell Scoring System, AOFAS Ankle-Hindfoot, the SF-36, and a patient satisfaction questionnaire 12 months after the intervention. Radiographic and clinical consolidation was achieved in 92% of cases. Main outcomes were "good" in 57 patients as determined by the Angus and Cowell criteria, with 13 "fair" and 7 "poor" results. In the prospective group, AOFAS scores improved by 47.6 points (95% CI: 50.7-42.5) 12 months after surgical intervention. SF-36 outcomes improved by 14.5 points (95% CI: 11.58-17.31) in the mental summary component and 4.2 points (95% CI: 2.2-6.1) in the physical summary component. We recorded no cases of early complications such as wound infections, neurovascular damage, or delayed wound healing. To our knowledge, the present series represents the largest study on subtalar arthrodesis using minimally invasive surgery. The data obtained showed a similar rate of bony union and clinical outcomes compared with the literature, but without early wound complications. ISA using the MIS technique was a good option for patients at greater risk of wound healing complications. Level IV, case series.

  11. Asymptotic safety, emergence and minimal length

    NASA Astrophysics Data System (ADS)

    Percacci, Roberto; Vacca, Gian Paolo

    2010-12-01

    There seems to be a common prejudice that asymptotic safety is either incompatible with, or at best unrelated to, the other topics in the title. This is not the case. In fact, we show that (1) the existence of a fixed point with suitable properties is a promising way of deriving emergent properties of gravity, and (2) there is a sense in which asymptotic safety implies a minimal length. In doing so we also discuss possible signatures of asymptotic safety in scattering experiments.

  12. [Minimally invasive spine surgery: past and present].

    PubMed

    Corniola, M V; Stienen, M N; Tessitore, E; Schaller, K; Gautschi, O P

    2015-11-18

    In the early twentieth century, the understanding of spine biomechanics and the advent of surgical techniques of the lumbar spine, led to the currently emerging concept of minimal invasive spine surgery, By reducing surgical access, blood loss, infection rate and general morbidity, functional prognosis of patients is improved. This is a real challenge for the spine surgeon, who has to maintain a good operative result by significantly reducing surgical collateral damages due to the relatively traumatic conventional access.

  13. Acoustic spectrometer with minimized background dissipation.

    PubMed

    Driaev, D; Iashvili, A; Kankadze, L; Tsakadze, S

    2017-05-01

    Apparatus for measurements of internal friction from Q(-1)∼10(-6) and elastic moduli in the kilohertz frequency range in which external friction losses were minimized using a new-type of three-reed tuning fork as a sample under study is described. High sensitivity of the apparatus made it possible to observe a resonance plasticization of diamagnetic LiF crystals under the action of crossed magnetic fields (≈100 μT) in the EPR conditions.

  14. Telementoring and Telesurgery for Minimally Invasive Surgery.

    PubMed

    Hung, Andrew J; Chen, Jian; Shah, Ankeet; Gill, Inderbir S

    2017-06-24

    Tremendous interest and need lie in the intersection of telemedicine and minimally invasive surgery. Robotics provides an ideal environment for surgical telementoring and telesurgery, given its endoscopic optics and mechanized instrument movement. We review the current status, challenges and future promise of telemedicine in endoscopic and minimally invasive surgery, with particular focus on urologic applications. Two paired investigators screened Pubmed(®), Scopus(®) and Web of Science(®) databases for all full-text English language articles published between 1995 and 2016, using the keywords: telemedicine, minimally invasive surgical procedure, robotic surgical procedure, education, distance. We categorized and included studies on level of interaction between proctors and trainees. Research design, special equipment, telecommunicating network bandwidth, and research outcomes of each study were demonstrated and analyzed. Of 65 identified articles, 38 peer-reviewed manuscripts qualified for inclusion. Studies were categorized into four advancing levels: verbal guidance, guidance with telestration, guidance with tele-assist, and telesurgery. More advanced levels of surgical telementoring provides more effective and experiential teaching, but are associated with higher telecommunicating network bandwidth requirement and expense. Concerns from patient safety, legal, financial, economic, and ethical standpoints remain to be reconciled. Telementoring and telesurgery in minimally invasive surgery are becoming more practical and cost-effective in facilitating the teaching of advanced surgical skills worldwide and the delivery of surgical care to underserved areas. Yet, many challenges remain. Maturity of these modalities depends on financial incentives, favorable legislation and collaboration with cybersecurity experts to ensure safety and cost-effectiveness. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All

  15. Reaction torque minimization techniques for articulated payloads

    NASA Technical Reports Server (NTRS)

    Kral, Kevin; Aleman, Roberto M.

    1988-01-01

    Articulated payloads on spacecraft, such as antenna telemetry systems and robotic elements, impart reaction torques back into the vehicle which can significantly affect the performance of other payloads. This paper discusses ways to minimize the reaction torques of articulated payloads through command-shaping algorithms and unique control implementations. The effects of reaction torques encountered on Landsat are presented and compared with simulated and measured data of prototype systems employing these improvements.

  16. Smooth GERBS, orthogonal systems and energy minimization

    NASA Astrophysics Data System (ADS)

    Dechevsky, Lubomir T.; Zanaty, Peter

    2013-12-01

    New results are obtained in three mutually related directions of the rapidly developing theory of generalized expo-rational B-splines (GERBS) [7, 6]: closed-form computability of C∞-smooth GERBS in terms of elementary and special functions, Hermite interpolation and least-squares best approximation via smooth GERBS, energy minimizing properties of smooth GERBS similar to those of the classical cubic polynomial B-splines.

  17. On 3D minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Alishahiha, Mohsen; Qaemmaqami, Mohammad M.; Naseh, Ali; Shirzad, Ahmad

    2014-12-01

    We study linearized equations of motion of the newly proposed three dimensional gravity, known as minimal massive gravity, using its metric formulation. By making use of a redefinition of the parameters of the model, we observe that the resulting linearized equations are exactly the same as that of TMG. In particular the model admits logarithmic modes at critical points. We also study several vacuum solutions of the model, specially at a certain limit where the contribution of Chern-Simons term vanishes.

  18. Smooth GERBS, orthogonal systems and energy minimization

    SciTech Connect

    Dechevsky, Lubomir T. E-mail: pza@hin.no; Zanaty, Peter E-mail: pza@hin.no

    2013-12-18

    New results are obtained in three mutually related directions of the rapidly developing theory of generalized expo-rational B-splines (GERBS) [7, 6]: closed-form computability of C{sup ∞}-smooth GERBS in terms of elementary and special functions, Hermite interpolation and least-squares best approximation via smooth GERBS, energy minimizing properties of smooth GERBS similar to those of the classical cubic polynomial B-splines.

  19. Nonlinear transient analysis via energy minimization

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.

  20. Cancer and Aging: Epidemiology and Methodological Challenges

    PubMed Central

    Pedersen, Jacob K; Engholm, Gerda; Skytthe, Axel; Christensen, Kaare

    2016-01-01

    Epidemiological cancer data shed light on key questions within basic science, clinical medicine and public health. For decades, Denmark has had linkable health registers that contain individual level data on the entire population with virtually complete follow-up. This has enabled high quality studies of cancer epidemiology and minimized the challenges often faced in many countries, such as uncertain identification of the study base, age misreporting, and low validity of the cancer diagnoses. However, methodological challenges still remain to be addressed, especially in cancer epidemiology studies among the elderly and the oldest-old. E.g., a characteristic pattern for many cancer types is that the incidence increases up to a maximum at about ages 75 to 90 years and is then followed by a decline or a leveling off at the oldest ages. It has been suggested that the oldest individuals may be asymptomatic, or even insusceptible to cancer. An alternative interpretation is that this pattern is an artifact due to lower diagnostic intensity among the elderly and oldest-old caused by higher levels of co-morbidities in this age group. Currently, the available cancer epidemiology data are not able to provide clear evidence for any of these hypotheses. PMID:26825001

  1. Psychophysical estimation of speed discrimination. I. Methodology.

    PubMed

    Lakshminarayanan, Vasudevan; Raghuram, Aparna; Khanna, Ritu

    2005-10-01

    Thresholds were assessed for a speed discrimination task with a pair of luminance-defined drifting gratings. The design and results of a series of experiments dealing in general with speed discrimination are described. Results show that for a speed discrimination task using drifting gratings, simultaneous presentation of the pair of gratings (spatially separated) was preferred over sequential presentation (temporally separated) in order to minimize the effects of eye movements and tracking. An interstimulus interval of at least 1000 ms was necessary to prevent motion aftereffects on subsequently viewed stimuli. For the two reference speeds tested of 2 and 8 deg/s using identical spatial frequency or randomizing spatial frequency for the pair of gratings did not affect speed discrimination thresholds. Implementing a staircase method of estimating thresholds was preferred over the method of constant stimuli or the method of limits. The results of these experiments were used to define the methodology for an investigation of aging and motion perception. These results will be of interest and use to psychophysicists designing and implementing speed discrimination paradigms.

  2. Methodology investigations for shear wave splitting analysis

    NASA Astrophysics Data System (ADS)

    Kong, Fansheng

    Over the past several decades, shear wave splitting analyses have been increasingly utilized to delineate mantle structure and probe mantle dynamics. However, the reported splitting parameters (fast polarization orientations and splitting times) are frequently inconsistent among different studies, partially due to the different techniques used to estimate the splitting parameters. Here the study conduct research on methodology investigations for shear wave splitting analysis, which are composed of two sub-topics, i.e., a systematic comparison of the transverse minimization (TM) and the splitting intensity (SI) techniques and applicability of the multiple-event stacking technique (MES). Numerical experiments are conducted using both synthetic and observed data. In addition, crustal anisotropy beneath 71 broadband seismic stations situated at the eastern Tibetan Plateau and adjacent areas is investigated based on the sinusoidal moveout of P-to-S conversions from the Moho and an intra-crustal discontinuity with an average splitting time of 0.39 +/- 0.19 s and dominantly fracture-parallel fast orientations. The crustal anisotropy measurements support the existences of mid/lower crustal flow in the southern Songpan-Ganzi Terrane and crustal shortening deformation beneath the Longmenshan fault zone.

  3. Directional reflectance characterization facility and measurement methodology.

    PubMed

    McGuckin, B T; Haner, D A; Menzies, R T; Esproles, C; Brothers, A M

    1996-08-20

    A precision reflectance characterization facility, constructed specifically for the measurement of the bidirectional reflectance properties of Spectralon panels planned for use as in-flight calibrators on the NASA Multiangle Imaging Spectroradiometer (MISR) instrument is described. The incident linearly polarized radiation is provided at three laser wavelengths: 442, 632.8, and 859.9 nm. Each beam is collimated when incident on the Spectralon. The illuminated area of the panel is viewed with a silicon photodetector that revolves around the panel (360°) on a 30-cm boom extending from a common rotational axis. The reflected radiance detector signal is ratioed with the signal from a reference detector to minimize the effect of amplitude instabilities in the laser sources. This and other measures adopted to reduce noise have resulted in a bidirectional reflection function (BRF) calibration facility with a measurement precision with regard to a BRF measurement of ±0.002 at the 1ς confidence level. The Spectralon test piece panel is held in a computer-controlled three-axis rotational assembly capable of a full 360° rotation in the horizontal plane and 90° in the vertical. The angular positioning system has repeatability and resolution of 0.001°. Design details and an outline of the measurement methodology are presented.

  4. The non-minimal ekpyrotic trispectrum

    NASA Astrophysics Data System (ADS)

    Fertig, Angelika; Lehners, Jean-Luc

    2016-01-01

    Employing the covariant formalism, we derive the evolution equations for two scalar fields with non-canonical field space metric up to third order in perturbation theory. These equations can be used to derive predictions for local bi- and trispectra of multi-field cosmological models. Our main application is to ekpyrotic models in which the primordial curvature perturbations are generated via the non-minimal entropic mechanism. In these models, nearly scale-invariant entropy perturbations are generated first due to a non-minimal kinetic coupling between two scalar fields, and subsequently these perturbations are converted into curvature perturbations. Remarkably, the entropy perturbations have vanishing bi- and trispectra during the ekpyrotic phase. However, as we show, the conversion process to curvature perturbations induces local non-Gaussianity parameters fNL and gNL at levels that should be detectable by near-future observations. In fact, in order to obtain a large enough amplitude and small enough bispectrum of the curvature perturbations, as seen in current measurements, the conversion process must be very efficient. Interestingly, for such efficient conversions the trispectrum parameter gNL remains negative and typically of a magnitude Script O(102)-Script O(103), resulting in a distinguishing feature of non-minimally coupled ekpyrotic models.

  5. The non-minimal ekpyrotic trispectrum

    SciTech Connect

    Fertig, Angelika; Lehners, Jean-Luc E-mail: jlehners@aei.mpg.de

    2016-01-01

    Employing the covariant formalism, we derive the evolution equations for two scalar fields with non-canonical field space metric up to third order in perturbation theory. These equations can be used to derive predictions for local bi- and trispectra of multi-field cosmological models. Our main application is to ekpyrotic models in which the primordial curvature perturbations are generated via the non-minimal entropic mechanism. In these models, nearly scale-invariant entropy perturbations are generated first due to a non-minimal kinetic coupling between two scalar fields, and subsequently these perturbations are converted into curvature perturbations. Remarkably, the entropy perturbations have vanishing bi- and trispectra during the ekpyrotic phase. However, as we show, the conversion process to curvature perturbations induces local non-Gaussianity parameters f{sub NL} and g{sub NL} at levels that should be detectable by near-future observations. In fact, in order to obtain a large enough amplitude and small enough bispectrum of the curvature perturbations, as seen in current measurements, the conversion process must be very efficient. Interestingly, for such efficient conversions the trispectrum parameter g{sub NL} remains negative and typically of a magnitude O(10{sup 2})–O(10{sup 3}), resulting in a distinguishing feature of non-minimally coupled ekpyrotic models.

  6. Esophageal surgery in minimally invasive era

    PubMed Central

    Bencini, Lapo; Moraldi, Luca; Bartolini, Ilenia; Coratti, Andrea

    2016-01-01

    The widespread popularity of new surgical technologies such as laparoscopy, thoracoscopy and robotics has led many surgeons to treat esophageal diseases with these methods. The expected benefits of minimally invasive surgery (MIS) mainly include reductions of postoperative complications, length of hospital stay, and pain and better cosmetic results. All of these benefits could potentially be of great interest when dealing with the esophagus due to the potentially severe complications that can occur after conventional surgery. Moreover, robotic platforms are expected to reduce many of the difficulties encountered during advanced laparoscopic and thoracoscopic procedures such as anastomotic reconstructions, accurate lymphadenectomies, and vascular sutures. Almost all esophageal diseases are approachable in a minimally invasive way, including diverticula, gastro-esophageal reflux disease, achalasia, perforations and cancer. Nevertheless, while the limits of MIS for benign esophageal diseases are mainly technical issues and costs, oncologic outcomes remain the cornerstone of any procedure to cure malignancies, for which the long-term results are critical. Furthermore, many of the minimally invasive esophageal operations should be compared to pharmacologic interventions and advanced pure endoscopic procedures; such a comparison requires a difficult literature analysis and leads to some confounding results of clinical trials. This review aims to examine the evidence for the use of MIS in both malignancies and more common benign disease of the esophagus, with a particular emphasis on future developments and ongoing areas of research. PMID:26843913

  7. Waste minimization in an autobody repair shop

    SciTech Connect

    Baria, D.N.; Dorland, D.; Bergeron, J.T.

    1994-12-31

    This work was done to document the waste minimization incorporated in a new autobody repair facility in Hermantown, Minnesota. Humes Collision Center incorporated new waste reduction techniques when it expanded its old facilities in 1992 and it was able to achieve the benefits of cost reduction and waste reduction. Humes Collision Center repairs an average of 500 cars annually and is a very small quantity generator (VSQG) of hazardous waste, as defined by the Minnesota Pollution Control Agency (MPCA). The hazardous waste consists of antifreeze, batteries, paint sludge, refrigerants, and used oil, while the nonhazardous waste consists of cardboard, glass, paint filters, plastic, sanding dust, scrap metal, and wastewater. The hazardous and nonhazardous waste output were decreased by 72%. In addition, there was a 63% reduction in the operating costs. The waste minimization includes antifreeze recovery and recycling, reduction in unused waste paint, reduction, recovery and recycle of waste lacquer thinner for cleaning spray guns and paint cups, elimination of used plastic car bags, recovery and recycle of refrigerant, reduction in waste sandpaper and elimination of sanding dust, and elimination of waste paint filters. The rate of return on the investment in waste minimization equipment is estimated from 37% per year for the distillation unit, 80% for vacuum sanding, 146% for computerized paint mixing, 211% for the refrigerant recycler, to 588% per year for the gun washer. The corresponding payback time varies from 3 years to 2 months.

  8. Minimally invasive thyroidectomy: a ten years experience

    PubMed Central

    Viani, Lorenzo; Montana, Chiara Montana; Cozzani, Federico; Sianesi, Mario

    2016-01-01

    Background The conventional thyroidectomy is the most frequent surgical procedure for thyroidal surgical disease. From several years were introduced minimally invasive approaches to thyroid surgery. These new procedures improved the incidence of postoperative pain, cosmetic results, patient’s quality of life, postoperative morbidity. The mini invasive video-assisted thyroidectomy (MIVAT) is a minimally invasive procedure that uses a minicervicotomy to treat thyroidal diseases. Methods We present our experience on 497 consecutively treated patients with MIVAT technique. We analyzed the mean age, sex, mean operative time, rate of bleeding, hypocalcemia, transitory and definitive nerve palsy (6 months after the procedure), postoperative pain scale from 0 to 10 at 1 hour and 24 hours after surgery, mean hospital stay. Results The indications to treat were related to preoperative diagnosis: 182 THYR 6, 184 THYR 3–4, 27 plummer, 24 basedow, 28 toxic goiter, 52 goiter. On 497 cases we have reported 1 case of bleeding (0,2%), 12 (2,4%) cases of transitory nerve palsy and 4 (0,8%) definitive nerve palsy. The rate of serologic hypocalcemia was 24.9% (124 cases) and clinical in 7.2% (36 cases); 1 case of hypoparathyroidism (0.2%). Conclusions The MIVAT is a safe approach to surgical thyroid disease, the cost are similar to CT as the adverse events. The minicervicotomy is really a minimally invasive tissue dissection. PMID:27294036

  9. Minimally invasive local therapies for liver cancer

    PubMed Central

    Li, David; Kang, Josephine; Golas, Benjamin J.; Yeung, Vincent W.; Madoff, David C.

    2014-01-01

    Primary and metastatic liver tumors are an increasing global health problem, with hepatocellular carcinoma (HCC) now being the third leading cause of cancer-related mortality worldwide. Systemic treatment options for HCC remain limited, with Sorafenib as the only prospectively validated agent shown to increase overall survival. Surgical resection and/or transplantation, locally ablative therapies and regional or locoregional therapies have filled the gap in liver tumor treatments, providing improved survival outcomes for both primary and metastatic tumors. Minimally invasive local therapies have an increasing role in the treatment of both primary and metastatic liver tumors. For patients with low volume disease, these therapies have now been established into consensus practice guidelines. This review highlights technical aspects and outcomes of commonly utilized, minimally invasive local therapies including laparoscopic liver resection (LLR), radiofrequency ablation (RFA), microwave ablation (MWA), high-intensity focused ultrasound (HIFU), irreversible electroporation (IRE), and stereotactic body radiation therapy (SBRT). In addition, the role of combination treatment strategies utilizing these minimally invasive techniques is reviewed. PMID:25610708

  10. Direct energy functional minimization under orthogonality constraints

    NASA Astrophysics Data System (ADS)

    Weber, Valéry; VandeVondele, Joost; Hutter, Jürg; Niklasson, Anders M. N.

    2008-02-01

    The direct energy functional minimization problem in electronic structure theory, where the single-particle orbitals are optimized under the constraint of orthogonality, is explored. We present an orbital transformation based on an efficient expansion of the inverse factorization of the overlap matrix that keeps orbitals orthonormal. The orbital transformation maps the orthogonality constrained energy functional to an approximate unconstrained functional, which is correct to some order in a neighborhood of an orthogonal but approximate solution. A conjugate gradient scheme can then be used to find the ground state orbitals from the minimization of a sequence of transformed unconstrained electronic energy functionals. The technique provides an efficient, robust, and numerically stable approach to direct total energy minimization in first principles electronic structure theory based on tight-binding, Hartree-Fock, or density functional theory. For sparse problems, where both the orbitals and the effective single-particle Hamiltonians have sparse matrix representations, the effort scales linearly with the number of basis functions N in each iteration. For problems where only the overlap and Hamiltonian matrices are sparse the computational cost scales as O(M2N ), where M is the number of occupied orbitals. We report a single point density functional energy calculation of a DNA decamer hydrated with 4003 water molecules under periodic boundary conditions. The DNA fragment containing a cis-syn thymine dimer is composed of 634 atoms and the whole system contains a total of 12 661 atoms and 103 333 spherical Gaussian basis functions.

  11. Minimally radiating sources for personal audio.

    PubMed

    Elliott, Stephen J; Cheer, Jordan; Murfet, Harry; Holland, Keith R

    2010-10-01

    In order to reduce annoyance from the audio output of personal devices, it is necessary to maintain the sound level at the user position while minimizing the levels elsewhere. If the dark zone, within which the sound is to be minimized, extends over the whole far field of the source, the problem reduces to that of minimizing the radiated sound power while maintaining the pressure level at the user position. It is shown analytically that the optimum two-source array then has a hypercardioid directivity and gives about 7 dB reduction in radiated sound power, compared with a monopole producing the same on-axis pressure. The performance of other linear arrays is studied using monopole simulations for the motivating example of a mobile phone. The trade-off is investigated between the performance in reducing radiated noise, and the electrical power required to drive the array for different numbers of elements. It is shown for both simulations and experiments conducted on a small array of loudspeakers under anechoic conditions, that both two and three element arrays provide a reasonable compromise between these competing requirements. The implementation of the two-source array in a coupled enclosure is also shown to reduce the electrical power requirements.

  12. [Minimally Invasive Treatment of Esophageal Benign Diseases].

    PubMed

    Inoue, Haruhiro

    2016-07-01

    As a minimally invasive treatment of esophageal achalasia per-oral endoscopic myotomy( POEM) was developed in 2008. More than 1,100 cases of achalasia-related diseases received POEM. Success rate of the procedure was more than 95%(Eckerdt score improvement 3 points and more). No serious( Clavian-Dindo classification III b and more) complication was experienced. These results suggest that POEM becomes a standard minimally invasive treatment for achalasia-related diseases. As an off-shoot of POEM submucosal tumor removal through submucosal tunnel (per-oral endoscopic tumor resection:POET) was developed and safely performed. Best indication of POET is less than 5 cm esophageal leiomyoma. A novel endoscopic treatment of gastroesophageal reflux disease (GERD) was developed. Anti-reflux mucosectomy( ARMS) is nearly circumferential mucosal reduction of gastric cardia mucosa. ARMS is performed in 56 consecutive cases of refractory GERD. No major complications were encountered and excellent clinical results. Best indication of ARMS is a refractory GERD without long sliding hernia. Longest follow-up case is more than 10 years. Minimally invasive treatments for esophageal benign diseases are currently performed by therapeutic endoscopy.

  13. [Minimal cerebral dysfunctions and ADHD in adulthood].

    PubMed

    Linden, M; Weddigen, J

    2016-11-01

    Attention deficit hyperactivity disorder (ADHD) is of great importance not only in children but also in adults; however, despite extensive research there are still many unsolved questions with respect to the diagnosis. Patients not only suffer from attention deficits and hyperactivity but also a variety of other problems, such as dyspraxia, problems with stimulus discrimination, dysgrammatism, legasthenia, or motor coordination problems. Furthermore, there are also psychopathological disorders, such as problems with memory, formal thinking, emotional modulation, drive and vegetative stability, in the sense of a psycho-organic syndrome. Such syndromes have long been known in psychiatry under terms, such as complex capacity disorders, minimal cerebral dysfunction (MCD), minimal brain dysfunction (MBD), mild psycho-organic syndrome, psycho-organic axis syndrome, mild cognitive impairment, developmental disorder and developmental biological syndrome. Etiological data with respect to genetics and early childhood brain trauma support the notion of a psychobiological disorder for complex cerebral dysfunction in the sense of a psycho-organic syndrome. Depending on the individual life and work situation, these additional symptoms of ADHD are in many cases of greater relevance for life adjustment than the core symptoms, depending on the individual life and work situations. The concept of minimal cerebral dysfunction describes the ADHD problem better and has a direct bearing on the diagnosis, therapy and sociomedical care of the patients.

  14. A pollution reduction methodology for chemical process simulators

    SciTech Connect

    Mallick, S.K.; Cabezas, H.; Bare, J.C.; Sikdar, S.K.

    1996-11-01

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has been modified by weighing the mass flowrate of each pollutant by its potential environmental impact score. This converts the mass balance into an environmental impact balance. This balance defines an impact index with units of environmental impact per mass of products. The impact index measures the potential environmental effects of process wastes. Three different schemes for chemical ranking were considered: (1) no ranking, (2) simple ranking from 0 to 3, and (3) ranking by a scientifically derived measure of human health and environmental effects. Use of the methodology is illustrated with two examples from the production of (1) methyl ethyl ketone and (2) synthetic ammonia.

  15. Feminist Methodologies and Engineering Education Research

    ERIC Educational Resources Information Center

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  16. The threshold algorithm: Description of the methodology and new developments

    NASA Astrophysics Data System (ADS)

    Neelamraju, Sridhar; Oligschleger, Christina; Schön, J. Christian

    2017-10-01

    Understanding the dynamics of complex systems requires the investigation of their energy landscape. In particular, the flow of probability on such landscapes is a central feature in visualizing the time evolution of complex systems. To obtain such flows, and the concomitant stable states of the systems and the generalized barriers among them, the threshold algorithm has been developed. Here, we describe the methodology of this approach starting from the fundamental concepts in complex energy landscapes and present recent new developments, the threshold-minimization algorithm and the molecular dynamics threshold algorithm. For applications of these new algorithms, we draw on landscape studies of three disaccharide molecules: lactose, maltose, and sucrose.

  17. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  18. Methodology of management of dredging operations I. Conceptual developments.

    PubMed

    Abriak, N E; Junqua, G; Dubois, V; Gregoire, P; Mac Farlane, F; Damidot, D

    2006-04-01

    This article presents a new methodology for the management of dredging operations. Partly derived from existing methodologies (OECD, UNEP, AIPCN), its aim is to be more complete, by integrating the qualities and complementarities of former methodologies. Moreover, it was carried out in a context of sustainable development. Indeed, it supports, according to a framework of industrial ecology, the development and the implementation of solutions of waste improvement of dredged materials, in order to minimize the environmental impact of dredging. It also uses a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.), in order to integrate local characteristics. In addition, this tool, called DRAGSED, allows to a dialogue to be established between all the parties concerned with a dredging project (harbour's authorities, industrialists, municipalities, administrations, populations, associations,...). Thus, the implementation of this methodology enables consensus to be reached for the dredging solution retained. It also proposes an environmental follow-up, which will allow an evaluation during its application.

  19. Minimizing inter-microscope variability in dental microwear texture analysis

    NASA Astrophysics Data System (ADS)

    Arman, Samuel D.; Ungar, Peter S.; Brown, Christopher A.; DeSantis, Larisa R. G.; Schmidt, Christopher; Prideaux, Gavin J.

    2016-06-01

    A common approach to dental microwear texture analysis (DMTA) uses confocal profilometry in concert with scale-sensitive fractal analysis to help understand the diets of extinct mammals. One of the main benefits of DMTA over other methods is the repeatable, objective manner of data collection. This repeatability, however, is threatened by variation in results of DMTA of the same dental surfaces yielded by different microscopes. Here we compare DMTA data of five species of kangaroos measured on seven profilers of varying specifications. Comparison between microscopes confirms that inter-microscope differences are present, but we show that deployment of a number of automated treatments to remove measurement noise can help minimize inter-microscope differences. Applying these same treatments to a published hominin DMTA dataset shows that they alter some significant differences between dietary groups. Minimising microscope variability while maintaining interspecific dietary differences requires then that these factors are balanced in determining appropriate treatments. The process outlined here offers a solution for allowing comparison of data between microscopes, which is essential for ongoing DMTA research. In addition, the process undertaken, including considerations of other elements of DMTA protocols also promises to streamline methodology, remove measurement noise and in doing so, optimize recovery of a reliable dietary signature.

  20. Minimal selective concentrations of tetracycline in complex aquatic bacterial biofilms.

    PubMed

    Lundström, Sara V; Östman, Marcus; Bengtsson-Palme, Johan; Rutgersson, Carolin; Thoudal, Malin; Sircar, Triranta; Blanck, Hans; Eriksson, K Martin; Tysklind, Mats; Flach, Carl-Fredrik; Larsson, D G Joakim

    2016-05-15

    Selection pressure generated by antibiotics released into the environment could enrich for antibiotic resistance genes and antibiotic resistant bacteria, thereby increasing the risk for transmission to humans and animals. Tetracyclines comprise an antibiotic class of great importance to both human and animal health. Accordingly, residues of tetracycline are commonly detected in aquatic environments. To assess if tetracycline pollution in aquatic environments promotes development of resistance, we determined minimal selective concentrations (MSCs) in biofilms of complex aquatic bacterial communities using both phenotypic and genotypic assays. Tetracycline significantly increased the relative abundance of resistant bacteria at 10 μg/L, while specific tet genes (tetA and tetG) increased significantly at the lowest concentration tested (1 μg/L). Taxonomic composition of the biofilm communities was altered with increasing tetracycline concentrations. Metagenomic analysis revealed a concurrent increase of several tet genes and a range of other genes providing resistance to different classes of antibiotics (e.g. cmlA, floR, sul1, and mphA), indicating potential for co-selection. Consequently, MSCs for the tet genes of ≤ 1 μg/L suggests that current exposure levels in e.g. sewage treatment plants could be sufficient to promote resistance. The methodology used here to assess MSCs could be applied in risk assessment of other antibiotics as well. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Minimizing variability of cascade impaction measurements in inhalers and nebulizers.

    PubMed

    Bonam, Matthew; Christopher, David; Cipolla, David; Donovan, Brent; Goodwin, David; Holmes, Susan; Lyapustina, Svetlana; Mitchell, Jolyon; Nichols, Steve; Pettersson, Gunilla; Quale, Chris; Rao, Nagaraja; Singh, Dilraj; Tougas, Terrence; Van Oort, Mike; Walther, Bernd; Wyka, Bruce

    2008-01-01

    The purpose of this article is to catalogue in a systematic way the available information about factors that may influence the outcome and variability of cascade impactor (CI) measurements of pharmaceutical aerosols for inhalation, such as those obtained from metered dose inhalers (MDIs), dry powder inhalers (DPIs) or products for nebulization; and to suggest ways to minimize the influence of such factors. To accomplish this task, the authors constructed a cause-and-effect Ishikawa diagram for a CI measurement and considered the influence of each root cause based on industry experience and thorough literature review. The results illustrate the intricate network of underlying causes of CI variability, with the potential for several multi-way statistical interactions. It was also found that significantly more quantitative information exists about impactor-related causes than about operator-derived influences, the contribution of drug assay methodology and product-related causes, suggesting a need for further research in those areas. The understanding and awareness of all these factors should aid in the development of optimized CI methods and appropriate quality control measures for aerodynamic particle size distribution (APSD) of pharmaceutical aerosols, in line with the current regulatory initiatives involving quality-by-design (QbD).

  2. NUREG-1150 risk assessment methodology

    SciTech Connect

    Benjamin, A.S.; Amos, C.N.; Cunningham, M.A.; Murphy, J.A.

    1987-01-01

    This paper describes the methodology developed in support of the US Nuclear Regulatory Commission's (NCR's) evaluation of severe accident risks in NUREG-1150. After the accident at Three Mile Island, Unit 2, the NRC initiated a sever accident research program to develop an improved understanding of severe accidents and to provide a second technical basis to support regulatory decisions in this area. A key product of this program is NUREG-1150, which provides estimates of risk for several nuclear reactors of different design. The principal technical analyses for NUREG-1150 were performed at Sandia National Labs. under the Severe Accident Risk Reduction Program and the Accident Sequence Evaluation Program. A major aspect of the work was the development of a methodology that improved upon previous full-scale probabilistic risk assessments (PRA) in several areas which are described.

  3. Methodology issues in implementation science.

    PubMed

    Newhouse, Robin; Bobay, Kathleen; Dykes, Patricia C; Stevens, Kathleen R; Titler, Marita

    2013-04-01

    Putting evidence into practice at the point of care delivery requires an understanding of implementation strategies that work, in what context and how. To identify methodological issues in implementation science using 4 studies as cases and make recommendations for further methods development. Four cases are presented and methodological issues identified. For each issue raised, evidence on the state of the science is described. Issues in implementation science identified include diverse conceptual frameworks, potential weaknesses in pragmatic study designs, and the paucity of standard concepts and measurement. Recommendations to advance methods in implementation include developing a core set of implementation concepts and metrics, generating standards for implementation methods including pragmatic trials, mixed methods designs, complex interventions and measurement, and endorsing reporting standards for implementation studies.

  4. Design methodology of Dutch banknotes

    NASA Astrophysics Data System (ADS)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  5. [CODESIGN METHODOLOGIES: A ENABLING RESOURCE?].

    PubMed

    Oboeuf, Alexandre; Aiguier, Grégory; Loute, Alain

    2016-01-01

    To reflect on the learning of the relationship in the care, seventeen people were mobilized to participate in a day of codesign. This methodology is to foster the creativity of a group with a succession creativity exercises. This article is primarily intended to reflect on the conditions by which such a methodology can become a resource for thinking learning ethics. The role of affectivity in the success of a codesign day is questioned. This work highlights include its central place in the construction of the innovative climate and the divergent thinking mechanism. The article aims to open new questions on the articulation exercises, affectivity, the role of the animator or that of the patient. The research perspectives invite disciplinary dialogue.

  6. Methodologies for building robust schedules

    NASA Technical Reports Server (NTRS)

    Dean, John H.

    1992-01-01

    COMPASS is the name of a Computer Aided Scheduling System designed and built for NASA. COMPASS can be used to develop schedule of activities based upon the temporal relationships of the activities and their resource requirements. COMPASS uses this information, and guided by the user, develops precise start and stop times for the activities. In actual practice however, it is impossible to know with complete certainty what the actual durations of the scheduled activities will really be. The best that one can hope for is knowledge of the probability distribution for the durations. This paper investigates methodologies for using a scheduling tool like COMPASS that is based upon definite values for the resource requirements, while building schedules that remain valid in the face of the schedule execution perturbations. Representations for the schedules developed by these methodologies are presented, along with a discussion of the algorithm that could be used by a computer onboard a spacecraft to efficiently monitor and execute these schedules.

  7. Methodology for optimizing transistor performance

    NASA Astrophysics Data System (ADS)

    Waldo, Whitson G.

    1997-08-01

    A methodology is presented for optimizing transistor performance by considering the coupled response of the on- state and off-state parameters of saturated drain current and subthreshold drain leakage current, respectively. Good die yield in a CMOS digital logic integrated circuit is shown to be highly correlated to a multiple linear model of these transistor performance parameters using empirical data. These currents are correspondingly highly correlated to threshold voltage and effective channel length. Monte Carlo simulation is used to predict the distributions of saturated drain current and subthreshold drain current based on the natural variation of threshold voltage and effective channel length. A case study is presented using this methodology for the development of a CMOS retrograde well process using high energy implants from a baseline process using conventional diffused wells.

  8. Diffusion methodology: time to innovate?

    PubMed

    Meyer, Gary

    2004-01-01

    Over the past 60 years, thousands of diffusion studies have been conducted in numerous disciplines of study including sociology, education, communication, marketing, and pubic health. With few exceptions, these studies have been driven by a methodological approach that has become institutionalized in diffusion research. This approach is characterized by the collection of quantitative data about one innovation gathered from adopters at a single point in time after widespread diffusion has occurred. This dominant approach is examined here in terms of both its strengths and weaknesses and with regard to its contribution to the collective base of understanding the diffusion of innovations. Alternative methodological approaches are proposed and reviewed with consideration for the means by which they may expand the knowledge base.

  9. Defect reduction through Lean methodology

    NASA Astrophysics Data System (ADS)

    Purdy, Kathleen; Kindt, Louis; Densmore, Jim; Benson, Craig; Zhou, Nancy; Leonard, John; Whiteside, Cynthia; Nolan, Robert; Shanks, David

    2010-09-01

    Lean manufacturing is a systematic method of identifying and eliminating waste. Use of Lean manufacturing techniques at the IBM photomask manufacturing facility has increased efficiency and productivity of the photomask process. Tools, such as, value stream mapping, 5S and structured problem solving are widely used today. In this paper we describe a step-by-step Lean technique used to systematically decrease defects resulting in reduced material costs, inspection costs and cycle time. The method used consists of an 8-step approach commonly referred to as the 8D problem solving process. This process allowed us to identify both prominent issues as well as more subtle problems requiring in depth investigation. The methodology used is flexible and can be applied to numerous situations. Advantages to Lean methodology are also discussed.

  10. Structural methodologies for auditing SNOMED.

    PubMed

    Wang, Yue; Halper, Michael; Min, Hua; Perl, Yehoshua; Chen, Yan; Spackman, Kent A

    2007-10-01

    SNOMED is one of the leading health care terminologies being used worldwide. As such, quality assurance is an important part of its maintenance cycle. Methodologies for auditing SNOMED based on structural aspects of its organization are presented. In particular, automated techniques for partitioning SNOMED into smaller groups of concepts based primarily on relationships patterns are defined. Two abstraction networks, the area taxonomy and p-area taxonomy, are derived from the partitions. The high-level views afforded by these abstraction networks form the basis for systematic auditing. The networks tend to highlight errors that manifest themselves as irregularities at the abstract level. They also support group-based auditing, where sets of purportedly similar concepts are focused on for review. The auditing methodologies are demonstrated on one of SNOMED's top-level hierarchies. Errors discovered during the auditing process are reported.

  11. Methodological assessment of HCC literature

    PubMed Central

    Daniele, G.; Costa, N.; Lorusso, V.; Costa-Maia, J.; Pache, I.; Pirisi, M.

    2013-01-01

    Despite the fact that the hepatocellular carcinoma (HCC) represents a major health problem, very few interventions are available for this disease, and only sorafenib is approved for the treatment of advanced disease. Of note, only very few interventions have been thoroughly evaluated over time for HCC patients compared with several hundreds in other, equally highly lethal, tumours. Additionally, clinical trials in HCC have often been questioned for poor design and methodological issues. As a consequence, a gap between what is measured in clinical trials and what clinicians have to face in daily practice often occurs. As a result of this scenario, even the most recent guidelines for treatment of HCC patients use low strength evidence to make recommendations. In this review, we will discuss some of the potential methodological issues hindering a rational development of new treatments for HCC patients. PMID:23715943

  12. Some methodological issues in biosurveillance.

    PubMed

    Fricker, Ronald D

    2011-02-28

    This paper briefly summarizes a short course I gave at the 12th Biennial Centers for Disease Control and Prevention (CDC) and Agency for Toxic Substances and Disease Registry (ATSDR) Symposium held in Decatur, Georgia on April 6, 2009. The goal of this short course was to discuss various methodological issues of biosurveillance detection algorithms, with a focus on the issues related to developing, evaluating, and implementing such algorithms.

  13. [Methods and methodology of pathology].

    PubMed

    Lushnikov, E F

    2016-01-01

    The lecture gives the state-of-the-art of the methodology of human pathology that is an area of the scientific and practice activity of specialists to produce and systematize objective knowledge of pathology and to use the knowledge in clinical medicine. It considers the objects and subjects of an investigation, materials and methods of a pathologist, and the results of his/her work.

  14. ISE System Development Methodology Manual

    SciTech Connect

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  15. Micromanipulator: effectiveness in minimally invasive neurosurgery.

    PubMed

    Jain, R; Kato, Y; Sano, H; Imizu, S; Watanabe, S; Yamaguchi, S; Shinya, N; Jindal, V; Kanno, T

    2003-08-01

    Minimally invasive surgeries by innovative approaches are practiced in all fields. The evolution of microneurosurgery has revolutionized the results in neurosurgery. Use of endoscopes and navigation has made microsurgery less invasive. Another development to make minimally invasive microneurosurgery further lesser invasive is the use of micromanipulator. The use and effectiveness of manually controlled micromanipulator system is presented. The manually controlled micromanipulator system consists of three parts, i.e., a basic micromanipulator, manipulator supporting device and the manual control. The micromanipulator fitted in supporting device is arranged before the start of surgery. The supporting device used is pneumatically driven powered endoscopic holding device (Mitaka Kohki Co., Tokyo) In maximum number of times we used the system for endoscopic assisted cerebrovascular microneurosurgery. In a span of two months we used it in thirty aneurysm clipping surgeries. The endoscope fitted in system has three ranges of motions (forward/backward, upside/down and sideways). We use MACHIDA rigid endoscope with internal diameter of 2.7 mm (smallest diameter endoscope available). Special features of this endoscope are accurate visualization at a deeper plane, stable movements and availability of single focus point for long time. All these features are valuable during pre- and postoperative clipping observation. The aim of development of micromanipulator system was to further reduce invasiveness. A significant improvement in manual dexterity is possible when working through the micromanipulator interface, which dampens human physiological tremor. The physiological tremor would render the manual dexterity unsafe at the end of lever arm of long instruments. Thus, the use endoscope becomes practical. The minimally invasive microneurosurgery can be further made lesser invasive by use of micromanipulator and we are convinced that this will facilitate more accurate and

  16. Fusion algebras of logarithmic minimal models

    NASA Astrophysics Data System (ADS)

    Rasmussen, Jørgen; Pearce, Paul A.

    2007-11-01

    We present explicit conjectures for the chiral fusion algebras of the logarithmic minimal models {\\cal LM}(p,p^{\\prime}) considering Virasoro representations with no enlarged or extended symmetry algebra. The generators of fusion are countably infinite in number but the ensuing fusion rules are quasi-rational in the sense that the fusion of a finite number of representations decomposes into a finite direct sum of representations. The fusion rules are commutative, associative and exhibit an sell(2) structure but require so-called Kac representations which are typically reducible yet indecomposable representations of rank 1. In particular, the identity of the fundamental fusion algebra p ≠ 1 is a reducible yet indecomposable Kac representation of rank 1. We make detailed comparisons of our fusion rules with the results of Gaberdiel and Kausch for p = 1 and with Eberle and Flohr for (p, p') = (2, 5) corresponding to the logarithmic Yang-Lee model. In the latter case, we confirm the appearance of indecomposable representations of rank 3. We also find that closure of a fundamental fusion algebra is achieved without the introduction of indecomposable representations of rank higher than 3. The conjectured fusion rules are supported, within our lattice approach, by extensive numerical studies of the associated integrable lattice models. Details of our lattice findings and numerical results will be presented elsewhere. The agreement of our fusion rules with the previous fusion rules lends considerable support for the identification of the logarithmic minimal models {\\cal LM}(p,p^{\\prime}) with the augmented c_{p,p^{\\prime}} (minimal) models defined algebraically.

  17. Minimal formulation of joint motion for biomechanisms

    PubMed Central

    Seth, Ajay; Sherman, Michael; Eastman, Peter; Delp, Scott

    2010-01-01

    Biomechanical systems share many properties with mechanically engineered systems, and researchers have successfully employed mechanical engineering simulation software to investigate the mechanical behavior of diverse biological mechanisms, ranging from biomolecules to human joints. Unlike their man-made counterparts, however, biomechanisms rarely exhibit the simple, uncoupled, pure-axial motion that is engineered into mechanical joints such as sliders, pins, and ball-and-socket joints. Current mechanical modeling software based on internal-coordinate multibody dynamics can formulate engineered joints directly in minimal coordinates, but requires additional coordinates restricted by constraints to model more complex motions. This approach can be inefficient, inaccurate, and difficult for biomechanists to customize. Since complex motion is the rule rather than the exception in biomechanisms, the benefits of minimal coordinate modeling are not fully realized in biomedical research. Here we introduce a practical implementation for empirically-defined internal-coordinate joints, which we call “mobilizers.” A mobilizer encapsulates the observations, measurement frame, and modeling requirements into a hinge specification of the permissible-motion manifold for a minimal set of internal coordinates. Mobilizers support nonlinear mappings that are mathematically equivalent to constraint manifolds but have the advantages of fewer coordinates, no constraints, and exact representation of the biomechanical motion-space—the benefits long enjoyed for internal-coordinate models of mechanical joints. Hinge matrices within the mobilizer are easily specified by user-supplied functions, and provide a direct means of mapping permissible motion derived from empirical data. We present computational results showing substantial performance and accuracy gains for mobilizers versus equivalent joints implemented with constraints. Examples of mobilizers for joints from human biomechanics

  18. Minimally Invasive Approach of a Retrocaval Ureter

    PubMed Central

    Pinheiro, Hugo; Ferronha, Frederico; Morales, Jorge; Campos Pinheiro, Luís

    2016-01-01

    The retrocaval ureter is a rare congenital entity, classically managed with open pyeloplasty techniques. The experience obtained with the laparoscopic approach of other more frequent causes of ureteropelvic junction (UPJ) obstruction has opened the method for the minimally invasive approach of the retrocaval ureter. In our paper, we describe a clinical case of a right retrocaval ureter managed successfully with laparoscopic dismembered pyeloplasty. The main standpoints of the procedure are described. Our results were similar to others published by other urologic centers, which demonstrates the safety and feasibility of the procedure for this condition. PMID:27635277

  19. Minimal model for spoof acoustoelastic surface states

    SciTech Connect

    Christensen, J. Willatzen, M.; Liang, Z.

    2014-12-15

    Similar to textured perfect electric conductors for electromagnetic waves sustaining artificial or spoof surface plasmons we present an equivalent phenomena for the case of sound. Aided by a minimal model that is able to capture the complex wave interaction of elastic cavity modes and airborne sound radiation in perfect rigid panels, we construct designer acoustoelastic surface waves that are entirely controlled by the geometrical environment. Comparisons to results obtained by full-wave simulations confirm the feasibility of the model and we demonstrate illustrative examples such as resonant transmissions and waveguiding to show a few examples of many where spoof elastic surface waves are useful.

  20. Mercury Contamination: Fate and Risk Minimization Strategies

    NASA Astrophysics Data System (ADS)

    Charlet, L.

    Two river basins have been studied in French Guyana, which are subject to heavy mercury contamination, due to illegal gold mining. Within the framework of an interdisciplinary European project, the fate of mercury in water, air, soil, sediment has been studied, as well as its bio-accumulation in the food chain. This bioaccumulation results in the contamination of amerindian populations, through fish consumption. This study has been done in close contact with the economic and political actors. The results of the scientific interdisciplinary study has been translated in terms of risk minimization strategies, which are analyzed in the framework of the European Water Framework Directive.

  1. Qualifying and quantifying minimal hepatic encephalopathy.

    PubMed

    Morgan, Marsha Y; Amodio, Piero; Cook, Nicola A; Jackson, Clive D; Kircheis, Gerald; Lauridsen, Mette M; Montagnese, Sara; Schiff, Sami; Weissenborn, Karin

    2016-12-01

    Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables. There is no gold standard for the diagnosis of this syndrome. As these patients have, by definition, no recognizable clinical features of brain dysfunction, the primary prerequisite for the diagnosis is careful exclusion of clinical symptoms and signs. A large number of psychometric tests/test systems have been evaluated in this patient group. Of these the best known and validated is the Portal Systemic Hepatic Encephalopathy Score (PHES) derived from a test battery of five paper and pencil tests; normative reference data are available in several countries. The electroencephalogram (EEG) has been used to diagnose hepatic encephalopathy since the 1950s but, once popular, the technology is not as accessible now as it once was. The performance characteristics of the EEG are critically dependent on the type of analysis undertaken; spectral analysis has better performance characteristics than visual analysis; evolving analytical techniques may provide better diagnostic information while the advent of portable wireless headsets may facilitate more widespread use. A large number of other diagnostic tools have been validated for the diagnosis of minimal hepatic encephalopathy including Critical Flicker Frequency, the Inhibitory Control Test, the Stroop test, the Scan package and the Continuous Reaction Time; each has its pros and cons; strengths and weaknesses; protagonists and detractors. Recent AASLD/EASL Practice Guidelines suggest that the diagnosis of minimal hepatic encephalopathy should be based on the PHES test together with one of the validated alternative techniques or the EEG. Minimal hepatic encephalopathy has a detrimental effect on the well-being of patients and their care

  2. Strategies for minimizing nosocomial measles transmission.

    PubMed Central

    Biellik, R. J.; Clements, C. J.

    1997-01-01

    As a result of the highly contagious nature of measles before the onset of rash, nosocomial transmission will remain a threat until the disease is eradicated. However, a number of strategies can minimize its nosocomial spread. It is therefore vital to maximize awareness among health care staff that an individual with measles can enter a health facility at any time and that a continual risk of the nosocomial transmission of measles exists. The present review makes two groups of recommendations: those which are generally applicable to all countries, and certain additional recommendations which may be suitable only for industrialized countries. PMID:9342896

  3. Lowered ultraviolet minimal erythema dose in hemiplegia.

    PubMed Central

    Cox, N. H.; Williams, S. J.

    1985-01-01

    In view of recent reports of increased tanning in hemiplegic limbs, we have investigated ultraviolet (u.v.) minimal erythema dose (MED) in hemiplegia using the bilateral comparison technique. Seven of 10 patients had a lower MED in the hemiplegic arm compared to the normal side, the mean reduction being 16% (range 0-33%, P = 0.003). No patients had a higher MED in the hemiplegic arm. We review the literature regarding other non-neurological features of hemiplegia, in particular asymmetry of temperature, oedema, and finger clubbing, and we propose a vasomotor or trophic aetiology for these findings. PMID:4022889

  4. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  5. Minimizing Occupational Exposure to Antineoplastic Agents.

    PubMed

    Polovich, Martha

    2016-01-01

    The inherent toxicity of antineoplastic drugs used for the treatment of cancer makes them harmful to healthy cells as well as to cancer cells. Nurses who prepare and/or administer the agents potentially are exposed to the drugs and their negative effects. Knowledge about these drugs and the precautions aimed at reducing exposure are essential aspects of infusion nursing practice. This article briefly reviews the mechanisms of action of common antineoplastic drugs, the adverse outcomes associated with exposure, the potential for occupational exposure from preparation and administration, and recommended strategies for minimizing occupational exposure.

  6. Reducing robotic prostatectomy costs by minimizing instrumentation.

    PubMed

    Delto, Joan C; Wayne, George; Yanes, Rafael; Nieder, Alan M; Bhandari, Akshay

    2015-05-01

    Since the introduction of robotic surgery for radical prostatectomy, the cost-benefit of this technology has been under scrutiny. While robotic surgery professes to offer multiple advantages, including reduced blood loss, reduced length of stay, and expedient recovery, the associated costs tend to be significantly higher, secondary to the fixed cost of the robot as well as the variable costs associated with instrumentation. This study provides a simple framework for the careful consideration of costs during the selection of equipment and materials. Two experienced robotic surgeons at our institution as well as several at other institutions were queried about their preferred instrument usage for robot-assisted prostatectomy. Costs of instruments and materials were obtained and clustered by type and price. A minimal set of instruments was identified and compared against alternative instrumentation. A retrospective review of 125 patients who underwent robotically assisted laparoscopic prostatectomy for prostate cancer at our institution was performed to compare estimated blood loss (EBL), operative times, and intraoperative complications for both surgeons. Our surgeons now conceptualize instrument costs as proportional changes to the cost of the baseline minimal combination. Robotic costs at our institution were reduced by eliminating an energy source like the Ligasure or vessel sealer, exploiting instrument versatility, and utilizing inexpensive tools such as Hem-o-lok clips. Such modifications reduced surgeon 1's cost of instrumentation to ∼40% less compared with surgeon 2 and up to 32% less than instrumentation used by surgeons at other institutions. Surgeon 1's combination may not be optimal for all robotic surgeons; however, it establishes a minimally viable toolbox for our institution through a rudimentary cost analysis. A similar analysis may aid others in better conceptualizing long-term costs not as nominal, often unwieldy prices, but as percent changes in

  7. Nonunity gain minimal-disturbance measurement

    SciTech Connect

    Sabuncu, Metin; Andersen, Ulrik L.; Mista, Ladislav Jr.; Fiurasek, Jaromir; Filip, Radim; Leuchs, Gerd

    2007-09-15

    We propose and experimentally demonstrate an optimal nonunity gain Gaussian scheme for partial measurement of an unknown coherent state that causes minimal disturbance of the state. The information gain and the state disturbance are quantified by the noise added to the measurement outcomes and to the output state, respectively. We derive the optimal trade-off relation between the two noises and we show that the tradeoff is saturated by nonunity gain teleportation. Optimal partial measurement is demonstrated experimentally using a linear optics scheme with feedforward.

  8. Minimally invasive splenectomy: an update and review.

    PubMed

    Gamme, Gary; Birch, Daniel W; Karmali, Shahzeer

    2013-08-01

    Laparoscopic splenectomy (LS) has become an established standard of care in the management of surgical diseases of the spleen. The present article is an update and review of current procedures and controversies regarding minimally invasive splenectomy. We review the indications and contraindications for LS as well as preoperative considerations. An individual assessment of the procedures and outcomes of multiport laparoscopic splenectomy, hand-assisted laparoscopic splenectomy, robotic splenectomy, natural orifice transluminal endoscopic splenectomy and single-port splenectomy is included. Furthermore, this review examines postoperative considerations after LS, including the postoperative course of uncomplicated patients, postoperative portal vein thrombosis, infections and malignancy.

  9. Sensorless Force Sensing for Minimally Invasive Surgery

    PubMed Central

    Zhao, Baoliang; Nelson, Carl A.

    2015-01-01

    Robotic minimally invasive surgery (R-MIS) has achieved success in various procedures; however, the lack of haptic feedback is considered by some to be a limiting factor. The typical method to acquire tool–tissue reaction forces is attaching force sensors on surgical tools, but this complicates sterilization and makes the tool bulky. This paper explores the feasibility of using motor current to estimate tool-tissue forces and demonstrates acceptable results in terms of time delay and accuracy. This sensorless force estimation method sheds new light on the possibility of equipping existing robotic surgical systems with haptic interfaces that require no sensors and are compatible with existing sterilization methods. PMID:27222680

  10. Area Minimizing Discs in Metric Spaces

    NASA Astrophysics Data System (ADS)

    Lytchak, Alexander; Wenger, Stefan

    2017-03-01

    We solve the classical problem of Plateau in the setting of proper metric spaces. Precisely, we prove that among all disc-type surfaces with prescribed Jordan boundary in a proper metric space there exists an area minimizing disc which moreover has a quasi-conformal parametrization. If the space supports a local quadratic isoperimetric inequality for curves we prove that such a solution is locally Hölder continuous in the interior and continuous up to the boundary. Our results generalize corresponding results of Douglas Radò and Morrey from the setting of Euclidean space and Riemannian manifolds to that of proper metric spaces.

  11. Minimal Hepatic Encephalopathy Impairs Quality of Life

    PubMed Central

    Agrawal, Swastik; Umapathy, Sridharan; Dhiman, Radha K.

    2015-01-01

    Minimal hepatic encephalopathy (MHE) is the mildest form of the spectrum of neurocognitive impairment in cirrhosis. It is a frequent occurrence in patients of cirrhosis and is detectable only by specialized neurocognitive testing. MHE is a clinically significant disorder which impairs daily functioning, driving performance, work capability and learning ability. It also predisposes to the development of overt hepatic encephalopathy, increased falls and increased mortality. This results in impaired quality of life for the patient as well as significant social and economic burden for health providers and care givers. Early detection and treatment of MHE with ammonia lowering therapy can reverse MHE and improve quality of life. PMID:26041957

  12. Functional minimization problems in image processing

    NASA Astrophysics Data System (ADS)

    Kim, Yunho; Vese, Luminita A.

    2008-02-01

    In this work we wish to recover an unknown image from a blurry version. We solve this inverse problem by energy minimization and regularization. We seek a solution of the form u + v, where u is a function of bounded variation (cartoon component), while v is an oscillatory component (texture), modeled by a Sobolev function with negative degree of differentiability. Experimental results show that this cartoon + texture model better recovers textured details in natural images, by comparison with the more standard models where the unknown is restricted only to the space of functions of bounded variation.

  13. Minimally invasive surgery for esophageal achalasia.

    PubMed

    Chen, Huan-Wen; Du, Ming

    2016-07-01

    Esophageal achalasia is due to the esophagus of neuromuscular dysfunction caused by esophageal functional disease. Its main feature is the lack of esophageal peristalsis, the lower esophageal sphincter pressure and to reduce the swallow's relaxation response. Lower esophageal muscular dissection is one of the main ways to treat esophageal achalasia. At present, the period of muscular layer under the thoracoscope esophagus dissection is one of the treatment of esophageal achalasia. Combined with our experience in minimally invasive esophageal surgery, to improved incision and operation procedure, and adopts the model of the complete period of muscular layer under the thoracoscope esophagus dissection in the treatment of esophageal achalasia.

  14. Periodical cicadas: A minimal automaton model

    NASA Astrophysics Data System (ADS)

    de O. Cardozo, Giovano; de A. M. M. Silvestre, Daniel; Colato, Alexandre

    2007-08-01

    The Magicicada spp. life cycles with its prime periods and highly synchronized emergence have defied reasonable scientific explanation since its discovery. During the last decade several models and explanations for this phenomenon appeared in the literature along with a great deal of discussion. Despite this considerable effort, there is no final conclusion about this long standing biological problem. Here, we construct a minimal automaton model without predation/parasitism which reproduces some of these aspects. Our results point towards competition between different strains with limited dispersal threshold as the main factor leading to the emergence of prime numbered life cycles.

  15. Minimal relativistic three-particle equations

    SciTech Connect

    Lindesay, J.

    1981-07-01

    A minimal self-consistent set of covariant and unitary three-particle equations is presented. Numerical results are obtained for three-particle bound states, elastic scattering and rearrangement of bound pairs with a third particle, and amplitudes for breakup into states of three free particles. The mathematical form of the three-particle bound state equations is explored; constraints are set upon the range of eigenvalues and number of eigenstates of these one parameter equations. The behavior of the number of eigenstates as the two-body binding energy decreases to zero in a covariant context generalizes results previously obtained non-relativistically by V. Efimov.

  16. Solar array stepping to minimize array excitation

    NASA Technical Reports Server (NTRS)

    Bhat, Mahabaleshwar K. P. (Inventor); Liu, Tung Y. (Inventor); Plescia, Carl T. (Inventor)

    1989-01-01

    Mechanical oscillations of a mechanism containing a stepper motor, such as a solar-array powered spacecraft, are reduced and minimized by the execution of step movements in pairs of steps, the period between steps being equal to one-half of the period of torsional oscillation of the mechanism. Each pair of steps is repeated at needed intervals to maintain desired continuous movement of the portion of elements to be moved, such as the solar array of a spacecraft. In order to account for uncertainty as well as slow change in the period of torsional oscillation, a command unit may be provided for varying the interval between steps in a pair.

  17. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  18. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  19. Energy Efficiency Indicators Methodology Booklet

    SciTech Connect

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  20. Methodology for astronaut reconditioning research.

    PubMed

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Minimal model of financial stylized facts.

    PubMed

    Delpini, Danilo; Bormetti, Giacomo

    2011-04-01

    In this work we propose a statistical characterization of a linear stochastic volatility model featuring inverse-gamma stationary distribution for the instantaneous volatility. We detail the derivation of the moments of the return distribution, revealing the role of the inverse-gamma law in the emergence of fat tails and of the relevant correlation functions. We also propose a systematic methodology for estimating the parameters and we describe the empirical analysis of the Standard & Poor's 500 index daily returns, confirming the ability of the model to capture many of the established stylized facts as well as the scaling properties of empirical distributions over different time horizons. ©2011 American Physical Society

  2. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  3. Minimizing solids buildup in cooling towers

    SciTech Connect

    Barzuza, I.

    1995-10-01

    The quality of water passing through a cooling tower affects its operation and performance as a heat exchanger. Since evaporative cooling is usually the primary mode of operation, any dirt or solid impurity in the water tends to become concentrated in the closed-cycle tower as the water is evaporated. To reduce the buildup of impurities and thus minimize fouling, the cooling tower is periodically or continuously discharged or blown down and fresh makeup water added. An effective technique used to further minimize dirt load is to install a filter on a side stream of the tower. But it is often difficult to prove any cost justification for a particular type of filtration equipment. Presented below is an algorithm to calculate the changes in solids load in a cooling tower after the installation of a filtration system of known performance. This calculation constitutes the first step toward full-fledged economic evaluation of a filter`s cost-effectiveness. Furthermore, a method is introduced to operate self-cleaning strainers installed on the side stream of a cooling tower. The method increases the removal efficiency of particles smaller than the mesh size of the filter cake.

  4. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry.

  5. Navy Shipboard Hazardous Material Minimization Program

    SciTech Connect

    Bieberich, M.J.; Robinson, P.; Chastain, B.

    1994-12-31

    The use of hazardous (and potentially hazardous) materials in shipboard cleaning applications has proliferated as new systems and equipments have entered the fleet to reside alongside existing equipments. With the growing environmental awareness (and additional, more restrictive regulations) at all levels/echelon commands of the DoD, the Navy has initiated a proactive program to undertake the minimization/elimination of these hazardous materials in order to eliminate HMs at the source. This paper will focus on the current Shipboard Hazardous Materials Minimization Program initiatives including the identification of authorized HM currently used onboard, identification of potential substitute materials for HM replacement, identification of new cleaning technologies and processes/procedures, and identification of technical documents which will require revision to eliminate the procurement of HMs into the federal supply system. Also discussed will be the anticipated path required to implement the changes into the fleet and automated decision processes (substitution algorithm) currently employed. The paper will also present the most recent technologies identified for approval or additional testing and analysis including: supercritical CO{sub 2} cleaning, high pressure blasting (H{sub 2}O + baking soda), aqueous and semi-aqueous cleaning materials and processes, solvent replacements and dedicated parts washing systems with internal filtering capabilities, automated software for solvent/cleaning process substitute selection. Along with these technological advances, data availability (from on-line databases and CDROM Database libraries) will be identified and discussed.

  6. Surgical efficacy of minimally invasive thoracic discectomy.

    PubMed

    Elhadi, Ali M; Zehri, Aqib H; Zaidi, Hasan A; Almefty, Kaith K; Preul, Mark C; Theodore, Nicholas; Dickman, Curtis A

    2015-11-01

    We aimed to determine the clinical indications and surgical outcomes for thoracoscopic discectomy. Thoracic disc disease is a rare degenerative process. Thoracoscopic approaches serve to minimize tissue injury during the approach, but critics argue that this comes at the cost of surgical efficacy. Current reports in the literature are limited to small institutional patient series. We systematically identified all English language articles on thoracoscopic discectomy with at least two patients, published from 1994 to 2013 on MEDLINE, Science Direct, and Google Scholar. We analyzed 12 articles that met the inclusion criteria, five prospective and seven retrospective studies comprising 545 surgical patients. The overall complication rate was 24% (n=129), with reported complications ranging from intercostal neuralgia (6.1%), atelectasis (2.8%), and pleural effusion (2.6%), to more severe complications such as pneumonia (0.8%), pneumothorax (1.3%), and venous thrombosis (0.2%). The average reported postoperative follow-up was 20.5 months. Complete resolution of symptoms was reported in 79% of patients, improvement with residual symptoms in 10.2%, no change in 9.6%, and worsening in 1.2%. The minimally invasive endoscopic approaches to the thoracic spine among selected patients demonstrate excellent clinical efficacy and acceptable complication rates, comparable to the open approaches. Disc herniations confined to a single level, with small or no calcifications, are ideal for such an approach, whereas patients with calcified discs adherent to the dura would benefit from an open approach.

  7. Utilization of biocatalysts in cellulose waste minimization

    SciTech Connect

    Woodward, J.; Evans, B.R.

    1996-09-01

    Cellulose, a polymer of glucose, is the principal component of biomass and, therefore, a major source of waste that is either buried or burned. Examples of biomass waste include agricultural crop residues, forestry products, and municipal wastes. Recycling of this waste is important for energy conservation as well as waste minimization and there is some probability that in the future biomass could become a major energy source and replace fossil fuels that are currently used for fuels and chemicals production. It has been estimated that in the United States, between 100-450 million dry tons of agricultural waste are produced annually, approximately 6 million dry tons of animal waste, and of the 190 million tons of municipal solid waste (MSW) generated annually, approximately two-thirds is cellulosic in nature and over one-third is paper waste. Interestingly, more than 70% of MSW is landfilled or burned, however landfill space is becoming increasingly scarce. On a smaller scale, important cellulosic products such as cellulose acetate also present waste problems; an estimated 43 thousand tons of cellulose ester waste are generated annually in the United States. Biocatalysts could be used in cellulose waste minimization and this chapter describes their characteristics and potential in bioconversion and bioremediation processes.

  8. Minimally invasive thyroidectomy (MIT): indications and results.

    PubMed

    Docimo, Giovanni; Salvatore Tolone, Salvatore; Gili, Simona; d'Alessandro, A; Casalino, G; Brusciano, L; Ruggiero, Roberto; Docimo, Ludovico

    2013-01-01

    To establish if the indication for different approaches for thyroidectomy and the incision length provided by means of pre-operative assessment of gland volume and size of nodules resulted in safe and effective outcomes and in any notable aesthetic or quality-of-life impact on patients. Ninehundred eightytwo consecutive patients, undergoing total thyroidectomy, were enrolled. The thyroid volume and maximal nodule diameter were measured by means of ultrasounds. Based on ultrasounds findings, patients were divided into three groups: minimally invasive video assisted thyroidectomy (MIVAT), minimally invasive thyroidectomy (MIT) and conventional thyroidectomy (CT) groups. The data concerning the following parameters were collected: operative time, postoperative complications, postoperative pain and cosmetic results. The MIVAT group included 179 patients, MIT group included 592 patients and CT group included 211 patients. Incidence of complications did not differ significantly in each group. In MIVAT and MIT group, the perception of postoperative pain was less intense than CT group. The patients in the MIVAT (7±1.5) and MIT (8±2) groups were more satisfied with the cosmetic results than those in CT group (5±1.3) (p= <0.05). The MIT is a technique totally reproducible, and easily convertible to perform surgical procedures in respect of the patient, without additional complications, increased costs, and with better aesthetic results.

  9. Singlet-stabilized minimal gauge mediation

    SciTech Connect

    Curtin, David; Tsai, Yuhsin

    2011-04-01

    We propose singlet-stabilized minimal gauge mediation as a simple Intriligator, Seiberg and Shih-based model of direct gauge mediation which avoids both light gauginos and Landau poles. The hidden sector is a massive s-confining supersymmetric QCD that is distinguished by a minimal SU(5) flavor group. The uplifted vacuum is stabilized by coupling the meson to an additional singlet sector with its own U(1) gauge symmetry via nonrenormalizable interactions suppressed by a higher scale {Lambda}{sub UV} in the electric theory. This generates a nonzero vacuum expectation value for the singlet meson via the inverted hierarchy mechanism, but requires tuning to a precision {approx}({Lambda}/{Lambda}{sub UV}){sup 2}, which is {approx}10{sup -4}. In the course of this analysis we also outline some simple model-building rules for stabilizing uplifted-ISS models, which lead us to conclude that meson deformations are required (or at least heavily favored) to stabilize the adjoint component of the magnetic meson.

  10. Osmosis in a minimal model system

    NASA Astrophysics Data System (ADS)

    Lion, Thomas W.; Allen, Rosalind J.

    2012-12-01

    Osmosis is one of the most important physical phenomena in living and soft matter systems. While the thermodynamics of osmosis is well understood, the underlying microscopic dynamical mechanisms remain the subject of discussion. Unravelling these mechanisms is a prerequisite for understanding osmosis in non-equilibrium systems. Here, we investigate the microscopic basis of osmosis, in a system at equilibrium, using molecular dynamics simulations of a minimal model in which repulsive solute and solvent particles differ only in their interactions with an external potential. For this system, we can derive a simple virial-like relation for the osmotic pressure. Our simulations support an intuitive picture in which the solvent concentration gradient, at osmotic equilibrium, arises from the balance between an outward force, caused by the increased total density in the solution, and an inward diffusive flux caused by the decreased solvent density in the solution. While more complex effects may occur in other osmotic systems, our results suggest that they are not required for a minimal picture of the dynamic mechanisms underlying osmosis.

  11. Flavored dark matter beyond Minimal Flavor Violation

    SciTech Connect

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms as triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.

  12. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Sala, Filippo; Taoso, Marco; Hambye, Thomas; Panci, Paolo E-mail: thambye@ulb.ac.be E-mail: filippo.sala@cea.fr

    2015-10-01

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  13. Environmental projects. Volume 16: Waste minimization assessment

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Goldstone Deep Space Communications Complex (GDSCC), located in the MoJave Desert, is part of the National Aeronautics and Space Administration's (NASA's) Deep Space Network (DSN), the world's largest and most sensitive scientific telecommunications and radio navigation network. The Goldstone Complex is operated for NASA by the Jet Propulsion Laboratory. At present, activities at the GDSCC support the operation of nine parabolic dish antennas situated at five separate locations known as 'sites.' Each of the five sites at the GDSCC has one or more antennas, called 'Deep Space Stations' (DSS's). In the course of operation of these DSS's, various hazardous and non-hazardous wastes are generated. In 1992, JPL retained Kleinfelder, Inc., San Diego, California, to quantify the various streams of hazardous and non-hazardous wastes generated at the GDSCC. In June 1992, Kleinfelder, Inc., submitted a report to JPL entitled 'Waste Minimization Assessment.' This present volume is a JPL-expanded version of the Kleinfelder, Inc. report. The 'Waste Minimization Assessment' report did not find any deficiencies in the various waste-management programs now practiced at the GDSCC, and it found that these programs are being carried out in accordance with environmental rules and regulations.

  14. Minimally invasive "pocket incision" aortic valve surgery.

    PubMed

    Yakub, M A; Pau, K K; Awang, Y

    1999-02-01

    A minimally invasive approach to aortic valve surgery through a transverse incision ("pocket incision") at the right second intercostal space was examined. Sixteen patients with a mean age of 30 years underwent this approach. The third costal cartilage was either excised (n = 5) or dislocated (n = 11). The right internal mammary artery was preserved. Cardiopulmonary bypass (CPB) was established with aortic-right atrial cannulation in all except the first case. Aortic valve replacements (AVR) were performed in 15 patients and one had aortic valve repair with concomitant ventricular septal defect closure. There was no mortality and no major complications. The aortic cross-clamp, CPB and operative times were 72 +/- 19 mins, 105 +/- 26 mins and 3 hrs 00 min +/- 29 mins respectively. The mean time to extubation was 5.7 +/- 4.0 hrs, ICU stay of 27 +/- 9 hrs and postoperative hospital stay of 5.1 +/- 1.2 days. Minimally invasive "pocket incision" aortic valve surgery is technically feasible and safe. It has the advantages of central cannulation for CPB, preservation of the internal mammary artery and avoiding sternotomy. This approach is cosmetically acceptable and allows rapid patient recovery.

  15. Flavored dark matter beyond Minimal Flavor Violation

    DOE PAGES

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms asmore » triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.« less

  16. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Hambye, Thomas; Panci, Paolo; Sala, Filippo; Taoso, Marco

    2015-10-12

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  17. One hospital's road to waste minimization.

    PubMed

    Hooper, D M

    1994-05-01

    There are many new and exciting waste minimization programs being offered to healthcare facilities. Companies are now making reusable operating packs and gowns that are more efficient than disposables. The selling point is that the system will save healthcare money! The reusable programs do save disposal costs for an institution. Shore Memorial has scheduled a trial evaluation for reusable operating room linens to begin May 1, 1994. The concept has not been difficult to sell to physicians and staff. Perhaps this is because people are generally more aware of their environment and the reasons why it should be protected. The hospital will also be evaluating an IV bottle and bag recycling program. The New Jersey Department of Environmental Protection Agency has given approval to proceed with this type of recycling program, and Shore Memorial is in the process of scheduling this trial program with a local vendor. Waste reduction and recycling in healthcare settings will continue to be challenging because of the diversity of the wastestream and the changing environment facing healthcare. Certainly, healthcare has as much of a responsibility to the well-being of patients as it does to keeping the environment healthy. Returning to the "old way" of doing things, such as reusables, does not have a negative impact on people, but it does have an impact on the environment. Shore Memorial believes it is moving in the right direction with its waste minimization program to make a positive environmental impact.

  18. Direct solution to the minimal generalized pose.

    PubMed

    Miraldo, Pedro; Araujo, Helder

    2015-03-01

    Pose estimation is a relevant problem for imaging systems whose applications range from augmented reality to robotics. In this paper we propose a novel solution for the minimal pose problem, within the framework of generalized camera models and using a planar homography. Within this framework and considering only the geometric elements of the generalized camera models, an imaging system can be modeled by a set of mappings associating image pixels to 3-D straight lines. This mapping is defined in a 3-D world coordinate system. Pose estimation performs the computation of the rigid transformation between the original 3-D world coordinate system and the one in which the camera was calibrated. Using synthetic data, we compare the proposed minimal-based method with the state-of-the-art methods in terms of numerical errors, number of solutions and processing time. From the experiments, we conclude that the proposed method performs better, especially because there is a smaller variation in numerical errors, while results are similar in terms of number of solutions and computation time. To further evaluate the proposed approach we tested our method with real data. One of the relevant contributions of this paper is theoretical. When compared to the state-of-the-art approaches, we propose a completely new parametrization of the problem that can be solved in four simple steps. In addition, our approach does not require any predefined transformation of the dataset, which yields a simpler solution for the problem.

  19. [Theory and practice of minimally invasive endodontics].

    PubMed

    Jiang, H W

    2016-08-01

    The primary goal of modern endodontic therapy is to achieve the long-term retention of a functional tooth by preventing or treating pulpitis or apical periodontitis is. The long-term retention of endodontically treated tooth is correlated with the remaining amount of tooth tissue and the quality of the restoration after root canal filling. In recent years, there has been rapid progress and development in the basic research of endodontic biology, instrument and applied materials, making treatment procedures safer, more accurate, and more efficient. Thus, minimally invasive endodontics(MIE)has received increasing attention at present. MIE aims to preserve the maximum of tooth structure during root canal therapy, and the concept covers the whole process of diagnosis and treatment of teeth. This review article focuses on describing the minimally invasive concepts and operating essentials in endodontics, from diagnosis and treatment planning to the access opening, pulp cavity finishing, root canal cleaning and shaping, 3-dimensional root canal filling and restoration after root canal treatment.

  20. Minimally Invasive Laminectomy in Spondylolisthetic Lumbar Stenosis

    PubMed Central

    Caralopoulos, Ilias N.; Bui, Cuong J.

    2014-01-01

    Background Degenerative lumbar stenosis associated with spondylolisthesis is common in elderly patients. The most common symptoms are those of neurogenic claudication with leg pain. Surgery is indicated for those who fail conservative management. The generally accepted recommendation is to perform a laminectomy and a fusion at the involved level. Methods We reviewed our results for minimally invasive single-level decompression without fusion performed by the senior author in patients with symptomatic lumbar stenosis with spondylolisthesis with no dynamic instability from 2008 to 2011 at a single institution. Outcomes were measured using the visual analog scale (VAS), Prolo Economic Functional Rating Scale, and revised Oswestry Disability Index (ODI) at initial presentation and at 3-month, 6-month, and 1-year follow-up time points. Results Records for 28 patients (19 males, 9 females) were reviewed. The success rate, defined as improvement in pain and functional outcome without the need for surgical fusion, was 86%. VAS scores decreased by 6.3 points, Prolo scores increased by 3.5 points, and the ODI decreased by 31% at 1 year. All changes were statistically significant. Conclusion Minimally invasive decompression alone can be a reasonable alternative to decompression and fusion for patients with spondylolisthetic lumbar stenosis and neurogenic claudication with leg pain. Decompression without fusion should be considered for older patients and for patients who are not ideal fusion candidates. PMID:24688331