Science.gov

Sample records for minimal cut-set methodology

  1. Minimal cut-set methodology for artificial intelligence applications

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements.

  2. Minimal cut-set methodology for artificial intelligence applications

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper suggests that given the considerable (and growing) literature of expert systems for diagnostics and maintenance, consideration of the theory of minimal cut sets should be most beneficial. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification reduces the tree to an equivalent diagram displaying the smallest combinations of independent component failures which could result in the fault symbolized by the root of the tree and called the top event. This paper reviews minimal cut-set theory and illustrates its application with an example. Using this approach, expert diagnostic systems would have a tool in which, with minimum search, the description of fault causes is made clear and explicit, contributor sequences to a top event fault are easily quantified and ranked, and the probability of the top event is easily computed. Finally, the application of minimal cut sets to planning and problem solving is developed.

  3. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same

  4. A new efficient algorithm generating all minimal S-T cut-sets in a graph-modeled network

    NASA Astrophysics Data System (ADS)

    Malinowski, Jacek

    2016-06-01

    A new algorithm finding all minimal s-t cut-sets in a graph-modeled network with failing links and nodes is presented. It is based on the analysis of the tree of acyclic s-t paths connecting a given pair of nodes in the considered structure. The construction of such a tree is required by many existing algorithms for s-t cut-sets generation in order to eliminate "stub" edges or subgraphs through which no acyclic path passes. The algorithm operates on the acyclic paths tree alone, i.e. no other analysis of the network's topology is necessary. It can be applied to both directed and undirected graphs, as well as partly directed ones. It is worth noting that the cut-sets can be composed of both links and failures, while many known algorithms do not take nodes into account, which is quite restricting from the practical point of view. The developed cut-sets generation technique makes the algorithm significantly faster than most of the previous methods, as proved by the experiments.

  5. FTA Basic Event & Cut Set Ranking.

    Energy Science and Technology Software Center (ESTSC)

    1999-05-04

    Version 00 IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability ofmore » the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code.« less

  6. SIGPI. Fault Tree Cut Set System Performance

    SciTech Connect

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.

  7. SIGPI. Fault Tree Cut Set System Performance

    SciTech Connect

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.

  8. Fault Tree Cut Set System Performance.

    Energy Science and Technology Software Center (ESTSC)

    2000-02-21

    Version 00 SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependentmore » components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  9. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  10. Energy minimization in medical image analysis: Methodologies and applications.

    PubMed

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26186171

  11. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  12. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  13. Using benchmarking to minimize common DOE waste streams. Volume 1, Methodology and liquid photographic waste

    SciTech Connect

    Levin, V.

    1994-04-01

    Finding innovative ways to reduce waste streams generated at Department of Energy (DOE) sites by 50% by the year 2000 is a challenge for DOE`s waste minimization efforts. This report examines the usefulness of benchmarking as a waste minimization tool, specifically regarding common waste streams at DOE sites. A team of process experts from a variety of sites, a project leader, and benchmarking consultants completed the project with management support provided by the Waste Minimization Division EM-352. Using a 12-step benchmarking process, the team examined current waste minimization processes for liquid photographic waste used at their sites and used telephone and written questionnaires to find ``best-in-class`` industrv partners willing to share information about their best waste minimization techniques and technologies through a site visit. Eastman Kodak Co., and Johnson Space Center/National Aeronautics and Space Administration (NASA) agreed to be partners. The site visits yielded strategies for source reduction, recycle/recovery of components, regeneration/reuse of solutions, and treatment of residuals, as well as best management practices. An additional benefit of the work was the opportunity for DOE process experts to network and exchange ideas with their peers at similar sites.

  14. Methodology to optimize detector geometry in fluorescence tomography of tissue using the minimized curvature of the summed diffuse sensitivity projections.

    PubMed

    Holt, Robert W; Leblond, Frederic L; Pogue, Brian W

    2013-08-01

    The dependence of the sensitivity function in fluorescence tomography on the geometry of the excitation source and detection locations can severely influence an imaging system's ability to recover fluorescent distributions. Here a methodology for choosing imaging configuration based on the uniformity of the sensitivity function is presented. The uniformity of detection sensitivity is correlated with reconstruction accuracy in silico, and reconstructions in a murine head model show that a detector configuration optimized using Nelder-Mead minimization improves recovery over uniformly sampled tomography. PMID:24323220

  15. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations

    PubMed Central

    Andrews, Nick A.; Latrémolière, Alban; Basbaum, Allan I.; Mogil, Jeffrey S.; Porreca, Frank; Rice, Andrew S.C.; Woolf, Clifford J.; Currie, Gillian L.; Dworkin, Robert H.; Eisenach, James C.; Evans, Scott; Gewandter, Jennifer S.; Gover, Tony D.; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P.; Kennedy, Jeffrey D.; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P.; McMahon, Stephen B.; Price, Theodore J.; Ross, Sarah E.; Scherrer, Grégory; Seal, Rebecca P.; Sena, Emily S.; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I.; Turk, Dennis C.; Whiteside, Garth

    2015-01-01

    Abstract There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  16. A minimally invasive methodology based on morphometric parameters for day 2 embryo quality assessment.

    PubMed

    Molina, Inmaculada; Lázaro-Ibáñez, Elisa; Pertusa, Jose; Debón, Ana; Martínez-Sanchís, Juan Vicente; Pellicer, Antonio

    2014-10-01

    The risk of multiple pregnancy to maternal-fetal health can be minimized by reducing the number of embryos transferred. New tools for selecting embryos with the highest implantation potential should be developed. The aim of this study was to evaluate the ability of morphological and morphometric variables to predict implantation by analysing images of embryos. This was a retrospective study of 135 embryo photographs from 112 IVF-ICSI cycles carried out between January and March 2011. The embryos were photographed immediately before transfer using Cronus 3 software. Their images were analysed using the public program ImageJ. Significant effects (P < 0.05), and higher discriminant power to predict implantation were observed for the morphometric embryo variables compared with morphological ones. The features for successfully implanted embryos were as follows: four cells on day 2 of development; all blastomeres with circular shape (roundness factor greater than 0.9), an average zona pellucida thickness of 13 µm and an average of 17695.1 µm² for the embryo area. Embryo size, which is described by its area and the average roundness factor for each cell, provides two objective variables to consider when predicting implantation. This approach should be further investigated for its potential ability to improve embryo scoring. PMID:25154014

  17. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations.

    PubMed

    Andrews, Nick A; Latrémolière, Alban; Basbaum, Allan I; Mogil, Jeffrey S; Porreca, Frank; Rice, Andrew S C; Woolf, Clifford J; Currie, Gillian L; Dworkin, Robert H; Eisenach, James C; Evans, Scott; Gewandter, Jennifer S; Gover, Tony D; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P; Kennedy, Jeffrey D; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P; McMahon, Stephen B; Price, Theodore J; Ross, Sarah E; Scherrer, Grégory; Seal, Rebecca P; Sena, Emily S; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I; Turk, Dennis C; Whiteside, Garth

    2016-04-01

    There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public-private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  18. Towards uniform accelerometry analysis: a standardization methodology to minimize measurement bias due to systematic accelerometer wear-time variation.

    PubMed

    Katapally, Tarun R; Muhajarine, Nazeem

    2014-05-01

    Accelerometers are predominantly used to objectively measure the entire range of activity intensities - sedentary behaviour (SED), light physical activity (LPA) and moderate to vigorous physical activity (MVPA). However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants), jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within 'valid' data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com) were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA). Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time's influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and MVPA as outcome

  19. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology.

  20. Endovascular treatment for Small Core and Anterior circulation Proximal occlusion with Emphasis on minimizing CT to recanalization times (ESCAPE) trial: methodology.

    PubMed

    Demchuk, Andrew M; Goyal, Mayank; Menon, Bijoy K; Eesa, Muneer; Ryckborst, Karla J; Kamal, Noreen; Patil, Shivanand; Mishra, Sachin; Almekhlafi, Mohammed; Randhawa, Privia A; Roy, Daniel; Willinsky, Robert; Montanera, Walter; Silver, Frank L; Shuaib, Ashfaq; Rempel, Jeremy; Jovin, Tudor; Frei, Donald; Sapkota, Biggya; Thornton, J Michael; Poppe, Alexandre; Tampieri, Donatella; Lum, Cheemun; Weill, Alain; Sajobi, Tolulope T; Hill, Michael D

    2015-04-01

    ESCAPE is a prospective, multicenter, randomized clinical trial that will enroll subjects with the following main inclusion criteria: less than 12 h from symptom onset, age > 18, baseline NIHSS >5, ASPECTS score of >5 and CTA evidence of carotid T/L or M1 segment MCA occlusion, and at least moderate collaterals by CTA. The trial will determine if endovascular treatment will result in higher rates of favorable outcome compared with standard medical therapy alone. Patient populations that are eligible include those receiving IV tPA, tPA ineligible and unwitnessed onset or wake up strokes with 12 h of last seen normal. The primary end-point, based on intention-to-treat criteria is the distribution of modified Rankin Scale scores at 90 days assessed using a proportional odds model. The projected maximum sample size is 500 subjects. Randomization is stratified under a minimization process using age, gender, baseline NIHSS, baseline ASPECTS (8-10 vs. 6-7), IV tPA treatment and occlusion location (ICA vs. MCA) as covariates. The study will have one formal interim analysis after 300 subjects have been accrued. Secondary end-points at 90 days include the following: mRS 0-1; mRS 0-2; Barthel 95-100, EuroQOL and a cognitive battery. Safety outcomes are symptomatic ICH, major bleeding, contrast nephropathy, total radiation dose, malignant MCA infarction, hemicraniectomy and mortality at 90 days. PMID:25546514

  1. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows. PMID:24871934

  2. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  3. Taxonomic minimalism.

    PubMed

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. PMID:21236933

  4. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  5. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  6. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  7. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  8. Minimal change disease

    MedlinePlus

    Minimal change nephrotic syndrome; Nil disease; Lipoid nephrosis; Idiopathic nephrotic syndrome of childhood ... which filter blood and produce urine. In minimal change disease, there is damage to the glomeruli. These ...

  9. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  10. Inverse Modeling Via Linearized Functional Minimization

    NASA Astrophysics Data System (ADS)

    Barajas-Solano, D. A.; Wohlberg, B.; Vesselinov, V. V.; Tartakovsky, D. M.

    2014-12-01

    We present a novel parameter estimation methodology for transient models of geophysical systems with uncertain, spatially distributed, heterogeneous and piece-wise continuous parameters.The methodology employs a bayesian approach to propose an inverse modeling problem for the spatial configuration of the model parameters.The likelihood of the configuration is formulated using sparse measurements of both model parameters and transient states.We propose using total variation regularization (TV) as the prior reflecting the heterogeneous, piece-wise continuity assumption on the parameter distribution.The maximum a posteriori (MAP) estimator of the parameter configuration is then computed by minimizing the negative bayesian log-posterior using a linearized functional minimization approach. The computation of the MAP estimator is a large-dimensional nonlinear minimization problem with two sources of nonlinearity: (1) the TV operator, and (2) the nonlinear relation between states and parameters provided by the model's governing equations.We propose a a hybrid linearized functional minimization (LFM) algorithm in two stages to efficiently treat both sources of nonlinearity.The relation between states and parameters is linearized, resulting in a linear minimization sub-problem equipped with the TV operator; this sub-problem is then minimized using the Alternating Direction Method of Multipliers (ADMM). The methodology is illustrated with a transient saturated groundwater flow application in a synthetic domain, stimulated by external point-wise loadings representing aquifer pumping, together with an array of discrete measurements of hydraulic conductivity and transient measurements of hydraulic head.We show that our inversion strategy is able to recover the overall large-scale features of the parameter configuration, and that the reconstruction is improved by the addition of transient information of the state variable.

  11. Prostate resection - minimally invasive

    MedlinePlus

    ... are: Erection problems (impotence) No symptom improvement Passing semen back into your bladder instead of out through ... Whelan JP, Goeree L. Systematic review and meta-analysis of transurethral resection of the prostate versus minimally ...

  12. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  13. Minimalism. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  14. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  15. Minimally invasive hip replacement

    MedlinePlus

    ... Smits SA, Swinford RR, Bahamonde RE. A randomized, prospective study of 3 minimally invasive surgical approaches in total hip arthroplasty: comprehensive gait analysis. J Arthroplasty . 2008;23:68-73. PMID: 18722305 ...

  16. Testing methodologies

    SciTech Connect

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  17. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  18. Minimally Invasive Radiofrequency Devices.

    PubMed

    Sadick, Neil; Rothaus, Kenneth O

    2016-07-01

    This article reviews minimally invasive radiofrequency options for skin tightening, focusing on describing their mechanism of action and clinical profile in terms of safety and efficacy and presenting peer-reviewed articles associated with the specific technologies. Treatments offered by minimally invasive radiofrequency devices (fractional, microneedling, temperature-controlled) are increasing in popularity due to the dramatic effects they can have without requiring skin excision, downtime, or even extreme financial burden from the patient's perspective. Clinical applications thus far have yielded impressive results in treating signs of the aging face and neck, either as stand-alone or as postoperative maintenance treatments. PMID:27363771

  19. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  20. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  1. The Minimal Era

    ERIC Educational Resources Information Center

    Van Ness, Wilhelmina

    1974-01-01

    Described the development of Minimal Art, a composite name that has been applied to the scattering of bland, bleak, non-objective fine arts painting and sculpture forms that proliferated slightly mysteriously in the middle 1960's as Pop Art began to decline. (Author/RK)

  2. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  3. Minimally invasive radioguided parathyroidectomy.

    PubMed

    Costello, D; Norman, J

    1999-07-01

    The last decade has been characterized by an emphasis on minimizing interventional techniques, hospital stays, and overall costs of patient care. It is clear that most patients with sporadic HPT do not require a complete neck exploration. We now know that a minimal approach is appropriate for this disease. Importantly, the MIRP technique can be applied to most patients with sporadic HPT and can be performed by surgeons with modest advanced training. The use of a gamma probe as a surgical tool converts the sestamibi to a functional and anatomical scan eliminating the need for any other preoperative localizing study. Quantification of the radioactivity within the removed gland eliminates the need for routine frozen section histologic examination and obviates the need for costly intraoperative parathyroid hormone measurements. This radioguided technique allows the benefit of local anesthesia, dramatically reduces operative times, eliminates postoperative blood tests, provides a smaller scar, requires minimal time spent in the hospital, and almost assures a rapid, near pain-free recovery. This combination is beneficial to the patient whereas helping achieve a reduction in overall costs. PMID:10448697

  4. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  5. Minimally invasive mediastinal surgery.

    PubMed

    Melfi, Franca M A; Fanucchi, Olivia; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a "no-touch" technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally invasive

  6. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  7. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  8. Minimally Invasive Parathyroidectomy

    PubMed Central

    Starker, Lee F.; Fonseca, Annabelle L.; Carling, Tobias; Udelsman, Robert

    2011-01-01

    Minimally invasive parathyroidectomy (MIP) is an operative approach for the treatment of primary hyperparathyroidism (pHPT). Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT. PMID:21747851

  9. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  10. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  11. Minimizing hazardous waste

    SciTech Connect

    DeClue, S.C.

    1996-06-01

    Hazardous waste minimization is a broad term often associated with pollution prevention, saving the environment or protecting Mother Earth. Some associate hazardous waste minimization with saving money. Thousands of hazardous materials are used in processes every day, but when these hazardous materials become hazardous wastes, dollars must be spent for disposal. When hazardous waste is reduced, an organization will spend less money on hazardous waste disposal. In 1993, Fort Bragg reduced its hazardous waste generation by over 100,000 pounds and spent nearly $90,000 less on hazardous waste disposal costs than in 1992. Fort Bragg generates a variety of wastes: Vehicle maintenance wastes such as antifreeze, oil, grease and solvents; helicopter maintenance wastes, including solvents, adhesives, lubricants and paints; communication operation wastes such as lithium, magnesium, mercury and nickel-cadmium batteries; chemical defense wastes detection, decontamination, and protective mask filters. The Hazardous Waste Office has the responsibility to properly identify, characterize, classify and dispose of these waste items in accordance with US Environmental Protection Agency (EPA) and US Department of Transportation (DOT) regulations.

  12. Microbiological methodology in astrobiology

    NASA Astrophysics Data System (ADS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-09-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. The Antarctic glacier and Earth permafrost habitats, where living microbial cells preserved viability for millennia years due to entering the anabiotic state, are often regarded as terrestrial analogs of Martian polar subsurface layers. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that living microorganisms had not been preserved and underwent mineralization. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ.

  13. A perturbation technique for shield weight minimization

    SciTech Connect

    Watkins, E.F.; Greenspan, E. )

    1993-01-01

    The radiation shield optimization code SWAN (Ref. 1) was originally developed for minimizing the thickness of a shield that will meet a given dose (or another) constraint or for extremizing a performance parameter of interest (e.g., maximizing energy multiplication or minimizing dose) while maintaining the shield volume constraint. The SWAN optimization process proved to be highly effective (e.g., see Refs. 2, 3, and 4). The purpose of this work is to investigate the applicability of the SWAN methodology to problems in which the weight rather than the volume is the relevant shield characteristic. Such problems are encountered in shield design for space nuclear power systems. The investigation is carried out using SWAN with the coupled neutron-photon cross-section library FLUNG (Ref. 5).

  14. Minimal noise subsystems

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoting; Byrd, Mark; Jacobs, Kurt

    2016-03-01

    A system subjected to noise contains a decoherence-free subspace or subsystem (DFS) only if the noise possesses an exact symmetry. Here we consider noise models in which a perturbation breaks a symmetry of the noise, so that if S is a DFS under a given noise process it is no longer so under the new perturbed noise process. We ask whether there is a subspace or subsystem that is more robust to the perturbed noise than S . To answer this question we develop a numerical method that allows us to search for subspaces or subsystems that are maximally robust to arbitrary noise processes. We apply this method to a number of examples, and find that a subsystem that is a DFS is often not the subsystem that experiences minimal noise when the symmetry of the noise is broken by a perturbation. We discuss which classes of noise have this property.

  15. Minimal quiver standard model

    SciTech Connect

    Berenstein, David; Pinansky, Samuel

    2007-05-01

    This paper discusses the minimal quiver gauge theory embedding of the standard model that could arise from brane world type string theory constructions. It is based on the low energy effective field theory of D branes in the perturbative regime. The model differs from the standard model by the addition of one extra massive gauge boson, and contains only one additional parameter to the standard model: the mass of this new particle. The coupling of this new particle to the standard model is uniquely determined by input from the standard model and consistency conditions of perturbative string theory. We also study some aspects of the phenomenology of this model and bounds on its possible observation at the Large Hadron Collider.

  16. [Minimally invasive breast surgery].

    PubMed

    Mátrai, Zoltán; Gulyás, Gusztáv; Kunos, Csaba; Sávolt, Akos; Farkas, Emil; Szollár, András; Kásler, Miklós

    2014-02-01

    Due to the development in medical science and industrial technology, minimally invasive procedures have appeared in the surgery of benign and malignant breast diseases. In general , such interventions result in significantly reduced breast and chest wall scars, shorter hospitalization and less pain, but they require specific, expensive devices, longer surgical time compared to open surgery. Furthermore, indications or oncological safety have not been established yet. It is quite likely, that minimally invasive surgical procedures with high-tech devices - similar to other surgical subspecialties -, will gradually become popular and it may form part of routine breast surgery even. Vacuum-assisted core biopsy with a therapeutic indication is suitable for the removal of benign fibroadenomas leaving behind an almost invisible scar, while endoscopically assisted skin-sparing and nipple-sparing mastectomy, axillary staging and reconstruction with latissimus dorsi muscle flap are all feasible through the same short axillary incision. Endoscopic techniques are also suitable for the diagnostics and treatment of intracapsular complications of implant-based breast reconstructions (intracapsular fluid, implant rupture, capsular contracture) and for the biopsy of intracapsular lesions with uncertain pathology. Perception of the role of radiofrequency ablation of breast tumors requires further hands-on experience, but it is likely that it can serve as a replacement of surgical removal in a portion of primary tumors in the future due to the development in functional imaging and anticancer drugs. With the reduction of the price of ductoscopes routine examination of the ductal branch system, guided microdochectomy and targeted surgical removal of terminal ducto-lobular units or a "sick lobe" as an anatomical unit may become feasible. The paper presents the experience of the authors and provides a literature review, for the first time in Hungarian language on the subject. Orv. Hetil

  17. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  18. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  19. Minimal complexity control law synthesis

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Haddad, Wassim M.; Nett, Carl N.

    1989-01-01

    A paradigm for control law design for modern engineering systems is proposed: Minimize control law complexity subject to the achievement of a specified accuracy in the face of a specified level of uncertainty. Correspondingly, the overall goal is to make progress towards the development of a control law design methodology which supports this paradigm. Researchers achieve this goal by developing a general theory of optimal constrained-structure dynamic output feedback compensation, where here constrained-structure means that the dynamic-structure (e.g., dynamic order, pole locations, zero locations, etc.) of the output feedback compensation is constrained in some way. By applying this theory in an innovative fashion, where here the indicated iteration occurs over the choice of the compensator dynamic-structure, the paradigm stated above can, in principle, be realized. The optimal constrained-structure dynamic output feedback problem is formulated in general terms. An elegant method for reducing optimal constrained-structure dynamic output feedback problems to optimal static output feedback problems is then developed. This reduction procedure makes use of star products, linear fractional transformations, and linear fractional decompositions, and yields as a byproduct a complete characterization of the class of optimal constrained-structure dynamic output feedback problems which can be reduced to optimal static output feedback problems. Issues such as operational/physical constraints, operating-point variations, and processor throughput/memory limitations are considered, and it is shown how anti-windup/bumpless transfer, gain-scheduling, and digital processor implementation can be facilitated by constraining the controller dynamic-structure in an appropriate fashion.

  20. On Modelling Minimal Disease Activity

    PubMed Central

    Jackson, Christopher H.; Su, Li; Gladman, Dafna D.

    2016-01-01

    Objective To explore methods for statistical modelling of minimal disease activity (MDA) based on data from intermittent clinic visits. Methods The analysis was based on a 2‐state model. Comparisons were made between analyses based on “complete case” data from visits at which MDA status was known, and the use of hidden model methodology that incorporated information from visits at which only some MDA defining criteria could be established. Analyses were based on an observational psoriatic arthritis cohort. Results With data from 856 patients and 7,024 clinic visits, analysis was based on virtually all visits, although only 62.6% provided enough information to determine MDA status. Estimated mean times for an episode of MDA varied from 4.18 years to 3.10 years, with smaller estimates derived from the hidden 2‐state model analysis. Over a 10‐year period, the estimated expected times spent in MDA episodes of longer than 1 year was 3.90 to 4.22, and the probability of having such an MDA episode was estimated to be 0.85 to 0.91, with longer times and greater probabilities seen with the hidden 2‐state model analysis. Conclusion A 2‐state model provides a useful framework for the analysis of MDA. Use of data from visits at which MDA status can not be determined provide more precision, and notable differences are seen in estimated quantities related to MDA episodes based on complete case and hidden 2‐state model analyses. The possibility of bias, as well as loss of precision, should be recognized when complete case analyses are used. PMID:26315478

  1. Swarm robotics and minimalism

    NASA Astrophysics Data System (ADS)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  2. Minimal distances between SCFTs

    NASA Astrophysics Data System (ADS)

    Buican, Matthew

    2014-01-01

    We study lower bounds on the minimal distance in theory space between four-dimensional superconformal field theories (SCFTs) connected via broad classes of renormalization group (RG) flows preserving various amounts of supersymmetry (SUSY). For = 1 RG flows, the ultraviolet (UV) and infrared (IR) endpoints of the flow can be parametrically close. On the other hand, for RG flows emanating from a maximally supersymmetric SCFT, the distance to the IR theory cannot be arbitrarily small regardless of the amount of (non-trivial) SUSY preserved along the flow. The case of RG flows from =2 UV SCFTs is more subtle. We argue that for RG flows preserving the full =2 SUSY, there are various obstructions to finding examples with parametrically close UV and IR endpoints. Under reasonable assumptions, these obstructions include: unitarity, known bounds on the c central charge derived from associativity of the operator product expansion, and the central charge bounds of Hofman and Maldacena. On the other hand, for RG flows that break = 2 → = 1, it is possible to find IR fixed points that are parametrically close to the UV ones. In this case, we argue that if the UV SCFT possesses a single stress tensor, then such RG flows excite of order all the degrees of freedom of the UV theory. Furthermore, if the UV theory has some flavor symmetry, we argue that the UV central charges should not be too large relative to certain parameters in the theory.

  3. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  4. Minimal Higgs inflation

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Oda, Kin-ya

    2014-02-01

    We consider a possibility that the Higgs field in the Standard Model (SM) serves as an inflaton when its value is around the Planck scale. We assume that the SM is valid up to an ultraviolet cutoff scale Λ , which is slightly below the Planck scale, and that the Higgs potential becomes almost flat above Λ . Contrary to the ordinary Higgs inflation scenario, we do not assume the huge non-minimal coupling, of O(10^4), of the Higgs field to the Ricci scalar. We find that Λ must be less than 5× 10^{17} {GeV} in order to explain the observed fluctuation of the cosmic microwave background, no matter how we extrapolate the Higgs potential above Λ . The scale 10^{17} {GeV} coincides with the perturbative string scale, which suggests that the SM is directly connected with string theory. For this to be true, the top quark mass is restricted to around 171 GeV, with which Λ can exceed 10^{17} {GeV}. As a concrete example of the potential above Λ , we propose a simple log-type potential. The predictions of this specific model for the e-foldings N_*=50-60 are consistent with the current observation, namely, the scalar spectral index is n_s=0.977hbox {-}0.983 and the tensor to scalar ratio 0

  5. Microbiological Methodology in Astrobiology

    NASA Technical Reports Server (NTRS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  6. Minimizing Launch Mass for ISRU Processes

    NASA Technical Reports Server (NTRS)

    England, C.; Hallinan, K. P.

    2004-01-01

    The University of Dayton and the Jet Propulsion Laboratory are developing a methodology for estimating the Earth launch mass (ELM) of processes for In-Situ Resource Utilization (ISRU) with a focus on lunar resource recovery. ISRU may be enabling for both an extended presence on the Moon, and for large sample return missions and for a human presence on Mars. To accomplish these exploration goals, the resources recovered by ISRU must offset the ELM for the recovery process. An appropriate figure of merit is the cost of the exploration mission, which is closely related to ELM. For a given production rate and resource concentration, the lowest ELM - and the best ISRU process - is achieved by minimizing capital equipment for both the ISRU process and energy production. ISRU processes incur Carnot limitations and second law losses (irreversibilities) that ultimately determine production rate, material utilization and energy efficiencies. Heat transfer, chemical reaction, and mechanical operations affect the ELM in ways that are best understood by examining the process's detailed energetics. Schemes for chemical and thermal processing that do not incorporate an understanding of second law losses will be incompletely understood. Our team is developing a methodology that will aid design and selection of ISRU processes by identifying the impact of thermodynamic losses on ELM. The methodology includes mechanical, thermal and chemical operations, and, when completed, will provide a procedure and rationale for optimizing their design and minimizing their cost. The technique for optimizing ISRU with respect to ELM draws from work of England and Funk that relates the cost of endothermic processes to their second law efficiencies. Our team joins their approach for recovering resources by chemical processing with analysis of thermal and mechanical operations in space. Commercial firms provide cost inputs for ELM and planetary landing. Additional information is included in the

  7. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  8. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  9. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  10. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  11. Minimizing waste in environmental restoration

    SciTech Connect

    Moos, L.; Thuot, J.R.

    1996-07-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs.

  12. Process waste assessment methodology for mechanical departments

    SciTech Connect

    Hedrick, R.B.

    1992-12-01

    Process waste assessments (PWAS) were performed for three pilot processes to develop methodology for performing PWAs for all the various processes used throughout the mechanical departments. A material balance and process flow diagram identifying the raw materials utilized in the process and the quantity and types of materials entering the waste streams from the process is defined for each PWA. The data and information are used to determine potential options'' for eliminating hazardous materials or minimizing wastes generated.

  13. Heart bypass surgery - minimally invasive

    MedlinePlus

    ... in 30-day outcomes in high-risk patients randomized to off-pump versus on-pump coronary bypass ... Thiele H, Neumann-Schniedewind P, Jacobs S, et al. Randomized comparison of minimally invasive direct coronary artery bypass ...

  14. Mitral valve surgery - minimally invasive

    MedlinePlus

    ... that does many of these procedures. Minimally invasive heart valve surgery has improved greatly in recent years. These ... WT, Mack MJ. Transcatheter cardiac valve interventions. Surg Clin North Am . 2009;89:951-66. ...

  15. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  16. Menopause and Methodological Doubt

    ERIC Educational Resources Information Center

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  17. Theories and Methodologies.

    ERIC Educational Resources Information Center

    Skemp, Richard R.

    Provided is an examination of the methodology used to study the problems of learning addition and subtraction skills used by developmental researchers. The report has sections on categories of theory and their methodologies, which review: (1) Behaviorist, Neo-Behaviorist and Piagetian Theories; (2) the Behaviorist and Piagetian Paradigms; (3)…

  18. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  19. Minimizing pollutants with multimedia strategies

    SciTech Connect

    Phillips, J.B.; Hindawi, M.A.

    1997-01-01

    A multimedia approach to pollution prevention that focuses on minimizing or eliminating production of pollutants is one of the most advantageous strategies to adopt in preparing an overall facility environmental plan. If processes are optimized to preclude or minimize the manufacture of streams containing pollutants, or to reduce the levels of pollutants in waste streams, then the task of multimedia pollution prevention becomes more manageable simply as a result of a smaller problem needing to be addressed. An orderly and systematic approach to waste minimization can result in a comprehensive strategy to reduce the production of waste streams and simultaneously improve the profitability of a process or industrial operation. There are a number of miscellaneous strategies for a waste minimization that attack the problem via process chemistry or engineering. Examples include installation of low-NO{sub x} burners, selection of valves that minimize fugitive emissions, high-level switches on storage tanks, the use of in-plant stills for recycling and reusing solvents and using water-based products instead of hydrocarbon-based products wherever possible. Other waste minimization countermeasures can focus on O and M issues.

  20. Specialized minimal PDFs for optimized LHC calculations

    NASA Astrophysics Data System (ADS)

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-04-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.

  1. [Essential genes, minimal genome and synthetic cell of bacteria: a review].

    PubMed

    Qiu, Dongru

    2012-05-01

    Single-cell prokaryotes represent a simple and primitive cellular life form. The identification of the essential genes of bacteria and the minimal genome for the free-living cellular life could provide insights into the origin, evolution, and essence of life forms. The principles, methodology, and recent progresses in the identification of essential genes and minimal genome and the creation of synthetic cells are reviewed and particularly the strategies for creating the minimal genome and the potential applications are introduced. PMID:22916492

  2. Minimally invasive surgery for atrial fibrillation

    PubMed Central

    Suwalski, Piotr

    2013-01-01

    Atrial fibrillation (AF) remains the most common cardiac arrhythmia, affecting nearly 2% of the general population worldwide. Minimally invasive surgical ablation remains one of the most dynamically evolving fields of modern cardiac surgery. While there are more than a dozen issues driving this development, two seem to play the most important role: first, there is lack of evidence supporting percutaneous catheter based approach to treat patients with persistent and long-standing persistent AF. Paucity of this data offers surgical community unparalleled opportunity to challenge guidelines and change indications for surgical intervention. Large, multicenter prospective clinical studies are therefore of utmost importance, as well as honest, clear data reporting. Second, a collaborative methodology started a long-awaited debate on a Heart Team approach to AF, similar to the debate on coronary artery disease and transcatheter valves. Appropriate patient selection and tailored treatment options will most certainly result in better outcomes and patient satisfaction, coupled with appropriate use of always-limited institutional resources. The aim of this review, unlike other reviews of minimally invasive surgical ablation, is to present medical professionals with two distinctly different, approaches. The first one is purely surgical, Standalone surgical isolation of the pulmonary veins using bipolar energy source with concomitant amputation of the left atrial appendage—a method of choice in one of the most important clinical trials on AF—The Atrial Fibrillation Catheter Ablation Versus Surgical Ablation Treatment (FAST) Trial. The second one represents the most complex approach to this problem: a multidisciplinary, combined effort of a cardiac surgeon and electrophysiologist. The Convergent Procedure, which includes both endocardial and epicardial unipolar ablation bonds together minimally invasive endoscopic surgery with electroanatomical mapping, to deliver best of

  3. Minimally invasive surgery for atrial fibrillation.

    PubMed

    Zembala, Michael O; Suwalski, Piotr

    2013-11-01

    Atrial fibrillation (AF) remains the most common cardiac arrhythmia, affecting nearly 2% of the general population worldwide. Minimally invasive surgical ablation remains one of the most dynamically evolving fields of modern cardiac surgery. While there are more than a dozen issues driving this development, two seem to play the most important role: first, there is lack of evidence supporting percutaneous catheter based approach to treat patients with persistent and long-standing persistent AF. Paucity of this data offers surgical community unparalleled opportunity to challenge guidelines and change indications for surgical intervention. Large, multicenter prospective clinical studies are therefore of utmost importance, as well as honest, clear data reporting. Second, a collaborative methodology started a long-awaited debate on a Heart Team approach to AF, similar to the debate on coronary artery disease and transcatheter valves. Appropriate patient selection and tailored treatment options will most certainly result in better outcomes and patient satisfaction, coupled with appropriate use of always-limited institutional resources. The aim of this review, unlike other reviews of minimally invasive surgical ablation, is to present medical professionals with two distinctly different, approaches. The first one is purely surgical, Standalone surgical isolation of the pulmonary veins using bipolar energy source with concomitant amputation of the left atrial appendage-a method of choice in one of the most important clinical trials on AF-The Atrial Fibrillation Catheter Ablation Versus Surgical Ablation Treatment (FAST) Trial. The second one represents the most complex approach to this problem: a multidisciplinary, combined effort of a cardiac surgeon and electrophysiologist. The Convergent Procedure, which includes both endocardial and epicardial unipolar ablation bonds together minimally invasive endoscopic surgery with electroanatomical mapping, to deliver best of the

  4. Minimally invasive video-assisted versus minimally invasive nonendoscopic thyroidectomy.

    PubMed

    Fík, Zdeněk; Astl, Jaromír; Zábrodský, Michal; Lukeš, Petr; Merunka, Ilja; Betka, Jan; Chovanec, Martin

    2014-01-01

    Minimally invasive video-assisted thyroidectomy (MIVAT) and minimally invasive nonendoscopic thyroidectomy (MINET) represent well accepted and reproducible techniques developed with the main goal to improve cosmetic outcome, accelerate healing, and increase patient's comfort following thyroid surgery. Between 2007 and 2011, a prospective nonrandomized study of patients undergoing minimally invasive thyroid surgery was performed to compare advantages and disadvantages of the two different techniques. There were no significant differences in the length of incision to perform surgical procedures. Mean duration of hemithyroidectomy was comparable in both groups, but it was more time consuming to perform total thyroidectomy by MIVAT. There were more patients undergoing MIVAT procedures without active drainage in the postoperative course and we also could see a trend for less pain in the same group. This was paralleled by statistically significant decreased administration of both opiates and nonopiate analgesics. We encountered two cases of recurrent laryngeal nerve palsies in the MIVAT group only. MIVAT and MINET represent safe and feasible alternative to conventional thyroid surgery in selected cases and this prospective study has shown minimal differences between these two techniques. PMID:24800227

  5. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  6. Technology transfer methodology

    NASA Technical Reports Server (NTRS)

    Labotz, Rich

    1991-01-01

    Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.

  7. In vivo minimally invasive interstitial multi-functional microendoscopy

    PubMed Central

    Shahmoon, Asaf; Aharon, Shiran; Kruchik, Oded; Hohmann, Martin; Slovin, Hamutal; Douplik, Alexandre; Zalevsky, Zeev

    2013-01-01

    Developing minimally invasive methodologies for imaging of internal organs is an emerging field in the biomedical examination research. This paper introduces a new multi-functional microendoscope device capable of imaging of internal organs with a minimal invasive intervention. In addition, the developed microendoscope can also be employed as a monitoring device for measuring local hemoglobin concentration in blood stream when administrated into a blood artery. The microendoscope device has a total external diameter of only 200 μm and can provide high imaging resolution capability of more than 5,000 pixels. The device can detect features with a spatial resolution of less than 1 μm. The microendoscope has been tested both in-vitro as well as in-vivo in rats presenting a promising and powerful tool as a high resolution and minimally invasive imaging facility suitable for previously unreachable clinical modalities. PMID:23712369

  8. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  9. WASTE MINIMIZATION OPPORTUNITY ASSESSMENT MANUAL

    EPA Science Inventory

    Waste minimization (WM) is a policy specifically mandated by the U.S. Congress in the 1984 Hazardous and Solid Wastes Amendments to the Resource Conservation and Recovery Act (RCRA). The RCRA regulations require that generators of hazardous waste have a program in place to reduce...

  10. Assembly of a minimal protocell

    NASA Astrophysics Data System (ADS)

    Rasmussen, Steen

    2007-03-01

    What is minimal life, how can we make it, and how can it be useful? We present experimental and computational results towards bridging nonliving and living matter, which results in life that is different and much simpler than contemporary life. A simple yet tightly coupled catalytic cooperation between genes, metabolism, and container forms the design underpinnings of our protocell, which is a minimal self-replicating molecular machine. Experimentally, we have recently demonstrated this coupling by having an informational molecule (8-oxoguanine) catalytically control the light driven metabolic (Ru-bpy based) production of container materials (fatty acids). This is a significant milestone towards assembling a minimal self-replicating molecular machine. Recent theoretical investigations indicate that coordinated exponential component growth should naturally emerge as a result from such a catalytic coupling between the main protocellular components. A 3-D dissipative particle simulation (DPD) study of the full protocell life-cycle exposes a number of anticipated systemic issues associated with the remaining experimental challenges for the implementation of the minimal protocell. Finally we outline how more general self-replicating materials could be useful.

  11. Minimally invasive aortic valve surgery.

    PubMed

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-09-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  12. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  13. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  14. Minimally invasive surgical approach to pancreatic malignancies

    PubMed Central

    Bencini, Lapo; Annecchiarico, Mario; Farsi, Marco; Bartolini, Ilenia; Mirasolo, Vita; Guerra, Francesco; Coratti, Andrea

    2015-01-01

    Pancreatic surgery for malignancy is recognized as challenging for the surgeons and risky for the patients due to consistent perioperative morbidity and mortality. Furthermore, the oncological long-term results are largely disappointing, even for those patients who experience an uneventfully hospital stay. Nevertheless, surgery still remains the cornerstone of a multidisciplinary treatment for pancreatic cancer. In order to maximize the benefits of surgery, the advent of both laparoscopy and robotics has led many surgeons to treat pancreatic cancers with these new methodologies. The reduction of postoperative complications, length of hospital stay and pain, together with a shorter interval between surgery and the beginning of adjuvant chemotherapy, represent the potential advantages over conventional surgery. Lastly, a better cosmetic result, although not crucial in any cancerous patient, could also play a role by improving overall well-being and patient self-perception. The laparoscopic approach to pancreatic surgery is, however, difficult in inexperienced hands and requires a dedicated training in both advanced laparoscopy and pancreatic surgery. The recent large diffusion of the da Vinci® robotic platform seems to facilitate many of the technical maneuvers, such as anastomotic biliary and pancreatic reconstructions, accurate lymphadenectomy, and vascular sutures. The two main pancreatic operations, distal pancreatectomy and pancreaticoduodenectomy, are approachable by a minimally invasive path, but more limited interventions such as enucleation are also feasible. Nevertheless, a word of caution should be taken into account when considering the increasing costs of these newest technologies because the main concerns regarding these are the maintenance of all oncological standards and the lack of long-term follow-up. The purpose of this review is to examine the evidence for the use of minimally invasive surgery in pancreatic cancer (and less aggressive tumors

  15. Toward a Minimal Artificial Axon.

    PubMed

    Ariyaratne, Amila; Zocchi, Giovanni

    2016-07-01

    The electrophysiology of action potentials is usually studied in neurons, through relatively demanding experiments which are difficult to scale up to a defined network. Here we pursue instead the minimal artificial system based on the essential biological components-ion channels and lipid bilayers-where action potentials can be generated, propagated, and eventually networked. The fundamental unit is the classic supported bilayer: a planar bilayer patch with embedded ion channels in a fluidic environment where an ionic gradient is imposed across the bilayer. Two such units electrically connected form the basic building block for a network. The system is minimal in that we demonstrate that one kind of ion channel and correspondingly a gradient of only one ionic species is sufficient to generate an excitable system which shows amplification and threshold behavior. PMID:27049652

  16. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  17. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  18. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed. PMID:23410316

  19. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)]. PMID:26382367

  20. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  1. Minimizing liability during internal investigations.

    PubMed

    Morris, Cole

    2010-01-01

    Today's security professional must appreciate the potential landmines in any investigative effort and work collaboratively with others to minimize liability risks, the author points out. In this article he examines six civil torts that commonly arise from unprofessionally planned or poorly executed internal investigations-defamation, false imprisonment. intentional infliction of emotional distress, assault and battery, invasion of privacy, and malicious prosecution and abuse of process. PMID:20873494

  2. Minimal absent words in four human genome assemblies.

    PubMed

    Garcia, Sara P; Pinho, Armando J

    2011-01-01

    Minimal absent words have been computed in genomes of organisms from all domains of life. Here, we aim to contribute to the catalogue of human genomic variation by investigating the variation in number and content of minimal absent words within a species, using four human genome assemblies. We compare the reference human genome GRCh37 assembly, the HuRef assembly of the genome of Craig Venter, the NA12878 assembly from cell line GM12878, and the YH assembly of the genome of a Han Chinese individual. We find the variation in number and content of minimal absent words between assemblies more significant for large and very large minimal absent words, where the biases of sequencing and assembly methodologies become more pronounced. Moreover, we find generally greater similarity between the human genome assemblies sequenced with capillary-based technologies (GRCh37 and HuRef) than between the human genome assemblies sequenced with massively parallel technologies (NA12878 and YH). Finally, as expected, we find the overall variation in number and content of minimal absent words within a species to be generally smaller than the variation between species. PMID:22220210

  3. Document Conversion Methodology.

    ERIC Educational Resources Information Center

    Bovee, Donna

    1990-01-01

    Discusses digital imaging technology and examines document database conversion considerations. Two types of document imaging systems are described: (1) a work in process system, and (2) a storage and retrieval system. Conversion methodology is outlined, and a document conversion scenario is presented as a practical guide to conversion. (LRW)

  4. Complicating Methodological Transparency

    ERIC Educational Resources Information Center

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  5. Video: Modalities and Methodologies

    ERIC Educational Resources Information Center

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data collection…

  6. Courseware Engineering Methodology.

    ERIC Educational Resources Information Center

    Uden, Lorna

    2002-01-01

    Describes development of the Courseware Engineering Methodology (CEM), created to guide novices in designing effective courseware. Discusses CEM's four models: pedagogical (concerned with the courseware's pedagogical aspects), conceptual (dealing with software engineering), interface (relating to human-computer interaction), and hypermedia…

  7. SCI Hazard Report Methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  8. Temporal structure of consciousness and minimal self in schizophrenia.

    PubMed

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one's first-person experiential perspective as, for example, an "altered presence" during which the sense of the experienced self ("mineness") is subtly affected, or "altered sense of demarcation," i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an "I." In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  9. Temporal structure of consciousness and minimal self in schizophrenia

    PubMed Central

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one’s first-person experiential perspective as, for example, an “altered presence” during which the sense of the experienced self (“mineness”) is subtly affected, or “altered sense of demarcation,” i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an “I.” In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  10. Evidence-Based Integrated Environmental Solutions For Secondary Lead Smelters: Pollution Prevention And Waste Minimization Technologies And Practices

    EPA Science Inventory

    An evidence-based methodology was adopted in this research to establish strategies to increase lead recovery and recycling via a systematic review and critical appraisal of the published literature. In particular, the research examines pollution prevention and waste minimization...

  11. Unsupported standing with minimized ankle muscle fatigue.

    PubMed

    Mihelj, Matjaz; Munih, Marko

    2004-08-01

    In the past, limited unsupported standing has been restored in patients with thoracic spinal cord injury through open-loop functional electrical stimulation of paralyzed knee extensor muscles and the support of intact arm musculature. Here an optimal control system for paralyzed ankle muscles was designed that enables the subject to stand without hand support in a sagittal plane. The paraplegic subject was conceptualized as an underactuated double inverted pendulum structure with an active degree of freedom in the upper trunk and a passive degree of freedom in the paralyzed ankle joints. Control system design is based on the minimization of a cost function that estimates the effort of ankle joint muscles via observation of the ground reaction force position, relative to ankle joint axis. Furthermore, such a control system integrates voluntary upper trunk activity and artificial control of ankle joint muscles, resulting in a robust standing posture. Figures are shown for the initial simulation study, followed by disturbance tests on an intact volunteer and several laboratory trials with a paraplegic person. Benefits of the presented methodology are prolonged standing sessions and in the fact that the subject is able to maintain voluntary control over upper body orientation in space, enabling simple functional standing. PMID:15311817

  12. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  13. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  14. Optimal needle design for minimal insertion force and bevel length.

    PubMed

    Wang, Yancheng; Chen, Roland K; Tai, Bruce L; McLaughlin, Patrick W; Shih, Albert J

    2014-09-01

    This research presents a methodology for optimal design of the needle geometry to minimize the insertion force and bevel length based on mathematical models of cutting edge inclination and rake angles and the insertion force. In brachytherapy, the needle with lower insertion force typically is easier for guidance and has less deflection. In this study, the needle with lancet point (denoted as lancet needle) is applied to demonstrate the model-based optimization for needle design. Mathematical models to calculate the bevel length and inclination and rake angles for lancet needle are presented. A needle insertion force model is developed to predict the insertion force for lancet needle. The genetic algorithm is utilized to optimize the needle geometry for two cases. One is to minimize the needle insertion force. Using the geometry of a commercial lancet needle as the baseline, the optimized needle has 11% lower insertion force with the same bevel length. The other case is to minimize the bevel length under the same needle insertion force. The optimized design can reduce the bevel length by 46%. Both optimized needle designs were validated experimentally in ex vivo porcine liver needle insertion tests and demonstrated the methodology of the model-based optimal needle design. PMID:24957487

  15. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes. PMID:16711089

  16. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  17. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  18. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given. PMID:19655979

  19. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  20. Risk minimization through portfolio replication

    NASA Astrophysics Data System (ADS)

    Ciliberti, S.; Mã©Zard, M.

    2007-05-01

    We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.

  1. Diagnosis of minimal hepatic encephalopathy.

    PubMed

    Weissenborn, Karin

    2015-03-01

    Minimal hepatic encephalopathy (mHE) has significant impact upon a liver patient's daily living and health related quality of life. Therefore a majority of clinicians agree that mHE should be diagnosed and treated. The optimal means for diagnosing mHE, however, is controversial. This paper describes the currently most frequently used methods-EEG, critical flicker frequency, Continuous Reaction time Test, Inhibitory Control Test, computerized test batteries such as the Cognitive Drug Research test battery, the psychometric hepatic encephalopathy score (PHES) and the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS)-and their pros and cons. PMID:26041959

  2. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  3. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  4. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  5. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  6. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  7. Symmetry breaking for drag minimization

    NASA Astrophysics Data System (ADS)

    Roper, Marcus; Squires, Todd M.; Brenner, Michael P.

    2005-11-01

    For locomotion at high Reynolds numbers drag minimization favors fore-aft asymmetric slender shapes with blunt noses and sharp trailing edges. On the other hand, in an inertialess fluid the drag experienced by a body is independent of whether it travels forward or backward through the fluid, so there is no advantage to having a single preferred swimming direction. In fact numerically determined minimum drag shapes are known to exhibit almost no fore-aft asymmetry even at moderate Re. We show that asymmetry persists, albeit extremely weakly, down to vanishingly small Re, scaling asymptotically as Re^3. The need to minimize drag to maximize speed for a given propulsive capacity gives one possible mechanism for the increasing asymmetry in the body plans seen in nature, as organisms increase in size and swimming speed from bacteria like E-Coli up to pursuit predator fish such as tuna. If it is the dominant mechanism, then this signature scaling will be observed in the shapes of motile micro-organisms.

  8. Structural femtochemistry: experimental methodology.

    PubMed Central

    Williamson, J C; Zewail, A H

    1991-01-01

    The experimental methodology for structural femtochemistry of reactions is considered. With the extension of femtosecond transition-state spectroscopy to the diffraction regime, it is possible to obtain in a general way the trajectories of chemical reactions (change of internuclear separations with time) on the femtosecond time scale. This method, considered here for simple alkali halide dissociation, promises many applications to more complex reactions and to conformational changes. Alignment on the time scale of the experiments is also discussed. Images PMID:11607189

  9. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  10. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  11. Minimally invasive posterior hamstring harvest.

    PubMed

    Wilson, Trent J; Lubowitz, James H

    2013-01-01

    Autogenous hamstring harvesting for knee ligament reconstruction is a well-established standard. Minimally invasive posterior hamstring harvest is a simple, efficient, reproducible technique for harvest of the semitendinosus or gracilis tendon or both medial hamstring tendons. A 2- to 3-cm longitudinal incision from the popliteal crease proximally, in line with the semitendinosus tendon, is sufficient. The deep fascia is bluntly penetrated, and the tendon or tendons are identified. Adhesions are dissected. Then, an open tendon stripper is used to release the tendon or tendons proximally; a closed, sharp tendon stripper is used to release the tendon or tendons from the pes. Layered, absorbable skin closure is performed, and the skin is covered with a skin sealant, bolster dressing, and plastic adhesive bandage for 2 weeks. PMID:24266003

  12. Minimally Invasive Spigelian Hernia Repair

    PubMed Central

    Baucom, Catherine; Nguyen, Quan D.; Hidalgo, Marco

    2009-01-01

    Introduction: Spigelian hernia is an uncommon ventral hernia characterized by a defect in the linea semilunaris. Repair of spigelian hernia has traditionally been accomplished via an open transverse incision and primary repair. The purpose of this article is to present 2 case reports of incarcerated spigelian hernia that were successfully repaired laparoscopically using Gortex mesh and to present a review of the literature regarding laparoscopic repair of spigelian hernias. Methods: Retrospective chart review and Medline literature search. Results: Two patients underwent laparoscopic mesh repair of incarcerated spigelian hernias. Both were started on a regular diet on postoperative day 1 and discharged on postoperative days 2 and 3. One patient developed a seroma that resolved without intervention. There was complete resolution of preoperative symptoms at the 12-month follow-up. Conclusion: Minimally invasive repair of spigelian hernias is an alternative to the traditional open surgical technique. Further studies are needed to directly compare the open and the laparoscopic repair. PMID:19660230

  13. Minimally invasive radioguided parathyroidectomy (MIRP).

    PubMed

    Goldstein, R E; Martin, W H; Richards, K

    2003-06-01

    The technique of parathyroidectomy has traditionally involved a bilateral exploration of the neck with the intent of visualizing 4 parathyroid glands and resecting pathologically enlarged glands. Parathyroid scanning using technetium-99m sestamibi has evolved and can now localize 80% to 90% of parathyroid adenomas. The technique of minimally invasive radioguided parathyroidectomy (MIRP) is a surgical option for most patients with primary hyperparathyroidism and a positive preoperative parathyroid scan. The technique makes use of a hand-held gamma probe that is used intraoperatively to guide the dissection in a highly directed manner with the procedure often performed under local anesthesia. The technique results in excellent cure rates while allowing most patients to leave the hospital within a few hours after the completion of the procedure. Current data also suggest the procedure can decrease hospital charges by approximately 50%. This technique may significantly change the management of primary hyperparathyroidism. PMID:12955045

  14. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR. PMID:27295772

  15. Non-minimal Inflationary Attractors

    SciTech Connect

    Kallosh, Renata; Linde, Andrei E-mail: alinde@stanford.edu

    2013-10-01

    Recently we identified a new class of (super)conformally invariant theories which allow inflation even if the scalar potential is very steep in terms of the original conformal variables. Observational predictions of a broad class of such theories are nearly model-independent. In this paper we consider generalized versions of these models where the inflaton has a non-minimal coupling to gravity with a negative parameter ξ different from its conformal value -1/6. We show that these models exhibit attractor behavior. With even a slight increase of |ξ| from |ξ| = 0, predictions of these models for n{sub s} and r rapidly converge to their universal model-independent values corresponding to conformal coupling ξ = −1/6. These values of n{sub s} and r practically coincide with the corresponding values in the limit ξ → −∞.

  16. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  17. Minimizing the pain on burnout

    SciTech Connect

    Billings, A.

    1985-03-01

    An investment in an oil and gas shelter warrants an additional investment to fund tax liability on burnout. A relatively liquid and low-risk investment is preferable so as to assure timely satisfaction of tax liability when burnout occurs. If an investor decides to allow the shelter to die a timely death, the investment funds could be used to fund annual tax liability. In situations where a leak develops, the fund will once again be invaluable. When a leak or burnout occurs, investors may be able to do no more than minimize their maximum losses. Relief of debt on most dispositions will be deemed receipt of cash, thus triggering gains. Ordinary income will result by operation of Code Sections 1245, 1250, and 1254. Bankruptcy or a charitable contribution will grant limited reprieve from tax losses; however, economic losses will still result.

  18. Minimal unitary (covariant) scattering theory

    SciTech Connect

    Lindesay, J.V.; Markevich, A.

    1983-06-01

    In the minimal three particle equations developed by Lindesay the two body input amplitude was an on shell relativistic generalization of the non-relativistic scattering model characterized by a single mass parameter ..mu.. which in the two body (m + m) system looks like an s-channel bound state (..mu.. < 2m) or virtual state (..mu.. > 2m). Using this driving term in covariant Faddeev equations generates a rich covariant and unitary three particle dynamics. However, the simplest way of writing the relativisitic generalization of the Faddeev equations can take the on shell Mandelstam parameter s = 4(q/sup 2/ + m/sup 2/), in terms of which the two particle input is expressed, to negative values in the range of integration required by the dynamics. This problem was met in the original treatment by multiplying the two particle input amplitude by THETA(s). This paper provides what we hope to be a more direct way of meeting the problem.

  19. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  20. The minimal composite Higgs model

    NASA Astrophysics Data System (ADS)

    Agashe, Kaustubh; Contino, Roberto; Pomarol, Alex

    2005-07-01

    We study the idea of a composite Higgs in the framework of a five-dimensional AdS theory. We present the minimal model of the Higgs as a pseudo-Goldstone boson in which electroweak symmetry is broken dynamically via top loop effects, all flavour problems are solved, and contributions to electroweak precision observables are below experimental bounds. Since the 5D theory is weakly coupled, we are able to fully determine the Higgs potential and other physical quantities. The lightest resonances are expected to have a mass around 2 TeV and should be discovered at the LHC. The top sector is mostly composite and deviations from Standard Model couplings are expected.

  1. Natural supersymmetric minimal dark matter

    NASA Astrophysics Data System (ADS)

    Fabbrichesi, Marco; Urbano, Alfredo

    2016-03-01

    We show how the Higgs boson mass is protected from the potentially large corrections due to the introduction of minimal dark matter if the new physics sector is made supersymmetric. The fermionic dark matter candidate (a 5-plet of S U (2 )L) is accompanied by a scalar state. The weak gauge sector is made supersymmetric, and the Higgs boson is embedded in a supersymmetric multiplet. The remaining standard model states are nonsupersymmetric. Nonvanishing corrections to the Higgs boson mass only appear at three-loop level, and the model is natural for dark matter masses up to 15 TeV—a value larger than the one required by the cosmological relic density. The construction presented stands as an example of a general approach to naturalness that solves the little hierarchy problem which arises when new physics is added beyond the standard model at an energy scale around 10 TeV.

  2. Chemical basis for minimal cognition.

    PubMed

    Hanczyc, Martin M; Ikegami, Takashi

    2010-01-01

    We have developed a simple chemical system capable of self-movement in order to study the physicochemical origins of movement. We propose how this system may be useful in the study of minimal perception and cognition. The system consists simply of an oil droplet in an aqueous environment. A chemical reaction within the oil droplet induces an instability, the symmetry of the oil droplet breaks, and the droplet begins to move through the aqueous phase. The complement of physical phenomena that is then generated indicates the presence of feedback cycles that, as will be argued, form the basis for self-regulation, homeostasis, and perhaps an extended form of autopoiesis. We discuss the result that simple chemical systems are capable of sensory-motor coupling and possess a homeodynamic state from which cognitive processes may emerge. PMID:20586578

  3. A minimal little Higgs model

    NASA Astrophysics Data System (ADS)

    Barceló, Roberto; Masip, Manuel

    2008-11-01

    We discuss a little Higgs scenario that introduces below the TeV scale just the two minimal ingredients of these models, a vectorlike T quark and a singlet component (implying anomalous couplings) in the Higgs field, together with a pseudoscalar singlet η. In the model, which is a variation of Schmaltz’s simplest little Higgs model, all the extra vector bosons are much heavier than the T quark. In the Yukawa sector the global symmetry is approximate, implying a single large coupling per flavor, whereas in the scalar sector it is only broken at the loop level. We obtain the one-loop effective potential and show that it provides acceptable masses for the Higgs h and for the singlet η with no need for an extra μ term. We find that mη can be larger than mh/2, which would forbid the (otherwise dominant) decay mode h→ηη.

  4. Waste minimization in analytical methods

    SciTech Connect

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-05-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department`s goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision.

  5. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  6. Minimizing Variation in Outdoor CPV Power Ratings: Preprint

    SciTech Connect

    Muller, M.; Marion, B.; Rodriguez, J.; Kurtz, S.

    2011-07-01

    The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  7. Minimal length uncertainty and accelerating universe

    NASA Astrophysics Data System (ADS)

    Farmany, A.; Mortazavi, S. S.

    2016-06-01

    In this paper, minimal length uncertainty is used as a constraint to solve the Friedman equation. It is shown that, based on the minimal length uncertainty principle, the Hubble scale is decreasing which corresponds to an accelerating universe.

  8. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  9. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  10. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  11. Closed locally minimal nets on tetrahedra

    SciTech Connect

    Strelkova, Nataliya P

    2011-01-31

    Closed locally minimal networks are in a sense a generalization of closed geodesics. A complete classification is known of closed locally minimal networks on regular (and generally any equihedral) tetrahedra. In the present paper certain necessary and certain sufficient conditions are given for at least one closed locally minimal network to exist on a given non-equihedral tetrahedron. Bibliography: 6 titles.

  12. Methodology for Developing a Crop Yield Stability Map for a Field

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This abstract will summarize the methodology used to develop a yield stability map for a field. We proposed that there exist yield stability patters for commercial field crop production which growers can use to optimize crop production while minimizing inputs. The methodology uses multiple years o...

  13. Relative Hazard Calculation Methodology

    SciTech Connect

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-09-07

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation).

  14. Regional Expansion of Minimally Invasive Surgery for Hysterectomy: Implementation and Methodology in a Large Multispecialty Group

    PubMed Central

    Andryjowicz, Esteban; Wray, Teresa

    2011-01-01

    Introduction: Approximately 600,000 hysterectomies are performed in the US each year, making hysterectomy the second most common major operation performed in women. Several methods can be used to perform this procedure. In 2009, a Cochrane Review concluded “that vaginal hysterectomy should be performed in preference to abdominal hysterectomy, where possible. Where vaginal hysterectomy is not possible, a laparoscopic approach may avoid the need for an abdominal hysterectomy. Risks and benefits of different approaches may however be influenced by the surgeon's experience. More research is needed, particularly to examine the long-term effects of the different types of surgery.” This article reviews the steps that a large multispecialty group used to teach non-open hysterectomy methods to improve the quality of care for their patients and to decrease the number of inpatient procedures and therefore costs. The percentages of each type of hysterectomy performed yearly between 2005 and 2010 were calculated, as well as the length of stay (LOS) for each method. Methods: A structured educational intervention with both didactic and hands-on exercises was created and rolled out to 12 medical centers. All patients undergoing hysterectomy for benign conditions through the Southern California Permanente Medical Group (a large multispecialty group that provides medical care to Kaiser Permanente patients in Southern California) between 2005 and 2010 were included. This amounted to 26,055 hysterectomies for benign conditions being performed by more than 350 obstetrician/gynecologists (Ob/Gyns). Results: More than 300 Ob/Gyns took the course across 12 medical centers. On the basis of hospital discharge data, the total number of hysterectomies, types of hysterectomies, and LOS for each type were identified for each year. Between 2005 and 2010, the rate of non-open hysterectomies has increased 120% (from 38% to 78%) and the average LOS has decreased 31%. PMID:22319415

  15. Analysis of drug combinations: current methodological landscape

    PubMed Central

    Foucquier, Julie; Guedj, Mickael

    2015-01-01

    Combination therapies exploit the chances for better efficacy, decreased toxicity, and reduced development of drug resistance and owing to these advantages, have become a standard for the treatment of several diseases and continue to represent a promising approach in indications of unmet medical need. In this context, studying the effects of a combination of drugs in order to provide evidence of a significant superiority compared to the single agents is of particular interest. Research in this field has resulted in a large number of papers and revealed several issues. Here, we propose an overview of the current methodological landscape concerning the study of combination effects. First, we aim to provide the minimal set of mathematical and pharmacological concepts necessary to understand the most commonly used approaches, divided into effect-based approaches and dose–effect-based approaches, and introduced in light of their respective practical advantages and limitations. Then, we discuss six main common methodological issues that scientists have to face at each step of the development of new combination therapies. In particular, in the absence of a reference methodology suitable for all biomedical situations, the analysis of drug combinations should benefit from a collective, appropriate, and rigorous application of the concepts and methods reviewed here. PMID:26171228

  16. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  17. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  18. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  19. Minimalism through intraoperative functional mapping.

    PubMed

    Berger, M S

    1996-01-01

    Intraoperative stimulation mapping may be used to avoid unnecessary risk to functional regions subserving language and sensori-motor pathways. Based on the data presented here, language localization is variable in the entire population, with only certainty existing for the inferior frontal region responsible for motor speech. Anatomical landmarks such as the anterior temporal tip for temporal lobe language sites and the posterior aspect of the lateral sphenoid wing for the frontal lobe language zones are unreliable in avoiding postoperative aphasias. Thus, individual mapping to identify essential language sites has the greatest likelihood of avoiding permanent deficits in naming, reading, and motor speech. In a similar approach, motor and sensory pathways from the cortex and underlying white matter may be reliably stimulated and mapped in both awake and asleep patients. Although these techniques require an additional operative time and equipment nominally priced, the result is often gratifying, as postoperative morbidity has been greatly reduced in the process of incorporating these surgical strategies. The patients quality of life is improved in terms of seizure control, with or without antiepileptic drugs. This avoids having to perform a second costly operative procedure, which is routinely done when extraoperative stimulation and recording is done via subdural grids. In addition, an aggressive tumor resection at the initial operation lengthens the time to tumor recurrence and often obviates the need for a subsequent reoperation. Thus, intraoperative functional mapping may be best alluded to as a surgical technique that results in "minimalism in the long term". PMID:9247814

  20. Minimally invasive medial hip approach.

    PubMed

    Chiron, P; Murgier, J; Cavaignac, E; Pailhé, R; Reina, N

    2014-10-01

    The medial approach to the hip via the adductors, as described by Ludloff or Ferguson, provides restricted visualization and incurs a risk of neurovascular lesion. We describe a minimally invasive medial hip approach providing broader exposure of extra- and intra-articular elements in a space free of neurovascular structures. With the lower limb in a "frog-leg" position, the skin incision follows the adductor longus for 6cm and then the aponeurosis is incised. A slide plane between all the adductors and the aponeurosis is easily released by blunt dissection, with no interposed neurovascular elements. This gives access to the lesser trochanter, psoas tendon and inferior sides of the femoral neck and head, anterior wall of the acetabulum and labrum. We report a series of 56 cases, with no major complications: this approach allows treatment of iliopsoas muscle lesions and resection or filling of benign tumors of the cervical region and enables intra-articular surgery (arthrolysis, resection of osteophytes or foreign bodies, labral suture). PMID:25164350

  1. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  2. Cancer Cytogenetics: Methodology Revisited

    PubMed Central

    2014-01-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed. PMID:25368816

  3. Methodological Problems of Nanotechnoscience

    NASA Astrophysics Data System (ADS)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  4. Heart bypass surgery - minimally invasive - discharge

    MedlinePlus

    ... Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest . ... bypass surgery - minimally invasive Heart failure - overview High blood cholesterol ...

  5. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  6. Dosimetric methodology of the ICRP

    SciTech Connect

    Eckerman, K.F.

    1994-12-31

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples.

  7. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  8. Status of sonic boom methodology and understanding

    NASA Technical Reports Server (NTRS)

    Darden, Christine M.; Powell, Clemans A.; Hayes, Wallace D.; George, Albert R.; Pierce, Allan D.

    1989-01-01

    In January 1988, approximately 60 representatives of industry, academia, government, and the military gathered at NASA-Langley for a 2 day workshop on the state-of-the-art of sonic boom physics, methodology, and understanding. The purpose of the workshop was to assess the sonic boom area, to determine areas where additional sonic boom research is needed, and to establish some strategies and priorities in this sonic boom research. Attendees included many internationally recognized sonic boom experts who had been very active in the Supersonic Transport (SST) and Supersonic Cruise Aircraft Research Programs of the 60's and 70's. Summaries of the assessed state-of-the-art and the research needs in theory, minimization, atmospheric effects during propagation, and human response are given.

  9. Locus minimization in breed prediction using artificial neural network approach.

    PubMed

    Iquebal, M A; Ansari, M S; Sarika; Dixit, S P; Verma, N K; Aggarwal, R A K; Jayakumar, S; Rai, A; Kumar, D

    2014-12-01

    Molecular markers, viz. microsatellites and single nucleotide polymorphisms, have revolutionized breed identification through the use of small samples of biological tissue or germplasm, such as blood, carcass samples, embryos, ova and semen, that show no evident phenotype. Classical tools of molecular data analysis for breed identification have limitations, such as the unavailability of referral breed data, causing increased cost of collection each time, compromised computational accuracy and complexity of the methodology used. We report here the successful use of an artificial neural network (ANN) in background to decrease the cost of genotyping by locus minimization. The webserver is freely accessible (http://nabg.iasri.res.in/bisgoat) to the research community. We demonstrate that the machine learning (ANN) approach for breed identification is capable of multifold advantages such as locus minimization, leading to a drastic reduction in cost, and web availability of reference breed data, alleviating the need for repeated genotyping each time one investigates the identity of an unknown breed. To develop this model web implementation based on ANN, we used 51,850 samples of allelic data of microsatellite-marker-based DNA fingerprinting on 25 loci covering 22 registered goat breeds of India for training. Minimizing loci to up to nine loci through the use of a multilayer perceptron model, we achieved 96.63% training accuracy. This server can be an indispensable tool for identification of existing breeds and new synthetic commercial breeds, leading to protection of intellectual property in case of sovereignty and bio-piracy disputes. This server can be widely used as a model for cost reduction by locus minimization for various other flora and fauna in terms of variety, breed and/or line identification, especially in conservation and improvement programs. PMID:25183434

  10. WASTE MINIMIZATION ASSESSMENT FOR A DAIRY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. n an effort to assist these manufacturers, Waste Minimization Assessment Ce...

  11. Is goal ascription possible in minimal mindreading?

    PubMed

    Butterfill, Stephen A; Apperly, Ian A

    2016-03-01

    In this response to the commentary by Michael and Christensen, we first explain how minimal mindreading is compatible with the development of increasingly sophisticated mindreading behaviors that involve both executive functions and general knowledge and then sketch 1 approach to a minimal account of goal ascription. PMID:26901746

  12. Making the Most of Minimalism in Music.

    ERIC Educational Resources Information Center

    Geiersbach, Frederick J.

    1998-01-01

    Describes the minimalist movement in music. Discusses generations of minimalist musicians and, in general, the minimalist approach. Considers various ways that minimalist strategies can be integrated into the music classroom focusing on (1) minimalism and (2) student-centered composition and principles of minimalism for use with elementary band…

  13. WASTE MINIMIZATION ASSESSMENT FOR A BOURBON DISTILLERY

    EPA Science Inventory

    The U.S.Environmental Protection Agency (EPA) has funded a pilot project to assist small and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. Waste Minimization Assessment Centers (WMACs) were established at selected un...

  14. Minimizing electrode contamination in an electrochemical cell

    DOEpatents

    Kim, Yu Seung; Zelenay, Piotr; Johnston, Christina

    2014-12-09

    An electrochemical cell assembly that is expected to prevent or at least minimize electrode contamination includes one or more getters that trap a component or components leached from a first electrode and prevents or at least minimizes them from contaminating a second electrode.

  15. Methodological Pluralism and Narrative Inquiry

    ERIC Educational Resources Information Center

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  16. Exploring biomolecular systems: From methodology to application

    NASA Astrophysics Data System (ADS)

    Liu, Pu

    This thesis describes new methodology development and applications in the computer simulation on biomolecular systems. To reduce the number of parallel processors in replica exchange, we deform the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. To accelerate barrier crossing in sampling of rough energy landscape, we invoke quantum tunnelling by using Feynman path-integral theory. Combined with local minimization, this new global optimization method successfully locates almost all the known classical global energy minima for Lennard-Jones clusters of size up to 100. We present a new methodology for calculating diffusion coefficients for molecules in confined space and apply it in water-vapor interface. We examine hydrogen bond dynamics of water-vapor interface and compare dynamics in polarizable and fixed charge water models. The result highlights the potential importance of polarization effect in the water-vapor interface. Finally, we discover a strong water drying transition in a biological protein system, the melittin tetramer. This is the first observation of such a strong transition in computer simulation for protein systems. The surface topology is shown to be very important for this drying transition.

  17. A non-parametric segmentation methodology for oral videocapillaroscopic images.

    PubMed

    Bellavia, Fabio; Cacioppo, Antonino; Lupaşcu, Carmen Alina; Messina, Pietro; Scardina, Giuseppe; Tegolo, Domenico; Valenti, Cesare

    2014-05-01

    We aim to describe a new non-parametric methodology to support the clinician during the diagnostic process of oral videocapillaroscopy to evaluate peripheral microcirculation. Our methodology, mainly based on wavelet analysis and mathematical morphology to preprocess the images, segments them by minimizing the within-class luminosity variance of both capillaries and background. Experiments were carried out on a set of real microphotographs to validate this approach versus handmade segmentations provided by physicians. By using a leave-one-patient-out approach, we pointed out that our methodology is robust, according to precision-recall criteria (average precision and recall are equal to 0.924 and 0.923, respectively) and it acts as a physician in terms of the Jaccard index (mean and standard deviation equal to 0.858 and 0.064, respectively). PMID:24657094

  18. Technology applications for radioactive waste minimization

    SciTech Connect

    Devgun, J.S.

    1994-07-01

    The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry.

  19. Minimally Invasive Cardiovascular Surgery: Incisions and Approaches

    PubMed Central

    Langer, Nathaniel B.; Argenziano, Michael

    2016-01-01

    Throughout the modern era of cardiac surgery, most operations have been performed via median sternotomy with cardiopulmonary bypass. This paradigm is changing, however, as cardiovascular surgery is increasingly adopting minimally invasive techniques. Advances in patient evaluation, instrumentation, and operative technique have allowed surgeons to perform a wide variety of complex operations through smaller incisions and, in some cases, without cardiopulmonary bypass. With patients desiring less invasive operations and the literature supporting decreased blood loss, shorter hospital length of stay, improved postoperative pain, and better cosmesis, minimally invasive cardiac surgery should be widely practiced. Here, we review the incisions and approaches currently used in minimally invasive cardiovascular surgery. PMID:27127555

  20. Minimal representations, geometric quantization, and unitarity.

    PubMed Central

    Brylinski, R; Kostant, B

    1994-01-01

    In the framework of geometric quantization we explicitly construct, in a uniform fashion, a unitary minimal representation pio of every simply-connected real Lie group Go such that the maximal compact subgroup of Go has finite center and Go admits some minimal representation. We obtain algebraic and analytic results about pio. We give several results on the algebraic and symplectic geometry of the minimal nilpotent orbits and then "quantize" these results to obtain the corresponding representations. We assume (Lie Go)C is simple. PMID:11607478

  1. Imaging and minimally invasive aortic valve replacement

    PubMed Central

    Loor, Gabriel

    2015-01-01

    Cardiovascular imaging has been the most important tool allowing for innovation in cardiac surgery. There are now a variety of approaches available for treating aortic valve disease, including standard sternotomy, minimally invasive surgery, and percutaneous valve replacement. Minimally invasive cardiac surgery relies on maximizing exposure within a limited field of view. The complexity of this approach is increased as the relationship between the great vessels and the bony thorax varies between individuals. Ultimately, the success of minimally invasive surgery depends on appropriate choices regarding the type and location of the incision, cannulation approach, and cardioprotection strategy. These decisions are facilitated by preoperative imaging, which forms the focus of this review. PMID:25694979

  2. Minimization of power consumption during charging of superconducting accelerating cavities

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Anirban Krishna; Ziemann, Volker; Ruber, Roger; Goryashko, Vitaliy

    2015-11-01

    The radio frequency cavities, used to accelerate charged particle beams, need to be charged to their nominal voltage after which the beam can be injected into them. The standard procedure for such cavity filling is to use a step charging profile. However, during initial stages of such a filling process a substantial amount of the total energy is wasted in reflection for superconducting cavities because of their extremely narrow bandwidth. The paper presents a novel strategy to charge cavities, which reduces total energy reflection. We use variational calculus to obtain analytical expression for the optimal charging profile. Energies, reflected and required, and generator peak power are also compared between the charging schemes and practical aspects (saturation, efficiency and gain characteristics) of power sources (tetrodes, IOTs and solid state power amplifiers) are also considered and analysed. The paper presents a methodology to successfully identify the optimal charging scheme for different power sources to minimize total energy requirement.

  3. Minimizing Variation in Outdoor CPV Power Ratings (Presentation)

    SciTech Connect

    Muller, M.

    2011-04-01

    Presented at the 7th International Conference on Concentrating Photovoltaic Systems (CPV-7), 4-6 April 2011, Las Vegas, Nevada. The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  4. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  5. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    Not Available

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  6. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  7. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  8. Genetic algorithms for minimal source reconstructions

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.

    1993-12-01

    Under-determined linear inverse problems arise in applications in which signals must be estimated from insufficient data. In these problems the number of potentially active sources is greater than the number of observations. In many situations, it is desirable to find a minimal source solution. This can be accomplished by minimizing a cost function that accounts from both the compatibility of the solution with the observations and for its ``sparseness``. Minimizing functions of this form can be a difficult optimization problem. Genetic algorithms are a relatively new and robust approach to the solution of difficult optimization problems, providing a global framework that is not dependent on local continuity or on explicit starting values. In this paper, the authors describe the use of genetic algorithms to find minimal source solutions, using as an example a simulation inspired by the reconstruction of neural currents in the human brain from magnetoencephalographic (MEG) measurements.

  9. Minimally Invasive Transcatheter Aortic Valve Replacement (TAVR)

    MedlinePlus Videos and Cool Tools

    Watch a Broward Health surgeon perform a minimally invasive Transcatheter Aortic Valve Replacement (TAVR) Click Here to view the BroadcastMed, Inc. Privacy Policy and Legal Notice © 2016 BroadcastMed, Inc. All rights reserved.

  10. Assembling Precise Truss Structures With Minimal Stresses

    NASA Technical Reports Server (NTRS)

    Sword, Lee F.

    1996-01-01

    Improved method of assembling precise truss structures involves use of simple devices. Tapered pins that fit in tapered holes indicate deviations from prescribed lengths. Method both helps to ensure precision of finished structures and minimizes residual stresses within structures.

  11. Minimally Invasive Treatments for Breast Cancer

    MedlinePlus

    ... SIR login) Interventional Radiology Minimally Invasive Treatments for Breast Cancer Interventional Radiology Treatments Offer New Options and Hope ... have in the fight against breast cancer. About Breast Cancer When breast tissue divides and grows at an ...

  12. TOWARD MINIMALLY ADHESIVE SURFACES UTILIZING SILOXANES

    EPA Science Inventory

    Three types of siloxane-based network polymers have been investigated for their surface properties towards potential applications as minimally adhesive coatings. A filled poly(dimethylsiloxane) (PDMS) elastomer, RTV it, has been studied to determine surface weldability and stabil...

  13. Waste minimization in electroplating industries: a review.

    PubMed

    Babu, B Ramesh; Bhanu, S Udaya; Meera, K Seeni

    2009-07-01

    Wastewater, spent solvent, spent process solutions, and sludge are the major waste streams generated in large volumes daily in electroplating industries. These waste streams can be significantly minimized through process modification and operational improvement. Waste minimization methods have been implemented in some of the electroplating industries. Suggestions such as practicing source reduction approaches, reduction in drag out and waste, process modification and environmental benefits, have also been adopted. In this endeavor, extensive knowledge covering various disciplines has been studied, which makes problem solving extremely easy. Moreover, available process data pertaining to waste minimization (WM) is usually imprecise, incomplete, and uncertain due to the lack of sensors, the difficulty of measurement, and process variations. In this article waste minimization techniques and its advantages on the improvement of working atmosphere and reduction in operating cost have been discussed. PMID:19657919

  14. Controlling molecular transport in minimal emulsions.

    PubMed

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of 'minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  15. Analysis of lipid flow on minimal surfaces

    NASA Astrophysics Data System (ADS)

    Bahmani, Fatemeh; Christenson, Joel; Rangamani, Padmini

    2016-03-01

    Interaction between the bilayer shape and surface flow is important for capturing the flow of lipids in many biological membranes. Recent microscopy evidence has shown that minimal surfaces (planes, catenoids, and helicoids) occur often in cellular membranes. In this study, we explore lipid flow in these geometries using a `stream function' formulation for viscoelastic lipid bilayers. Using this formulation, we derive two-dimensional lipid flow equations for the commonly occurring minimal surfaces in lipid bilayers. We show that for three minimal surfaces (planes, catenoids, and helicoids), the surface flow equations satisfy Stokes flow equations. In helicoids and catenoids, we show that the tangential velocity field is a Killing vector field. Thus, our analysis provides fundamental insight into the flow patterns of lipids on intracellular organelle membranes that are characterized by fixed shapes reminiscent of minimal surfaces.

  16. Degreasing of titanium to minimize stress corrosion

    NASA Technical Reports Server (NTRS)

    Carpenter, S. R.

    1967-01-01

    Stress corrosion of titanium and its alloys at elevated temperatures is minimized by replacing trichloroethylene with methanol or methyl ethyl ketone as a degreasing agent. Wearing cotton gloves reduces stress corrosion from perspiration before the metal components are processed.

  17. Mesonic spectroscopy of minimal walking technicolor

    SciTech Connect

    Del Debbio, Luigi; Lucini, Biagio; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2010-07-01

    We investigate the structure and the novel emerging features of the mesonic nonsinglet spectrum of the minimal walking technicolor theory. Precision measurements in the nonsinglet pseudoscalar and vector channels are compared to the expectations for an IR-conformal field theory and a QCD-like theory. Our results favor a scenario in which minimal walking technicolor is (almost) conformal in the infrared, while spontaneous chiral symmetry breaking seems less plausible.

  18. Minimally Invasive Osteotomies of the Calcaneus.

    PubMed

    Guyton, Gregory P

    2016-09-01

    Osteotomies of the calcaneus are powerful surgical tools, representing a critical component of the surgical reconstruction of pes planus and pes cavus deformity. Modern minimally invasive calcaneal osteotomies can be performed safely with a burr through a lateral incision. Although greater kerf is generated with the burr, the effect is modest, can be minimized, and is compatible with many fixation techniques. A hinged jig renders the procedure more reproducible and accessible. PMID:27524705

  19. PWM control techniques for rectifier filter minimization

    SciTech Connect

    Ziogas, P.D.; Kang, Y-G; Stefanovic, V.R.

    1985-09-01

    Minimization of input/output filters is an essential step towards manufacturing compact low-cost static power supplies. Three PWM control techniques that yield substantial filter size reduction for three-phase (self-commutated) rectifiers are presented and analyzed. Filters required by typical line-commutated rectifiers are used as the basis for comparison. Moreover, it is shown that in addition to filter minimization two of the proposed three control techniques improve substantially the rectifier total input power factor.

  20. Minimally Invasive Forefoot Surgery in France.

    PubMed

    Meusnier, Tristan; Mukish, Prikesht

    2016-06-01

    Study groups have been formed in France to advance the use of minimally invasive surgery. These techniques are becoming more frequently used and the technique nuances are continuing to evolve. The objective of this article was to advance the awareness of the current trends in minimally invasive surgery for common diseases of the forefoot. The percutaneous surgery at the forefoot is less developed at this time, but also will be discussed. PMID:27261810

  1. Alternating minimization and Boltzmann machine learning.

    PubMed

    Byrne, W

    1992-01-01

    Training a Boltzmann machine with hidden units is appropriately treated in information geometry using the information divergence and the technique of alternating minimization. The resulting algorithm is shown to be closely related to gradient descent Boltzmann machine learning rules, and the close relationship of both to the EM algorithm is described. An iterative proportional fitting procedure for training machines without hidden units is described and incorporated into the alternating minimization algorithm. PMID:18276461

  2. Future of Minimally Invasive Colorectal Surgery.

    PubMed

    Whealon, Matthew; Vinci, Alessio; Pigazzi, Alessio

    2016-09-01

    Minimally invasive surgery is slowly taking over as the preferred operative approach for colorectal diseases. However, many of the procedures remain technically difficult. This article will give an overview of the state of minimally invasive surgery and the many advances that have been made over the last two decades. Specifically, we discuss the introduction of the robotic platform and some of its benefits and limitations. We also describe some newer techniques related to robotics. PMID:27582647

  3. Mach, methodology, hysteresis and economics

    NASA Astrophysics Data System (ADS)

    Cross, R.

    2008-11-01

    This methodological note examines the epistemological foundations of hysteresis with particular reference to applications to economic systems. The economy principles of Ernst Mach are advocated and used in this assessment.

  4. Methodological Problems of Soviet Pedagogy

    ERIC Educational Resources Information Center

    Noah, Harold J., Ed.; Beach, Beatrice S., Ed.

    1974-01-01

    Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)

  5. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  6. New Directions for Futures Methodology.

    ERIC Educational Resources Information Center

    Enzer, Selwyn

    1983-01-01

    Understanding the link between futures research and strategic planning is crucial to effective long-range planning and administration. Current trends and the latest developments in the methodology of futures research are discussed. (MLW)

  7. Minimally Invasive Surgery in Gynecologic Oncology

    PubMed Central

    Mori, Kristina M.; Neubauer, Nikki L.

    2013-01-01

    Minimally invasive surgery has been utilized in the field of obstetrics and gynecology as far back as the 1940s when culdoscopy was first introduced as a visualization tool. Gynecologists then began to employ minimally invasive surgery for adhesiolysis and obtaining biopsies but then expanded its use to include procedures such as tubal sterilization (Clyman (1963), L. E. Smale and M. L. Smale (1973), Thompson and Wheeless (1971), Peterson and Behrman (1971)). With advances in instrumentation, the first laparoscopic hysterectomy was successfully performed in 1989 by Reich et al. At the same time, minimally invasive surgery in gynecologic oncology was being developed alongside its benign counterpart. In the 1975s, Rosenoff et al. reported using peritoneoscopy for pretreatment evaluation in ovarian cancer, and Spinelli et al. reported on using laparoscopy for the staging of ovarian cancer. In 1993, Nichols used operative laparoscopy to perform pelvic lymphadenectomy in cervical cancer patients. The initial goals of minimally invasive surgery, not dissimilar to those of modern medicine, were to decrease the morbidity and mortality associated with surgery and therefore improve patient outcomes and patient satisfaction. This review will summarize the history and use of minimally invasive surgery in gynecologic oncology and also highlight new minimally invasive surgical approaches currently in development. PMID:23997959

  8. Multifunction minimization for programmable logic arrays

    SciTech Connect

    Campbell, J.A.

    1984-01-01

    The problem of minimizing two-level AND/OR Boolean algebraic functions of n inputs and m outputs for implementation on programmable logic arrays (PLA) is examined. The theory of multiple-output functions as well as the historically alternative approaches to reckoning the cost of an equation implementation are reviewed. The PLA is shown to be a realization of the least product gate equation cost criterion. The multi-function minimization is dealt with in the context of a directed tree search algorithm developed in previous research. The PLA oriented minimization is shown to alter the nature of each of the basic tenets of multiple-output minimization used in earlier work. The concept of a non-prime but selectable implicant is introduced. A new cost criterion, the quantum cost, is discussed, and an approximation algorithm utilizing this criterion is developed. A timing analysis of a cyclic resolution algorithm for PLA based functions is presented. Lastly, the question of efficiency in automated minimization algorithms is examined. The application of the PLA cost criterion is shown to exhibit intrinsic increases in computational efficiency. A minterm classification algorithm is suggested and a PLA minimization algorithm is implemented in the FORTRAN language.

  9. Economic impact of minimally invasive lumbar surgery

    PubMed Central

    Hofstetter, Christoph P; Hofer, Anna S; Wang, Michael Y

    2015-01-01

    Cost effectiveness has been demonstrated for traditional lumbar discectomy, lumbar laminectomy as well as for instrumented and noninstrumented arthrodesis. While emerging evidence suggests that minimally invasive spine surgery reduces morbidity, duration of hospitalization, and accelerates return to activites of daily living, data regarding cost effectiveness of these novel techniques is limited. The current study analyzes all available data on minimally invasive techniques for lumbar discectomy, decompression, short-segment fusion and deformity surgery. In general, minimally invasive spine procedures appear to hold promise in quicker patient recovery times and earlier return to work. Thus, minimally invasive lumbar spine surgery appears to have the potential to be a cost-effective intervention. Moreover, novel less invasive procedures are less destabilizing and may therefore be utilized in certain indications that traditionally required arthrodesis procedures. However, there is a lack of studies analyzing the economic impact of minimally invasive spine surgery. Future studies are necessary to confirm the durability and further define indications for minimally invasive lumbar spine procedures. PMID:25793159

  10. Department of Energy's waste minimization program

    SciTech Connect

    Not Available

    1991-09-01

    Waste minimization, as mandated by the Congress, requires, the elimination or reduction of the generation of waste as its source, that is, before it can become waste. This audit was made to determine the adequacy of DOE's efforts to minimize the generation of waste. The audit emphasized radioactive and other hazardous waste generation at DOE's nuclear weapons production plants and design laboratories. We included waste minimization activities and actions that can be taken now, in contrast to the long-range weapons complex modernization effort. We reviewed waste minimization activities within the Office of Environmental Restoration and Waste Management (EM), the Office of the Assistant Secretary for Defense Programs (DP), the Hazardous Waste Remedial Action Program Office, and the Waste Minimization Management Group (WMMG) in the Albuquerque Field Office. Waste minimization programs were examined in detail at the three largest nuclear weapons production facilities -- the Rocky Flats plant, which manufactures plutonium parts; the Y-12 facility, which produces uranium components; and the Savannah River site, which manufactures and loads tritium -- and two of DOE's weapons design laboratories, Los Alamos and Sandia.

  11. Swords into plowshares -- Tritium waste minimization (training development project)

    SciTech Connect

    Hehmeyer, J.; Sienkiewicz, C.; Kent, L.; Gill, J.; Schmitz, W.; Mills, T.; Wurstner, R.; Adams, F.; Seabaugh, P.

    1995-12-31

    A concentrated emphasis of Mound`s historical mission has been working with tritium. As the phase out of defense work begins and the increase on environmental technology strengthens, so too must a shift occur in applying one`s focus. Mound`s longstanding efforts in Tritium Training have proven fruitful to them and the Complex. It is this emphasis for which a new generation of worker training is being developed, one which reflects a new mission; Tritium Waste Minimization. The efforts of previous training, particularly under Accreditation, have given a solid base on which to launch the Waste Minimization program. Typical operations consider the impact on the varying levels of containment and the tools and agents used to achieve those levels. D and D and system modifications are bringing new light to such things as floor tile, oils, mole sieves, and rust. Of financial interest is the amount of savings which have been obtained through review and modification, rather than developing a new program. The authors are learning not to reinvent the wheel. The presentation will compare and contrast the methodologies used in creating and implementing this training program. Emphasis will be placed on lessons learned, costs saved, and program enhancement.

  12. Cluster Stability Estimation Based on a Minimal Spanning Trees Approach

    NASA Astrophysics Data System (ADS)

    Volkovich, Zeev (Vladimir); Barzily, Zeev; Weber, Gerhard-Wilhelm; Toledano-Kitai, Dvora

    2009-08-01

    Among the areas of data and text mining which are employed today in science, economy and technology, clustering theory serves as a preprocessing step in the data analyzing. However, there are many open questions still waiting for a theoretical and practical treatment, e.g., the problem of determining the true number of clusters has not been satisfactorily solved. In the current paper, this problem is addressed by the cluster stability approach. For several possible numbers of clusters we estimate the stability of partitions obtained from clustering of samples. Partitions are considered consistent if their clusters are stable. Clusters validity is measured as the total number of edges, in the clusters' minimal spanning trees, connecting points from different samples. Actually, we use the Friedman and Rafsky two sample test statistic. The homogeneity hypothesis, of well mingled samples within the clusters, leads to asymptotic normal distribution of the considered statistic. Resting upon this fact, the standard score of the mentioned edges quantity is set, and the partition quality is represented by the worst cluster corresponding to the minimal standard score value. It is natural to expect that the true number of clusters can be characterized by the empirical distribution having the shortest left tail. The proposed methodology sequentially creates the described value distribution and estimates its left-asymmetry. Numerical experiments, presented in the paper, demonstrate the ability of the approach to detect the true number of clusters.

  13. Approach to analytically minimize the LCD moiré by image-based particle swarm optimization.

    PubMed

    Tsai, Yu-Lin; Tien, Chung-Hao

    2015-10-01

    In this paper, we proposed a methodology to optimize the parametric window of a liquid crystal display (LCD) system, whose visual performance was deteriorated by the pixel moiré arising in between multiple periodic structures. Conventional analysis and minimization of moiré patterns are limited by few parameters. With the proposed image-based particle swarm optimization (PSO), we enable a multivariable optimization at the same time. A series of experiments was conducted to validate the methodology. Due to its versatility, the proposed technique will certainly have a promising impact on the fast optimization in LCD design with more complex configuration. PMID:26479663

  14. Minimally invasive procedures on the lumbar spine.

    PubMed

    Skovrlj, Branko; Gilligan, Jeffrey; Cutler, Holt S; Qureshi, Sheeraz A

    2015-01-16

    Degenerative disease of the lumbar spine is a common and increasingly prevalent condition that is often implicated as the primary reason for chronic low back pain and the leading cause of disability in the western world. Surgical management of lumbar degenerative disease has historically been approached by way of open surgical procedures aimed at decompressing and/or stabilizing the lumbar spine. Advances in technology and surgical instrumentation have led to minimally invasive surgical techniques being developed and increasingly used in the treatment of lumbar degenerative disease. Compared to the traditional open spine surgery, minimally invasive techniques require smaller incisions and decrease approach-related morbidity by avoiding muscle crush injury by self-retaining retractors, preventing the disruption of tendon attachment sites of important muscles at the spinous processes, using known anatomic neurovascular and muscle planes, and minimizing collateral soft-tissue injury by limiting the width of the surgical corridor. The theoretical benefits of minimally invasive surgery over traditional open surgery include reduced blood loss, decreased postoperative pain and narcotics use, shorter hospital length of stay, faster recover and quicker return to work and normal activity. This paper describes the different minimally invasive techniques that are currently available for the treatment of degenerative disease of the lumbar spine. PMID:25610845

  15. Minimal control power of the controlled teleportation

    NASA Astrophysics Data System (ADS)

    Jeong, Kabgyun; Kim, Jaewan; Lee, Soojoon

    2016-03-01

    We generalize the control power of a perfect controlled teleportation of an entangled three-qubit pure state, suggested by Li and Ghose [Phys. Rev. A 90, 052305 (2014), 10.1103/PhysRevA.90.052305], to the control power of a general controlled teleportation of a multiqubit pure state. Thus, we define the minimal control power, and calculate the values of the minimal control power for a class of general three-qubit Greenberger-Horne-Zeilinger (GHZ) states and the three-qubit W class whose states have zero three-tangles. Moreover, we show that the standard three-qubit GHZ state and the standard three-qubit W state have the maximal values of the minimal control power for the two classes, respectively. This means that the minimal control power can be interpreted as not only an operational quantity of a three-qubit quantum communication but also a degree of three-qubit entanglement. In addition, we calculate the values of the minimal control power for general n -qubit GHZ states and the n -qubit W -type states.

  16. Methodology of metal criticality determination.

    PubMed

    Graedel, T E; Barr, Rachel; Chandler, Chelsea; Chase, Thomas; Choi, Joanne; Christoffersen, Lee; Friedlander, Elizabeth; Henly, Claire; Jun, Christine; Nassar, Nedal T; Schechner, Daniel; Warren, Simon; Yang, Man-Yu; Zhu, Charles

    2012-01-17

    A comprehensive methodology has been created to quantify the degree of criticality of the metals of the periodic table. In this paper, we present and discuss the methodology, which is comprised of three dimensions: supply risk, environmental implications, and vulnerability to supply restriction. Supply risk differs with the time scale (medium or long), and at its more complex involves several components, themselves composed of a number of distinct indicators drawn from readily available peer-reviewed indexes and public information. Vulnerability to supply restriction differs with the organizational level (i.e., global, national, and corporate). The criticality methodology, an enhancement of a United States National Research Council template, is designed to help corporate, national, and global stakeholders conduct risk evaluation and to inform resource utilization and strategic decision-making. Although we believe our methodological choices lead to the most robust results, the framework has been constructed to permit flexibility by the user. Specific indicators can be deleted or added as desired and weighted as the user deems appropriate. The value of each indicator will evolve over time, and our future research will focus on this evolution. The methodology has proven to be sufficiently robust as to make it applicable across the entire spectrum of metals and organizational levels and provides a structural approach that reflects the multifaceted factors influencing the availability of metals in the 21st century. PMID:22191617

  17. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services. PMID:16378531

  18. Minimal invasive treatments for liver malignancies.

    PubMed

    Orsi, Franco; Varano, Gianluca

    2015-11-01

    Minimal invasive therapies have proved useful in the management of primary and secondary hepatic malignancies. The most relevant aspects of all these therapies are their minimal toxicity profiles and highly effective tumor responses without affecting the normal hepatic parenchyma. These unique characteristics coupled with their minimally invasive nature provide an attractive therapeutic option for patients who previously may have had few alternatives. Combination of these therapies might extend indications to bring curative treatment to a wider selected population. The results of various ongoing combination trials of intraarterial therapies with targeted therapies are awaited to further improve survival in this patient group. This review focuses on the application of ablative and intra-arterial therapies in the management of hepatocellular carcinoma and hepatic colorectal metastasis. PMID:26050603

  19. On Equilibria for ADM Minimization Games

    NASA Astrophysics Data System (ADS)

    Epstein, Leah; Levin, Asaf

    In the ADM minimization problem, the input is a set of arcs along a directed ring. The input arcs need to be partitioned into non-overlapping chains and cycles so as to minimize the total number of endpoints, where a k-arc cycle contributes k endpoints and a k-arc chain contains k + 1 endpoints. We study ADM minimization problem both as a non-cooperative and a cooperative games. In these games, each arc corresponds to a player, and the players share the cost of the ADM switches. We consider two cost allocation models, a model which was considered by Flammini et al., and a new cost allocation model, which is inspired by congestion games. We compare the price of anarchy and price of stability in the two cost allocation models, as well as the strong price of anarchy and the strong price of stability.

  20. Advanced pyrochemical technologies for minimizing nuclear waste

    SciTech Connect

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-06-01

    The Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium, from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 liters of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper will present technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts.

  1. Genetic Research on Biospecimens Poses Minimal Risk

    PubMed Central

    Wendler, David S.; Rid, Annette

    2014-01-01

    Genetic research on human biospecimens is increasingly common. Yet, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to address this concern using the widely-endorsed ‘risks of daily life’ standard. The three extant versions of this standard all suggest that, with proper measures in place to protect donor confidentiality, most genetic research on human biospecimens poses minimal risk to donors. PMID:25530152

  2. One-dimensional Gromov minimal filling problem

    SciTech Connect

    Ivanov, Alexandr O; Tuzhilin, Alexey A

    2012-05-31

    The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.

  3. [EVOLUTION OF MINIMALLY INVASIVE CARDIAC SURGERY].

    PubMed

    Fujita, Tomoyuki; Kobayashi, Junjiro

    2016-03-01

    Minimally invasive surgery is an attractive choice for patients undergoing major cardiac surgery. We review the history of minimally invasive valve surgery in this article. Due to many innovations in surgical tools, cardiopulmonary bypass systems, visualization systems, and robotic systems as well as surgical techniques, minimally invasive cardiac surgery has become standard care for valve lesion repair. In particular, aortic cross-clamp techniques and methods for cardioplegia using the Chitwood clamp and root cannula or endoballoon catheter in combination with femoro-femoral bypass systems have made such procedures safer and more practical. On the other hand, robotically assisted surgery has not become standard due to the cost and slow learning curve. However, along with the development of robotics, this less-invasive technique may provide another choice for patients in the near future. PMID:27295770

  4. Genetic research on biospecimens poses minimal risk.

    PubMed

    Wendler, David S; Rid, Annette

    2015-01-01

    Genetic research on human biospecimens is increasingly common. However, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to resolve this debate using the widely-endorsed 'risks of daily life' standard. The three extant versions of this standard all suggest that, with proper measures in place to protect confidentiality, most genetic research on human biospecimens poses minimal risk to donors. PMID:25530152

  5. Responsible gambling: general principles and minimal requirements.

    PubMed

    Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc

    2011-12-01

    Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program. PMID:21359586

  6. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  7. The NLC Software Requirements Methodology

    SciTech Connect

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  8. Minimally invasive surgical training: challenges and solutions.

    PubMed

    Pierorazio, Phillip M; Allaf, Mohamad E

    2009-01-01

    Treatment options for urological malignancies continue to increase and include endoscopic, laparoscopic, robotic, and image-guided percutaneous techniques. This ever expanding array of technically demanding management options coupled with a static training paradigm introduces challenges to training the urological oncologist of the future. Minimally invasive learning opportunities continue to evolve, and include an intensive experience during residency, postgraduate short courses or mini-apprenticeships, and full time fellowship programs. Incorporation of large animal surgery and surgical simulators may help shorten the necessary learning curve. Ultimately, programs must provide an intense hands-on experience to trainees in all minimally invasive surgical aspects for optimal training. PMID:19285236

  9. The Parisi Formula has a Unique Minimizer

    NASA Astrophysics Data System (ADS)

    Auffinger, Antonio; Chen, Wei-Kuo

    2015-05-01

    In 1979, Parisi (Phys Rev Lett 43:1754-1756, 1979) predicted a variational formula for the thermodynamic limit of the free energy in the Sherrington-Kirkpatrick model, and described the role played by its minimizer. This formula was verified in the seminal work of Talagrand (Ann Math 163(1):221-263, 2006) and later generalized to the mixed p-spin models by Panchenko (Ann Probab 42(3):946-958, 2014). In this paper, we prove that the minimizer in Parisi's formula is unique at any temperature and external field by establishing the strict convexity of the Parisi functional.

  10. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  11. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D.W.; Bennett, B.L.; Cockroft, N.J.

    1998-09-08

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal. 5 figs.

  12. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D. Wayne; Bennett, Bryan L.; Cockroft, Nigel J.

    1998-01-01

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal.

  13. Minimal mass design of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Nagase, Kenji; Skelton, R. E.

    2014-03-01

    This paper provides a unified framework for minimal mass design of tensegrity systems. For any given configuration and any given set of external forces, we design force density (member force divided by length) and cross-section area to minimize the structural mass subject to an equilibrium condition and a maximum stress constraint. The answer is provided by a linear program. Stability is assured by a positive definite stiffness matrix. This condition is described by a linear matrix inequality. Numerical examples are shown to illustrate the proposed method.

  14. Navigated minimally invasive unicompartmental knee arthroplasty.

    PubMed

    Jenny, Jean-Yves; Müller, Peter E; Weyer, R; John, Michael; Weber, Patrick; Ciobanu, Eugène; Schmitz, Andreas; Bacher, Thomas; Neumann, Wolfram; Jansson, Volkmar

    2006-10-01

    Unicompartmental knee arthroplasty (UKA) is an alternative procedure to high tibial osteotomy. This study assessed the procedure using computer navigation to improve implantation accuracy and presents early radiological results of a group of patients implanted with the univation UKA (B. Braun Aesculap, Tuttlingen, Germany) with navigation instrumentation and a minimally invasive approach. The authors concluded that navigated implantation of a UKA using a nonimage-based system improved radiologic accuracy implantation without significant inconvenience and minimal change in the conventional operating technique. PMID:17407935

  15. Instabilities and Solitons in Minimal Strips

    NASA Astrophysics Data System (ADS)

    Machon, Thomas; Alexander, Gareth P.; Goldstein, Raymond E.; Pesci, Adriana I.

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ4 theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  16. Instabilities and Solitons in Minimal Strips.

    PubMed

    Machon, Thomas; Alexander, Gareth P; Goldstein, Raymond E; Pesci, Adriana I

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ^{4} theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation. PMID:27419593

  17. Minimally invasive transforaminal lumbosacral interbody fusion.

    PubMed

    Chang, Peng-Yuan; Wang, Michael Y

    2016-07-01

    In minimally invasive spinal fusion surgery, transforaminal lumbar (sacral) interbody fusion (TLIF) is one of the most common procedures that provides both anterior and posterior column support without retraction or violation to the neural structure. Direct and indirect decompression can be done through this single approach. Preoperative plain radiographs and MR scan should be carefully evaluated. This video demonstrates a standard approach for how to perform a minimally invasive transforaminal lumbosacral interbody fusion. The video can be found here: https://youtu.be/bhEeafKJ370 . PMID:27364426

  18. Non-minimal inflation and SUSY GUTs

    SciTech Connect

    Okada, Nobuchika

    2012-07-27

    The Standard Model Higgs boson with the nonminimal coupling to the gravitational curvature can drive cosmological inflation. We study this type of inflationary scenario in the context of supergravity. We first point out that it is naturally implemented in the minimal supersymmetric SU(5) model, and hence virtually in any GUT models. Next we propose another scenario based on the Minimal Supersymmetric Standard Model supplemented by the right-handed neutrinos. These models can be tested by new observational data from the Planck satellite experiments within a few years.

  19. Minimally invasive approach to familial multiple lipomatosis.

    PubMed

    Ronan, S J; Broderick, T

    2000-09-01

    Thirty-five abdominal wall lipomas were removed from a patient with familial multiple lipomatosis using a minimally invasive approach in a cost-effective, reliable, and cosmetically pleasing manner. The surgical technique used is described in this case report. Clinical findings and prior excisions provided the preoperative diagnosis. The abdominal wall was dissected through two small, vertical midline incisions in the suprafascial plane with the aid of a lighted breast retractor. A complete excision of all palpable lipomas was achieved with this approach. The patient had excellent cosmetic results with minimal postoperative scarring. PMID:11007403

  20. From Jack polynomials to minimal model spectra

    NASA Astrophysics Data System (ADS)

    Ridout, David; Wood, Simon

    2015-01-01

    In this note, a deep connection between free field realizations of conformal field theories and symmetric polynomials is presented. We give a brief introduction into the necessary prerequisites of both free field realizations and symmetric polynomials, in particular Jack symmetric polynomials. Then we combine these two fields to classify the irreducible representations of the minimal model vertex operator algebras as an illuminating example of the power of these methods. While these results on the representation theory of the minimal models are all known, this note exploits the full power of Jack polynomials to present significant simplifications of the original proofs in the literature.

  1. Minimal Guidelines for Authors of Web Pages.

    ERIC Educational Resources Information Center

    ADE Bulletin, 2002

    2002-01-01

    Presents guidelines that recommend the minimal reference information that should be provided on Web pages intended for use by students, teachers, and scholars in the modern languages. Suggests the inclusion of information about responsible parties, copyright declaration, privacy statements, and site information. Makes a note on Web page style. (SG)

  2. Pancreatic cancer: Open or minimally invasive surgery?

    PubMed Central

    Zhang, Yu-Hua; Zhang, Cheng-Wu; Hu, Zhi-Ming; Hong, De-Fei

    2016-01-01

    Pancreatic duct adenocarcinoma is one of the most fatal malignancies, with R0 resection remaining the most important part of treatment of this malignancy. However, pancreatectomy is believed to be one of the most challenging procedures and R0 resection remains the only chance for patients with pancreatic cancer to have a good prognosis. Some surgeons have tried minimally invasive pancreatic surgery, but the short- and long-term outcomes of pancreatic malignancy remain controversial between open and minimally invasive procedures. We collected comparative data about minimally invasive and open pancreatic surgery. The available evidence suggests that minimally invasive pancreaticoduodenectomy (MIPD) is as safe and feasible as open PD (OPD), and shows some benefit, such as less intraoperative blood loss and shorter postoperative hospital stay. Despite the limited evidence for MIPD in pancreatic cancer, most of the available data show that the short-term oncological adequacy is similar between MIPD and OPD. Some surgical techniques, including superior mesenteric artery-first approach and laparoscopic pancreatoduodenectomy with major vein resection, are believed to improve the rate of R0 resection. Laparoscopic distal pancreatectomy is less technically demanding and is accepted in more pancreatic centers. It is technically safe and feasible and has similar short-term oncological prognosis compared with open distal pancreatectomy. PMID:27621576

  3. Inflation with non-minimally derivative coupling

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Gao, Qing; Gong, Yungui

    2015-10-01

    We derive the second order correction to the scalar and tensor spectral tilts for the inflationary models with non-minimally derivative coupling. In the high friction limit, the quartic power law potential is consistent with the observational constraint at 95% CL because the amplitude of the primordial gravitational waves is smaller, and the inflaton excursion is sub-Planckian.

  4. Minimal Mimicry: Mere Effector Matching Induces Preference

    ERIC Educational Resources Information Center

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-01-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually…

  5. MULTIOBJECTIVE PARALLEL GENETIC ALGORITHM FOR WASTE MINIMIZATION

    EPA Science Inventory

    In this research we have developed an efficient multiobjective parallel genetic algorithm (MOPGA) for waste minimization problems. This MOPGA integrates PGAPack (Levine, 1996) and NSGA-II (Deb, 2000) with novel modifications. PGAPack is a master-slave parallel implementation of a...

  6. Minimally Invasive Surgery for Inflammatory Bowel Disease

    PubMed Central

    Holder-Murray, Jennifer; Marsicovetere, Priscilla

    2015-01-01

    Abstract: Surgical management of inflammatory bowel disease is a challenging endeavor given infectious and inflammatory complications, such as fistula, and abscess, complex often postoperative anatomy, including adhesive disease from previous open operations. Patients with Crohn's disease and ulcerative colitis also bring to the table the burden of their chronic illness with anemia, malnutrition, and immunosuppression, all common and contributing independently as risk factors for increased surgical morbidity in this high-risk population. However, to reduce the physical trauma of surgery, technologic advances and worldwide experience with minimally invasive surgery have allowed laparoscopic management of patients to become standard of care, with significant short- and long-term patient benefits compared with the open approach. In this review, we will describe the current state-of the-art for minimally invasive surgery for inflammatory bowel disease and the caveats inherent with this practice in this complex patient population. Also, we will review the applicability of current and future trends in minimally invasive surgical technique, such as laparoscopic “incisionless,” single-incision laparoscopic surgery (SILS), robotic-assisted, and other techniques for the patient with inflammatory bowel disease. There can be no doubt that minimally invasive surgery has been proven to decrease the short- and long-term burden of surgery of these chronic illnesses and represents high-value care for both patient and society. PMID:25989341

  7. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. The intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative s...

  8. Practice Enables Successful Learning under Minimal Guidance

    ERIC Educational Resources Information Center

    Brunstein, Angela; Betts, Shawn; Anderson, John R.

    2009-01-01

    Two experiments were conducted, contrasting a minimally guided discovery condition with a variety of instructional conditions. College students interacted with a computer-based tutor that presented algebra-like problems in a novel graphical representation. Although the tutor provided no instruction in a discovery condition, it constrained the…

  9. Minimally invasive pancreatic surgery – a review

    PubMed Central

    Damoli, Isacco; Ramera, Marco; Paiella, Salvatore; Marchegiani, Giovanni; Bassi, Claudio

    2015-01-01

    During the past 20 years the application of a minimally invasive approach to pancreatic surgery has progressively increased. Distal pancreatectomy is the most frequently performed procedure, because of the absence of a reconstructive phase. However, middle pancreatectomy and pancreatoduodenectomy have been demonstrated to be safe and feasible as well. Laparoscopic distal pancreatectomy is recognized as the gold standard treatment for small tumors of the pancreatic body-tail, with several advantages over the traditional open approach in terms of patient recovery. The surgical treatment of lesions of the pancreatic head via a minimally invasive approach is still limited to a few highly experienced surgeons, due to the very challenging resection and complex anastomoses. Middle pancreatectomy and enucleation are indicated for small and benign tumors and offer the maximum preservation of the parenchyma. The introduction of a robotic platform more than ten years ago increased the interest of many surgeons in minimally invasive treatment of pancreatic diseases. This new technology overcomes all the limitations of laparoscopic surgery, but actual benefits for the patients are still under investigation. The increased costs associated with robotic surgery are under debate too. This article presents the state of the art of minimally invasive pancreatic surgery. PMID:26240612

  10. Minimal Interventions in the Teaching of Mathematics

    ERIC Educational Resources Information Center

    Foster, Colin

    2014-01-01

    This paper addresses ways in which mathematics pedagogy can benefit from insights gleaned from counselling. Person-centred counselling stresses the value of genuineness, warm empathetic listening and minimal intervention to support people in solving their own problems and developing increased autonomy. Such an approach contrasts starkly with the…