Science.gov

Sample records for minimal cut-set methodology

  1. Minimal cut-set methodology for artificial intelligence applications

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements.

  2. Minimal cut-set methodology for artificial intelligence applications

    SciTech Connect

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper suggests that given the considerable (and growing) literature of expert systems for diagnostics and maintenance, consideration of the theory of minimal cut sets should be most beneficial. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification reduces the tree to an equivalent diagram displaying the smallest combinations of independent component failures which could result in the fault symbolized by the root of the tree and called the top event. This paper reviews minimal cut-set theory and illustrates its application with an example. Using this approach, expert diagnostic systems would have a tool in which, with minimum search, the description of fault causes is made clear and explicit, contributor sequences to a top event fault are easily quantified and ranked, and the probability of the top event is easily computed. Finally, the application of minimal cut sets to planning and problem solving is developed.

  3. CUTSETS - MINIMAL CUT SET CALCULATION FOR DIGRAPH AND FAULT TREE RELIABILITY MODELS

    NASA Technical Reports Server (NTRS)

    Iverson, D. L.

    1994-01-01

    Fault tree and digraph models are frequently used for system failure analysis. Both type of models represent a failure space view of the system using AND and OR nodes in a directed graph structure. Fault trees must have a tree structure and do not allow cycles or loops in the graph. Digraphs allow any pattern of interconnection between loops in the graphs. A common operation performed on digraph and fault tree models is the calculation of minimal cut sets. A cut set is a set of basic failures that could cause a given target failure event to occur. A minimal cut set for a target event node in a fault tree or digraph is any cut set for the node with the property that if any one of the failures in the set is removed, the occurrence of the other failures in the set will not cause the target failure event. CUTSETS will identify all the minimal cut sets for a given node. The CUTSETS package contains programs that solve for minimal cut sets of fault trees and digraphs using object-oriented programming techniques. These cut set codes can be used to solve graph models for reliability analysis and identify potential single point failures in a modeled system. The fault tree minimal cut set code reads in a fault tree model input file with each node listed in a text format. In the input file the user specifies a top node of the fault tree and a maximum cut set size to be calculated. CUTSETS will find minimal sets of basic events which would cause the failure at the output of a given fault tree gate. The program can find all the minimal cut sets of a node, or minimal cut sets up to a specified size. The algorithm performs a recursive top down parse of the fault tree, starting at the specified top node, and combines the cut sets of each child node into sets of basic event failures that would cause the failure event at the output of that gate. Minimal cut set solutions can be found for all nodes in the fault tree or just for the top node. The digraph cut set code uses the same

  4. A new efficient algorithm generating all minimal S-T cut-sets in a graph-modeled network

    NASA Astrophysics Data System (ADS)

    Malinowski, Jacek

    2016-06-01

    A new algorithm finding all minimal s-t cut-sets in a graph-modeled network with failing links and nodes is presented. It is based on the analysis of the tree of acyclic s-t paths connecting a given pair of nodes in the considered structure. The construction of such a tree is required by many existing algorithms for s-t cut-sets generation in order to eliminate "stub" edges or subgraphs through which no acyclic path passes. The algorithm operates on the acyclic paths tree alone, i.e. no other analysis of the network's topology is necessary. It can be applied to both directed and undirected graphs, as well as partly directed ones. It is worth noting that the cut-sets can be composed of both links and failures, while many known algorithms do not take nodes into account, which is quite restricting from the practical point of view. The developed cut-sets generation technique makes the algorithm significantly faster than most of the previous methods, as proved by the experiments.

  5. FTA Basic Event & Cut Set Ranking.

    Energy Science and Technology Software Center (ESTSC)

    1999-05-04

    Version 00 IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability ofmore » the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code.« less

  6. SIGPI. Fault Tree Cut Set System Performance

    SciTech Connect

    Patenaude, C.J.

    1992-01-14

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.

  7. SIGPI. Fault Tree Cut Set System Performance

    SciTech Connect

    Patenaude, C.J.

    1992-01-13

    SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependent components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.

  8. Fault Tree Cut Set System Performance.

    Energy Science and Technology Software Center (ESTSC)

    2000-02-21

    Version 00 SIGPI computes the probabilistic performance of complex systems by combining cut set or other binary product data with probability information on each basic event. SIGPI is designed to work with either coherent systems, where the system fails when certain combinations of components fail, or noncoherent systems, where at least one cut set occurs only if at least one component of the system is operating properly. The program can handle conditionally independent components, dependentmore » components, or a combination of component types and has been used to evaluate responses to environmental threats and seismic events. The three data types that can be input are cut set data in disjoint normal form, basic component probabilities for independent basic components, and mean and covariance data for statistically dependent basic components.« less

  9. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  10. Energy minimization in medical image analysis: Methodologies and applications.

    PubMed

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26186171

  11. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  12. A methodology for formulating a minimal uncertainty model for robust control system design and analysis

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1989-01-01

    In the design and analysis of robust control systems for uncertain plants, the technique of formulating what is termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents the transfer function matrix M(s) of the nominal system, and delta represents an uncertainty matrix acting on M(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unstructured uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, and for real parameter variations the diagonal elements are real. As stated in the literature, this structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the literature addresses methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty. Since have a delta matrix of minimum order would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta model would be useful. A generalized method of obtaining a minimal M-delta structure for systems with real parameter variations is given.

  13. Using benchmarking to minimize common DOE waste streams. Volume 1, Methodology and liquid photographic waste

    SciTech Connect

    Levin, V.

    1994-04-01

    Finding innovative ways to reduce waste streams generated at Department of Energy (DOE) sites by 50% by the year 2000 is a challenge for DOE`s waste minimization efforts. This report examines the usefulness of benchmarking as a waste minimization tool, specifically regarding common waste streams at DOE sites. A team of process experts from a variety of sites, a project leader, and benchmarking consultants completed the project with management support provided by the Waste Minimization Division EM-352. Using a 12-step benchmarking process, the team examined current waste minimization processes for liquid photographic waste used at their sites and used telephone and written questionnaires to find ``best-in-class`` industrv partners willing to share information about their best waste minimization techniques and technologies through a site visit. Eastman Kodak Co., and Johnson Space Center/National Aeronautics and Space Administration (NASA) agreed to be partners. The site visits yielded strategies for source reduction, recycle/recovery of components, regeneration/reuse of solutions, and treatment of residuals, as well as best management practices. An additional benefit of the work was the opportunity for DOE process experts to network and exchange ideas with their peers at similar sites.

  14. Methodology to optimize detector geometry in fluorescence tomography of tissue using the minimized curvature of the summed diffuse sensitivity projections.

    PubMed

    Holt, Robert W; Leblond, Frederic L; Pogue, Brian W

    2013-08-01

    The dependence of the sensitivity function in fluorescence tomography on the geometry of the excitation source and detection locations can severely influence an imaging system's ability to recover fluorescent distributions. Here a methodology for choosing imaging configuration based on the uniformity of the sensitivity function is presented. The uniformity of detection sensitivity is correlated with reconstruction accuracy in silico, and reconstructions in a murine head model show that a detector configuration optimized using Nelder-Mead minimization improves recovery over uniformly sampled tomography. PMID:24323220

  15. A minimally invasive methodology based on morphometric parameters for day 2 embryo quality assessment.

    PubMed

    Molina, Inmaculada; Lázaro-Ibáñez, Elisa; Pertusa, Jose; Debón, Ana; Martínez-Sanchís, Juan Vicente; Pellicer, Antonio

    2014-10-01

    The risk of multiple pregnancy to maternal-fetal health can be minimized by reducing the number of embryos transferred. New tools for selecting embryos with the highest implantation potential should be developed. The aim of this study was to evaluate the ability of morphological and morphometric variables to predict implantation by analysing images of embryos. This was a retrospective study of 135 embryo photographs from 112 IVF-ICSI cycles carried out between January and March 2011. The embryos were photographed immediately before transfer using Cronus 3 software. Their images were analysed using the public program ImageJ. Significant effects (P < 0.05), and higher discriminant power to predict implantation were observed for the morphometric embryo variables compared with morphological ones. The features for successfully implanted embryos were as follows: four cells on day 2 of development; all blastomeres with circular shape (roundness factor greater than 0.9), an average zona pellucida thickness of 13 µm and an average of 17695.1 µm² for the embryo area. Embryo size, which is described by its area and the average roundness factor for each cell, provides two objective variables to consider when predicting implantation. This approach should be further investigated for its potential ability to improve embryo scoring. PMID:25154014

  16. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations.

    PubMed

    Andrews, Nick A; Latrémolière, Alban; Basbaum, Allan I; Mogil, Jeffrey S; Porreca, Frank; Rice, Andrew S C; Woolf, Clifford J; Currie, Gillian L; Dworkin, Robert H; Eisenach, James C; Evans, Scott; Gewandter, Jennifer S; Gover, Tony D; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P; Kennedy, Jeffrey D; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P; McMahon, Stephen B; Price, Theodore J; Ross, Sarah E; Scherrer, Grégory; Seal, Rebecca P; Sena, Emily S; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I; Turk, Dennis C; Whiteside, Garth

    2016-04-01

    There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public-private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  17. Ensuring transparency and minimization of methodologic bias in preclinical pain research: PPRECISE considerations

    PubMed Central

    Andrews, Nick A.; Latrémolière, Alban; Basbaum, Allan I.; Mogil, Jeffrey S.; Porreca, Frank; Rice, Andrew S.C.; Woolf, Clifford J.; Currie, Gillian L.; Dworkin, Robert H.; Eisenach, James C.; Evans, Scott; Gewandter, Jennifer S.; Gover, Tony D.; Handwerker, Hermann; Huang, Wenlong; Iyengar, Smriti; Jensen, Mark P.; Kennedy, Jeffrey D.; Lee, Nancy; Levine, Jon; Lidster, Katie; Machin, Ian; McDermott, Michael P.; McMahon, Stephen B.; Price, Theodore J.; Ross, Sarah E.; Scherrer, Grégory; Seal, Rebecca P.; Sena, Emily S.; Silva, Elizabeth; Stone, Laura; Svensson, Camilla I.; Turk, Dennis C.; Whiteside, Garth

    2015-01-01

    Abstract There is growing concern about lack of scientific rigor and transparent reporting across many preclinical fields of biological research. Poor experimental design and lack of transparent reporting can result in conscious or unconscious experimental bias, producing results that are not replicable. The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks (ACTTION) public–private partnership with the U.S. Food and Drug Administration sponsored a consensus meeting of the Preclinical Pain Research Consortium for Investigating Safety and Efficacy (PPRECISE) Working Group. International participants from universities, funding agencies, government agencies, industry, and a patient advocacy organization attended. Reduction of publication bias, increasing the ability of others to faithfully repeat experimental methods, and increased transparency of data reporting were specifically discussed. Parameters deemed essential to increase confidence in the published literature were clear, specific reporting of an a priori hypothesis and definition of primary outcome measure. Power calculations and whether measurement of minimal meaningful effect size to determine these should be a core component of the preclinical research effort provoked considerable discussion, with many but not all agreeing. Greater transparency of reporting should be driven by scientists, journal editors, reviewers, and grant funders. The conduct of high-quality science that is fully reported should not preclude novelty and innovation in preclinical pain research, and indeed, any efforts that curtail such innovation would be misguided. We believe that to achieve the goal of finding effective new treatments for patients with pain, the pain field needs to deal with these challenging issues. PMID:26683237

  18. Towards uniform accelerometry analysis: a standardization methodology to minimize measurement bias due to systematic accelerometer wear-time variation.

    PubMed

    Katapally, Tarun R; Muhajarine, Nazeem

    2014-05-01

    Accelerometers are predominantly used to objectively measure the entire range of activity intensities - sedentary behaviour (SED), light physical activity (LPA) and moderate to vigorous physical activity (MVPA). However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants), jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within 'valid' data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com) were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA). Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time's influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and MVPA as outcome

  19. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    SciTech Connect

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology.

  20. Endovascular treatment for Small Core and Anterior circulation Proximal occlusion with Emphasis on minimizing CT to recanalization times (ESCAPE) trial: methodology.

    PubMed

    Demchuk, Andrew M; Goyal, Mayank; Menon, Bijoy K; Eesa, Muneer; Ryckborst, Karla J; Kamal, Noreen; Patil, Shivanand; Mishra, Sachin; Almekhlafi, Mohammed; Randhawa, Privia A; Roy, Daniel; Willinsky, Robert; Montanera, Walter; Silver, Frank L; Shuaib, Ashfaq; Rempel, Jeremy; Jovin, Tudor; Frei, Donald; Sapkota, Biggya; Thornton, J Michael; Poppe, Alexandre; Tampieri, Donatella; Lum, Cheemun; Weill, Alain; Sajobi, Tolulope T; Hill, Michael D

    2015-04-01

    ESCAPE is a prospective, multicenter, randomized clinical trial that will enroll subjects with the following main inclusion criteria: less than 12 h from symptom onset, age > 18, baseline NIHSS >5, ASPECTS score of >5 and CTA evidence of carotid T/L or M1 segment MCA occlusion, and at least moderate collaterals by CTA. The trial will determine if endovascular treatment will result in higher rates of favorable outcome compared with standard medical therapy alone. Patient populations that are eligible include those receiving IV tPA, tPA ineligible and unwitnessed onset or wake up strokes with 12 h of last seen normal. The primary end-point, based on intention-to-treat criteria is the distribution of modified Rankin Scale scores at 90 days assessed using a proportional odds model. The projected maximum sample size is 500 subjects. Randomization is stratified under a minimization process using age, gender, baseline NIHSS, baseline ASPECTS (8-10 vs. 6-7), IV tPA treatment and occlusion location (ICA vs. MCA) as covariates. The study will have one formal interim analysis after 300 subjects have been accrued. Secondary end-points at 90 days include the following: mRS 0-1; mRS 0-2; Barthel 95-100, EuroQOL and a cognitive battery. Safety outcomes are symptomatic ICH, major bleeding, contrast nephropathy, total radiation dose, malignant MCA infarction, hemicraniectomy and mortality at 90 days. PMID:25546514

  1. Up-cycling waste glass to minimal water adsorption/absorption lightweight aggregate by rapid low temperature sintering: optimization by dual process-mixture response surface methodology.

    PubMed

    Velis, Costas A; Franco-Salinas, Claudia; O'Sullivan, Catherine; Najorka, Jens; Boccaccini, Aldo R; Cheeseman, Christopher R

    2014-07-01

    Mixed color waste glass extracted from municipal solid waste is either not recycled, in which case it is an environmental and financial liability, or it is used in relatively low value applications such as normal weight aggregate. Here, we report on converting it into a novel glass-ceramic lightweight aggregate (LWA), potentially suitable for high added value applications in structural concrete (upcycling). The artificial LWA particles were formed by rapidly sintering (<10 min) waste glass powder with clay mixes using sodium silicate as binder and borate salt as flux. Composition and processing were optimized using response surface methodology (RSM) modeling, and specifically (i) a combined process-mixture dual RSM, and (ii) multiobjective optimization functions. The optimization considered raw materials and energy costs. Mineralogical and physical transformations occur during sintering and a cellular vesicular glass-ceramic composite microstructure is formed, with strong correlations existing between bloating/shrinkage during sintering, density and water adsorption/absorption. The diametrical expansion could be effectively modeled via the RSM and controlled to meet a wide range of specifications; here we optimized for LWA structural concrete. The optimally designed LWA is sintered in comparatively low temperatures (825-835 °C), thus potentially saving costs and lowering emissions; it had exceptionally low water adsorption/absorption (6.1-7.2% w/wd; optimization target: 1.5-7.5% w/wd); while remaining substantially lightweight (density: 1.24-1.28 g.cm(-3); target: 0.9-1.3 g.cm(-3)). This is a considerable advancement for designing effective environmentally friendly lightweight concrete constructions, and boosting resource efficiency of waste glass flows. PMID:24871934

  2. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  3. Taxonomic minimalism.

    PubMed

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. PMID:21236933

  4. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  5. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  6. Regional Shelter Analysis Methodology

    SciTech Connect

    Dillon, Michael B.; Dennison, Deborah; Kane, Jave; Walker, Hoyt; Miller, Paul

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  7. Minimal change disease

    MedlinePlus

    Minimal change nephrotic syndrome; Nil disease; Lipoid nephrosis; Idiopathic nephrotic syndrome of childhood ... which filter blood and produce urine. In minimal change disease, there is damage to the glomeruli. These ...

  8. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  9. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  10. Inverse Modeling Via Linearized Functional Minimization

    NASA Astrophysics Data System (ADS)

    Barajas-Solano, D. A.; Wohlberg, B.; Vesselinov, V. V.; Tartakovsky, D. M.

    2014-12-01

    We present a novel parameter estimation methodology for transient models of geophysical systems with uncertain, spatially distributed, heterogeneous and piece-wise continuous parameters.The methodology employs a bayesian approach to propose an inverse modeling problem for the spatial configuration of the model parameters.The likelihood of the configuration is formulated using sparse measurements of both model parameters and transient states.We propose using total variation regularization (TV) as the prior reflecting the heterogeneous, piece-wise continuity assumption on the parameter distribution.The maximum a posteriori (MAP) estimator of the parameter configuration is then computed by minimizing the negative bayesian log-posterior using a linearized functional minimization approach. The computation of the MAP estimator is a large-dimensional nonlinear minimization problem with two sources of nonlinearity: (1) the TV operator, and (2) the nonlinear relation between states and parameters provided by the model's governing equations.We propose a a hybrid linearized functional minimization (LFM) algorithm in two stages to efficiently treat both sources of nonlinearity.The relation between states and parameters is linearized, resulting in a linear minimization sub-problem equipped with the TV operator; this sub-problem is then minimized using the Alternating Direction Method of Multipliers (ADMM). The methodology is illustrated with a transient saturated groundwater flow application in a synthetic domain, stimulated by external point-wise loadings representing aquifer pumping, together with an array of discrete measurements of hydraulic conductivity and transient measurements of hydraulic head.We show that our inversion strategy is able to recover the overall large-scale features of the parameter configuration, and that the reconstruction is improved by the addition of transient information of the state variable.

  11. Prostate resection - minimally invasive

    MedlinePlus

    ... are: Erection problems (impotence) No symptom improvement Passing semen back into your bladder instead of out through ... Whelan JP, Goeree L. Systematic review and meta-analysis of transurethral resection of the prostate versus minimally ...

  12. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  13. Minimally invasive hip replacement

    MedlinePlus

    ... Smits SA, Swinford RR, Bahamonde RE. A randomized, prospective study of 3 minimally invasive surgical approaches in total hip arthroplasty: comprehensive gait analysis. J Arthroplasty . 2008;23:68-73. PMID: 18722305 ...

  14. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  15. Minimalism. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  16. Testing methodologies

    SciTech Connect

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  17. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  18. Minimally Invasive Radiofrequency Devices.

    PubMed

    Sadick, Neil; Rothaus, Kenneth O

    2016-07-01

    This article reviews minimally invasive radiofrequency options for skin tightening, focusing on describing their mechanism of action and clinical profile in terms of safety and efficacy and presenting peer-reviewed articles associated with the specific technologies. Treatments offered by minimally invasive radiofrequency devices (fractional, microneedling, temperature-controlled) are increasing in popularity due to the dramatic effects they can have without requiring skin excision, downtime, or even extreme financial burden from the patient's perspective. Clinical applications thus far have yielded impressive results in treating signs of the aging face and neck, either as stand-alone or as postoperative maintenance treatments. PMID:27363771

  19. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  20. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  1. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  2. The Minimal Era

    ERIC Educational Resources Information Center

    Van Ness, Wilhelmina

    1974-01-01

    Described the development of Minimal Art, a composite name that has been applied to the scattering of bland, bleak, non-objective fine arts painting and sculpture forms that proliferated slightly mysteriously in the middle 1960's as Pop Art began to decline. (Author/RK)

  3. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  4. Minimally invasive radioguided parathyroidectomy.

    PubMed

    Costello, D; Norman, J

    1999-07-01

    The last decade has been characterized by an emphasis on minimizing interventional techniques, hospital stays, and overall costs of patient care. It is clear that most patients with sporadic HPT do not require a complete neck exploration. We now know that a minimal approach is appropriate for this disease. Importantly, the MIRP technique can be applied to most patients with sporadic HPT and can be performed by surgeons with modest advanced training. The use of a gamma probe as a surgical tool converts the sestamibi to a functional and anatomical scan eliminating the need for any other preoperative localizing study. Quantification of the radioactivity within the removed gland eliminates the need for routine frozen section histologic examination and obviates the need for costly intraoperative parathyroid hormone measurements. This radioguided technique allows the benefit of local anesthesia, dramatically reduces operative times, eliminates postoperative blood tests, provides a smaller scar, requires minimal time spent in the hospital, and almost assures a rapid, near pain-free recovery. This combination is beneficial to the patient whereas helping achieve a reduction in overall costs. PMID:10448697

  5. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  6. Minimally invasive mediastinal surgery.

    PubMed

    Melfi, Franca M A; Fanucchi, Olivia; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a "no-touch" technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally invasive

  7. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  8. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  9. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  10. Minimally Invasive Parathyroidectomy

    PubMed Central

    Starker, Lee F.; Fonseca, Annabelle L.; Carling, Tobias; Udelsman, Robert

    2011-01-01

    Minimally invasive parathyroidectomy (MIP) is an operative approach for the treatment of primary hyperparathyroidism (pHPT). Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT. PMID:21747851

  11. Minimizing hazardous waste

    SciTech Connect

    DeClue, S.C.

    1996-06-01

    Hazardous waste minimization is a broad term often associated with pollution prevention, saving the environment or protecting Mother Earth. Some associate hazardous waste minimization with saving money. Thousands of hazardous materials are used in processes every day, but when these hazardous materials become hazardous wastes, dollars must be spent for disposal. When hazardous waste is reduced, an organization will spend less money on hazardous waste disposal. In 1993, Fort Bragg reduced its hazardous waste generation by over 100,000 pounds and spent nearly $90,000 less on hazardous waste disposal costs than in 1992. Fort Bragg generates a variety of wastes: Vehicle maintenance wastes such as antifreeze, oil, grease and solvents; helicopter maintenance wastes, including solvents, adhesives, lubricants and paints; communication operation wastes such as lithium, magnesium, mercury and nickel-cadmium batteries; chemical defense wastes detection, decontamination, and protective mask filters. The Hazardous Waste Office has the responsibility to properly identify, characterize, classify and dispose of these waste items in accordance with US Environmental Protection Agency (EPA) and US Department of Transportation (DOT) regulations.

  12. Microbiological methodology in astrobiology

    NASA Astrophysics Data System (ADS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-09-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. The Antarctic glacier and Earth permafrost habitats, where living microbial cells preserved viability for millennia years due to entering the anabiotic state, are often regarded as terrestrial analogs of Martian polar subsurface layers. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that living microorganisms had not been preserved and underwent mineralization. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ.

  13. Minimal noise subsystems

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoting; Byrd, Mark; Jacobs, Kurt

    2016-03-01

    A system subjected to noise contains a decoherence-free subspace or subsystem (DFS) only if the noise possesses an exact symmetry. Here we consider noise models in which a perturbation breaks a symmetry of the noise, so that if S is a DFS under a given noise process it is no longer so under the new perturbed noise process. We ask whether there is a subspace or subsystem that is more robust to the perturbed noise than S . To answer this question we develop a numerical method that allows us to search for subspaces or subsystems that are maximally robust to arbitrary noise processes. We apply this method to a number of examples, and find that a subsystem that is a DFS is often not the subsystem that experiences minimal noise when the symmetry of the noise is broken by a perturbation. We discuss which classes of noise have this property.

  14. Minimal quiver standard model

    SciTech Connect

    Berenstein, David; Pinansky, Samuel

    2007-05-01

    This paper discusses the minimal quiver gauge theory embedding of the standard model that could arise from brane world type string theory constructions. It is based on the low energy effective field theory of D branes in the perturbative regime. The model differs from the standard model by the addition of one extra massive gauge boson, and contains only one additional parameter to the standard model: the mass of this new particle. The coupling of this new particle to the standard model is uniquely determined by input from the standard model and consistency conditions of perturbative string theory. We also study some aspects of the phenomenology of this model and bounds on its possible observation at the Large Hadron Collider.

  15. A perturbation technique for shield weight minimization

    SciTech Connect

    Watkins, E.F.; Greenspan, E. )

    1993-01-01

    The radiation shield optimization code SWAN (Ref. 1) was originally developed for minimizing the thickness of a shield that will meet a given dose (or another) constraint or for extremizing a performance parameter of interest (e.g., maximizing energy multiplication or minimizing dose) while maintaining the shield volume constraint. The SWAN optimization process proved to be highly effective (e.g., see Refs. 2, 3, and 4). The purpose of this work is to investigate the applicability of the SWAN methodology to problems in which the weight rather than the volume is the relevant shield characteristic. Such problems are encountered in shield design for space nuclear power systems. The investigation is carried out using SWAN with the coupled neutron-photon cross-section library FLUNG (Ref. 5).

  16. [Minimally invasive breast surgery].

    PubMed

    Mátrai, Zoltán; Gulyás, Gusztáv; Kunos, Csaba; Sávolt, Akos; Farkas, Emil; Szollár, András; Kásler, Miklós

    2014-02-01

    Due to the development in medical science and industrial technology, minimally invasive procedures have appeared in the surgery of benign and malignant breast diseases. In general , such interventions result in significantly reduced breast and chest wall scars, shorter hospitalization and less pain, but they require specific, expensive devices, longer surgical time compared to open surgery. Furthermore, indications or oncological safety have not been established yet. It is quite likely, that minimally invasive surgical procedures with high-tech devices - similar to other surgical subspecialties -, will gradually become popular and it may form part of routine breast surgery even. Vacuum-assisted core biopsy with a therapeutic indication is suitable for the removal of benign fibroadenomas leaving behind an almost invisible scar, while endoscopically assisted skin-sparing and nipple-sparing mastectomy, axillary staging and reconstruction with latissimus dorsi muscle flap are all feasible through the same short axillary incision. Endoscopic techniques are also suitable for the diagnostics and treatment of intracapsular complications of implant-based breast reconstructions (intracapsular fluid, implant rupture, capsular contracture) and for the biopsy of intracapsular lesions with uncertain pathology. Perception of the role of radiofrequency ablation of breast tumors requires further hands-on experience, but it is likely that it can serve as a replacement of surgical removal in a portion of primary tumors in the future due to the development in functional imaging and anticancer drugs. With the reduction of the price of ductoscopes routine examination of the ductal branch system, guided microdochectomy and targeted surgical removal of terminal ducto-lobular units or a "sick lobe" as an anatomical unit may become feasible. The paper presents the experience of the authors and provides a literature review, for the first time in Hungarian language on the subject. Orv. Hetil

  17. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  18. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  19. Minimal complexity control law synthesis

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Haddad, Wassim M.; Nett, Carl N.

    1989-01-01

    A paradigm for control law design for modern engineering systems is proposed: Minimize control law complexity subject to the achievement of a specified accuracy in the face of a specified level of uncertainty. Correspondingly, the overall goal is to make progress towards the development of a control law design methodology which supports this paradigm. Researchers achieve this goal by developing a general theory of optimal constrained-structure dynamic output feedback compensation, where here constrained-structure means that the dynamic-structure (e.g., dynamic order, pole locations, zero locations, etc.) of the output feedback compensation is constrained in some way. By applying this theory in an innovative fashion, where here the indicated iteration occurs over the choice of the compensator dynamic-structure, the paradigm stated above can, in principle, be realized. The optimal constrained-structure dynamic output feedback problem is formulated in general terms. An elegant method for reducing optimal constrained-structure dynamic output feedback problems to optimal static output feedback problems is then developed. This reduction procedure makes use of star products, linear fractional transformations, and linear fractional decompositions, and yields as a byproduct a complete characterization of the class of optimal constrained-structure dynamic output feedback problems which can be reduced to optimal static output feedback problems. Issues such as operational/physical constraints, operating-point variations, and processor throughput/memory limitations are considered, and it is shown how anti-windup/bumpless transfer, gain-scheduling, and digital processor implementation can be facilitated by constraining the controller dynamic-structure in an appropriate fashion.

  20. On Modelling Minimal Disease Activity

    PubMed Central

    Jackson, Christopher H.; Su, Li; Gladman, Dafna D.

    2016-01-01

    Objective To explore methods for statistical modelling of minimal disease activity (MDA) based on data from intermittent clinic visits. Methods The analysis was based on a 2‐state model. Comparisons were made between analyses based on “complete case” data from visits at which MDA status was known, and the use of hidden model methodology that incorporated information from visits at which only some MDA defining criteria could be established. Analyses were based on an observational psoriatic arthritis cohort. Results With data from 856 patients and 7,024 clinic visits, analysis was based on virtually all visits, although only 62.6% provided enough information to determine MDA status. Estimated mean times for an episode of MDA varied from 4.18 years to 3.10 years, with smaller estimates derived from the hidden 2‐state model analysis. Over a 10‐year period, the estimated expected times spent in MDA episodes of longer than 1 year was 3.90 to 4.22, and the probability of having such an MDA episode was estimated to be 0.85 to 0.91, with longer times and greater probabilities seen with the hidden 2‐state model analysis. Conclusion A 2‐state model provides a useful framework for the analysis of MDA. Use of data from visits at which MDA status can not be determined provide more precision, and notable differences are seen in estimated quantities related to MDA episodes based on complete case and hidden 2‐state model analyses. The possibility of bias, as well as loss of precision, should be recognized when complete case analyses are used. PMID:26315478

  1. Minimal distances between SCFTs

    NASA Astrophysics Data System (ADS)

    Buican, Matthew

    2014-01-01

    We study lower bounds on the minimal distance in theory space between four-dimensional superconformal field theories (SCFTs) connected via broad classes of renormalization group (RG) flows preserving various amounts of supersymmetry (SUSY). For = 1 RG flows, the ultraviolet (UV) and infrared (IR) endpoints of the flow can be parametrically close. On the other hand, for RG flows emanating from a maximally supersymmetric SCFT, the distance to the IR theory cannot be arbitrarily small regardless of the amount of (non-trivial) SUSY preserved along the flow. The case of RG flows from =2 UV SCFTs is more subtle. We argue that for RG flows preserving the full =2 SUSY, there are various obstructions to finding examples with parametrically close UV and IR endpoints. Under reasonable assumptions, these obstructions include: unitarity, known bounds on the c central charge derived from associativity of the operator product expansion, and the central charge bounds of Hofman and Maldacena. On the other hand, for RG flows that break = 2 → = 1, it is possible to find IR fixed points that are parametrically close to the UV ones. In this case, we argue that if the UV SCFT possesses a single stress tensor, then such RG flows excite of order all the degrees of freedom of the UV theory. Furthermore, if the UV theory has some flavor symmetry, we argue that the UV central charges should not be too large relative to certain parameters in the theory.

  2. Swarm robotics and minimalism

    NASA Astrophysics Data System (ADS)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  3. Payload training methodology study

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The results of the Payload Training Methodology Study (PTMS) are documented. Methods and procedures are defined for the development of payload training programs to be conducted at the Marshall Space Flight Center Payload Training Complex (PCT) for the Space Station Freedom program. The study outlines the overall training program concept as well as the six methodologies associated with the program implementation. The program concept outlines the entire payload training program from initial identification of training requirements to the development of detailed design specifications for simulators and instructional material. The following six methodologies are defined: (1) The Training and Simulation Needs Assessment Methodology; (2) The Simulation Approach Methodology; (3) The Simulation Definition Analysis Methodology; (4) The Simulator Requirements Standardization Methodology; (5) The Simulator Development Verification Methodology; and (6) The Simulator Validation Methodology.

  4. Minimal Higgs inflation

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Oda, Kin-ya

    2014-02-01

    We consider a possibility that the Higgs field in the Standard Model (SM) serves as an inflaton when its value is around the Planck scale. We assume that the SM is valid up to an ultraviolet cutoff scale Λ , which is slightly below the Planck scale, and that the Higgs potential becomes almost flat above Λ . Contrary to the ordinary Higgs inflation scenario, we do not assume the huge non-minimal coupling, of O(10^4), of the Higgs field to the Ricci scalar. We find that Λ must be less than 5× 10^{17} {GeV} in order to explain the observed fluctuation of the cosmic microwave background, no matter how we extrapolate the Higgs potential above Λ . The scale 10^{17} {GeV} coincides with the perturbative string scale, which suggests that the SM is directly connected with string theory. For this to be true, the top quark mass is restricted to around 171 GeV, with which Λ can exceed 10^{17} {GeV}. As a concrete example of the potential above Λ , we propose a simple log-type potential. The predictions of this specific model for the e-foldings N_*=50-60 are consistent with the current observation, namely, the scalar spectral index is n_s=0.977hbox {-}0.983 and the tensor to scalar ratio 0

  5. Microbiological Methodology in Astrobiology

    NASA Technical Reports Server (NTRS)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  6. Minimizing Launch Mass for ISRU Processes

    NASA Technical Reports Server (NTRS)

    England, C.; Hallinan, K. P.

    2004-01-01

    The University of Dayton and the Jet Propulsion Laboratory are developing a methodology for estimating the Earth launch mass (ELM) of processes for In-Situ Resource Utilization (ISRU) with a focus on lunar resource recovery. ISRU may be enabling for both an extended presence on the Moon, and for large sample return missions and for a human presence on Mars. To accomplish these exploration goals, the resources recovered by ISRU must offset the ELM for the recovery process. An appropriate figure of merit is the cost of the exploration mission, which is closely related to ELM. For a given production rate and resource concentration, the lowest ELM - and the best ISRU process - is achieved by minimizing capital equipment for both the ISRU process and energy production. ISRU processes incur Carnot limitations and second law losses (irreversibilities) that ultimately determine production rate, material utilization and energy efficiencies. Heat transfer, chemical reaction, and mechanical operations affect the ELM in ways that are best understood by examining the process's detailed energetics. Schemes for chemical and thermal processing that do not incorporate an understanding of second law losses will be incompletely understood. Our team is developing a methodology that will aid design and selection of ISRU processes by identifying the impact of thermodynamic losses on ELM. The methodology includes mechanical, thermal and chemical operations, and, when completed, will provide a procedure and rationale for optimizing their design and minimizing their cost. The technique for optimizing ISRU with respect to ELM draws from work of England and Funk that relates the cost of endothermic processes to their second law efficiencies. Our team joins their approach for recovering resources by chemical processing with analysis of thermal and mechanical operations in space. Commercial firms provide cost inputs for ELM and planetary landing. Additional information is included in the

  7. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  8. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  9. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  10. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  11. Minimizing waste in environmental restoration

    SciTech Connect

    Moos, L.; Thuot, J.R.

    1996-07-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs.

  12. Process waste assessment methodology for mechanical departments

    SciTech Connect

    Hedrick, R.B.

    1992-12-01

    Process waste assessments (PWAS) were performed for three pilot processes to develop methodology for performing PWAs for all the various processes used throughout the mechanical departments. A material balance and process flow diagram identifying the raw materials utilized in the process and the quantity and types of materials entering the waste streams from the process is defined for each PWA. The data and information are used to determine potential options'' for eliminating hazardous materials or minimizing wastes generated.

  13. Mitral valve surgery - minimally invasive

    MedlinePlus

    ... that does many of these procedures. Minimally invasive heart valve surgery has improved greatly in recent years. These ... WT, Mack MJ. Transcatheter cardiac valve interventions. Surg Clin North Am . 2009;89:951-66. ...

  14. Heart bypass surgery - minimally invasive

    MedlinePlus

    ... in 30-day outcomes in high-risk patients randomized to off-pump versus on-pump coronary bypass ... Thiele H, Neumann-Schniedewind P, Jacobs S, et al. Randomized comparison of minimally invasive direct coronary artery bypass ...

  15. Menopause and Methodological Doubt

    ERIC Educational Resources Information Center

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  16. Theories and Methodologies.

    ERIC Educational Resources Information Center

    Skemp, Richard R.

    Provided is an examination of the methodology used to study the problems of learning addition and subtraction skills used by developmental researchers. The report has sections on categories of theory and their methodologies, which review: (1) Behaviorist, Neo-Behaviorist and Piagetian Theories; (2) the Behaviorist and Piagetian Paradigms; (3)…

  17. The Methodology of Magpies

    ERIC Educational Resources Information Center

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  18. Data Centric Development Methodology

    ERIC Educational Resources Information Center

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  19. Minimizing pollutants with multimedia strategies

    SciTech Connect

    Phillips, J.B.; Hindawi, M.A.

    1997-01-01

    A multimedia approach to pollution prevention that focuses on minimizing or eliminating production of pollutants is one of the most advantageous strategies to adopt in preparing an overall facility environmental plan. If processes are optimized to preclude or minimize the manufacture of streams containing pollutants, or to reduce the levels of pollutants in waste streams, then the task of multimedia pollution prevention becomes more manageable simply as a result of a smaller problem needing to be addressed. An orderly and systematic approach to waste minimization can result in a comprehensive strategy to reduce the production of waste streams and simultaneously improve the profitability of a process or industrial operation. There are a number of miscellaneous strategies for a waste minimization that attack the problem via process chemistry or engineering. Examples include installation of low-NO{sub x} burners, selection of valves that minimize fugitive emissions, high-level switches on storage tanks, the use of in-plant stills for recycling and reusing solvents and using water-based products instead of hydrocarbon-based products wherever possible. Other waste minimization countermeasures can focus on O and M issues.

  20. Specialized minimal PDFs for optimized LHC calculations

    NASA Astrophysics Data System (ADS)

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-04-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.

  1. [Essential genes, minimal genome and synthetic cell of bacteria: a review].

    PubMed

    Qiu, Dongru

    2012-05-01

    Single-cell prokaryotes represent a simple and primitive cellular life form. The identification of the essential genes of bacteria and the minimal genome for the free-living cellular life could provide insights into the origin, evolution, and essence of life forms. The principles, methodology, and recent progresses in the identification of essential genes and minimal genome and the creation of synthetic cells are reviewed and particularly the strategies for creating the minimal genome and the potential applications are introduced. PMID:22916492

  2. Minimally invasive surgery for atrial fibrillation.

    PubMed

    Zembala, Michael O; Suwalski, Piotr

    2013-11-01

    Atrial fibrillation (AF) remains the most common cardiac arrhythmia, affecting nearly 2% of the general population worldwide. Minimally invasive surgical ablation remains one of the most dynamically evolving fields of modern cardiac surgery. While there are more than a dozen issues driving this development, two seem to play the most important role: first, there is lack of evidence supporting percutaneous catheter based approach to treat patients with persistent and long-standing persistent AF. Paucity of this data offers surgical community unparalleled opportunity to challenge guidelines and change indications for surgical intervention. Large, multicenter prospective clinical studies are therefore of utmost importance, as well as honest, clear data reporting. Second, a collaborative methodology started a long-awaited debate on a Heart Team approach to AF, similar to the debate on coronary artery disease and transcatheter valves. Appropriate patient selection and tailored treatment options will most certainly result in better outcomes and patient satisfaction, coupled with appropriate use of always-limited institutional resources. The aim of this review, unlike other reviews of minimally invasive surgical ablation, is to present medical professionals with two distinctly different, approaches. The first one is purely surgical, Standalone surgical isolation of the pulmonary veins using bipolar energy source with concomitant amputation of the left atrial appendage-a method of choice in one of the most important clinical trials on AF-The Atrial Fibrillation Catheter Ablation Versus Surgical Ablation Treatment (FAST) Trial. The second one represents the most complex approach to this problem: a multidisciplinary, combined effort of a cardiac surgeon and electrophysiologist. The Convergent Procedure, which includes both endocardial and epicardial unipolar ablation bonds together minimally invasive endoscopic surgery with electroanatomical mapping, to deliver best of the

  3. Minimally invasive surgery for atrial fibrillation

    PubMed Central

    Suwalski, Piotr

    2013-01-01

    Atrial fibrillation (AF) remains the most common cardiac arrhythmia, affecting nearly 2% of the general population worldwide. Minimally invasive surgical ablation remains one of the most dynamically evolving fields of modern cardiac surgery. While there are more than a dozen issues driving this development, two seem to play the most important role: first, there is lack of evidence supporting percutaneous catheter based approach to treat patients with persistent and long-standing persistent AF. Paucity of this data offers surgical community unparalleled opportunity to challenge guidelines and change indications for surgical intervention. Large, multicenter prospective clinical studies are therefore of utmost importance, as well as honest, clear data reporting. Second, a collaborative methodology started a long-awaited debate on a Heart Team approach to AF, similar to the debate on coronary artery disease and transcatheter valves. Appropriate patient selection and tailored treatment options will most certainly result in better outcomes and patient satisfaction, coupled with appropriate use of always-limited institutional resources. The aim of this review, unlike other reviews of minimally invasive surgical ablation, is to present medical professionals with two distinctly different, approaches. The first one is purely surgical, Standalone surgical isolation of the pulmonary veins using bipolar energy source with concomitant amputation of the left atrial appendage—a method of choice in one of the most important clinical trials on AF—The Atrial Fibrillation Catheter Ablation Versus Surgical Ablation Treatment (FAST) Trial. The second one represents the most complex approach to this problem: a multidisciplinary, combined effort of a cardiac surgeon and electrophysiologist. The Convergent Procedure, which includes both endocardial and epicardial unipolar ablation bonds together minimally invasive endoscopic surgery with electroanatomical mapping, to deliver best of

  4. Minimally invasive video-assisted versus minimally invasive nonendoscopic thyroidectomy.

    PubMed

    Fík, Zdeněk; Astl, Jaromír; Zábrodský, Michal; Lukeš, Petr; Merunka, Ilja; Betka, Jan; Chovanec, Martin

    2014-01-01

    Minimally invasive video-assisted thyroidectomy (MIVAT) and minimally invasive nonendoscopic thyroidectomy (MINET) represent well accepted and reproducible techniques developed with the main goal to improve cosmetic outcome, accelerate healing, and increase patient's comfort following thyroid surgery. Between 2007 and 2011, a prospective nonrandomized study of patients undergoing minimally invasive thyroid surgery was performed to compare advantages and disadvantages of the two different techniques. There were no significant differences in the length of incision to perform surgical procedures. Mean duration of hemithyroidectomy was comparable in both groups, but it was more time consuming to perform total thyroidectomy by MIVAT. There were more patients undergoing MIVAT procedures without active drainage in the postoperative course and we also could see a trend for less pain in the same group. This was paralleled by statistically significant decreased administration of both opiates and nonopiate analgesics. We encountered two cases of recurrent laryngeal nerve palsies in the MIVAT group only. MIVAT and MINET represent safe and feasible alternative to conventional thyroid surgery in selected cases and this prospective study has shown minimal differences between these two techniques. PMID:24800227

  5. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  6. Technology transfer methodology

    NASA Technical Reports Server (NTRS)

    Labotz, Rich

    1991-01-01

    Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.

  7. In vivo minimally invasive interstitial multi-functional microendoscopy

    PubMed Central

    Shahmoon, Asaf; Aharon, Shiran; Kruchik, Oded; Hohmann, Martin; Slovin, Hamutal; Douplik, Alexandre; Zalevsky, Zeev

    2013-01-01

    Developing minimally invasive methodologies for imaging of internal organs is an emerging field in the biomedical examination research. This paper introduces a new multi-functional microendoscope device capable of imaging of internal organs with a minimal invasive intervention. In addition, the developed microendoscope can also be employed as a monitoring device for measuring local hemoglobin concentration in blood stream when administrated into a blood artery. The microendoscope device has a total external diameter of only 200 μm and can provide high imaging resolution capability of more than 5,000 pixels. The device can detect features with a spatial resolution of less than 1 μm. The microendoscope has been tested both in-vitro as well as in-vivo in rats presenting a promising and powerful tool as a high resolution and minimally invasive imaging facility suitable for previously unreachable clinical modalities. PMID:23712369

  8. Minimally invasive aortic valve surgery.

    PubMed

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-09-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  9. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  10. WASTE MINIMIZATION OPPORTUNITY ASSESSMENT MANUAL

    EPA Science Inventory

    Waste minimization (WM) is a policy specifically mandated by the U.S. Congress in the 1984 Hazardous and Solid Wastes Amendments to the Resource Conservation and Recovery Act (RCRA). The RCRA regulations require that generators of hazardous waste have a program in place to reduce...

  11. Assembly of a minimal protocell

    NASA Astrophysics Data System (ADS)

    Rasmussen, Steen

    2007-03-01

    What is minimal life, how can we make it, and how can it be useful? We present experimental and computational results towards bridging nonliving and living matter, which results in life that is different and much simpler than contemporary life. A simple yet tightly coupled catalytic cooperation between genes, metabolism, and container forms the design underpinnings of our protocell, which is a minimal self-replicating molecular machine. Experimentally, we have recently demonstrated this coupling by having an informational molecule (8-oxoguanine) catalytically control the light driven metabolic (Ru-bpy based) production of container materials (fatty acids). This is a significant milestone towards assembling a minimal self-replicating molecular machine. Recent theoretical investigations indicate that coordinated exponential component growth should naturally emerge as a result from such a catalytic coupling between the main protocellular components. A 3-D dissipative particle simulation (DPD) study of the full protocell life-cycle exposes a number of anticipated systemic issues associated with the remaining experimental challenges for the implementation of the minimal protocell. Finally we outline how more general self-replicating materials could be useful.

  12. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  13. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  14. Minimally invasive surgical approach to pancreatic malignancies

    PubMed Central

    Bencini, Lapo; Annecchiarico, Mario; Farsi, Marco; Bartolini, Ilenia; Mirasolo, Vita; Guerra, Francesco; Coratti, Andrea

    2015-01-01

    Pancreatic surgery for malignancy is recognized as challenging for the surgeons and risky for the patients due to consistent perioperative morbidity and mortality. Furthermore, the oncological long-term results are largely disappointing, even for those patients who experience an uneventfully hospital stay. Nevertheless, surgery still remains the cornerstone of a multidisciplinary treatment for pancreatic cancer. In order to maximize the benefits of surgery, the advent of both laparoscopy and robotics has led many surgeons to treat pancreatic cancers with these new methodologies. The reduction of postoperative complications, length of hospital stay and pain, together with a shorter interval between surgery and the beginning of adjuvant chemotherapy, represent the potential advantages over conventional surgery. Lastly, a better cosmetic result, although not crucial in any cancerous patient, could also play a role by improving overall well-being and patient self-perception. The laparoscopic approach to pancreatic surgery is, however, difficult in inexperienced hands and requires a dedicated training in both advanced laparoscopy and pancreatic surgery. The recent large diffusion of the da Vinci® robotic platform seems to facilitate many of the technical maneuvers, such as anastomotic biliary and pancreatic reconstructions, accurate lymphadenectomy, and vascular sutures. The two main pancreatic operations, distal pancreatectomy and pancreaticoduodenectomy, are approachable by a minimally invasive path, but more limited interventions such as enucleation are also feasible. Nevertheless, a word of caution should be taken into account when considering the increasing costs of these newest technologies because the main concerns regarding these are the maintenance of all oncological standards and the lack of long-term follow-up. The purpose of this review is to examine the evidence for the use of minimally invasive surgery in pancreatic cancer (and less aggressive tumors

  15. Toward a Minimal Artificial Axon.

    PubMed

    Ariyaratne, Amila; Zocchi, Giovanni

    2016-07-01

    The electrophysiology of action potentials is usually studied in neurons, through relatively demanding experiments which are difficult to scale up to a defined network. Here we pursue instead the minimal artificial system based on the essential biological components-ion channels and lipid bilayers-where action potentials can be generated, propagated, and eventually networked. The fundamental unit is the classic supported bilayer: a planar bilayer patch with embedded ion channels in a fluidic environment where an ionic gradient is imposed across the bilayer. Two such units electrically connected form the basic building block for a network. The system is minimal in that we demonstrate that one kind of ion channel and correspondingly a gradient of only one ionic species is sufficient to generate an excitable system which shows amplification and threshold behavior. PMID:27049652

  16. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  17. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  18. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed. PMID:23410316

  19. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)]. PMID:26382367

  20. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  1. Minimizing liability during internal investigations.

    PubMed

    Morris, Cole

    2010-01-01

    Today's security professional must appreciate the potential landmines in any investigative effort and work collaboratively with others to minimize liability risks, the author points out. In this article he examines six civil torts that commonly arise from unprofessionally planned or poorly executed internal investigations-defamation, false imprisonment. intentional infliction of emotional distress, assault and battery, invasion of privacy, and malicious prosecution and abuse of process. PMID:20873494

  2. Minimal absent words in four human genome assemblies.

    PubMed

    Garcia, Sara P; Pinho, Armando J

    2011-01-01

    Minimal absent words have been computed in genomes of organisms from all domains of life. Here, we aim to contribute to the catalogue of human genomic variation by investigating the variation in number and content of minimal absent words within a species, using four human genome assemblies. We compare the reference human genome GRCh37 assembly, the HuRef assembly of the genome of Craig Venter, the NA12878 assembly from cell line GM12878, and the YH assembly of the genome of a Han Chinese individual. We find the variation in number and content of minimal absent words between assemblies more significant for large and very large minimal absent words, where the biases of sequencing and assembly methodologies become more pronounced. Moreover, we find generally greater similarity between the human genome assemblies sequenced with capillary-based technologies (GRCh37 and HuRef) than between the human genome assemblies sequenced with massively parallel technologies (NA12878 and YH). Finally, as expected, we find the overall variation in number and content of minimal absent words within a species to be generally smaller than the variation between species. PMID:22220210

  3. Courseware Engineering Methodology.

    ERIC Educational Resources Information Center

    Uden, Lorna

    2002-01-01

    Describes development of the Courseware Engineering Methodology (CEM), created to guide novices in designing effective courseware. Discusses CEM's four models: pedagogical (concerned with the courseware's pedagogical aspects), conceptual (dealing with software engineering), interface (relating to human-computer interaction), and hypermedia…

  4. Document Conversion Methodology.

    ERIC Educational Resources Information Center

    Bovee, Donna

    1990-01-01

    Discusses digital imaging technology and examines document database conversion considerations. Two types of document imaging systems are described: (1) a work in process system, and (2) a storage and retrieval system. Conversion methodology is outlined, and a document conversion scenario is presented as a practical guide to conversion. (LRW)

  5. Complicating Methodological Transparency

    ERIC Educational Resources Information Center

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  6. Video: Modalities and Methodologies

    ERIC Educational Resources Information Center

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data collection…

  7. SCI Hazard Report Methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  8. Temporal structure of consciousness and minimal self in schizophrenia.

    PubMed

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one's first-person experiential perspective as, for example, an "altered presence" during which the sense of the experienced self ("mineness") is subtly affected, or "altered sense of demarcation," i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an "I." In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  9. Temporal structure of consciousness and minimal self in schizophrenia

    PubMed Central

    Martin, Brice; Wittmann, Marc; Franck, Nicolas; Cermolacce, Michel; Berna, Fabrice; Giersch, Anne

    2014-01-01

    The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self-experience may be defined as a distortion of one’s first-person experiential perspective as, for example, an “altered presence” during which the sense of the experienced self (“mineness”) is subtly affected, or “altered sense of demarcation,” i.e., a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non-self-reflexive, and distinct from the verbalized, explicit awareness of an “I.” In this paper, we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing) as well as duration perception disturbances (explicit time processing) in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed. PMID:25400597

  10. Evidence-Based Integrated Environmental Solutions For Secondary Lead Smelters: Pollution Prevention And Waste Minimization Technologies And Practices

    EPA Science Inventory

    An evidence-based methodology was adopted in this research to establish strategies to increase lead recovery and recycling via a systematic review and critical appraisal of the published literature. In particular, the research examines pollution prevention and waste minimization...

  11. Unsupported standing with minimized ankle muscle fatigue.

    PubMed

    Mihelj, Matjaz; Munih, Marko

    2004-08-01

    In the past, limited unsupported standing has been restored in patients with thoracic spinal cord injury through open-loop functional electrical stimulation of paralyzed knee extensor muscles and the support of intact arm musculature. Here an optimal control system for paralyzed ankle muscles was designed that enables the subject to stand without hand support in a sagittal plane. The paraplegic subject was conceptualized as an underactuated double inverted pendulum structure with an active degree of freedom in the upper trunk and a passive degree of freedom in the paralyzed ankle joints. Control system design is based on the minimization of a cost function that estimates the effort of ankle joint muscles via observation of the ground reaction force position, relative to ankle joint axis. Furthermore, such a control system integrates voluntary upper trunk activity and artificial control of ankle joint muscles, resulting in a robust standing posture. Figures are shown for the initial simulation study, followed by disturbance tests on an intact volunteer and several laboratory trials with a paraplegic person. Benefits of the presented methodology are prolonged standing sessions and in the fact that the subject is able to maintain voluntary control over upper body orientation in space, enabling simple functional standing. PMID:15311817

  12. Development of a flight software testing methodology

    NASA Technical Reports Server (NTRS)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  13. A POLLUTION REDUCTION METHODOLOGY FOR CHEMICAL PROCESS SIMULATORS

    EPA Science Inventory

    A pollution minimization methodology was developed for chemical process design using computer simulation. It is based on a pollution balance that at steady state is used to define a pollution index with units of mass of pollution per mass of products. The pollution balance has be...

  14. Risk minimization through portfolio replication

    NASA Astrophysics Data System (ADS)

    Ciliberti, S.; Mã©Zard, M.

    2007-05-01

    We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.

  15. Diagnosis of minimal hepatic encephalopathy.

    PubMed

    Weissenborn, Karin

    2015-03-01

    Minimal hepatic encephalopathy (mHE) has significant impact upon a liver patient's daily living and health related quality of life. Therefore a majority of clinicians agree that mHE should be diagnosed and treated. The optimal means for diagnosing mHE, however, is controversial. This paper describes the currently most frequently used methods-EEG, critical flicker frequency, Continuous Reaction time Test, Inhibitory Control Test, computerized test batteries such as the Cognitive Drug Research test battery, the psychometric hepatic encephalopathy score (PHES) and the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS)-and their pros and cons. PMID:26041959

  16. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  17. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given. PMID:19655979

  18. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  19. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes. PMID:16711089

  20. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  1. Optimal needle design for minimal insertion force and bevel length.

    PubMed

    Wang, Yancheng; Chen, Roland K; Tai, Bruce L; McLaughlin, Patrick W; Shih, Albert J

    2014-09-01

    This research presents a methodology for optimal design of the needle geometry to minimize the insertion force and bevel length based on mathematical models of cutting edge inclination and rake angles and the insertion force. In brachytherapy, the needle with lower insertion force typically is easier for guidance and has less deflection. In this study, the needle with lancet point (denoted as lancet needle) is applied to demonstrate the model-based optimization for needle design. Mathematical models to calculate the bevel length and inclination and rake angles for lancet needle are presented. A needle insertion force model is developed to predict the insertion force for lancet needle. The genetic algorithm is utilized to optimize the needle geometry for two cases. One is to minimize the needle insertion force. Using the geometry of a commercial lancet needle as the baseline, the optimized needle has 11% lower insertion force with the same bevel length. The other case is to minimize the bevel length under the same needle insertion force. The optimized design can reduce the bevel length by 46%. Both optimized needle designs were validated experimentally in ex vivo porcine liver needle insertion tests and demonstrated the methodology of the model-based optimal needle design. PMID:24957487

  2. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  3. Acoustic methodology review

    NASA Technical Reports Server (NTRS)

    Schlegel, R. G.

    1982-01-01

    It is important for industry and NASA to assess the status of acoustic design technology for predicting and controlling helicopter external noise in order for a meaningful research program to be formulated which will address this problem. The prediction methodologies available to the designer and the acoustic engineer are three-fold. First is what has been described as a first principle analysis. This analysis approach attempts to remove any empiricism from the analysis process and deals with a theoretical mechanism approach to predicting the noise. The second approach attempts to combine first principle methodology (when available) with empirical data to formulate source predictors which can be combined to predict vehicle levels. The third is an empirical analysis, which attempts to generalize measured trends into a vehicle noise prediction method. This paper will briefly address each.

  4. Soft Systems Methodology

    NASA Astrophysics Data System (ADS)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  5. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  6. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  7. Symmetry breaking for drag minimization

    NASA Astrophysics Data System (ADS)

    Roper, Marcus; Squires, Todd M.; Brenner, Michael P.

    2005-11-01

    For locomotion at high Reynolds numbers drag minimization favors fore-aft asymmetric slender shapes with blunt noses and sharp trailing edges. On the other hand, in an inertialess fluid the drag experienced by a body is independent of whether it travels forward or backward through the fluid, so there is no advantage to having a single preferred swimming direction. In fact numerically determined minimum drag shapes are known to exhibit almost no fore-aft asymmetry even at moderate Re. We show that asymmetry persists, albeit extremely weakly, down to vanishingly small Re, scaling asymptotically as Re^3. The need to minimize drag to maximize speed for a given propulsive capacity gives one possible mechanism for the increasing asymmetry in the body plans seen in nature, as organisms increase in size and swimming speed from bacteria like E-Coli up to pursuit predator fish such as tuna. If it is the dominant mechanism, then this signature scaling will be observed in the shapes of motile micro-organisms.

  8. Structural femtochemistry: experimental methodology.

    PubMed Central

    Williamson, J C; Zewail, A H

    1991-01-01

    The experimental methodology for structural femtochemistry of reactions is considered. With the extension of femtosecond transition-state spectroscopy to the diffraction regime, it is possible to obtain in a general way the trajectories of chemical reactions (change of internuclear separations with time) on the femtosecond time scale. This method, considered here for simple alkali halide dissociation, promises many applications to more complex reactions and to conformational changes. Alignment on the time scale of the experiments is also discussed. Images PMID:11607189

  9. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  10. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  11. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR. PMID:27295772

  12. Non-minimal Inflationary Attractors

    SciTech Connect

    Kallosh, Renata; Linde, Andrei E-mail: alinde@stanford.edu

    2013-10-01

    Recently we identified a new class of (super)conformally invariant theories which allow inflation even if the scalar potential is very steep in terms of the original conformal variables. Observational predictions of a broad class of such theories are nearly model-independent. In this paper we consider generalized versions of these models where the inflaton has a non-minimal coupling to gravity with a negative parameter ξ different from its conformal value -1/6. We show that these models exhibit attractor behavior. With even a slight increase of |ξ| from |ξ| = 0, predictions of these models for n{sub s} and r rapidly converge to their universal model-independent values corresponding to conformal coupling ξ = −1/6. These values of n{sub s} and r practically coincide with the corresponding values in the limit ξ → −∞.

  13. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  14. Minimizing the pain on burnout

    SciTech Connect

    Billings, A.

    1985-03-01

    An investment in an oil and gas shelter warrants an additional investment to fund tax liability on burnout. A relatively liquid and low-risk investment is preferable so as to assure timely satisfaction of tax liability when burnout occurs. If an investor decides to allow the shelter to die a timely death, the investment funds could be used to fund annual tax liability. In situations where a leak develops, the fund will once again be invaluable. When a leak or burnout occurs, investors may be able to do no more than minimize their maximum losses. Relief of debt on most dispositions will be deemed receipt of cash, thus triggering gains. Ordinary income will result by operation of Code Sections 1245, 1250, and 1254. Bankruptcy or a charitable contribution will grant limited reprieve from tax losses; however, economic losses will still result.

  15. Minimal unitary (covariant) scattering theory

    SciTech Connect

    Lindesay, J.V.; Markevich, A.

    1983-06-01

    In the minimal three particle equations developed by Lindesay the two body input amplitude was an on shell relativistic generalization of the non-relativistic scattering model characterized by a single mass parameter ..mu.. which in the two body (m + m) system looks like an s-channel bound state (..mu.. < 2m) or virtual state (..mu.. > 2m). Using this driving term in covariant Faddeev equations generates a rich covariant and unitary three particle dynamics. However, the simplest way of writing the relativisitic generalization of the Faddeev equations can take the on shell Mandelstam parameter s = 4(q/sup 2/ + m/sup 2/), in terms of which the two particle input is expressed, to negative values in the range of integration required by the dynamics. This problem was met in the original treatment by multiplying the two particle input amplitude by THETA(s). This paper provides what we hope to be a more direct way of meeting the problem.

  16. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  17. The minimal composite Higgs model

    NASA Astrophysics Data System (ADS)

    Agashe, Kaustubh; Contino, Roberto; Pomarol, Alex

    2005-07-01

    We study the idea of a composite Higgs in the framework of a five-dimensional AdS theory. We present the minimal model of the Higgs as a pseudo-Goldstone boson in which electroweak symmetry is broken dynamically via top loop effects, all flavour problems are solved, and contributions to electroweak precision observables are below experimental bounds. Since the 5D theory is weakly coupled, we are able to fully determine the Higgs potential and other physical quantities. The lightest resonances are expected to have a mass around 2 TeV and should be discovered at the LHC. The top sector is mostly composite and deviations from Standard Model couplings are expected.

  18. Minimally invasive posterior hamstring harvest.

    PubMed

    Wilson, Trent J; Lubowitz, James H

    2013-01-01

    Autogenous hamstring harvesting for knee ligament reconstruction is a well-established standard. Minimally invasive posterior hamstring harvest is a simple, efficient, reproducible technique for harvest of the semitendinosus or gracilis tendon or both medial hamstring tendons. A 2- to 3-cm longitudinal incision from the popliteal crease proximally, in line with the semitendinosus tendon, is sufficient. The deep fascia is bluntly penetrated, and the tendon or tendons are identified. Adhesions are dissected. Then, an open tendon stripper is used to release the tendon or tendons proximally; a closed, sharp tendon stripper is used to release the tendon or tendons from the pes. Layered, absorbable skin closure is performed, and the skin is covered with a skin sealant, bolster dressing, and plastic adhesive bandage for 2 weeks. PMID:24266003

  19. Minimally Invasive Spigelian Hernia Repair

    PubMed Central

    Baucom, Catherine; Nguyen, Quan D.; Hidalgo, Marco

    2009-01-01

    Introduction: Spigelian hernia is an uncommon ventral hernia characterized by a defect in the linea semilunaris. Repair of spigelian hernia has traditionally been accomplished via an open transverse incision and primary repair. The purpose of this article is to present 2 case reports of incarcerated spigelian hernia that were successfully repaired laparoscopically using Gortex mesh and to present a review of the literature regarding laparoscopic repair of spigelian hernias. Methods: Retrospective chart review and Medline literature search. Results: Two patients underwent laparoscopic mesh repair of incarcerated spigelian hernias. Both were started on a regular diet on postoperative day 1 and discharged on postoperative days 2 and 3. One patient developed a seroma that resolved without intervention. There was complete resolution of preoperative symptoms at the 12-month follow-up. Conclusion: Minimally invasive repair of spigelian hernias is an alternative to the traditional open surgical technique. Further studies are needed to directly compare the open and the laparoscopic repair. PMID:19660230

  20. A minimal little Higgs model

    NASA Astrophysics Data System (ADS)

    Barceló, Roberto; Masip, Manuel

    2008-11-01

    We discuss a little Higgs scenario that introduces below the TeV scale just the two minimal ingredients of these models, a vectorlike T quark and a singlet component (implying anomalous couplings) in the Higgs field, together with a pseudoscalar singlet η. In the model, which is a variation of Schmaltz’s simplest little Higgs model, all the extra vector bosons are much heavier than the T quark. In the Yukawa sector the global symmetry is approximate, implying a single large coupling per flavor, whereas in the scalar sector it is only broken at the loop level. We obtain the one-loop effective potential and show that it provides acceptable masses for the Higgs h and for the singlet η with no need for an extra μ term. We find that mη can be larger than mh/2, which would forbid the (otherwise dominant) decay mode h→ηη.

  1. Natural supersymmetric minimal dark matter

    NASA Astrophysics Data System (ADS)

    Fabbrichesi, Marco; Urbano, Alfredo

    2016-03-01

    We show how the Higgs boson mass is protected from the potentially large corrections due to the introduction of minimal dark matter if the new physics sector is made supersymmetric. The fermionic dark matter candidate (a 5-plet of S U (2 )L) is accompanied by a scalar state. The weak gauge sector is made supersymmetric, and the Higgs boson is embedded in a supersymmetric multiplet. The remaining standard model states are nonsupersymmetric. Nonvanishing corrections to the Higgs boson mass only appear at three-loop level, and the model is natural for dark matter masses up to 15 TeV—a value larger than the one required by the cosmological relic density. The construction presented stands as an example of a general approach to naturalness that solves the little hierarchy problem which arises when new physics is added beyond the standard model at an energy scale around 10 TeV.

  2. Chemical basis for minimal cognition.

    PubMed

    Hanczyc, Martin M; Ikegami, Takashi

    2010-01-01

    We have developed a simple chemical system capable of self-movement in order to study the physicochemical origins of movement. We propose how this system may be useful in the study of minimal perception and cognition. The system consists simply of an oil droplet in an aqueous environment. A chemical reaction within the oil droplet induces an instability, the symmetry of the oil droplet breaks, and the droplet begins to move through the aqueous phase. The complement of physical phenomena that is then generated indicates the presence of feedback cycles that, as will be argued, form the basis for self-regulation, homeostasis, and perhaps an extended form of autopoiesis. We discuss the result that simple chemical systems are capable of sensory-motor coupling and possess a homeodynamic state from which cognitive processes may emerge. PMID:20586578

  3. Minimally invasive radioguided parathyroidectomy (MIRP).

    PubMed

    Goldstein, R E; Martin, W H; Richards, K

    2003-06-01

    The technique of parathyroidectomy has traditionally involved a bilateral exploration of the neck with the intent of visualizing 4 parathyroid glands and resecting pathologically enlarged glands. Parathyroid scanning using technetium-99m sestamibi has evolved and can now localize 80% to 90% of parathyroid adenomas. The technique of minimally invasive radioguided parathyroidectomy (MIRP) is a surgical option for most patients with primary hyperparathyroidism and a positive preoperative parathyroid scan. The technique makes use of a hand-held gamma probe that is used intraoperatively to guide the dissection in a highly directed manner with the procedure often performed under local anesthesia. The technique results in excellent cure rates while allowing most patients to leave the hospital within a few hours after the completion of the procedure. Current data also suggest the procedure can decrease hospital charges by approximately 50%. This technique may significantly change the management of primary hyperparathyroidism. PMID:12955045

  4. Waste minimization in analytical methods

    SciTech Connect

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-05-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department`s goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision.

  5. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  6. Minimizing Variation in Outdoor CPV Power Ratings: Preprint

    SciTech Connect

    Muller, M.; Marion, B.; Rodriguez, J.; Kurtz, S.

    2011-07-01

    The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  7. Minimal length uncertainty and accelerating universe

    NASA Astrophysics Data System (ADS)

    Farmany, A.; Mortazavi, S. S.

    2016-06-01

    In this paper, minimal length uncertainty is used as a constraint to solve the Friedman equation. It is shown that, based on the minimal length uncertainty principle, the Hubble scale is decreasing which corresponds to an accelerating universe.

  8. Architectural Methodology Report

    NASA Technical Reports Server (NTRS)

    Dhas, Chris

    2000-01-01

    The establishment of conventions between two communicating entities in the end systems is essential for communications. Examples of the kind of decisions that need to be made in establishing a protocol convention include the nature of the data representation, the for-mat and the speed of the date representation over the communications path, and the sequence of control messages (if any) which are sent. One of the main functions of a protocol is to establish a standard path between the communicating entities. This is necessary to create a virtual communications medium with certain desirable characteristics. In essence, it is the function of the protocol to transform the characteristics of the physical communications environment into a more useful virtual communications model. The final function of a protocol is to establish standard data elements for communications over the path; that is, the protocol serves to create a virtual data element for exchange. Other systems may be constructed in which the transferred element is a program or a job. Finally, there are special purpose applications in which the element to be transferred may be a complex structure such as all or part of a graphic display. NASA's Glenn Research Center (GRC) defines and develops advanced technology for high priority national needs in communications technologies for application to aeronautics and space. GRC tasked Computer Networks and Software Inc. (CNS) to describe the methodologies used in developing a protocol architecture for an in-space Internet node. The node would support NASA:s four mission areas: Earth Science; Space Science; Human Exploration and Development of Space (HEDS); Aerospace Technology. This report presents the methodology for developing the protocol architecture. The methodology addresses the architecture for a computer communications environment. It does not address an analog voice architecture.

  9. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  10. Closed locally minimal nets on tetrahedra

    SciTech Connect

    Strelkova, Nataliya P

    2011-01-31

    Closed locally minimal networks are in a sense a generalization of closed geodesics. A complete classification is known of closed locally minimal networks on regular (and generally any equihedral) tetrahedra. In the present paper certain necessary and certain sufficient conditions are given for at least one closed locally minimal network to exist on a given non-equihedral tetrahedron. Bibliography: 6 titles.

  11. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  12. Methodology for Developing a Crop Yield Stability Map for a Field

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This abstract will summarize the methodology used to develop a yield stability map for a field. We proposed that there exist yield stability patters for commercial field crop production which growers can use to optimize crop production while minimizing inputs. The methodology uses multiple years o...

  13. Relative Hazard Calculation Methodology

    SciTech Connect

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-09-07

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation).

  14. Regional Expansion of Minimally Invasive Surgery for Hysterectomy: Implementation and Methodology in a Large Multispecialty Group

    PubMed Central

    Andryjowicz, Esteban; Wray, Teresa

    2011-01-01

    Introduction: Approximately 600,000 hysterectomies are performed in the US each year, making hysterectomy the second most common major operation performed in women. Several methods can be used to perform this procedure. In 2009, a Cochrane Review concluded “that vaginal hysterectomy should be performed in preference to abdominal hysterectomy, where possible. Where vaginal hysterectomy is not possible, a laparoscopic approach may avoid the need for an abdominal hysterectomy. Risks and benefits of different approaches may however be influenced by the surgeon's experience. More research is needed, particularly to examine the long-term effects of the different types of surgery.” This article reviews the steps that a large multispecialty group used to teach non-open hysterectomy methods to improve the quality of care for their patients and to decrease the number of inpatient procedures and therefore costs. The percentages of each type of hysterectomy performed yearly between 2005 and 2010 were calculated, as well as the length of stay (LOS) for each method. Methods: A structured educational intervention with both didactic and hands-on exercises was created and rolled out to 12 medical centers. All patients undergoing hysterectomy for benign conditions through the Southern California Permanente Medical Group (a large multispecialty group that provides medical care to Kaiser Permanente patients in Southern California) between 2005 and 2010 were included. This amounted to 26,055 hysterectomies for benign conditions being performed by more than 350 obstetrician/gynecologists (Ob/Gyns). Results: More than 300 Ob/Gyns took the course across 12 medical centers. On the basis of hospital discharge data, the total number of hysterectomies, types of hysterectomies, and LOS for each type were identified for each year. Between 2005 and 2010, the rate of non-open hysterectomies has increased 120% (from 38% to 78%) and the average LOS has decreased 31%. PMID:22319415

  15. Analysis of drug combinations: current methodological landscape

    PubMed Central

    Foucquier, Julie; Guedj, Mickael

    2015-01-01

    Combination therapies exploit the chances for better efficacy, decreased toxicity, and reduced development of drug resistance and owing to these advantages, have become a standard for the treatment of several diseases and continue to represent a promising approach in indications of unmet medical need. In this context, studying the effects of a combination of drugs in order to provide evidence of a significant superiority compared to the single agents is of particular interest. Research in this field has resulted in a large number of papers and revealed several issues. Here, we propose an overview of the current methodological landscape concerning the study of combination effects. First, we aim to provide the minimal set of mathematical and pharmacological concepts necessary to understand the most commonly used approaches, divided into effect-based approaches and dose–effect-based approaches, and introduced in light of their respective practical advantages and limitations. Then, we discuss six main common methodological issues that scientists have to face at each step of the development of new combination therapies. In particular, in the absence of a reference methodology suitable for all biomedical situations, the analysis of drug combinations should benefit from a collective, appropriate, and rigorous application of the concepts and methods reviewed here. PMID:26171228

  16. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  17. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  18. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  19. Minimalism through intraoperative functional mapping.

    PubMed

    Berger, M S

    1996-01-01

    Intraoperative stimulation mapping may be used to avoid unnecessary risk to functional regions subserving language and sensori-motor pathways. Based on the data presented here, language localization is variable in the entire population, with only certainty existing for the inferior frontal region responsible for motor speech. Anatomical landmarks such as the anterior temporal tip for temporal lobe language sites and the posterior aspect of the lateral sphenoid wing for the frontal lobe language zones are unreliable in avoiding postoperative aphasias. Thus, individual mapping to identify essential language sites has the greatest likelihood of avoiding permanent deficits in naming, reading, and motor speech. In a similar approach, motor and sensory pathways from the cortex and underlying white matter may be reliably stimulated and mapped in both awake and asleep patients. Although these techniques require an additional operative time and equipment nominally priced, the result is often gratifying, as postoperative morbidity has been greatly reduced in the process of incorporating these surgical strategies. The patients quality of life is improved in terms of seizure control, with or without antiepileptic drugs. This avoids having to perform a second costly operative procedure, which is routinely done when extraoperative stimulation and recording is done via subdural grids. In addition, an aggressive tumor resection at the initial operation lengthens the time to tumor recurrence and often obviates the need for a subsequent reoperation. Thus, intraoperative functional mapping may be best alluded to as a surgical technique that results in "minimalism in the long term". PMID:9247814

  20. Minimally invasive medial hip approach.

    PubMed

    Chiron, P; Murgier, J; Cavaignac, E; Pailhé, R; Reina, N

    2014-10-01

    The medial approach to the hip via the adductors, as described by Ludloff or Ferguson, provides restricted visualization and incurs a risk of neurovascular lesion. We describe a minimally invasive medial hip approach providing broader exposure of extra- and intra-articular elements in a space free of neurovascular structures. With the lower limb in a "frog-leg" position, the skin incision follows the adductor longus for 6cm and then the aponeurosis is incised. A slide plane between all the adductors and the aponeurosis is easily released by blunt dissection, with no interposed neurovascular elements. This gives access to the lesser trochanter, psoas tendon and inferior sides of the femoral neck and head, anterior wall of the acetabulum and labrum. We report a series of 56 cases, with no major complications: this approach allows treatment of iliopsoas muscle lesions and resection or filling of benign tumors of the cervical region and enables intra-articular surgery (arthrolysis, resection of osteophytes or foreign bodies, labral suture). PMID:25164350

  1. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  2. Cancer Cytogenetics: Methodology Revisited

    PubMed Central

    2014-01-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed. PMID:25368816

  3. Methodological Problems of Nanotechnoscience

    NASA Astrophysics Data System (ADS)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  4. Heart bypass surgery - minimally invasive - discharge

    MedlinePlus

    ... Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest . ... bypass surgery - minimally invasive Heart failure - overview High blood cholesterol ...

  5. Prioritization Methodology for Chemical Replacement

    NASA Technical Reports Server (NTRS)

    Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.

    1993-01-01

    This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.

  6. Dosimetric methodology of the ICRP

    SciTech Connect

    Eckerman, K.F.

    1994-12-31

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples.

  7. Development methodology for scientific software

    SciTech Connect

    Cort, G.; Goldstone, J.A.; Nelson, R.O.; Poore, R.V.; Miller, L.; Barrus, D.M.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed.

  8. Status of sonic boom methodology and understanding

    NASA Technical Reports Server (NTRS)

    Darden, Christine M.; Powell, Clemans A.; Hayes, Wallace D.; George, Albert R.; Pierce, Allan D.

    1989-01-01

    In January 1988, approximately 60 representatives of industry, academia, government, and the military gathered at NASA-Langley for a 2 day workshop on the state-of-the-art of sonic boom physics, methodology, and understanding. The purpose of the workshop was to assess the sonic boom area, to determine areas where additional sonic boom research is needed, and to establish some strategies and priorities in this sonic boom research. Attendees included many internationally recognized sonic boom experts who had been very active in the Supersonic Transport (SST) and Supersonic Cruise Aircraft Research Programs of the 60's and 70's. Summaries of the assessed state-of-the-art and the research needs in theory, minimization, atmospheric effects during propagation, and human response are given.

  9. Locus minimization in breed prediction using artificial neural network approach.

    PubMed

    Iquebal, M A; Ansari, M S; Sarika; Dixit, S P; Verma, N K; Aggarwal, R A K; Jayakumar, S; Rai, A; Kumar, D

    2014-12-01

    Molecular markers, viz. microsatellites and single nucleotide polymorphisms, have revolutionized breed identification through the use of small samples of biological tissue or germplasm, such as blood, carcass samples, embryos, ova and semen, that show no evident phenotype. Classical tools of molecular data analysis for breed identification have limitations, such as the unavailability of referral breed data, causing increased cost of collection each time, compromised computational accuracy and complexity of the methodology used. We report here the successful use of an artificial neural network (ANN) in background to decrease the cost of genotyping by locus minimization. The webserver is freely accessible (http://nabg.iasri.res.in/bisgoat) to the research community. We demonstrate that the machine learning (ANN) approach for breed identification is capable of multifold advantages such as locus minimization, leading to a drastic reduction in cost, and web availability of reference breed data, alleviating the need for repeated genotyping each time one investigates the identity of an unknown breed. To develop this model web implementation based on ANN, we used 51,850 samples of allelic data of microsatellite-marker-based DNA fingerprinting on 25 loci covering 22 registered goat breeds of India for training. Minimizing loci to up to nine loci through the use of a multilayer perceptron model, we achieved 96.63% training accuracy. This server can be an indispensable tool for identification of existing breeds and new synthetic commercial breeds, leading to protection of intellectual property in case of sovereignty and bio-piracy disputes. This server can be widely used as a model for cost reduction by locus minimization for various other flora and fauna in terms of variety, breed and/or line identification, especially in conservation and improvement programs. PMID:25183434

  10. WASTE MINIMIZATION ASSESSMENT FOR A DAIRY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. n an effort to assist these manufacturers, Waste Minimization Assessment Ce...

  11. Is goal ascription possible in minimal mindreading?

    PubMed

    Butterfill, Stephen A; Apperly, Ian A

    2016-03-01

    In this response to the commentary by Michael and Christensen, we first explain how minimal mindreading is compatible with the development of increasingly sophisticated mindreading behaviors that involve both executive functions and general knowledge and then sketch 1 approach to a minimal account of goal ascription. PMID:26901746

  12. WASTE MINIMIZATION ASSESSMENT FOR A BOURBON DISTILLERY

    EPA Science Inventory

    The U.S.Environmental Protection Agency (EPA) has funded a pilot project to assist small and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. Waste Minimization Assessment Centers (WMACs) were established at selected un...

  13. Making the Most of Minimalism in Music.

    ERIC Educational Resources Information Center

    Geiersbach, Frederick J.

    1998-01-01

    Describes the minimalist movement in music. Discusses generations of minimalist musicians and, in general, the minimalist approach. Considers various ways that minimalist strategies can be integrated into the music classroom focusing on (1) minimalism and (2) student-centered composition and principles of minimalism for use with elementary band…

  14. Minimizing electrode contamination in an electrochemical cell

    DOEpatents

    Kim, Yu Seung; Zelenay, Piotr; Johnston, Christina

    2014-12-09

    An electrochemical cell assembly that is expected to prevent or at least minimize electrode contamination includes one or more getters that trap a component or components leached from a first electrode and prevents or at least minimizes them from contaminating a second electrode.

  15. Methodological Pluralism and Narrative Inquiry

    ERIC Educational Resources Information Center

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  16. Exploring biomolecular systems: From methodology to application

    NASA Astrophysics Data System (ADS)

    Liu, Pu

    This thesis describes new methodology development and applications in the computer simulation on biomolecular systems. To reduce the number of parallel processors in replica exchange, we deform the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. To accelerate barrier crossing in sampling of rough energy landscape, we invoke quantum tunnelling by using Feynman path-integral theory. Combined with local minimization, this new global optimization method successfully locates almost all the known classical global energy minima for Lennard-Jones clusters of size up to 100. We present a new methodology for calculating diffusion coefficients for molecules in confined space and apply it in water-vapor interface. We examine hydrogen bond dynamics of water-vapor interface and compare dynamics in polarizable and fixed charge water models. The result highlights the potential importance of polarization effect in the water-vapor interface. Finally, we discover a strong water drying transition in a biological protein system, the melittin tetramer. This is the first observation of such a strong transition in computer simulation for protein systems. The surface topology is shown to be very important for this drying transition.

  17. A non-parametric segmentation methodology for oral videocapillaroscopic images.

    PubMed

    Bellavia, Fabio; Cacioppo, Antonino; Lupaşcu, Carmen Alina; Messina, Pietro; Scardina, Giuseppe; Tegolo, Domenico; Valenti, Cesare

    2014-05-01

    We aim to describe a new non-parametric methodology to support the clinician during the diagnostic process of oral videocapillaroscopy to evaluate peripheral microcirculation. Our methodology, mainly based on wavelet analysis and mathematical morphology to preprocess the images, segments them by minimizing the within-class luminosity variance of both capillaries and background. Experiments were carried out on a set of real microphotographs to validate this approach versus handmade segmentations provided by physicians. By using a leave-one-patient-out approach, we pointed out that our methodology is robust, according to precision-recall criteria (average precision and recall are equal to 0.924 and 0.923, respectively) and it acts as a physician in terms of the Jaccard index (mean and standard deviation equal to 0.858 and 0.064, respectively). PMID:24657094

  18. Minimally Invasive Cardiovascular Surgery: Incisions and Approaches

    PubMed Central

    Langer, Nathaniel B.; Argenziano, Michael

    2016-01-01

    Throughout the modern era of cardiac surgery, most operations have been performed via median sternotomy with cardiopulmonary bypass. This paradigm is changing, however, as cardiovascular surgery is increasingly adopting minimally invasive techniques. Advances in patient evaluation, instrumentation, and operative technique have allowed surgeons to perform a wide variety of complex operations through smaller incisions and, in some cases, without cardiopulmonary bypass. With patients desiring less invasive operations and the literature supporting decreased blood loss, shorter hospital length of stay, improved postoperative pain, and better cosmesis, minimally invasive cardiac surgery should be widely practiced. Here, we review the incisions and approaches currently used in minimally invasive cardiovascular surgery. PMID:27127555

  19. Technology applications for radioactive waste minimization

    SciTech Connect

    Devgun, J.S.

    1994-07-01

    The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry.

  20. Imaging and minimally invasive aortic valve replacement

    PubMed Central

    Loor, Gabriel

    2015-01-01

    Cardiovascular imaging has been the most important tool allowing for innovation in cardiac surgery. There are now a variety of approaches available for treating aortic valve disease, including standard sternotomy, minimally invasive surgery, and percutaneous valve replacement. Minimally invasive cardiac surgery relies on maximizing exposure within a limited field of view. The complexity of this approach is increased as the relationship between the great vessels and the bony thorax varies between individuals. Ultimately, the success of minimally invasive surgery depends on appropriate choices regarding the type and location of the incision, cannulation approach, and cardioprotection strategy. These decisions are facilitated by preoperative imaging, which forms the focus of this review. PMID:25694979

  1. Minimal representations, geometric quantization, and unitarity.

    PubMed Central

    Brylinski, R; Kostant, B

    1994-01-01

    In the framework of geometric quantization we explicitly construct, in a uniform fashion, a unitary minimal representation pio of every simply-connected real Lie group Go such that the maximal compact subgroup of Go has finite center and Go admits some minimal representation. We obtain algebraic and analytic results about pio. We give several results on the algebraic and symplectic geometry of the minimal nilpotent orbits and then "quantize" these results to obtain the corresponding representations. We assume (Lie Go)C is simple. PMID:11607478

  2. Minimization of power consumption during charging of superconducting accelerating cavities

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Anirban Krishna; Ziemann, Volker; Ruber, Roger; Goryashko, Vitaliy

    2015-11-01

    The radio frequency cavities, used to accelerate charged particle beams, need to be charged to their nominal voltage after which the beam can be injected into them. The standard procedure for such cavity filling is to use a step charging profile. However, during initial stages of such a filling process a substantial amount of the total energy is wasted in reflection for superconducting cavities because of their extremely narrow bandwidth. The paper presents a novel strategy to charge cavities, which reduces total energy reflection. We use variational calculus to obtain analytical expression for the optimal charging profile. Energies, reflected and required, and generator peak power are also compared between the charging schemes and practical aspects (saturation, efficiency and gain characteristics) of power sources (tetrodes, IOTs and solid state power amplifiers) are also considered and analysed. The paper presents a methodology to successfully identify the optimal charging scheme for different power sources to minimize total energy requirement.

  3. Minimizing Variation in Outdoor CPV Power Ratings (Presentation)

    SciTech Connect

    Muller, M.

    2011-04-01

    Presented at the 7th International Conference on Concentrating Photovoltaic Systems (CPV-7), 4-6 April 2011, Las Vegas, Nevada. The CPV community has agreed to have both indoor and outdoor power ratings at the module level. The indoor rating provides a repeatable measure of module performance as it leaves the factory line while the outdoor rating provides a measure of true performance under real world conditions. The challenge with an outdoor rating is that the spectrum, temperature, wind speed, etc are constantly in flux and therefore the resulting power rating varies from day to day and month to month. This work examines different methodologies for determining the outdoor power rating with the goal of minimizing variation even if data are collected under changing meteorological conditions.

  4. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  5. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  6. Genetic algorithms for minimal source reconstructions

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.

    1993-12-01

    Under-determined linear inverse problems arise in applications in which signals must be estimated from insufficient data. In these problems the number of potentially active sources is greater than the number of observations. In many situations, it is desirable to find a minimal source solution. This can be accomplished by minimizing a cost function that accounts from both the compatibility of the solution with the observations and for its ``sparseness``. Minimizing functions of this form can be a difficult optimization problem. Genetic algorithms are a relatively new and robust approach to the solution of difficult optimization problems, providing a global framework that is not dependent on local continuity or on explicit starting values. In this paper, the authors describe the use of genetic algorithms to find minimal source solutions, using as an example a simulation inspired by the reconstruction of neural currents in the human brain from magnetoencephalographic (MEG) measurements.

  7. Minimally Invasive Transcatheter Aortic Valve Replacement (TAVR)

    MedlinePlus Videos and Cool Tools

    Watch a Broward Health surgeon perform a minimally invasive Transcatheter Aortic Valve Replacement (TAVR) Click Here to view the BroadcastMed, Inc. Privacy Policy and Legal Notice © 2016 BroadcastMed, Inc. All rights reserved.

  8. Assembling Precise Truss Structures With Minimal Stresses

    NASA Technical Reports Server (NTRS)

    Sword, Lee F.

    1996-01-01

    Improved method of assembling precise truss structures involves use of simple devices. Tapered pins that fit in tapered holes indicate deviations from prescribed lengths. Method both helps to ensure precision of finished structures and minimizes residual stresses within structures.

  9. Minimally Invasive Treatments for Breast Cancer

    MedlinePlus

    ... SIR login) Interventional Radiology Minimally Invasive Treatments for Breast Cancer Interventional Radiology Treatments Offer New Options and Hope ... have in the fight against breast cancer. About Breast Cancer When breast tissue divides and grows at an ...

  10. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    Not Available

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  11. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  12. Degreasing of titanium to minimize stress corrosion

    NASA Technical Reports Server (NTRS)

    Carpenter, S. R.

    1967-01-01

    Stress corrosion of titanium and its alloys at elevated temperatures is minimized by replacing trichloroethylene with methanol or methyl ethyl ketone as a degreasing agent. Wearing cotton gloves reduces stress corrosion from perspiration before the metal components are processed.

  13. Waste minimization in electroplating industries: a review.

    PubMed

    Babu, B Ramesh; Bhanu, S Udaya; Meera, K Seeni

    2009-07-01

    Wastewater, spent solvent, spent process solutions, and sludge are the major waste streams generated in large volumes daily in electroplating industries. These waste streams can be significantly minimized through process modification and operational improvement. Waste minimization methods have been implemented in some of the electroplating industries. Suggestions such as practicing source reduction approaches, reduction in drag out and waste, process modification and environmental benefits, have also been adopted. In this endeavor, extensive knowledge covering various disciplines has been studied, which makes problem solving extremely easy. Moreover, available process data pertaining to waste minimization (WM) is usually imprecise, incomplete, and uncertain due to the lack of sensors, the difficulty of measurement, and process variations. In this article waste minimization techniques and its advantages on the improvement of working atmosphere and reduction in operating cost have been discussed. PMID:19657919

  14. Controlling molecular transport in minimal emulsions.

    PubMed

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of 'minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  15. Analysis of lipid flow on minimal surfaces

    NASA Astrophysics Data System (ADS)

    Bahmani, Fatemeh; Christenson, Joel; Rangamani, Padmini

    2016-03-01

    Interaction between the bilayer shape and surface flow is important for capturing the flow of lipids in many biological membranes. Recent microscopy evidence has shown that minimal surfaces (planes, catenoids, and helicoids) occur often in cellular membranes. In this study, we explore lipid flow in these geometries using a `stream function' formulation for viscoelastic lipid bilayers. Using this formulation, we derive two-dimensional lipid flow equations for the commonly occurring minimal surfaces in lipid bilayers. We show that for three minimal surfaces (planes, catenoids, and helicoids), the surface flow equations satisfy Stokes flow equations. In helicoids and catenoids, we show that the tangential velocity field is a Killing vector field. Thus, our analysis provides fundamental insight into the flow patterns of lipids on intracellular organelle membranes that are characterized by fixed shapes reminiscent of minimal surfaces.

  16. TOWARD MINIMALLY ADHESIVE SURFACES UTILIZING SILOXANES

    EPA Science Inventory

    Three types of siloxane-based network polymers have been investigated for their surface properties towards potential applications as minimally adhesive coatings. A filled poly(dimethylsiloxane) (PDMS) elastomer, RTV it, has been studied to determine surface weldability and stabil...

  17. Minimally Invasive Osteotomies of the Calcaneus.

    PubMed

    Guyton, Gregory P

    2016-09-01

    Osteotomies of the calcaneus are powerful surgical tools, representing a critical component of the surgical reconstruction of pes planus and pes cavus deformity. Modern minimally invasive calcaneal osteotomies can be performed safely with a burr through a lateral incision. Although greater kerf is generated with the burr, the effect is modest, can be minimized, and is compatible with many fixation techniques. A hinged jig renders the procedure more reproducible and accessible. PMID:27524705

  18. PWM control techniques for rectifier filter minimization

    SciTech Connect

    Ziogas, P.D.; Kang, Y-G; Stefanovic, V.R.

    1985-09-01

    Minimization of input/output filters is an essential step towards manufacturing compact low-cost static power supplies. Three PWM control techniques that yield substantial filter size reduction for three-phase (self-commutated) rectifiers are presented and analyzed. Filters required by typical line-commutated rectifiers are used as the basis for comparison. Moreover, it is shown that in addition to filter minimization two of the proposed three control techniques improve substantially the rectifier total input power factor.

  19. Minimally Invasive Forefoot Surgery in France.

    PubMed

    Meusnier, Tristan; Mukish, Prikesht

    2016-06-01

    Study groups have been formed in France to advance the use of minimally invasive surgery. These techniques are becoming more frequently used and the technique nuances are continuing to evolve. The objective of this article was to advance the awareness of the current trends in minimally invasive surgery for common diseases of the forefoot. The percutaneous surgery at the forefoot is less developed at this time, but also will be discussed. PMID:27261810

  20. Mesonic spectroscopy of minimal walking technicolor

    SciTech Connect

    Del Debbio, Luigi; Lucini, Biagio; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2010-07-01

    We investigate the structure and the novel emerging features of the mesonic nonsinglet spectrum of the minimal walking technicolor theory. Precision measurements in the nonsinglet pseudoscalar and vector channels are compared to the expectations for an IR-conformal field theory and a QCD-like theory. Our results favor a scenario in which minimal walking technicolor is (almost) conformal in the infrared, while spontaneous chiral symmetry breaking seems less plausible.

  1. Alternating minimization and Boltzmann machine learning.

    PubMed

    Byrne, W

    1992-01-01

    Training a Boltzmann machine with hidden units is appropriately treated in information geometry using the information divergence and the technique of alternating minimization. The resulting algorithm is shown to be closely related to gradient descent Boltzmann machine learning rules, and the close relationship of both to the EM algorithm is described. An iterative proportional fitting procedure for training machines without hidden units is described and incorporated into the alternating minimization algorithm. PMID:18276461

  2. Future of Minimally Invasive Colorectal Surgery.

    PubMed

    Whealon, Matthew; Vinci, Alessio; Pigazzi, Alessio

    2016-09-01

    Minimally invasive surgery is slowly taking over as the preferred operative approach for colorectal diseases. However, many of the procedures remain technically difficult. This article will give an overview of the state of minimally invasive surgery and the many advances that have been made over the last two decades. Specifically, we discuss the introduction of the robotic platform and some of its benefits and limitations. We also describe some newer techniques related to robotics. PMID:27582647

  3. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  4. Mach, methodology, hysteresis and economics

    NASA Astrophysics Data System (ADS)

    Cross, R.

    2008-11-01

    This methodological note examines the epistemological foundations of hysteresis with particular reference to applications to economic systems. The economy principles of Ernst Mach are advocated and used in this assessment.

  5. Methodological Problems of Soviet Pedagogy

    ERIC Educational Resources Information Center

    Noah, Harold J., Ed.; Beach, Beatrice S., Ed.

    1974-01-01

    Selected papers presented at the First Scientific Conference of Pedagogical Scholars of Socialist Countries, Moscow, 1971, deal with methodology in relation to science, human development, sociology, psychology, cybernetics, and the learning process. (KM)

  6. New Directions for Futures Methodology.

    ERIC Educational Resources Information Center

    Enzer, Selwyn

    1983-01-01

    Understanding the link between futures research and strategic planning is crucial to effective long-range planning and administration. Current trends and the latest developments in the methodology of futures research are discussed. (MLW)

  7. Multifunction minimization for programmable logic arrays

    SciTech Connect

    Campbell, J.A.

    1984-01-01

    The problem of minimizing two-level AND/OR Boolean algebraic functions of n inputs and m outputs for implementation on programmable logic arrays (PLA) is examined. The theory of multiple-output functions as well as the historically alternative approaches to reckoning the cost of an equation implementation are reviewed. The PLA is shown to be a realization of the least product gate equation cost criterion. The multi-function minimization is dealt with in the context of a directed tree search algorithm developed in previous research. The PLA oriented minimization is shown to alter the nature of each of the basic tenets of multiple-output minimization used in earlier work. The concept of a non-prime but selectable implicant is introduced. A new cost criterion, the quantum cost, is discussed, and an approximation algorithm utilizing this criterion is developed. A timing analysis of a cyclic resolution algorithm for PLA based functions is presented. Lastly, the question of efficiency in automated minimization algorithms is examined. The application of the PLA cost criterion is shown to exhibit intrinsic increases in computational efficiency. A minterm classification algorithm is suggested and a PLA minimization algorithm is implemented in the FORTRAN language.

  8. Economic impact of minimally invasive lumbar surgery

    PubMed Central

    Hofstetter, Christoph P; Hofer, Anna S; Wang, Michael Y

    2015-01-01

    Cost effectiveness has been demonstrated for traditional lumbar discectomy, lumbar laminectomy as well as for instrumented and noninstrumented arthrodesis. While emerging evidence suggests that minimally invasive spine surgery reduces morbidity, duration of hospitalization, and accelerates return to activites of daily living, data regarding cost effectiveness of these novel techniques is limited. The current study analyzes all available data on minimally invasive techniques for lumbar discectomy, decompression, short-segment fusion and deformity surgery. In general, minimally invasive spine procedures appear to hold promise in quicker patient recovery times and earlier return to work. Thus, minimally invasive lumbar spine surgery appears to have the potential to be a cost-effective intervention. Moreover, novel less invasive procedures are less destabilizing and may therefore be utilized in certain indications that traditionally required arthrodesis procedures. However, there is a lack of studies analyzing the economic impact of minimally invasive spine surgery. Future studies are necessary to confirm the durability and further define indications for minimally invasive lumbar spine procedures. PMID:25793159

  9. Minimally Invasive Surgery in Gynecologic Oncology

    PubMed Central

    Mori, Kristina M.; Neubauer, Nikki L.

    2013-01-01

    Minimally invasive surgery has been utilized in the field of obstetrics and gynecology as far back as the 1940s when culdoscopy was first introduced as a visualization tool. Gynecologists then began to employ minimally invasive surgery for adhesiolysis and obtaining biopsies but then expanded its use to include procedures such as tubal sterilization (Clyman (1963), L. E. Smale and M. L. Smale (1973), Thompson and Wheeless (1971), Peterson and Behrman (1971)). With advances in instrumentation, the first laparoscopic hysterectomy was successfully performed in 1989 by Reich et al. At the same time, minimally invasive surgery in gynecologic oncology was being developed alongside its benign counterpart. In the 1975s, Rosenoff et al. reported using peritoneoscopy for pretreatment evaluation in ovarian cancer, and Spinelli et al. reported on using laparoscopy for the staging of ovarian cancer. In 1993, Nichols used operative laparoscopy to perform pelvic lymphadenectomy in cervical cancer patients. The initial goals of minimally invasive surgery, not dissimilar to those of modern medicine, were to decrease the morbidity and mortality associated with surgery and therefore improve patient outcomes and patient satisfaction. This review will summarize the history and use of minimally invasive surgery in gynecologic oncology and also highlight new minimally invasive surgical approaches currently in development. PMID:23997959

  10. Department of Energy's waste minimization program

    SciTech Connect

    Not Available

    1991-09-01

    Waste minimization, as mandated by the Congress, requires, the elimination or reduction of the generation of waste as its source, that is, before it can become waste. This audit was made to determine the adequacy of DOE's efforts to minimize the generation of waste. The audit emphasized radioactive and other hazardous waste generation at DOE's nuclear weapons production plants and design laboratories. We included waste minimization activities and actions that can be taken now, in contrast to the long-range weapons complex modernization effort. We reviewed waste minimization activities within the Office of Environmental Restoration and Waste Management (EM), the Office of the Assistant Secretary for Defense Programs (DP), the Hazardous Waste Remedial Action Program Office, and the Waste Minimization Management Group (WMMG) in the Albuquerque Field Office. Waste minimization programs were examined in detail at the three largest nuclear weapons production facilities -- the Rocky Flats plant, which manufactures plutonium parts; the Y-12 facility, which produces uranium components; and the Savannah River site, which manufactures and loads tritium -- and two of DOE's weapons design laboratories, Los Alamos and Sandia.

  11. Cluster Stability Estimation Based on a Minimal Spanning Trees Approach

    NASA Astrophysics Data System (ADS)

    Volkovich, Zeev (Vladimir); Barzily, Zeev; Weber, Gerhard-Wilhelm; Toledano-Kitai, Dvora

    2009-08-01

    Among the areas of data and text mining which are employed today in science, economy and technology, clustering theory serves as a preprocessing step in the data analyzing. However, there are many open questions still waiting for a theoretical and practical treatment, e.g., the problem of determining the true number of clusters has not been satisfactorily solved. In the current paper, this problem is addressed by the cluster stability approach. For several possible numbers of clusters we estimate the stability of partitions obtained from clustering of samples. Partitions are considered consistent if their clusters are stable. Clusters validity is measured as the total number of edges, in the clusters' minimal spanning trees, connecting points from different samples. Actually, we use the Friedman and Rafsky two sample test statistic. The homogeneity hypothesis, of well mingled samples within the clusters, leads to asymptotic normal distribution of the considered statistic. Resting upon this fact, the standard score of the mentioned edges quantity is set, and the partition quality is represented by the worst cluster corresponding to the minimal standard score value. It is natural to expect that the true number of clusters can be characterized by the empirical distribution having the shortest left tail. The proposed methodology sequentially creates the described value distribution and estimates its left-asymmetry. Numerical experiments, presented in the paper, demonstrate the ability of the approach to detect the true number of clusters.

  12. Swords into plowshares -- Tritium waste minimization (training development project)

    SciTech Connect

    Hehmeyer, J.; Sienkiewicz, C.; Kent, L.; Gill, J.; Schmitz, W.; Mills, T.; Wurstner, R.; Adams, F.; Seabaugh, P.

    1995-12-31

    A concentrated emphasis of Mound`s historical mission has been working with tritium. As the phase out of defense work begins and the increase on environmental technology strengthens, so too must a shift occur in applying one`s focus. Mound`s longstanding efforts in Tritium Training have proven fruitful to them and the Complex. It is this emphasis for which a new generation of worker training is being developed, one which reflects a new mission; Tritium Waste Minimization. The efforts of previous training, particularly under Accreditation, have given a solid base on which to launch the Waste Minimization program. Typical operations consider the impact on the varying levels of containment and the tools and agents used to achieve those levels. D and D and system modifications are bringing new light to such things as floor tile, oils, mole sieves, and rust. Of financial interest is the amount of savings which have been obtained through review and modification, rather than developing a new program. The authors are learning not to reinvent the wheel. The presentation will compare and contrast the methodologies used in creating and implementing this training program. Emphasis will be placed on lessons learned, costs saved, and program enhancement.

  13. Approach to analytically minimize the LCD moiré by image-based particle swarm optimization.

    PubMed

    Tsai, Yu-Lin; Tien, Chung-Hao

    2015-10-01

    In this paper, we proposed a methodology to optimize the parametric window of a liquid crystal display (LCD) system, whose visual performance was deteriorated by the pixel moiré arising in between multiple periodic structures. Conventional analysis and minimization of moiré patterns are limited by few parameters. With the proposed image-based particle swarm optimization (PSO), we enable a multivariable optimization at the same time. A series of experiments was conducted to validate the methodology. Due to its versatility, the proposed technique will certainly have a promising impact on the fast optimization in LCD design with more complex configuration. PMID:26479663

  14. Minimally invasive procedures on the lumbar spine.

    PubMed

    Skovrlj, Branko; Gilligan, Jeffrey; Cutler, Holt S; Qureshi, Sheeraz A

    2015-01-16

    Degenerative disease of the lumbar spine is a common and increasingly prevalent condition that is often implicated as the primary reason for chronic low back pain and the leading cause of disability in the western world. Surgical management of lumbar degenerative disease has historically been approached by way of open surgical procedures aimed at decompressing and/or stabilizing the lumbar spine. Advances in technology and surgical instrumentation have led to minimally invasive surgical techniques being developed and increasingly used in the treatment of lumbar degenerative disease. Compared to the traditional open spine surgery, minimally invasive techniques require smaller incisions and decrease approach-related morbidity by avoiding muscle crush injury by self-retaining retractors, preventing the disruption of tendon attachment sites of important muscles at the spinous processes, using known anatomic neurovascular and muscle planes, and minimizing collateral soft-tissue injury by limiting the width of the surgical corridor. The theoretical benefits of minimally invasive surgery over traditional open surgery include reduced blood loss, decreased postoperative pain and narcotics use, shorter hospital length of stay, faster recover and quicker return to work and normal activity. This paper describes the different minimally invasive techniques that are currently available for the treatment of degenerative disease of the lumbar spine. PMID:25610845

  15. Minimal control power of the controlled teleportation

    NASA Astrophysics Data System (ADS)

    Jeong, Kabgyun; Kim, Jaewan; Lee, Soojoon

    2016-03-01

    We generalize the control power of a perfect controlled teleportation of an entangled three-qubit pure state, suggested by Li and Ghose [Phys. Rev. A 90, 052305 (2014), 10.1103/PhysRevA.90.052305], to the control power of a general controlled teleportation of a multiqubit pure state. Thus, we define the minimal control power, and calculate the values of the minimal control power for a class of general three-qubit Greenberger-Horne-Zeilinger (GHZ) states and the three-qubit W class whose states have zero three-tangles. Moreover, we show that the standard three-qubit GHZ state and the standard three-qubit W state have the maximal values of the minimal control power for the two classes, respectively. This means that the minimal control power can be interpreted as not only an operational quantity of a three-qubit quantum communication but also a degree of three-qubit entanglement. In addition, we calculate the values of the minimal control power for general n -qubit GHZ states and the n -qubit W -type states.

  16. Methodology of metal criticality determination.

    PubMed

    Graedel, T E; Barr, Rachel; Chandler, Chelsea; Chase, Thomas; Choi, Joanne; Christoffersen, Lee; Friedlander, Elizabeth; Henly, Claire; Jun, Christine; Nassar, Nedal T; Schechner, Daniel; Warren, Simon; Yang, Man-Yu; Zhu, Charles

    2012-01-17

    A comprehensive methodology has been created to quantify the degree of criticality of the metals of the periodic table. In this paper, we present and discuss the methodology, which is comprised of three dimensions: supply risk, environmental implications, and vulnerability to supply restriction. Supply risk differs with the time scale (medium or long), and at its more complex involves several components, themselves composed of a number of distinct indicators drawn from readily available peer-reviewed indexes and public information. Vulnerability to supply restriction differs with the organizational level (i.e., global, national, and corporate). The criticality methodology, an enhancement of a United States National Research Council template, is designed to help corporate, national, and global stakeholders conduct risk evaluation and to inform resource utilization and strategic decision-making. Although we believe our methodological choices lead to the most robust results, the framework has been constructed to permit flexibility by the user. Specific indicators can be deleted or added as desired and weighted as the user deems appropriate. The value of each indicator will evolve over time, and our future research will focus on this evolution. The methodology has proven to be sufficiently robust as to make it applicable across the entire spectrum of metals and organizational levels and provides a structural approach that reflects the multifaceted factors influencing the availability of metals in the 21st century. PMID:22191617

  17. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services. PMID:16378531

  18. On Equilibria for ADM Minimization Games

    NASA Astrophysics Data System (ADS)

    Epstein, Leah; Levin, Asaf

    In the ADM minimization problem, the input is a set of arcs along a directed ring. The input arcs need to be partitioned into non-overlapping chains and cycles so as to minimize the total number of endpoints, where a k-arc cycle contributes k endpoints and a k-arc chain contains k + 1 endpoints. We study ADM minimization problem both as a non-cooperative and a cooperative games. In these games, each arc corresponds to a player, and the players share the cost of the ADM switches. We consider two cost allocation models, a model which was considered by Flammini et al., and a new cost allocation model, which is inspired by congestion games. We compare the price of anarchy and price of stability in the two cost allocation models, as well as the strong price of anarchy and the strong price of stability.

  19. Advanced pyrochemical technologies for minimizing nuclear waste

    SciTech Connect

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-06-01

    The Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium, from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 liters of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper will present technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts.

  20. Genetic Research on Biospecimens Poses Minimal Risk

    PubMed Central

    Wendler, David S.; Rid, Annette

    2014-01-01

    Genetic research on human biospecimens is increasingly common. Yet, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to address this concern using the widely-endorsed ‘risks of daily life’ standard. The three extant versions of this standard all suggest that, with proper measures in place to protect donor confidentiality, most genetic research on human biospecimens poses minimal risk to donors. PMID:25530152

  1. One-dimensional Gromov minimal filling problem

    SciTech Connect

    Ivanov, Alexandr O; Tuzhilin, Alexey A

    2012-05-31

    The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.

  2. Responsible gambling: general principles and minimal requirements.

    PubMed

    Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc

    2011-12-01

    Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program. PMID:21359586

  3. [EVOLUTION OF MINIMALLY INVASIVE CARDIAC SURGERY].

    PubMed

    Fujita, Tomoyuki; Kobayashi, Junjiro

    2016-03-01

    Minimally invasive surgery is an attractive choice for patients undergoing major cardiac surgery. We review the history of minimally invasive valve surgery in this article. Due to many innovations in surgical tools, cardiopulmonary bypass systems, visualization systems, and robotic systems as well as surgical techniques, minimally invasive cardiac surgery has become standard care for valve lesion repair. In particular, aortic cross-clamp techniques and methods for cardioplegia using the Chitwood clamp and root cannula or endoballoon catheter in combination with femoro-femoral bypass systems have made such procedures safer and more practical. On the other hand, robotically assisted surgery has not become standard due to the cost and slow learning curve. However, along with the development of robotics, this less-invasive technique may provide another choice for patients in the near future. PMID:27295770

  4. Minimal invasive treatments for liver malignancies.

    PubMed

    Orsi, Franco; Varano, Gianluca

    2015-11-01

    Minimal invasive therapies have proved useful in the management of primary and secondary hepatic malignancies. The most relevant aspects of all these therapies are their minimal toxicity profiles and highly effective tumor responses without affecting the normal hepatic parenchyma. These unique characteristics coupled with their minimally invasive nature provide an attractive therapeutic option for patients who previously may have had few alternatives. Combination of these therapies might extend indications to bring curative treatment to a wider selected population. The results of various ongoing combination trials of intraarterial therapies with targeted therapies are awaited to further improve survival in this patient group. This review focuses on the application of ablative and intra-arterial therapies in the management of hepatocellular carcinoma and hepatic colorectal metastasis. PMID:26050603

  5. Genetic research on biospecimens poses minimal risk.

    PubMed

    Wendler, David S; Rid, Annette

    2015-01-01

    Genetic research on human biospecimens is increasingly common. However, debate continues over the level of risk that this research poses to sample donors. Some argue that genetic research on biospecimens poses minimal risk; others argue that it poses greater than minimal risk and therefore needs additional requirements and limitations. This debate raises concern that some donors are not receiving appropriate protection or, conversely, that valuable research is being subject to unnecessary requirements and limitations. The present paper attempts to resolve this debate using the widely-endorsed 'risks of daily life' standard. The three extant versions of this standard all suggest that, with proper measures in place to protect confidentiality, most genetic research on human biospecimens poses minimal risk to donors. PMID:25530152

  6. Approximate error conjugation gradient minimization methods

    DOEpatents

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  7. The NLC Software Requirements Methodology

    SciTech Connect

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  8. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D.W.; Bennett, B.L.; Cockroft, N.J.

    1998-09-08

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal. 5 figs.

  9. Minimizing radiation damage in nonlinear optical crystals

    DOEpatents

    Cooke, D. Wayne; Bennett, Bryan L.; Cockroft, Nigel J.

    1998-01-01

    Methods are disclosed for minimizing laser induced damage to nonlinear crystals, such as KTP crystals, involving various means for electrically grounding the crystals in order to diffuse electrical discharges within the crystals caused by the incident laser beam. In certain embodiments, electrically conductive material is deposited onto or into surfaces of the nonlinear crystals and the electrically conductive surfaces are connected to an electrical ground. To minimize electrical discharges on crystal surfaces that are not covered by the grounded electrically conductive material, a vacuum may be created around the nonlinear crystal.

  10. Minimal mass design of tensegrity structures

    NASA Astrophysics Data System (ADS)

    Nagase, Kenji; Skelton, R. E.

    2014-03-01

    This paper provides a unified framework for minimal mass design of tensegrity systems. For any given configuration and any given set of external forces, we design force density (member force divided by length) and cross-section area to minimize the structural mass subject to an equilibrium condition and a maximum stress constraint. The answer is provided by a linear program. Stability is assured by a positive definite stiffness matrix. This condition is described by a linear matrix inequality. Numerical examples are shown to illustrate the proposed method.

  11. Navigated minimally invasive unicompartmental knee arthroplasty.

    PubMed

    Jenny, Jean-Yves; Müller, Peter E; Weyer, R; John, Michael; Weber, Patrick; Ciobanu, Eugène; Schmitz, Andreas; Bacher, Thomas; Neumann, Wolfram; Jansson, Volkmar

    2006-10-01

    Unicompartmental knee arthroplasty (UKA) is an alternative procedure to high tibial osteotomy. This study assessed the procedure using computer navigation to improve implantation accuracy and presents early radiological results of a group of patients implanted with the univation UKA (B. Braun Aesculap, Tuttlingen, Germany) with navigation instrumentation and a minimally invasive approach. The authors concluded that navigated implantation of a UKA using a nonimage-based system improved radiologic accuracy implantation without significant inconvenience and minimal change in the conventional operating technique. PMID:17407935

  12. Instabilities and Solitons in Minimal Strips

    NASA Astrophysics Data System (ADS)

    Machon, Thomas; Alexander, Gareth P.; Goldstein, Raymond E.; Pesci, Adriana I.

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ4 theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation.

  13. Instabilities and Solitons in Minimal Strips.

    PubMed

    Machon, Thomas; Alexander, Gareth P; Goldstein, Raymond E; Pesci, Adriana I

    2016-07-01

    We show that highly twisted minimal strips can undergo a nonsingular transition, unlike the singular transitions seen in the Möbius strip and the catenoid. If the strip is nonorientable, this transition is topologically frustrated, and the resulting surface contains a helicoidal defect. Through a controlled analytic approximation, the system can be mapped onto a scalar ϕ^{4} theory on a nonorientable line bundle over the circle, where the defect becomes a topologically protected kink soliton or domain wall, thus establishing their existence in minimal surfaces. Demonstrations with soap films confirm these results and show how the position of the defect can be controlled through boundary deformation. PMID:27419593

  14. Minimally invasive surgical training: challenges and solutions.

    PubMed

    Pierorazio, Phillip M; Allaf, Mohamad E

    2009-01-01

    Treatment options for urological malignancies continue to increase and include endoscopic, laparoscopic, robotic, and image-guided percutaneous techniques. This ever expanding array of technically demanding management options coupled with a static training paradigm introduces challenges to training the urological oncologist of the future. Minimally invasive learning opportunities continue to evolve, and include an intensive experience during residency, postgraduate short courses or mini-apprenticeships, and full time fellowship programs. Incorporation of large animal surgery and surgical simulators may help shorten the necessary learning curve. Ultimately, programs must provide an intense hands-on experience to trainees in all minimally invasive surgical aspects for optimal training. PMID:19285236

  15. The Parisi Formula has a Unique Minimizer

    NASA Astrophysics Data System (ADS)

    Auffinger, Antonio; Chen, Wei-Kuo

    2015-05-01

    In 1979, Parisi (Phys Rev Lett 43:1754-1756, 1979) predicted a variational formula for the thermodynamic limit of the free energy in the Sherrington-Kirkpatrick model, and described the role played by its minimizer. This formula was verified in the seminal work of Talagrand (Ann Math 163(1):221-263, 2006) and later generalized to the mixed p-spin models by Panchenko (Ann Probab 42(3):946-958, 2014). In this paper, we prove that the minimizer in Parisi's formula is unique at any temperature and external field by establishing the strict convexity of the Parisi functional.

  16. Pattern Search Methods for Linearly Constrained Minimization

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We extend pattern search methods to linearly constrained minimization. We develop a general class of feasible point pattern search algorithms and prove global convergence to a Karush-Kuhn-Tucker point. As in the case of unconstrained minimization, pattern search methods for linearly constrained problems accomplish this without explicit recourse to the gradient or the directional derivative. Key to the analysis of the algorithms is the way in which the local search patterns conform to the geometry of the boundary of the feasible region.

  17. Minimally invasive transforaminal lumbosacral interbody fusion.

    PubMed

    Chang, Peng-Yuan; Wang, Michael Y

    2016-07-01

    In minimally invasive spinal fusion surgery, transforaminal lumbar (sacral) interbody fusion (TLIF) is one of the most common procedures that provides both anterior and posterior column support without retraction or violation to the neural structure. Direct and indirect decompression can be done through this single approach. Preoperative plain radiographs and MR scan should be carefully evaluated. This video demonstrates a standard approach for how to perform a minimally invasive transforaminal lumbosacral interbody fusion. The video can be found here: https://youtu.be/bhEeafKJ370 . PMID:27364426

  18. From Jack polynomials to minimal model spectra

    NASA Astrophysics Data System (ADS)

    Ridout, David; Wood, Simon

    2015-01-01

    In this note, a deep connection between free field realizations of conformal field theories and symmetric polynomials is presented. We give a brief introduction into the necessary prerequisites of both free field realizations and symmetric polynomials, in particular Jack symmetric polynomials. Then we combine these two fields to classify the irreducible representations of the minimal model vertex operator algebras as an illuminating example of the power of these methods. While these results on the representation theory of the minimal models are all known, this note exploits the full power of Jack polynomials to present significant simplifications of the original proofs in the literature.

  19. Non-minimal inflation and SUSY GUTs

    SciTech Connect

    Okada, Nobuchika

    2012-07-27

    The Standard Model Higgs boson with the nonminimal coupling to the gravitational curvature can drive cosmological inflation. We study this type of inflationary scenario in the context of supergravity. We first point out that it is naturally implemented in the minimal supersymmetric SU(5) model, and hence virtually in any GUT models. Next we propose another scenario based on the Minimal Supersymmetric Standard Model supplemented by the right-handed neutrinos. These models can be tested by new observational data from the Planck satellite experiments within a few years.

  20. Minimally invasive approach to familial multiple lipomatosis.

    PubMed

    Ronan, S J; Broderick, T

    2000-09-01

    Thirty-five abdominal wall lipomas were removed from a patient with familial multiple lipomatosis using a minimally invasive approach in a cost-effective, reliable, and cosmetically pleasing manner. The surgical technique used is described in this case report. Clinical findings and prior excisions provided the preoperative diagnosis. The abdominal wall was dissected through two small, vertical midline incisions in the suprafascial plane with the aid of a lighted breast retractor. A complete excision of all palpable lipomas was achieved with this approach. The patient had excellent cosmetic results with minimal postoperative scarring. PMID:11007403

  1. Minimally Invasive Surgery for Inflammatory Bowel Disease

    PubMed Central

    Holder-Murray, Jennifer; Marsicovetere, Priscilla

    2015-01-01

    Abstract: Surgical management of inflammatory bowel disease is a challenging endeavor given infectious and inflammatory complications, such as fistula, and abscess, complex often postoperative anatomy, including adhesive disease from previous open operations. Patients with Crohn's disease and ulcerative colitis also bring to the table the burden of their chronic illness with anemia, malnutrition, and immunosuppression, all common and contributing independently as risk factors for increased surgical morbidity in this high-risk population. However, to reduce the physical trauma of surgery, technologic advances and worldwide experience with minimally invasive surgery have allowed laparoscopic management of patients to become standard of care, with significant short- and long-term patient benefits compared with the open approach. In this review, we will describe the current state-of the-art for minimally invasive surgery for inflammatory bowel disease and the caveats inherent with this practice in this complex patient population. Also, we will review the applicability of current and future trends in minimally invasive surgical technique, such as laparoscopic “incisionless,” single-incision laparoscopic surgery (SILS), robotic-assisted, and other techniques for the patient with inflammatory bowel disease. There can be no doubt that minimally invasive surgery has been proven to decrease the short- and long-term burden of surgery of these chronic illnesses and represents high-value care for both patient and society. PMID:25989341

  2. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. The intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative s...

  3. Practice Enables Successful Learning under Minimal Guidance

    ERIC Educational Resources Information Center

    Brunstein, Angela; Betts, Shawn; Anderson, John R.

    2009-01-01

    Two experiments were conducted, contrasting a minimally guided discovery condition with a variety of instructional conditions. College students interacted with a computer-based tutor that presented algebra-like problems in a novel graphical representation. Although the tutor provided no instruction in a discovery condition, it constrained the…

  4. Minimally invasive pancreatic surgery – a review

    PubMed Central

    Damoli, Isacco; Ramera, Marco; Paiella, Salvatore; Marchegiani, Giovanni; Bassi, Claudio

    2015-01-01

    During the past 20 years the application of a minimally invasive approach to pancreatic surgery has progressively increased. Distal pancreatectomy is the most frequently performed procedure, because of the absence of a reconstructive phase. However, middle pancreatectomy and pancreatoduodenectomy have been demonstrated to be safe and feasible as well. Laparoscopic distal pancreatectomy is recognized as the gold standard treatment for small tumors of the pancreatic body-tail, with several advantages over the traditional open approach in terms of patient recovery. The surgical treatment of lesions of the pancreatic head via a minimally invasive approach is still limited to a few highly experienced surgeons, due to the very challenging resection and complex anastomoses. Middle pancreatectomy and enucleation are indicated for small and benign tumors and offer the maximum preservation of the parenchyma. The introduction of a robotic platform more than ten years ago increased the interest of many surgeons in minimally invasive treatment of pancreatic diseases. This new technology overcomes all the limitations of laparoscopic surgery, but actual benefits for the patients are still under investigation. The increased costs associated with robotic surgery are under debate too. This article presents the state of the art of minimally invasive pancreatic surgery. PMID:26240612

  5. Minimal Interventions in the Teaching of Mathematics

    ERIC Educational Resources Information Center

    Foster, Colin

    2014-01-01

    This paper addresses ways in which mathematics pedagogy can benefit from insights gleaned from counselling. Person-centred counselling stresses the value of genuineness, warm empathetic listening and minimal intervention to support people in solving their own problems and developing increased autonomy. Such an approach contrasts starkly with the…

  6. Minimization search method for data inversion

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.

    1975-01-01

    Technique has been developed for determining values of selected subsets of independent variables in mathematical formulations. Required computation time increases with first power of the number of variables. This is in contrast with classical minimization methods for which computational time increases with third power of the number of variables.

  7. Pancreatic cancer: Open or minimally invasive surgery?

    PubMed Central

    Zhang, Yu-Hua; Zhang, Cheng-Wu; Hu, Zhi-Ming; Hong, De-Fei

    2016-01-01

    Pancreatic duct adenocarcinoma is one of the most fatal malignancies, with R0 resection remaining the most important part of treatment of this malignancy. However, pancreatectomy is believed to be one of the most challenging procedures and R0 resection remains the only chance for patients with pancreatic cancer to have a good prognosis. Some surgeons have tried minimally invasive pancreatic surgery, but the short- and long-term outcomes of pancreatic malignancy remain controversial between open and minimally invasive procedures. We collected comparative data about minimally invasive and open pancreatic surgery. The available evidence suggests that minimally invasive pancreaticoduodenectomy (MIPD) is as safe and feasible as open PD (OPD), and shows some benefit, such as less intraoperative blood loss and shorter postoperative hospital stay. Despite the limited evidence for MIPD in pancreatic cancer, most of the available data show that the short-term oncological adequacy is similar between MIPD and OPD. Some surgical techniques, including superior mesenteric artery-first approach and laparoscopic pancreatoduodenectomy with major vein resection, are believed to improve the rate of R0 resection. Laparoscopic distal pancreatectomy is less technically demanding and is accepted in more pancreatic centers. It is technically safe and feasible and has similar short-term oncological prognosis compared with open distal pancreatectomy. PMID:27621576

  8. Inflation with non-minimally derivative coupling

    NASA Astrophysics Data System (ADS)

    Yang, Nan; Gao, Qing; Gong, Yungui

    2015-10-01

    We derive the second order correction to the scalar and tensor spectral tilts for the inflationary models with non-minimally derivative coupling. In the high friction limit, the quartic power law potential is consistent with the observational constraint at 95% CL because the amplitude of the primordial gravitational waves is smaller, and the inflaton excursion is sub-Planckian.

  9. Minimal Mimicry: Mere Effector Matching Induces Preference

    ERIC Educational Resources Information Center

    Sparenberg, Peggy; Topolinski, Sascha; Springer, Anne; Prinz, Wolfgang

    2012-01-01

    Both mimicking and being mimicked induces preference for a target. The present experiments investigate the minimal sufficient conditions for this mimicry-preference link to occur. We argue that mere effector matching between one's own and the other person's movement is sufficient to induce preference, independent of which movement is actually…

  10. MULTIOBJECTIVE PARALLEL GENETIC ALGORITHM FOR WASTE MINIMIZATION

    EPA Science Inventory

    In this research we have developed an efficient multiobjective parallel genetic algorithm (MOPGA) for waste minimization problems. This MOPGA integrates PGAPack (Levine, 1996) and NSGA-II (Deb, 2000) with novel modifications. PGAPack is a master-slave parallel implementation of a...

  11. Using Rewards to Minimize Overdue Book Rates

    ERIC Educational Resources Information Center

    Mitchell, W. Bede; Smith, Fred W.

    2005-01-01

    For as long as libraries have charged fines for books returned after their due dates, this familiar practice has excited comment and controversy. Fines are thought by many to deter patrons from keeping materials too long. However, others believe there is little persuasive evidence that fines are more effective at minimizing overdues than are…

  12. Pancreatic cancer: Open or minimally invasive surgery?

    PubMed

    Zhang, Yu-Hua; Zhang, Cheng-Wu; Hu, Zhi-Ming; Hong, De-Fei

    2016-08-28

    Pancreatic duct adenocarcinoma is one of the most fatal malignancies, with R0 resection remaining the most important part of treatment of this malignancy. However, pancreatectomy is believed to be one of the most challenging procedures and R0 resection remains the only chance for patients with pancreatic cancer to have a good prognosis. Some surgeons have tried minimally invasive pancreatic surgery, but the short- and long-term outcomes of pancreatic malignancy remain controversial between open and minimally invasive procedures. We collected comparative data about minimally invasive and open pancreatic surgery. The available evidence suggests that minimally invasive pancreaticoduodenectomy (MIPD) is as safe and feasible as open PD (OPD), and shows some benefit, such as less intraoperative blood loss and shorter postoperative hospital stay. Despite the limited evidence for MIPD in pancreatic cancer, most of the available data show that the short-term oncological adequacy is similar between MIPD and OPD. Some surgical techniques, including superior mesenteric artery-first approach and laparoscopic pancreatoduodenectomy with major vein resection, are believed to improve the rate of R0 resection. Laparoscopic distal pancreatectomy is less technically demanding and is accepted in more pancreatic centers. It is technically safe and feasible and has similar short-term oncological prognosis compared with open distal pancreatectomy. PMID:27621576

  13. Minimization of Salmonella Contamination on Raw Poultry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many reviews have discussed Salmonella in poultry and suggested best practices to minimize this organism on raw poultry meat. Despite years of research and conscientious control efforts by industry and regulatory agencies, human salmonellosis rates have declined only modestly and Salmonella is stil...

  14. Minimal Competency Measurement Conference: Summary Report.

    ERIC Educational Resources Information Center

    1976

    On March 11 and 12, 1976, the National Assessment of Educational Progress (NAEP), a project of the Education Commission of the States (ECS), and the Clearinghouse for Applied Performance Testing (CAPT), a project of the Northwest Regional Educational Laboratory (NWREL), cohosted a Minimal Competency Measurement Conference in Denver, Colorado.…

  15. Challenging the minimal supersymmetric SU(5) model

    SciTech Connect

    Bajc, Borut; Lavignac, Stéphane; Mede, Timon

    2014-06-24

    We review the main constraints on the parameter space of the minimal renormalizable supersymmetric SU(5) grand unified theory. They consist of the Higgs mass, proton decay, electroweak symmetry breaking and fermion masses. Superpartner masses are constrained both from below and from above, giving hope for confirming or definitely ruling out the theory in the future. This contribution is based on Ref. [1].

  16. Minimal Guidelines for Authors of Web Pages.

    ERIC Educational Resources Information Center

    ADE Bulletin, 2002

    2002-01-01

    Presents guidelines that recommend the minimal reference information that should be provided on Web pages intended for use by students, teachers, and scholars in the modern languages. Suggests the inclusion of information about responsible parties, copyright declaration, privacy statements, and site information. Makes a note on Web page style. (SG)

  17. DUPONT CHAMBERS WORKS WASTE MINIMIZATION PROJECT

    EPA Science Inventory

    In a joint U.S. Environmental Protection Agency (EPA) and DuPont waste minimization project, fifteen waste streams were-selected for assessment. he intent was to develop assessments diverse in terms of process type, mode of operation, waste type, disposal needed, and relative suc...

  18. Analytical Utility of Campylobacter Methodologies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The National Advisory Committee on Microbiological Criteria for Foods (NACMCF, or the Committee) was asked to address the analytical utility of Campylobacter methodologies in preparation for an upcoming United States Food Safety and Inspection Service (FSIS) baseline study to enumerate Campylobacter...

  19. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  20. INHALATION EXPOSURE-RESPONSE METHODOLOGY

    EPA Science Inventory

    The Inhalation Exposure-Response Analysis Methodology Document is expected to provide guidance on the development of the basic toxicological foundations for deriving reference values for human health effects, focusing on the hazard identification and dose-response aspects of the ...

  1. TRACE ELEMENTS - METHODOLOGY, AND LEGISLATION

    EPA Science Inventory

    This article has been requested by ASTM to be included in a special issue of Standardization News, dealing with analytical chemistry. The article traces federal water legislation as it pertains to drinking water and wastewater and how it applies to the approved methodology for me...

  2. WATER QUALITY ASSESSMENT METHODOLOGY (WQAM)

    EPA Science Inventory

    The Water Quality Assessment Methodology (WQAM) is a screening procedure for toxic and conventional pollutants in surface and ground waters and is a collection of formulas, tables, and graphs that planners can use for preliminary assessment of surface and ground water quality in ...

  3. Analyzing Media: Metaphors as Methodologies.

    ERIC Educational Resources Information Center

    Meyrowitz, Joshua

    Students have little intuitive insight into the process of thinking and structuring ideas. The image of metaphor for a phenomenon acts as a kind of methodology for the study of the phenomenon by (1) defining the key issues or problems; (2) shaping the type of research questions that are asked; (3) defining the type of data that are searched out;…

  4. A Methodological Investigation of Cultivation.

    ERIC Educational Resources Information Center

    Rubin, Alan M.; And Others

    Cultivation theory states that television engenders negative emotions in heavy viewers. Noting that cultivation methodology contains an apparent response bias, a study examined relationships between television exposure and positive restatements of cultivation concepts and tested a more instrumental media uses and effects model. Cultivation was…

  5. Philosophy, Methodology and Action Research

    ERIC Educational Resources Information Center

    Carr, Wilfred

    2006-01-01

    The aim of this paper is to examine the role of methodology in action research. It begins by showing how, as a form of inquiry concerned with the development of practice, action research is nothing other than a modern 20th century manifestation of the pre-modern tradition of practical philosophy. It then draws in Gadamer's powerful vindication of…

  6. Methodology for determining multilayered temperature inversions

    NASA Astrophysics Data System (ADS)

    Fochesatto, G. J.

    2015-05-01

    Temperature sounding of the atmospheric boundary layer (ABL) and lower troposphere exhibits multilayered temperature inversions specially in high latitudes during extreme winters. These temperature inversion layers are originated based on the combined forcing of local- and large-scale synoptic meteorology. At the local scale, the thermal inversion layer forms near the surface and plays a central role in controlling the surface radiative cooling and air pollution dispersion; however, depending upon the large-scale synoptic meteorological forcing, an upper level thermal inversion can also exist topping the local ABL. In this article a numerical methodology is reported to determine thermal inversion layers present in a given temperature profile and deduce some of their thermodynamic properties. The algorithm extracts from the temperature profile the most important temperature variations defining thermal inversion layers. This is accomplished by a linear interpolation function of variable length that minimizes an error function. The algorithm functionality is demonstrated on actual radiosonde profiles to deduce the multilayered temperature inversion structure with an error fraction set independently.

  7. Feminist Methodologies and Engineering Education Research

    ERIC Educational Resources Information Center

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  8. Medical device specificities: opportunities for a dedicated product development methodology.

    PubMed

    Santos, Isa C T; Gazelle, G Scott; Rocha, Luís A; Tavares, João Manuel R S

    2012-05-01

    The medical sector, similarly to other industries such as the aviation industry, has to comply with multiple regulations, guidelines and standards. In addition, there are multiple definitions for the expression 'medical device', and before entering the market, manufacturers must demonstrate their product's safety and effectiveness. In such a complex and demanding environment, it is crucial to know the particularities surrounding the product being developed in order to minimize the chances of a commercial flop. Thus, in this paper, medical device specificities are identified, and the most relevant legislation is reviewed providing the foundations for a dedicated product development methodology. PMID:22702261

  9. Utilize common criteria methodology for secure ubiquitous healthcare environment.

    PubMed

    Yu, Yao-Chang; Hou, Ting-Wei

    2012-06-01

    RFID technology is widely used in healthcare environments to ensure patient safety. Therefore, the testing of RFID tags, such as performance tests and security evaluations, is necessary to ensure inter-operational functional compatibility with standards. A survey of the literature shows that while standards that are around RFID performance tests have been addressed, but the same is not true for security evaluations. Therefore, in this paper, we introduce the Common Criteria security evaluation methodology, also known as ISO/IEC 15408, for the security evaluation of RFID tags and propose a framework as a minimal requirement for RFID tags to improve security assurance. PMID:21086153

  10. A Mixed Methodological Analysis of the Role of Culture in the Clinical Decision-Making Process

    ERIC Educational Resources Information Center

    Hays, Danica G.; Prosek, Elizabeth A.; McLeod, Amy L.

    2010-01-01

    Even though literature indicates that particular cultural groups receive more severe diagnoses at disproportionate rates, there has been minimal research that addresses how culture interfaces specifically with clinical decision making. This mixed methodological study of 41 counselors indicated that cultural characteristics of both counselors and…

  11. The R&D Strategy as an Alternative to Program Evaluation Methodology.

    ERIC Educational Resources Information Center

    Urban, Hugh B.

    Program evaluation methodology as an investigatory strategy has been developed to generate information as to the capabilities of human service programs to produce maximal desired effects in client populations with minimal resource use. Experience with the procedures, however, has disclosed multiple difficulties attendant upon their use, doubt as…

  12. Methodological challenges to human medical study.

    PubMed

    Zhong, Yixin; Liu, Baoyan; Qu, Hua; Xie, Qi

    2014-09-01

    With the transformation of modern medicinal pattern, medical studies are confronted with methodological challenges. By analyzing two methodologies existing in the study of physical matter system and information system, the article points out that traditional Chinese medicine (TCM), especially the treatment based on syndrome differentiation, embodies information conception of methodological positions, while western medicine represents matter conception of methodological positions. It proposes a new way of thinking about combination of TCM and western medicine by combinating two kinds of methodological methods. PMID:25159994

  13. The Minimal Supersymmetric Fat Higgs Model

    SciTech Connect

    Harnik, Roni; Kribs, Graham D.; Larson, Daniel T.; Murayama, Hitoshi

    2003-11-26

    We present a calculable supersymmetric theory of a composite"fat'" Higgs boson. Electroweak symmetry is broken dynamically through a new gauge interaction that becomes strong at an intermediate scale. The Higgs mass can easily be 200-450 GeV along with the superpartner masses, solving the supersymmetric little hierarchy problem. We explicitly verify that the model is consistent with precision electroweak data without fine-tuning. Gauge coupling unification can be maintained despite the inherently strong dynamics involved in electroweak symmetry breaking. Supersymmetrizing the Standard Model therefore does not imply a light Higgs mass, contrary to the lore in the literature. The Higgs sector of the minimal Fat Higgs model has a mass spectrum that is distinctly different from the Minimal Supersymmetric Standard Model.

  14. Perfusion techniques for minimally invasive valve procedures.

    PubMed

    de Jong, A; Popa, B A; Stelian, E; Karazanishvili, L; Lanzillo, G; Simonini, S; Renzi, L; Diena, M; Tesler, U F

    2015-05-01

    In this paper, we present, in detail, the simplified perfusion technique that we have adopted since January 2009 and that we have utilized in 200 cases for cardiac minimally invasive valvular procedures that were performed through a right lateral mini-thoracotomy in the 3(rd)-4(th) intercostal space. Cardiopulmonary bypass was achieved by means of the direct cannulation of the ascending aorta and the insertion of a percutaneous venous cannula in the femoral vein. A flexible aortic cross-clamp was applied through the skin incision and cardioplegic arrest was obtained with the antegrade delivery of a crystalloid solution. Gravity drainage was enhanced by vacuum-assisted aspiration. There were no technical complications related to this perfusion technique that we have adopted in minimally invasive surgical procedures. PMID:25280878

  15. The Sense of Commitment: A Minimal Approach

    PubMed Central

    Michael, John; Sebanz, Natalie; Knoblich, Günther

    2016-01-01

    This paper provides a starting point for psychological research on the sense of commitment within the context of joint action. We begin by formulating three desiderata: to illuminate the motivational factors that lead agents to feel and act committed, to pick out the cognitive processes and situational factors that lead agents to sense that implicit commitments are in place, and to illuminate the development of an understanding of commitment in ontogeny. In order to satisfy these three desiderata, we propose a minimal framework, the core of which is an analysis of the minimal structure of situations which can elicit a sense of commitment. We then propose a way of conceptualizing and operationalizing the sense of commitment, and discuss cognitive and motivational processes which may underpin the sense of commitment. PMID:26779080

  16. Commercial radioactive waste minimization program development guidance

    SciTech Connect

    Fischer, D.K.

    1991-01-01

    This document is one of two prepared by the EG&G Idaho, Inc., Waste Management Technical Support Program Group, National Low-Level Waste Management Program Unit. One of several Department of Energy responsibilities stated in the Amendments Act of 1985 is to provide technical assistance to compact regions Host States, and nonmember States (to the extent provided in appropriations acts) in establishing waste minimization program plans. Technical assistance includes, among other things, the development of technical guidelines for volume reduction options. Pursuant to this defined responsibility, the Department of Energy (through EG&G Idaho, Inc.) has prepared this report, which includes guidance on defining a program, State/compact commission participation, and waste minimization program plans.

  17. Commercial radioactive waste minimization program development guidance

    SciTech Connect

    Fischer, D.K.

    1991-01-01

    This document is one of two prepared by the EG G Idaho, Inc., Waste Management Technical Support Program Group, National Low-Level Waste Management Program Unit. One of several Department of Energy responsibilities stated in the Amendments Act of 1985 is to provide technical assistance to compact regions Host States, and nonmember States (to the extent provided in appropriations acts) in establishing waste minimization program plans. Technical assistance includes, among other things, the development of technical guidelines for volume reduction options. Pursuant to this defined responsibility, the Department of Energy (through EG G Idaho, Inc.) has prepared this report, which includes guidance on defining a program, State/compact commission participation, and waste minimization program plans.

  18. Facets of the balanced minimal evolution polytope.

    PubMed

    Forcey, Stefan; Keefe, Logan; Sands, William

    2016-08-01

    The balanced minimal evolution (BME) method of creating phylogenetic trees can be formulated as a linear programming problem, minimizing an inner product over the vertices of the BME polytope. In this paper we undertake the project of describing the facets of this polytope. We classify and identify the combinatorial structure and geometry (facet inequalities) of all the facets in dimensions up to five, and classify even more facets in all dimensions. A full set of facet inequalities would allow a full implementation of the simplex method for finding the BME tree-although there are reasons to think this an unreachable goal. However, our results provide the crucial first steps for a more likely-to-be-successful program: finding efficient relaxations of the BME polytope. PMID:26714816

  19. Neurocontroller analysis via evolutionary network minimization.

    PubMed

    Ganon, Zohar; Keinan, Alon; Ruppin, Eytan

    2006-01-01

    This study presents a new evolutionary network minimization (ENM) algorithm. Neurocontroller minimization is beneficial for finding small parsimonious networks that permit a better understanding of their workings. The ENM algorithm is specifically geared to an evolutionary agents setup, as it does not require any explicit supervised training error, and is very easily incorporated in current evolutionary algorithms. ENM is based on a standard genetic algorithm with an additional step during reproduction in which synaptic connections are irreversibly eliminated. It receives as input a successfully evolved neurocontroller and aims to output a pruned neurocontroller, while maintaining the original fitness level. The small neurocontrollers produced by ENM provide upper bounds on the neurocontroller size needed to perform a given task successfully, and can provide for more effcient hardware implementations. PMID:16859448

  20. Minimal optical decomposition of ray transfer matrices.

    PubMed

    Liu, Xiyuan; Brenner, Karl-Heinz

    2008-08-01

    The properties of first-order optical systems are described paraxially by a ray transfer matrix, also called the ABCD matrix. Here we consider the inverse problem: an ABCD matrix is given, and we look for the minimal optical system that consists of only lenses and pieces of free-space propagation. Similar decompositions have been studied before but without the restriction to these two element types or without an attempt at minimalization. As the main results of this paper, we found that general lossless one-dimensional optical systems can be synthesized with a maximum of four elements and two-dimensional optical systems can be synthesized with six elements at most. PMID:18670547

  1. Minimal conditions for protocell stationary growth.

    PubMed

    Bigan, Erwan; Steyaert, Jean-Marc; Douady, Stéphane

    2015-01-01

    We show that self-replication of a chemical system encapsulated within a membrane growing from within is possible without any explicit feature such as autocatalysis or metabolic closure, and without the need for their emergence through complexity. We use a protocell model relying upon random conservative chemical reaction networks with arbitrary stoichiometry, and we investigate the protocell's capability for self-replication, for various numbers of reactions in the network. We elucidate the underlying mechanisms in terms of simple minimal conditions pertaining only to the topology of the embedded chemical reaction network. A necessary condition is that each moiety must be fed, and a sufficient condition is that each siphon is fed. Although these minimal conditions are purely topological, by further endowing conservative chemical reaction networks with thermodynamically consistent kinetics, we show that the growth rate tends to increase on increasing the Gibbs energy per unit molecular weight of the nutrient and on decreasing that of the membrane precursor. PMID:25951201

  2. Revisiting the minimal chaotic inflation model

    NASA Astrophysics Data System (ADS)

    Harigaya, Keisuke; Ibe, Masahiro; Kawasaki, Masahiro; Yanagida, Tsutomu T.

    2016-05-01

    We point out that the prediction of the minimal chaotic inflation model is altered if a scalar field takes a large field value close to the Planck scale during inflation due to a negative Hubble induced mass. In particular, we show that the inflaton potential is effectively flattened at a large inflaton field value in the presence of such a scalar field. The scalar field may be identified with the standard model Higgs field or super partners of standard model fermions. With such Hubble-induced flattening, we find that the minimal chaotic inflation model, especially the model with a quadratic potential, is consistent with recent observations of the cosmic microwave background fluctuation without modifying the inflation model itself.

  3. Minimal information to determine affine shape equivalence.

    PubMed

    Wagemans, J; Van Gool, L; Lamote, C; Foster, D H

    2000-04-01

    Participants judged the affine equivalence of 2 simultaneously presented 4-point patterns. Performance level (d') varied between 1.5 and 2.7, depending on the information available for solving the correspondence problem (insufficient in Experiment 1a, superfluous in Experiment 1b, and minimal in Experiments 1c, 2a, 2b) and on the exposure time (unlimited in Experiments 1 and 2a and 500 ms in Experiment 2b), but it did not vary much with the complexity of the affine transformation (rotation and slant in Experiment 1 and same plus tilt in Experiment 2). Performance in Experiment 3 was lower with 3-point patterns than with 4-point patterns, whereas blocking the trials according to the affine transformation parameters had little effect. Determining affine shape equivalence with minimal-information displays is based on a fast assessment of qualitatively or quasi-invariant properties such as convexity/ concavity, parallelism, and collinearity. PMID:10811156

  4. Minimally Invasive Surgical Therapies for Atrial Fibrillation

    PubMed Central

    Nakamura, Yoshitsugu; Kiaii, Bob; Chu, Michael W. A.

    2012-01-01

    Atrial fibrillation is the most common sustained arrhythmia and is associated with significant risks of thromboembolism, stroke, congestive heart failure, and death. There have been major advances in the management of atrial fibrillation including pharmacologic therapies, antithrombotic therapies, and ablation techniques. Surgery for atrial fibrillation, including both concomitant and stand-alone interventions, is an effective therapy to restore sinus rhythm. Minimally invasive surgical ablation is an emerging field that aims for the superior results of the traditional Cox-Maze procedure through a less invasive operation with lower morbidity, quicker recovery, and improved patient satisfaction. These novel techniques utilize endoscopic or minithoracotomy approaches with various energy sources to achieve electrical isolation of the pulmonary veins in addition to other ablation lines. We review advancements in minimally invasive techniques for atrial fibrillation surgery, including management of the left atrial appendage. PMID:22666609

  5. Time minimizing transportation of calamity fallen timber

    NASA Astrophysics Data System (ADS)

    Kolman, P.; Střelec, L.

    2013-10-01

    In the field of transportation problems is usually the most decisive optimization criterion minimizing of transportation costs. However, there are situations where it is necessary to transport the material quickly. This is especially the case of material transport when it is transported a perishable material. In case the material is transported in time, i.e. before its vitiation, it may be further used and brings economic profit. Otherwise, there is the degradation with all the ensuing consequences. In this paper it is optimized calamity timber transportation from collection points to places of processing. Emphasis was placed on minimizing the transportation time. The reason for the solution of the problem was the need for calamity fallen timber transportation after hurricane Kyrill from collection points in the Bohemian Forest to processing sites. For solving the problem it was not possible to use the known methods of transportation problems. Therefore, the authors proposed the method, leading to the optimal solution of the described problem.

  6. [CODESIGN METHODOLOGIES: A ENABLING RESOURCE?].

    PubMed

    Oboeuf, Alexandre; Aiguier, Grégory; Loute, Alain

    2016-01-01

    To reflect on the learning of the relationship in the care, seventeen people were mobilized to participate in a day of codesign. This methodology is to foster the creativity of a group with a succession creativity exercises. This article is primarily intended to reflect on the conditions by which such a methodology can become a resource for thinking learning ethics. The role of affectivity in the success of a codesign day is questioned. This work highlights include its central place in the construction of the innovative climate and the divergent thinking mechanism. The article aims to open new questions on the articulation exercises, affectivity, the role of the animator or that of the patient. The research perspectives invite disciplinary dialogue. PMID:27305797

  7. Methodological assessment of HCC literature

    PubMed Central

    Daniele, G.; Costa, N.; Lorusso, V.; Costa-Maia, J.; Pache, I.; Pirisi, M.

    2013-01-01

    Despite the fact that the hepatocellular carcinoma (HCC) represents a major health problem, very few interventions are available for this disease, and only sorafenib is approved for the treatment of advanced disease. Of note, only very few interventions have been thoroughly evaluated over time for HCC patients compared with several hundreds in other, equally highly lethal, tumours. Additionally, clinical trials in HCC have often been questioned for poor design and methodological issues. As a consequence, a gap between what is measured in clinical trials and what clinicians have to face in daily practice often occurs. As a result of this scenario, even the most recent guidelines for treatment of HCC patients use low strength evidence to make recommendations. In this review, we will discuss some of the potential methodological issues hindering a rational development of new treatments for HCC patients. PMID:23715943

  8. Methodological Challenges in Online Trials

    PubMed Central

    Khadjesari, Zarnie; White, Ian R; Kalaitzaki, Eleftheria; Godfrey, Christine; McCambridge, Jim; Thompson, Simon G; Wallace, Paul

    2009-01-01

    Health care and health care services are increasingly being delivered over the Internet. There is a strong argument that interventions delivered online should also be evaluated online to maximize the trial’s external validity. Conducting a trial online can help reduce research costs and improve some aspects of internal validity. To date, there are relatively few trials of health interventions that have been conducted entirely online. In this paper we describe the major methodological issues that arise in trials (recruitment, randomization, fidelity of the intervention, retention, and data quality), consider how the online context affects these issues, and use our experience of one online trial evaluating an intervention to help hazardous drinkers drink less (DownYourDrink) to illustrate potential solutions. Further work is needed to develop online trial methodology. PMID:19403465

  9. Defect reduction through Lean methodology

    NASA Astrophysics Data System (ADS)

    Purdy, Kathleen; Kindt, Louis; Densmore, Jim; Benson, Craig; Zhou, Nancy; Leonard, John; Whiteside, Cynthia; Nolan, Robert; Shanks, David

    2010-09-01

    Lean manufacturing is a systematic method of identifying and eliminating waste. Use of Lean manufacturing techniques at the IBM photomask manufacturing facility has increased efficiency and productivity of the photomask process. Tools, such as, value stream mapping, 5S and structured problem solving are widely used today. In this paper we describe a step-by-step Lean technique used to systematically decrease defects resulting in reduced material costs, inspection costs and cycle time. The method used consists of an 8-step approach commonly referred to as the 8D problem solving process. This process allowed us to identify both prominent issues as well as more subtle problems requiring in depth investigation. The methodology used is flexible and can be applied to numerous situations. Advantages to Lean methodology are also discussed.

  10. Methodological challenges in online trials.

    PubMed

    Murray, Elizabeth; Khadjesari, Zarnie; White, Ian R; Kalaitzaki, Eleftheria; Godfrey, Christine; McCambridge, Jim; Thompson, Simon G; Wallace, Paul

    2009-01-01

    Health care and health care services are increasingly being delivered over the Internet. There is a strong argument that interventions delivered online should also be evaluated online to maximize the trial's external validity. Conducting a trial online can help reduce research costs and improve some aspects of internal validity. To date, there are relatively few trials of health interventions that have been conducted entirely online. In this paper we describe the major methodological issues that arise in trials (recruitment, randomization, fidelity of the intervention, retention, and data quality), consider how the online context affects these issues, and use our experience of one online trial evaluating an intervention to help hazardous drinkers drink less (DownYourDrink) to illustrate potential solutions. Further work is needed to develop online trial methodology. PMID:19403465

  11. Holographic dark energy from minimal supergravity

    NASA Astrophysics Data System (ADS)

    Landim, Ricardo C. G.

    2016-02-01

    We embed models of holographic dark energy (HDE) coupled to dark matter (DM) in minimal supergravity plus matter, with one chiral superfield. We analyze two cases. The first one has the Hubble radius as the infrared (IR) cutoff and the interaction between the two fluids is proportional to the energy density of the DE. The second case has the future event horizon as IR cutoff while the interaction is proportional to the energy density of both components of the dark sector.

  12. Seesaw Models with Minimal Flavor Violation

    NASA Astrophysics Data System (ADS)

    He, Xiao-Gang

    In this talk, I discuss implementation of minimal flavor violation (MFV) in seesaw models based on work appeared in arXiv:1401.2615, arXiv:1404.4436 and arXiv:1411.6612. Phenomenological implications on flavor-changing interactions related to leptons are studied by considering some effective dimension-six operators. We also comment on how one of the new effective operators can induce flavor-changing dilepton decays of the Higgs boson.

  13. Minimal Basis for Gauge Theory Amplitudes

    SciTech Connect

    Bjerrum-Bohr, N. E. J.; Damgaard, Poul H.; Vanhove, Pierre

    2009-10-16

    Identities based on monodromy for integrations in string theory are used to derive relations between different color-ordered tree-level amplitudes in both bosonic and supersymmetric string theory. These relations imply that the color-ordered tree-level n-point gauge theory amplitudes can be expanded in a minimal basis of (n-3)exclamation amplitudes. This result holds for any choice of polarizations of the external states and in any number of dimensions.

  14. Combined thoracoscopic and laparoscopic minimally invasive esophagectomy

    PubMed Central

    Zeng, Fuchun; Wang, Youyu; Xue, Yang; Cong, Wei

    2014-01-01

    With the improvement in thoracoscopic and laparoscopic surgery, thoracoscopic and laparoscopic esophagectomy (TLE), a minimally invasive approach, has attracted increasing attention as an alternative to open three-field esophagectomy. From June 2012 to October 2013, 90 patients underwent laparoscopic and thoracoscopic resection of esophageal carcinoma in our department. The VATS esophagectomy technique described here is the approach currently employed in the department of thoracic surgery at Sichuan Provincial People’s Hospital of China. PMID:24605230

  15. Realization of a minimal disturbance quantum measurement.

    PubMed

    Sciarrino, F; Ricci, M; De Martini, F; Filip, R; Mista, L

    2006-01-20

    We report the first experimental realization of an "optimal" quantum device able to perform a minimal disturbance measurement on polarization encoded qubits saturating the theoretical boundary established between the classical knowledge acquired of any input state, i.e., a "classical guess," and the fidelity of the same state after disturbance due to measurement. The device has been physically realized by means of a linear optical qubit manipulation, postselection measurement, and a classical feed-forward process. PMID:16486551

  16. Nonlinear transient analysis via energy minimization

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    The formulation basis for nonlinear transient analysis of finite element models of structures using energy minimization is provided. Geometric and material nonlinearities are included. The development is restricted to simple one and two dimensional finite elements which are regarded as being the basic elements for modeling full aircraft-like structures under crash conditions. The results indicate the effectiveness of the technique as a viable tool for this purpose.

  17. Minimally invasive aesthetic procedures in young adults

    PubMed Central

    Wollina, Uwe; Goldman, Alberto

    2011-01-01

    Age is a significant factor in modifying specific needs when it comes to medical aesthetic procedures. In this review we will focus on young adults in their third decade of life and review minimally invasive aesthetic procedures other than cosmetics and cosmeceuticals. Correction of asymmetries, correction after body modifying procedures, and facial sculpturing are important issues for young adults. The implication of aesthetic medicine as part of preventive medicine is a major ethical challenge that differentiates aesthetic medicine from fashion. PMID:21673871

  18. ISE System Development Methodology Manual

    SciTech Connect

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  19. Predictive methodology for supply disruptions

    SciTech Connect

    Beller, M.; D'Acierno, J.

    1982-04-01

    Energy supply disruptions do not suddenly arise in a full-blown fashion. Lags in the energy system provide a time horizon which allows for the prediction of a possible supply problem. A simple model is described which can be used to provide a set of indicators for the possible onset of an energy emergency. The methodology was tested on the gasoline shortage of 1979, and the results are presented.

  20. Waste minimization in an autobody repair shop

    SciTech Connect

    Baria, D.N.; Dorland, D.; Bergeron, J.T.

    1994-12-31

    This work was done to document the waste minimization incorporated in a new autobody repair facility in Hermantown, Minnesota. Humes Collision Center incorporated new waste reduction techniques when it expanded its old facilities in 1992 and it was able to achieve the benefits of cost reduction and waste reduction. Humes Collision Center repairs an average of 500 cars annually and is a very small quantity generator (VSQG) of hazardous waste, as defined by the Minnesota Pollution Control Agency (MPCA). The hazardous waste consists of antifreeze, batteries, paint sludge, refrigerants, and used oil, while the nonhazardous waste consists of cardboard, glass, paint filters, plastic, sanding dust, scrap metal, and wastewater. The hazardous and nonhazardous waste output were decreased by 72%. In addition, there was a 63% reduction in the operating costs. The waste minimization includes antifreeze recovery and recycling, reduction in unused waste paint, reduction, recovery and recycle of waste lacquer thinner for cleaning spray guns and paint cups, elimination of used plastic car bags, recovery and recycle of refrigerant, reduction in waste sandpaper and elimination of sanding dust, and elimination of waste paint filters. The rate of return on the investment in waste minimization equipment is estimated from 37% per year for the distillation unit, 80% for vacuum sanding, 146% for computerized paint mixing, 211% for the refrigerant recycler, to 588% per year for the gun washer. The corresponding payback time varies from 3 years to 2 months.

  1. Minimally invasive local therapies for liver cancer

    PubMed Central

    Li, David; Kang, Josephine; Golas, Benjamin J.; Yeung, Vincent W.; Madoff, David C.

    2014-01-01

    Primary and metastatic liver tumors are an increasing global health problem, with hepatocellular carcinoma (HCC) now being the third leading cause of cancer-related mortality worldwide. Systemic treatment options for HCC remain limited, with Sorafenib as the only prospectively validated agent shown to increase overall survival. Surgical resection and/or transplantation, locally ablative therapies and regional or locoregional therapies have filled the gap in liver tumor treatments, providing improved survival outcomes for both primary and metastatic tumors. Minimally invasive local therapies have an increasing role in the treatment of both primary and metastatic liver tumors. For patients with low volume disease, these therapies have now been established into consensus practice guidelines. This review highlights technical aspects and outcomes of commonly utilized, minimally invasive local therapies including laparoscopic liver resection (LLR), radiofrequency ablation (RFA), microwave ablation (MWA), high-intensity focused ultrasound (HIFU), irreversible electroporation (IRE), and stereotactic body radiation therapy (SBRT). In addition, the role of combination treatment strategies utilizing these minimally invasive techniques is reviewed. PMID:25610708

  2. Esophageal surgery in minimally invasive era.

    PubMed

    Bencini, Lapo; Moraldi, Luca; Bartolini, Ilenia; Coratti, Andrea

    2016-01-27

    The widespread popularity of new surgical technologies such as laparoscopy, thoracoscopy and robotics has led many surgeons to treat esophageal diseases with these methods. The expected benefits of minimally invasive surgery (MIS) mainly include reductions of postoperative complications, length of hospital stay, and pain and better cosmetic results. All of these benefits could potentially be of great interest when dealing with the esophagus due to the potentially severe complications that can occur after conventional surgery. Moreover, robotic platforms are expected to reduce many of the difficulties encountered during advanced laparoscopic and thoracoscopic procedures such as anastomotic reconstructions, accurate lymphadenectomies, and vascular sutures. Almost all esophageal diseases are approachable in a minimally invasive way, including diverticula, gastro-esophageal reflux disease, achalasia, perforations and cancer. Nevertheless, while the limits of MIS for benign esophageal diseases are mainly technical issues and costs, oncologic outcomes remain the cornerstone of any procedure to cure malignancies, for which the long-term results are critical. Furthermore, many of the minimally invasive esophageal operations should be compared to pharmacologic interventions and advanced pure endoscopic procedures; such a comparison requires a difficult literature analysis and leads to some confounding results of clinical trials. This review aims to examine the evidence for the use of MIS in both malignancies and more common benign disease of the esophagus, with a particular emphasis on future developments and ongoing areas of research. PMID:26843913

  3. Minimally invasive local therapies for liver cancer.

    PubMed

    Li, David; Kang, Josephine; Golas, Benjamin J; Yeung, Vincent W; Madoff, David C

    2014-12-01

    Primary and metastatic liver tumors are an increasing global health problem, with hepatocellular carcinoma (HCC) now being the third leading cause of cancer-related mortality worldwide. Systemic treatment options for HCC remain limited, with Sorafenib as the only prospectively validated agent shown to increase overall survival. Surgical resection and/or transplantation, locally ablative therapies and regional or locoregional therapies have filled the gap in liver tumor treatments, providing improved survival outcomes for both primary and metastatic tumors. Minimally invasive local therapies have an increasing role in the treatment of both primary and metastatic liver tumors. For patients with low volume disease, these therapies have now been established into consensus practice guidelines. This review highlights technical aspects and outcomes of commonly utilized, minimally invasive local therapies including laparoscopic liver resection (LLR), radiofrequency ablation (RFA), microwave ablation (MWA), high-intensity focused ultrasound (HIFU), irreversible electroporation (IRE), and stereotactic body radiation therapy (SBRT). In addition, the role of combination treatment strategies utilizing these minimally invasive techniques is reviewed. PMID:25610708

  4. Esophageal surgery in minimally invasive era

    PubMed Central

    Bencini, Lapo; Moraldi, Luca; Bartolini, Ilenia; Coratti, Andrea

    2016-01-01

    The widespread popularity of new surgical technologies such as laparoscopy, thoracoscopy and robotics has led many surgeons to treat esophageal diseases with these methods. The expected benefits of minimally invasive surgery (MIS) mainly include reductions of postoperative complications, length of hospital stay, and pain and better cosmetic results. All of these benefits could potentially be of great interest when dealing with the esophagus due to the potentially severe complications that can occur after conventional surgery. Moreover, robotic platforms are expected to reduce many of the difficulties encountered during advanced laparoscopic and thoracoscopic procedures such as anastomotic reconstructions, accurate lymphadenectomies, and vascular sutures. Almost all esophageal diseases are approachable in a minimally invasive way, including diverticula, gastro-esophageal reflux disease, achalasia, perforations and cancer. Nevertheless, while the limits of MIS for benign esophageal diseases are mainly technical issues and costs, oncologic outcomes remain the cornerstone of any procedure to cure malignancies, for which the long-term results are critical. Furthermore, many of the minimally invasive esophageal operations should be compared to pharmacologic interventions and advanced pure endoscopic procedures; such a comparison requires a difficult literature analysis and leads to some confounding results of clinical trials. This review aims to examine the evidence for the use of MIS in both malignancies and more common benign disease of the esophagus, with a particular emphasis on future developments and ongoing areas of research. PMID:26843913

  5. Minimally invasive thyroidectomy: a ten years experience

    PubMed Central

    Viani, Lorenzo; Montana, Chiara Montana; Cozzani, Federico; Sianesi, Mario

    2016-01-01

    Background The conventional thyroidectomy is the most frequent surgical procedure for thyroidal surgical disease. From several years were introduced minimally invasive approaches to thyroid surgery. These new procedures improved the incidence of postoperative pain, cosmetic results, patient’s quality of life, postoperative morbidity. The mini invasive video-assisted thyroidectomy (MIVAT) is a minimally invasive procedure that uses a minicervicotomy to treat thyroidal diseases. Methods We present our experience on 497 consecutively treated patients with MIVAT technique. We analyzed the mean age, sex, mean operative time, rate of bleeding, hypocalcemia, transitory and definitive nerve palsy (6 months after the procedure), postoperative pain scale from 0 to 10 at 1 hour and 24 hours after surgery, mean hospital stay. Results The indications to treat were related to preoperative diagnosis: 182 THYR 6, 184 THYR 3–4, 27 plummer, 24 basedow, 28 toxic goiter, 52 goiter. On 497 cases we have reported 1 case of bleeding (0,2%), 12 (2,4%) cases of transitory nerve palsy and 4 (0,8%) definitive nerve palsy. The rate of serologic hypocalcemia was 24.9% (124 cases) and clinical in 7.2% (36 cases); 1 case of hypoparathyroidism (0.2%). Conclusions The MIVAT is a safe approach to surgical thyroid disease, the cost are similar to CT as the adverse events. The minicervicotomy is really a minimally invasive tissue dissection. PMID:27294036

  6. [Minimally Invasive Treatment of Esophageal Benign Diseases].

    PubMed

    Inoue, Haruhiro

    2016-07-01

    As a minimally invasive treatment of esophageal achalasia per-oral endoscopic myotomy( POEM) was developed in 2008. More than 1,100 cases of achalasia-related diseases received POEM. Success rate of the procedure was more than 95%(Eckerdt score improvement 3 points and more). No serious( Clavian-Dindo classification III b and more) complication was experienced. These results suggest that POEM becomes a standard minimally invasive treatment for achalasia-related diseases. As an off-shoot of POEM submucosal tumor removal through submucosal tunnel (per-oral endoscopic tumor resection:POET) was developed and safely performed. Best indication of POET is less than 5 cm esophageal leiomyoma. A novel endoscopic treatment of gastroesophageal reflux disease (GERD) was developed. Anti-reflux mucosectomy( ARMS) is nearly circumferential mucosal reduction of gastric cardia mucosa. ARMS is performed in 56 consecutive cases of refractory GERD. No major complications were encountered and excellent clinical results. Best indication of ARMS is a refractory GERD without long sliding hernia. Longest follow-up case is more than 10 years. Minimally invasive treatments for esophageal benign diseases are currently performed by therapeutic endoscopy. PMID:27440038

  7. Expert System Development Methodology (ESDM)

    NASA Technical Reports Server (NTRS)

    Sary, Charisse; Gilstrap, Lewey; Hull, Larry G.

    1990-01-01

    The Expert System Development Methodology (ESDM) provides an approach to developing expert system software. Because of the uncertainty associated with this process, an element of risk is involved. ESDM is designed to address the issue of risk and to acquire the information needed for this purpose in an evolutionary manner. ESDM presents a life cycle in which a prototype evolves through five stages of development. Each stage consists of five steps, leading to a prototype for that stage. Development may proceed to a conventional development methodology (CDM) at any time if enough has been learned about the problem to write requirements. ESDM produces requirements so that a product may be built with a CDM. ESDM is considered preliminary because is has not yet been applied to actual projects. It has been retrospectively evaluated by comparing the methods used in two ongoing expert system development projects that did not explicitly choose to use this methodology but which provided useful insights into actual expert system development practices and problems.

  8. Energy Efficiency Indicators Methodology Booklet

    SciTech Connect

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  9. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  10. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  11. Prioritization methodology for chemical replacement

    NASA Technical Reports Server (NTRS)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  12. A versatile technique to minimize electrical losses in distribution feeders

    SciTech Connect

    Kyaruzi, A.L.

    1994-12-31

    This dissertation presents a method of minimizing electrical losses in radial distribution feeders by the use of shunt capacitors. The engineering benefits of reducing peak electrical power and energy losses are compared to the costs associated with the current engineering practice of buying, installing and servicing capacitor banks in the distribution feeders. The present analysis defines this cost-benefit problem and the formulation of the problem of nonuniform feeders with different wire gauges at various feeder sections. Standard utility capacitor bank sizes are used to give a more realistic model. An original computer solution methodology based on techniques developed for this study determines: (i) Whether it is economical to install compensating capacitor banks on a particular radial distribution feeder or not. (ii) The locations at which capacitor banks should be installed. (iii) The types and sizes of capacitor banks to be installed. (iv) The time setting of switched capacitor banks. The techniques have been applied to a typical radial distribution feeder in Dar-es-Salaam, Tanzania. The results and the engineering implications of this work are discussed and recommendations for the engineering community made.

  13. Minimizing inter-microscope variability in dental microwear texture analysis

    NASA Astrophysics Data System (ADS)

    Arman, Samuel D.; Ungar, Peter S.; Brown, Christopher A.; DeSantis, Larisa R. G.; Schmidt, Christopher; Prideaux, Gavin J.

    2016-06-01

    A common approach to dental microwear texture analysis (DMTA) uses confocal profilometry in concert with scale-sensitive fractal analysis to help understand the diets of extinct mammals. One of the main benefits of DMTA over other methods is the repeatable, objective manner of data collection. This repeatability, however, is threatened by variation in results of DMTA of the same dental surfaces yielded by different microscopes. Here we compare DMTA data of five species of kangaroos measured on seven profilers of varying specifications. Comparison between microscopes confirms that inter-microscope differences are present, but we show that deployment of a number of automated treatments to remove measurement noise can help minimize inter-microscope differences. Applying these same treatments to a published hominin DMTA dataset shows that they alter some significant differences between dietary groups. Minimising microscope variability while maintaining interspecific dietary differences requires then that these factors are balanced in determining appropriate treatments. The process outlined here offers a solution for allowing comparison of data between microscopes, which is essential for ongoing DMTA research. In addition, the process undertaken, including considerations of other elements of DMTA protocols also promises to streamline methodology, remove measurement noise and in doing so, optimize recovery of a reliable dietary signature.

  14. Minimal selective concentrations of tetracycline in complex aquatic bacterial biofilms.

    PubMed

    Lundström, Sara V; Östman, Marcus; Bengtsson-Palme, Johan; Rutgersson, Carolin; Thoudal, Malin; Sircar, Triranta; Blanck, Hans; Eriksson, K Martin; Tysklind, Mats; Flach, Carl-Fredrik; Larsson, D G Joakim

    2016-05-15

    Selection pressure generated by antibiotics released into the environment could enrich for antibiotic resistance genes and antibiotic resistant bacteria, thereby increasing the risk for transmission to humans and animals. Tetracyclines comprise an antibiotic class of great importance to both human and animal health. Accordingly, residues of tetracycline are commonly detected in aquatic environments. To assess if tetracycline pollution in aquatic environments promotes development of resistance, we determined minimal selective concentrations (MSCs) in biofilms of complex aquatic bacterial communities using both phenotypic and genotypic assays. Tetracycline significantly increased the relative abundance of resistant bacteria at 10 μg/L, while specific tet genes (tetA and tetG) increased significantly at the lowest concentration tested (1 μg/L). Taxonomic composition of the biofilm communities was altered with increasing tetracycline concentrations. Metagenomic analysis revealed a concurrent increase of several tet genes and a range of other genes providing resistance to different classes of antibiotics (e.g. cmlA, floR, sul1, and mphA), indicating potential for co-selection. Consequently, MSCs for the tet genes of ≤ 1 μg/L suggests that current exposure levels in e.g. sewage treatment plants could be sufficient to promote resistance. The methodology used here to assess MSCs could be applied in risk assessment of other antibiotics as well. PMID:26938321

  15. Iterative minimization algorithm for efficient calculations of transition states

    NASA Astrophysics Data System (ADS)

    Gao, Weiguo; Leng, Jing; Zhou, Xiang

    2016-03-01

    This paper presents an efficient algorithmic implementation of the iterative minimization formulation (IMF) for fast local search of transition state on potential energy surface. The IMF is a second order iterative scheme providing a general and rigorous description for the eigenvector-following (min-mode following) methodology. We offer a unified interpretation in numerics via the IMF for existing eigenvector-following methods, such as the gentlest ascent dynamics, the dimer method and many other variants. We then propose our new algorithm based on the IMF. The main feature of our algorithm is that the translation step is replaced by solving an optimization subproblem associated with an auxiliary objective function which is constructed from the min-mode information. We show that using an efficient scheme for the inexact solver and enforcing an adaptive stopping criterion for this subproblem, the overall computational cost will be effectively reduced and a super-linear rate between the accuracy and the computational cost can be achieved. A series of numerical tests demonstrate the significant improvement in the computational efficiency for the new algorithm.

  16. Calculating averted caries attributable to school-based sealant programs with a minimal data set

    PubMed Central

    Griffin, Susan O.; Jones, Kari; Crespin, Matthew

    2016-01-01

    Objectives We describe a methodology for school-based sealant programs (SBSP) to estimate averted cavities,(i.e.,difference in cavities without and with SBSP) over 9 years using a minimal data set. Methods A Markov model was used to estimate averted cavities. SBSP would input estimates of their annual attack rate (AR) and 1-year retention rate. The model estimated retention 2+ years after placement with a functional form obtained from the literature. Assuming a constant AR, SBSP can estimate their AR with child-level data collected prior to sealant placement on sealant presence, number of decayed/filled first molars, and age. We demonstrate the methodology with data from the Wisconsin SBSP. Finally, we examine how sensitive averted cavities obtained with this methodology is if an SBSP were to over or underestimate their AR or 1-year retention. Results Demonstrating the methodology with estimated AR (= 7 percent) and 1-year retention (= 92 percent) from the Wisconsin SBSP data, we found that placing 31,324 sealants averted 10,718 cavities. Sensitivity analysis indicated that for any AR, the magnitude of the error (percent) in estimating averted cavities was always less than the magnitude of the error in specifying the AR and equal to the error in specifying the 1-year retention rate. We also found that estimates of averted cavities were more robust to misspecifications of AR for higher- versus lower-risk children. Conclusions With Excel (Microsoft Corporation, Redmond, WA, USA) spreadsheets available upon request, SBSP can use this methodology to generate reasonable estimates of their impact with a minimal data set. PMID:24423023

  17. Optimal pulsed pumping schedule using calculus of variation methodology

    SciTech Connect

    Johannes, T.W.

    1999-03-01

    The application of a variational optimization technique has demonstrated the potential strength of pulsed pumping operations for use at existing pump-and-treat aquifer remediation sites. The optimized pulsed pumping technique has exhibited notable improvements in operational effectiveness over continuous pumping. The optimized pulsed pumping technique has also exhibited an advantage over uniform time intervals for pumping and resting cycles. The most important finding supports the potential for managing and improving pumping operations in the absence of complete knowledge of plume characteristics. An objective functional was selected to minimize mass of water removed and minimize the non- essential mass of contaminant removed. General forms of an essential concentration function were analyzed to determine the appropriate form required for compliance with management preferences. Third-order essential concentration functions provided optimal solutions for the objective functional. Results of using this form of the essential concentration function in the methodology provided optimal solutions for switching times. The methodology was applied to a hypothetical, two-dimensional aquifer influenced by specified and no-flow boundaries, injection wells and extraction wells. Flow simulations used MODFLOW, transport simulations used MT3D, and the graphical interface for obtaining concentration time series data and flow/transport links were generated by GMS version 2.1.

  18. Minimally invasive radio-guided parathyroidectomy.

    PubMed

    Rubello, Domenico; Giannini, Sandro; Martini, Chiara; Piotto, Andrea; Rampin, Lucia; Fanti, Stefano; Armigliato, Michela; Nardi, Alfredo; Carpi, Angelo; Mariani, Giuliano; Gross, Milton D; Pelizzo, Maria Rosa

    2006-04-01

    We reported here the data on minimally invasive radio-guided parathyroidectomy (MIRP) in a large group of 253 patients enrolled from the whole series of 355 consecutive patients affected by primary hyperparathyroidism (P-HPT) referred to our center. On the basis of preoperative imaging including Sestamibi scintigraphy and neck ultrasound (US), 263 patients (74% of the whole series) with evidence of a solitary parathyroid adenoma (PA) and a normal thyroid gland were addressed to MIRP and in 253 (96%) of them this minimally invasive neck exploration was successfully performed. The MIRP protocol developed in our center consisted of a very low 1 mCi Sestamibi injection in the operating room a few minutes before the start of intervention, thus minimizing the radiation exposure dose to the patient and personnel. No major intraoperative complication was recorded in patients treated by MIRP and only a transient hypocalcemia in 8.5% of cases. The mean duration time for MIRP was 35 min and the mean hospital stay 1.2 days. Local anesthesia was also performed in 62 patients, 54 of whom were elderly patients with concomitant invalidating diseases contraindicating general anesthesia. No HPT relapse was observed during subsequent follow-up. The gamma probe was used also during bilateral neck exploration in the group of 92 patients excluded from MIRP. The most frequent cause of exclusion from MIRP in our series was the presence of concomitant Sestamibi avid thyroid nodules (68.5% of cases) that can give false positive results at radio-guided surgery. In conclusion, MIRP is an effective treatment in patients with a high likelihood of a solitary PA and a normal thyroid gland at scintigraphy and US so that an accurate preoperative localizing imaging is required for MIRP. A low 1 mCi Sestamibi dose appears sufficient to perform MIRP. Patients with concomitant Sestamibi avid thyroid nodules should be excluded from MIRP. PMID:16524690

  19. Efficient Energy Minimization for Enforcing Label Statistics.

    PubMed

    Lim, Yongsub; Jung, Kyomin; Kohli, Pushmeet

    2014-09-01

    Energy minimization algorithms, such as graph cuts, enable the computation of the MAP solution under certain probabilistic models such as Markov random fields. However, for many computer vision problems, the MAP solution under the model is not the ground truth solution. In many problem scenarios, the system has access to certain statistics of the ground truth. For instance, in image segmentation, the area and boundary length of the object may be known. In these cases, we want to estimate the most probable solution that is consistent with such statistics, i.e., satisfies certain equality or inequality constraints. The above constrained energy minimization problem is NP-hard in general, and is usually solved using Linear Programming formulations, which relax the integrality constraints. This paper proposes a novel method that directly finds the discrete approximate solution of such problems by maximizing the corresponding Lagrangian dual. This method can be applied to any constrained energy minimization problem whose unconstrained version is polynomial time solvable, and can handle multiple, equality or inequality, and linear or non-linear constraints. One important advantage of our method is the ability to handle second order constraints with both-side inequalities with a weak restriction, not trivial in the relaxation based methods, and show that the restriction does not affect the accuracy in our cases.We demonstrate the efficacy of our method on the foreground/background image segmentation problem, and show that it produces impressive segmentation results with less error, and runs more than 20 times faster than the state-of-the-art LP relaxation based approaches. PMID:26352240

  20. Minimal formulation of joint motion for biomechanisms

    PubMed Central

    Seth, Ajay; Sherman, Michael; Eastman, Peter; Delp, Scott

    2010-01-01

    Biomechanical systems share many properties with mechanically engineered systems, and researchers have successfully employed mechanical engineering simulation software to investigate the mechanical behavior of diverse biological mechanisms, ranging from biomolecules to human joints. Unlike their man-made counterparts, however, biomechanisms rarely exhibit the simple, uncoupled, pure-axial motion that is engineered into mechanical joints such as sliders, pins, and ball-and-socket joints. Current mechanical modeling software based on internal-coordinate multibody dynamics can formulate engineered joints directly in minimal coordinates, but requires additional coordinates restricted by constraints to model more complex motions. This approach can be inefficient, inaccurate, and difficult for biomechanists to customize. Since complex motion is the rule rather than the exception in biomechanisms, the benefits of minimal coordinate modeling are not fully realized in biomedical research. Here we introduce a practical implementation for empirically-defined internal-coordinate joints, which we call “mobilizers.” A mobilizer encapsulates the observations, measurement frame, and modeling requirements into a hinge specification of the permissible-motion manifold for a minimal set of internal coordinates. Mobilizers support nonlinear mappings that are mathematically equivalent to constraint manifolds but have the advantages of fewer coordinates, no constraints, and exact representation of the biomechanical motion-space—the benefits long enjoyed for internal-coordinate models of mechanical joints. Hinge matrices within the mobilizer are easily specified by user-supplied functions, and provide a direct means of mapping permissible motion derived from empirical data. We present computational results showing substantial performance and accuracy gains for mobilizers versus equivalent joints implemented with constraints. Examples of mobilizers for joints from human biomechanics

  1. Minimally Invasive Approach of a Retrocaval Ureter

    PubMed Central

    Pinheiro, Hugo; Ferronha, Frederico; Morales, Jorge; Campos Pinheiro, Luís

    2016-01-01

    The retrocaval ureter is a rare congenital entity, classically managed with open pyeloplasty techniques. The experience obtained with the laparoscopic approach of other more frequent causes of ureteropelvic junction (UPJ) obstruction has opened the method for the minimally invasive approach of the retrocaval ureter. In our paper, we describe a clinical case of a right retrocaval ureter managed successfully with laparoscopic dismembered pyeloplasty. The main standpoints of the procedure are described. Our results were similar to others published by other urologic centers, which demonstrates the safety and feasibility of the procedure for this condition.

  2. Minimal Hepatic Encephalopathy Impairs Quality of Life

    PubMed Central

    Agrawal, Swastik; Umapathy, Sridharan; Dhiman, Radha K.

    2015-01-01

    Minimal hepatic encephalopathy (MHE) is the mildest form of the spectrum of neurocognitive impairment in cirrhosis. It is a frequent occurrence in patients of cirrhosis and is detectable only by specialized neurocognitive testing. MHE is a clinically significant disorder which impairs daily functioning, driving performance, work capability and learning ability. It also predisposes to the development of overt hepatic encephalopathy, increased falls and increased mortality. This results in impaired quality of life for the patient as well as significant social and economic burden for health providers and care givers. Early detection and treatment of MHE with ammonia lowering therapy can reverse MHE and improve quality of life. PMID:26041957

  3. Nonunity gain minimal-disturbance measurement

    SciTech Connect

    Sabuncu, Metin; Andersen, Ulrik L.; Mista, Ladislav Jr.; Fiurasek, Jaromir; Filip, Radim; Leuchs, Gerd

    2007-09-15

    We propose and experimentally demonstrate an optimal nonunity gain Gaussian scheme for partial measurement of an unknown coherent state that causes minimal disturbance of the state. The information gain and the state disturbance are quantified by the noise added to the measurement outcomes and to the output state, respectively. We derive the optimal trade-off relation between the two noises and we show that the tradeoff is saturated by nonunity gain teleportation. Optimal partial measurement is demonstrated experimentally using a linear optics scheme with feedforward.

  4. Minimal residual disease in hypopigmented mycosis fungoides.

    PubMed

    Hsiao, Pa-Fan; Hsiao, Cheng-Hsiang; Tsai, Tsen-Fang; Jee, Shiou-Hwa

    2006-05-01

    We describe the case of a 13-year-old boy with stage I hypopigmented mycosis fungoides in whom minimal residual disease was detected with T-cell receptor gamma-polymerase chain reaction after the disease was in complete clinical remission. We further cloned and sequenced the T-cell receptor gamma-polymerase chain reaction product of the lesion in remission and found that the original T-cell clone still existed in decreased amounts. The patient was followed up for 3 1/2 years without any new lesions developing. The clinical significance of this residual malignant T-cell clone in mycosis fungoides remains to be elucidated. PMID:16631939

  5. Status of the minimal supersymmetric SO(10)

    SciTech Connect

    Dorsner, Ilja

    2010-02-10

    We discuss status of the minimal supersymmetric SO(10) in both low and split supersymmetry regime. To demonstrate viability of the model we present a good fit of the fermion masses and their mixings. The solution needs a strongly split supersymmetry with gauginos and higgsinos around 10{sup 2} TeV, sfermions close to 10{sup 14} GeV and a GUT scale of around 6x10{sup 15} GeV. It predicts fast proton decay rates, hierarchical neutrino masses and large leptonic mixing angle sin{theta}{sub 13}{approx_equal}0.1.

  6. Learning Minimal Latent Directed Information Polytrees.

    PubMed

    Etesami, Jalal; Kiyavash, Negar; Coleman, Todd

    2016-09-01

    We propose an approach for learning latent directed polytrees as long as there exists an appropriately defined discrepancy measure between the observed nodes. Specifically, we use our approach for learning directed information polytrees where samples are available from only a subset of processes. Directed information trees are a new type of probabilistic graphical models that represent the causal dynamics among a set of random processes in a stochastic system. We prove that the approach is consistent for learning minimal latent directed trees. We analyze the sample complexity of the learning task when the empirical estimator of mutual information is used as the discrepancy measure. PMID:27391682

  7. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  8. The minimal geometric deformation approach extended

    NASA Astrophysics Data System (ADS)

    Casadio, R.; Ovalle, J.; da Rocha, Roldão

    2015-11-01

    The minimal geometric deformation approach was introduced in order to study the exterior spacetime around spherically symmetric self-gravitating systems, such as stars or similar astrophysical objects, in the Randall-Sundrum brane-world framework. A consistent extension of this approach is developed here, which contains modifications of both the time component and the radial component of a spherically symmetric metric. A modified Schwarzschild geometry is obtained as an example of its simplest application, and a new solution that is potentially useful to describe stars in the brane-world is also presented.

  9. Minimizing Occupational Exposure to Antineoplastic Agents.

    PubMed

    Polovich, Martha

    2016-01-01

    The inherent toxicity of antineoplastic drugs used for the treatment of cancer makes them harmful to healthy cells as well as to cancer cells. Nurses who prepare and/or administer the agents potentially are exposed to the drugs and their negative effects. Knowledge about these drugs and the precautions aimed at reducing exposure are essential aspects of infusion nursing practice. This article briefly reviews the mechanisms of action of common antineoplastic drugs, the adverse outcomes associated with exposure, the potential for occupational exposure from preparation and administration, and recommended strategies for minimizing occupational exposure. PMID:27598070

  10. Minimal relativistic three-particle equations

    SciTech Connect

    Lindesay, J.

    1981-07-01

    A minimal self-consistent set of covariant and unitary three-particle equations is presented. Numerical results are obtained for three-particle bound states, elastic scattering and rearrangement of bound pairs with a third particle, and amplitudes for breakup into states of three free particles. The mathematical form of the three-particle bound state equations is explored; constraints are set upon the range of eigenvalues and number of eigenstates of these one parameter equations. The behavior of the number of eigenstates as the two-body binding energy decreases to zero in a covariant context generalizes results previously obtained non-relativistically by V. Efimov.

  11. Strategies for minimizing nosocomial measles transmission.

    PubMed Central

    Biellik, R. J.; Clements, C. J.

    1997-01-01

    As a result of the highly contagious nature of measles before the onset of rash, nosocomial transmission will remain a threat until the disease is eradicated. However, a number of strategies can minimize its nosocomial spread. It is therefore vital to maximize awareness among health care staff that an individual with measles can enter a health facility at any time and that a continual risk of the nosocomial transmission of measles exists. The present review makes two groups of recommendations: those which are generally applicable to all countries, and certain additional recommendations which may be suitable only for industrialized countries. PMID:9342896

  12. Cigarette price minimization strategies used by adults.

    PubMed

    Pesko, Michael F; Kruger, Judy; Hyland, Andrew

    2012-09-01

    We used multivariate logistic regressions to analyze data from the 2006 to 2007 Tobacco Use Supplement of the Current Population Survey, a nationally representative sample of adults. We explored use of cigarette price minimization strategies, such as purchasing cartons of cigarettes, purchasing in states with lower after-tax cigarette prices, and purchasing on the Internet. Racial/ethnic minorities and persons with low socioeconomic status used these strategies less frequently at last purchase than did White and high-socioeconomic-status respondents. PMID:22742066

  13. Minimally Invasive Surgery Osteotomy of the Hindfoot.

    PubMed

    Vernois, Joel; Redfern, David; Ferraz, Linda; Laborde, Julien

    2015-07-01

    A minimally invasive surgical approach has been developed for hindfoot as well as forefoot procedures. Percutaneous techniques have been evolving for more than 20 years. Many conventional surgical techniques can be performed percutaneously after training. Percutaneous surgical techniques require knowledge specific to each procedure (eg, percutaneous Zadek osteotomy or percutaneous medial heel shift). In the treatment and correction of the hindfoot pathology the surgeon now has percutaneous options including medial or lateral heel shift, Zadek osteotomy, and exostectomy with/without arthroscopy. PMID:26117576

  14. Minimally Invasive Atrial Fibrillation Surgery: Hybrid Approach

    PubMed Central

    Beller, Jared P.; Downs, Emily A.; Ailawadi, Gorav

    2016-01-01

    Atrial fibrillation is a challenging pathologic process. There continues to be a great need for the development of a reproducible, durable cure when medical management has failed. An effective, minimally invasive, sternal-sparing intervention without the need for cardiopulmonary bypass is a promising treatment approach. In this article, we describe a hybrid technique being refined at our center that combines a thoracoscopic epicardial surgical approach with an endocardial catheter-based procedure. We also discuss our results and review the literature describing this unique treatment approach. PMID:27127561

  15. Minimal hepatic encephalopathy impairs quality of life.

    PubMed

    Agrawal, Swastik; Umapathy, Sridharan; Dhiman, Radha K

    2015-03-01

    Minimal hepatic encephalopathy (MHE) is the mildest form of the spectrum of neurocognitive impairment in cirrhosis. It is a frequent occurrence in patients of cirrhosis and is detectable only by specialized neurocognitive testing. MHE is a clinically significant disorder which impairs daily functioning, driving performance, work capability and learning ability. It also predisposes to the development of overt hepatic encephalopathy, increased falls and increased mortality. This results in impaired quality of life for the patient as well as significant social and economic burden for health providers and care givers. Early detection and treatment of MHE with ammonia lowering therapy can reverse MHE and improve quality of life. PMID:26041957

  16. Minimal decaying Dark Matter and the LHC

    SciTech Connect

    Arcadi, Giorgio; Covi, Laura E-mail: covi@theorie.physik.uni-goettingen.de

    2013-08-01

    We consider a minimal Dark Matter model with just two additional states, a Dark Matter Majorana fermion and a colored or electroweakly charged scalar, without introducing any symmetry to stabilize the DM state. We identify the parameter region where an indirect DM signal would be within the reach of future observations and the DM relic density generated fits the observations. We find in this way two possible regions in the parameter space, corresponding to a FIMP/SuperWIMP or a WIMP DM. We point out the different collider signals of this scenario and how it will be possible to measure the different couplings in case of a combined detection.

  17. Minimal-change disease secondary to etanercept

    PubMed Central

    Koya, Mariko; Pichler, Raimund; Jefferson, J. Ashley

    2012-01-01

    Etanercept is a soluble tumor necrosis factor alpha (TNFα) receptor which is widely used in the treatment of rheumatoid arthritis, psoriasis and other autoimmune inflammatory disorders. It is known for its relative lack of nephrotoxicity; however, there are reports on the development of nephrotic syndrome associated with the treatment with TNFα antagonists. Here, we describe a patient with psoriasis who developed biopsy-proven minimal-change disease (MCD) shortly after initiating etanercept. Our case is unique in that the MCD resolved after discontinuation of this medication, notably without the use of corticosteroids, strongly suggesting a drug-related phenomenon. PMID:26019819

  18. Minimal model for spoof acoustoelastic surface states

    SciTech Connect

    Christensen, J. Willatzen, M.; Liang, Z.

    2014-12-15

    Similar to textured perfect electric conductors for electromagnetic waves sustaining artificial or spoof surface plasmons we present an equivalent phenomena for the case of sound. Aided by a minimal model that is able to capture the complex wave interaction of elastic cavity modes and airborne sound radiation in perfect rigid panels, we construct designer acoustoelastic surface waves that are entirely controlled by the geometrical environment. Comparisons to results obtained by full-wave simulations confirm the feasibility of the model and we demonstrate illustrative examples such as resonant transmissions and waveguiding to show a few examples of many where spoof elastic surface waves are useful.

  19. Flocking with minimal cooperativity: The panic model

    NASA Astrophysics Data System (ADS)

    Pilkiewicz, Kevin R.; Eaves, Joel D.

    2014-01-01

    We present a two-dimensional lattice model of self-propelled spins that can change direction only upon collision with another spin. We show that even with ballistic motion and minimal cooperativity, these spins display robust flocking behavior at nearly all densities, forming long bands of stripes. The structural transition in this system is not a thermodynamic phase transition, but it can still be characterized by an order parameter, and we demonstrate that if this parameter is studied as a dynamical variable rather than a steady-state observable, we can extract a detailed picture of how the flocking mechanism varies with density.

  20. Periodical cicadas: A minimal automaton model

    NASA Astrophysics Data System (ADS)

    de O. Cardozo, Giovano; de A. M. M. Silvestre, Daniel; Colato, Alexandre

    2007-08-01

    The Magicicada spp. life cycles with its prime periods and highly synchronized emergence have defied reasonable scientific explanation since its discovery. During the last decade several models and explanations for this phenomenon appeared in the literature along with a great deal of discussion. Despite this considerable effort, there is no final conclusion about this long standing biological problem. Here, we construct a minimal automaton model without predation/parasitism which reproduces some of these aspects. Our results point towards competition between different strains with limited dispersal threshold as the main factor leading to the emergence of prime numbered life cycles.

  1. Minimally Invasive Transforaminal Lumbar Interbody Fusion.

    PubMed

    Ahn, Junyoung; Tabaraee, Ehsan; Singh, Kern

    2015-07-01

    Minimally invasive transforaminal lumbar interbody fusion (MIS TLIF) is performed via tubular dilators thereby preserving the integrity of the paraspinal musculature. The decreased soft tissue disruption in the MIS technique has been associated with significantly decreased blood loss, shorter length of hospitalization, and an expedited return to work while maintaining comparable arthrodesis rates when compared with the open technique particularly in the setting of spondylolisthesis (isthmic and degenerative), recurrent symptomatic disk herniation, spinal stenosis, pseudoarthrosis, iatrogenic instability, and spinal trauma. The purpose of this article and the accompanying video wass to demonstrate the techniques for a primary, single-level MIS TLIF. PMID:26079840

  2. Minimal energy damping in an axisymmetric flow

    NASA Astrophysics Data System (ADS)

    Sachs, Alexander

    2008-05-01

    The method of Lagrange's undetermined multipliers is used to find the velocity field which minimizes the energy damping for a viscous incompressible fluid described by the Navier- Stoke equation. The vorticity of this velocity field obeys a Helmholtz equation with an undetermined parameter. This Helmholtz equation is used to determine the axisymmetric velocity field in a cylinder. This velocity field is slightly different from the Poiseuille velocity field. The rate of energy damping per unit energy is calculated as a function of the parameter. It is a minimum when the parameter is equal to the root of a Bessel function.

  3. Minimal electroweak model for monopole annihilation

    SciTech Connect

    Farris, T.H. ); Kephart, T.W.; Weiler, T.J. ); Yuan, T.C. )

    1992-02-03

    We construct the minimal (most economical in fields) extension of the standard model implementing the Langacker-Pi mechanism for reducing the grand unified theory (GUT) monopole cosmic density to an allowed level. The model contains just a single charged scalar field in addition to the standard Higgs doublet, and is easily embeddable in any GUT. We identify the region of parameter space where monopoles annihilate in the higher temperature early Universe. A particularly alluring possibility is that the demise of monopoles at the electroweak scale is in fact the origin of the Universe's net baryon number.

  4. Minimal model for spatial coherence resonance.

    PubMed

    Perc, Matjaz; Marhl, Marko

    2006-06-01

    We show that a planar medium, locally modeled by a simple one-dimensional excitable system with a piece-wise linear potential, can serve as a minimal model for spatial coherence resonance. Via an analytical treatment of the spatially extended system, we derive the dependence of the resonant wave number on several crucial system parameters, ranging from the diffusion coefficient to the local excursion time of constitutive excitable units. Thus, we provide vital insights into mechanisms that enable the emergence of exclusively noise-induced spatial periodicity in excitable media. PMID:16906944

  5. Solar array stepping to minimize array excitation

    NASA Technical Reports Server (NTRS)

    Bhat, Mahabaleshwar K. P. (Inventor); Liu, Tung Y. (Inventor); Plescia, Carl T. (Inventor)

    1989-01-01

    Mechanical oscillations of a mechanism containing a stepper motor, such as a solar-array powered spacecraft, are reduced and minimized by the execution of step movements in pairs of steps, the period between steps being equal to one-half of the period of torsional oscillation of the mechanism. Each pair of steps is repeated at needed intervals to maintain desired continuous movement of the portion of elements to be moved, such as the solar array of a spacecraft. In order to account for uncertainty as well as slow change in the period of torsional oscillation, a command unit may be provided for varying the interval between steps in a pair.

  6. Minimally invasive surgery for esophageal achalasia

    PubMed Central

    Chen, Huan-Wen

    2016-01-01

    Esophageal achalasia is due to the esophagus of neuromuscular dysfunction caused by esophageal functional disease. Its main feature is the lack of esophageal peristalsis, the lower esophageal sphincter pressure and to reduce the swallow’s relaxation response. Lower esophageal muscular dissection is one of the main ways to treat esophageal achalasia. At present, the period of muscular layer under the thoracoscope esophagus dissection is one of the treatment of esophageal achalasia. Combined with our experience in minimally invasive esophageal surgery, to improved incision and operation procedure, and adopts the model of the complete period of muscular layer under the thoracoscope esophagus dissection in the treatment of esophageal achalasia. PMID:27499977

  7. APMS SVD methodology and implementation

    SciTech Connect

    BG Amidan; TA Ferryman

    2000-04-17

    One of the main tasks within the Aviation Performance Measurement System (APMS) program uses statistical methodologies to find atypical flights. With thousands of flights a day and hundreds of parameters being recorded every second for each flight, the amount of data escalates and the ability to find atypical flights becomes more difficult. The purpose of this paper is to explain the method known as single value decomposition (SVD) employed to search for the atypical flights and display useful graphics that facilitate understanding the causes of atypicality for these flights. Other methods could also perform this search and some are planned for future implementation.

  8. Minimal model of financial stylized facts.

    PubMed

    Delpini, Danilo; Bormetti, Giacomo

    2011-04-01

    In this work we propose a statistical characterization of a linear stochastic volatility model featuring inverse-gamma stationary distribution for the instantaneous volatility. We detail the derivation of the moments of the return distribution, revealing the role of the inverse-gamma law in the emergence of fat tails and of the relevant correlation functions. We also propose a systematic methodology for estimating the parameters and we describe the empirical analysis of the Standard & Poor's 500 index daily returns, confirming the ability of the model to capture many of the established stylized facts as well as the scaling properties of empirical distributions over different time horizons. PMID:21599119

  9. Feminist methodologies and engineering education research

    NASA Astrophysics Data System (ADS)

    Beddoes, Kacey

    2013-03-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory. The paper begins with a literature review that covers a broad range of topics featured in the literature on feminist methodologies. Next, data from interviews with engineering educators and researchers who have engaged with feminist methodologies are presented. The ways in which feminist methodologies shape their research topics, questions, frameworks of analysis, methods, practices and reporting are each discussed. The challenges and barriers they have faced are then discussed. Finally, the benefits of further and broader engagement with feminist methodologies within the engineering education community are identified.

  10. 76 FR 71431 - Civil Penalty Calculation Methodology

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... TRANSPORTATION Federal Motor Carrier Safety Administration Civil Penalty Calculation Methodology AGENCY: Federal... Uniform Fine Assessment (UFA) algorithm, which FMCSA currently uses for calculation of civil penalties... methodology for calculation of certain civil penalties. To induce compliance with federal regulations,...

  11. The minimal curvaton-higgs model

    SciTech Connect

    Enqvist, Kari; Lerner, Rose N.; Takahashi, Tomo E-mail: rose.lerner@desy.de

    2014-01-01

    We present the first full study of the minimal curvaton-higgs (MCH) model, which is a minimal interpretation of the curvaton scenario with one real scalar coupled to the standard model Higgs boson. The standard model coupling allows the dynamics of the model to be determined in detail, including effects from the thermal background and from radiative corrections to the potential. The relevant mechanisms for curvaton decay are incomplete non-perturbative decay (delayed by thermal blocking), followed by decay via a dimension-5 non-renormalisable operator. To avoid spoiling the predictions of big bang nucleosynthesis, we find the ''bare'' curvaton mass to be m{sub σ} ≥ 8 × 10{sup 4}GeV. To match observational data from Planck there is an upper limit on the curvaton-higgs coupling g, between 10{sup −3} and 10{sup −2}, depending on the mass. This is due to interactions with the thermal background. We find that typically non-Gaussianities are small but that if f{sub NL} is observed in the near future then m{sub σ}∼<5 × 10{sup 9}GeV, depending on Hubble scale during inflation. In a thermal dark matter model, the lower bound on m{sub σ} can increase substantially. The parameter space may also be affected once the baryogenesis mechanism is specified.

  12. Power Minimization techniques for Networked Data Centers.

    SciTech Connect

    Low, Steven; Tang, Kevin

    2011-09-28

    Our objective is to develop a mathematical model to optimize energy consumption at multiple levels in networked data centers, and develop abstract algorithms to optimize not only individual servers, but also coordinate the energy consumption of clusters of servers within a data center and across geographically distributed data centers to minimize the overall energy cost and consumption of brown energy of an enterprise. In this project, we have formulated a variety of optimization models, some stochastic others deterministic, and have obtained a variety of qualitative results on the structural properties, robustness, and scalability of the optimal policies. We have also systematically derived from these models decentralized algorithms to optimize energy efficiency, analyzed their optimality and stability properties. Finally, we have conducted preliminary numerical simulations to illustrate the behavior of these algorithms. We draw the following conclusion. First, there is a substantial opportunity to minimize both the amount and the cost of electricity consumption in a network of datacenters, by exploiting the fact that traffic load, electricity cost, and availability of renewable generation fluctuate over time and across geographical locations. Judiciously matching these stochastic processes can optimize the tradeoff between brown energy consumption, electricity cost, and response time. Second, given the stochastic nature of these three processes, real-time dynamic feedback should form the core of any optimization strategy. The key is to develop decentralized algorithms that can be implemented at different parts of the network as simple, local algorithms that coordinate through asynchronous message passing.

  13. Surgical efficacy of minimally invasive thoracic discectomy.

    PubMed

    Elhadi, Ali M; Zehri, Aqib H; Zaidi, Hasan A; Almefty, Kaith K; Preul, Mark C; Theodore, Nicholas; Dickman, Curtis A

    2015-11-01

    We aimed to determine the clinical indications and surgical outcomes for thoracoscopic discectomy. Thoracic disc disease is a rare degenerative process. Thoracoscopic approaches serve to minimize tissue injury during the approach, but critics argue that this comes at the cost of surgical efficacy. Current reports in the literature are limited to small institutional patient series. We systematically identified all English language articles on thoracoscopic discectomy with at least two patients, published from 1994 to 2013 on MEDLINE, Science Direct, and Google Scholar. We analyzed 12 articles that met the inclusion criteria, five prospective and seven retrospective studies comprising 545 surgical patients. The overall complication rate was 24% (n=129), with reported complications ranging from intercostal neuralgia (6.1%), atelectasis (2.8%), and pleural effusion (2.6%), to more severe complications such as pneumonia (0.8%), pneumothorax (1.3%), and venous thrombosis (0.2%). The average reported postoperative follow-up was 20.5 months. Complete resolution of symptoms was reported in 79% of patients, improvement with residual symptoms in 10.2%, no change in 9.6%, and worsening in 1.2%. The minimally invasive endoscopic approaches to the thoracic spine among selected patients demonstrate excellent clinical efficacy and acceptable complication rates, comparable to the open approaches. Disc herniations confined to a single level, with small or no calcifications, are ideal for such an approach, whereas patients with calcified discs adherent to the dura would benefit from an open approach. PMID:26206758

  14. Minimal size of a barchan dune.

    PubMed

    Parteli, E J R; Durán, O; Herrmann, H J

    2007-01-01

    Barchans are dunes of high mobility which have a crescent shape and propagate under conditions of unidirectional wind. However, sand dunes only appear above a critical size, which scales with the saturation distance of the sand flux [P. Hersen, S. Douady, and B. Andreotti, Phys. Rev. Lett. 89, 264301 (2002); B. Andreotti, P. Claudin, and S. Douady, Eur. Phys. J. B 28, 321 (2002); G. Sauermann, K. Kroy, and H. J. Herrmann, Phys. Rev. E 64, 31305 (2001)]. It has been suggested by P. Hersen, S. Douady, and B. Andreotti, Phys. Rev. Lett. 89, 264301 (2002)] that this flux fetch distance is itself constant. Indeed, this could not explain the protosize of barchan dunes, which often occur in coastal areas of high litoral drift, and the scale of dunes on Mars. In the present work, we show from three-dimensional calculations of sand transport that the size and the shape of the minimal barchan dune depend on the wind friction speed and the sand flux on the area between dunes in a field. Our results explain the common appearance of barchans a few tens of centimeter high which are observed along coasts. Furthermore, we find that the rate at which grains enter saltation on Mars is one order of magnitude higher than on Earth, and is relevant to correctly obtain the minimal dune size on Mars. PMID:17358139

  15. The thrust minimization problem and its applications

    NASA Astrophysics Data System (ADS)

    Ivanyukhin, A. V.; Petukhov, V. G.

    2015-07-01

    An indirect approach to the optimization of trajectories with finite thrust based on Pontryagin's maximum principle is discussed. The optimization is aimed at calculating the minimum thrust for a point-to-point flight completed within a given interval of time with a constant exhaust velocity and a constant power. This may help calculate the region of existence of the optimum trajectory with thrust switching: it is evident that the latter problem may be solved if minimum thrust is lower than or equal to the available thrust in the problem with switching. A technique for calculating the optimum trajectories with a finite thrust by solving the problem of minimization of the thrust acceleration with a subsequent numerical continuation with respect to the mass flow towards the thrust minimization problem is proposed. This technique offers an opportunity to detect degeneracies associated with the lack of thrust or specific impulse. In effect, it allows one to calculate the boundaries of the region of existence of trajectories with thrust switching and thus makes it possible to automate the process of solving the problem of optimization of trajectories with thrust switching.

  16. Osmosis in a minimal model system.

    PubMed

    Lion, Thomas W; Allen, Rosalind J

    2012-12-28

    Osmosis is one of the most important physical phenomena in living and soft matter systems. While the thermodynamics of osmosis is well understood, the underlying microscopic dynamical mechanisms remain the subject of discussion. Unravelling these mechanisms is a prerequisite for understanding osmosis in non-equilibrium systems. Here, we investigate the microscopic basis of osmosis, in a system at equilibrium, using molecular dynamics simulations of a minimal model in which repulsive solute and solvent particles differ only in their interactions with an external potential. For this system, we can derive a simple virial-like relation for the osmotic pressure. Our simulations support an intuitive picture in which the solvent concentration gradient, at osmotic equilibrium, arises from the balance between an outward force, caused by the increased total density in the solution, and an inward diffusive flux caused by the decreased solvent density in the solution. While more complex effects may occur in other osmotic systems, our results suggest that they are not required for a minimal picture of the dynamic mechanisms underlying osmosis. PMID:23277960

  17. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry. PMID:26925471

  18. Free energies for singleton minimal states

    NASA Astrophysics Data System (ADS)

    Golden, J. M.

    2016-06-01

    It is assumed that any free energy function exhibits strict periodic behavior for histories that have been periodic for all past times. This is not the case for the work function, which, however, has the usual defining properties of a free energy. Forms given in fairly recent years for the minimum and related free energies of linear materials with memory have this property. Materials for which the minimal states are all singletons are those for which at least some of the singularities of the Fourier transform of the relaxation function are not isolated. For such materials, the maximum free energy is the work function, and free energies intermediate between the minimum free energy and the work function should be given by a linear relation involving these two quantities. All such functionals, except the minimum free energy, therefore do not have strict periodic behavior for periodic histories, which contradicts our assumption. A way out of the difficulty is explored which involves approximating the relaxation function by a form for which the minimal states are no longer singletons. A representation can then be given of an arbitrary free energy as a linear combination of the minimum, maximum and intermediate free energies derived in earlier work. This representation obeys our periodicity assumption. Numerical data are presented, supporting the consistency of this approach.

  19. Gamma ray tests of Minimal Dark Matter

    SciTech Connect

    Cirelli, Marco; Hambye, Thomas; Panci, Paolo; Sala, Filippo; Taoso, Marco

    2015-10-12

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  20. Singlet-stabilized minimal gauge mediation

    SciTech Connect

    Curtin, David; Tsai, Yuhsin

    2011-04-01

    We propose singlet-stabilized minimal gauge mediation as a simple Intriligator, Seiberg and Shih-based model of direct gauge mediation which avoids both light gauginos and Landau poles. The hidden sector is a massive s-confining supersymmetric QCD that is distinguished by a minimal SU(5) flavor group. The uplifted vacuum is stabilized by coupling the meson to an additional singlet sector with its own U(1) gauge symmetry via nonrenormalizable interactions suppressed by a higher scale {Lambda}{sub UV} in the electric theory. This generates a nonzero vacuum expectation value for the singlet meson via the inverted hierarchy mechanism, but requires tuning to a precision {approx}({Lambda}/{Lambda}{sub UV}){sup 2}, which is {approx}10{sup -4}. In the course of this analysis we also outline some simple model-building rules for stabilizing uplifted-ISS models, which lead us to conclude that meson deformations are required (or at least heavily favored) to stabilize the adjoint component of the magnetic meson.

  1. Navy Shipboard Hazardous Material Minimization Program

    SciTech Connect

    Bieberich, M.J.; Robinson, P.; Chastain, B.

    1994-12-31

    The use of hazardous (and potentially hazardous) materials in shipboard cleaning applications has proliferated as new systems and equipments have entered the fleet to reside alongside existing equipments. With the growing environmental awareness (and additional, more restrictive regulations) at all levels/echelon commands of the DoD, the Navy has initiated a proactive program to undertake the minimization/elimination of these hazardous materials in order to eliminate HMs at the source. This paper will focus on the current Shipboard Hazardous Materials Minimization Program initiatives including the identification of authorized HM currently used onboard, identification of potential substitute materials for HM replacement, identification of new cleaning technologies and processes/procedures, and identification of technical documents which will require revision to eliminate the procurement of HMs into the federal supply system. Also discussed will be the anticipated path required to implement the changes into the fleet and automated decision processes (substitution algorithm) currently employed. The paper will also present the most recent technologies identified for approval or additional testing and analysis including: supercritical CO{sub 2} cleaning, high pressure blasting (H{sub 2}O + baking soda), aqueous and semi-aqueous cleaning materials and processes, solvent replacements and dedicated parts washing systems with internal filtering capabilities, automated software for solvent/cleaning process substitute selection. Along with these technological advances, data availability (from on-line databases and CDROM Database libraries) will be identified and discussed.

  2. An Aristotelian Account of Minimal Chemical Life

    NASA Astrophysics Data System (ADS)

    Bedau, Mark A.

    2010-12-01

    This paper addresses the open philosophical and scientific problem of explaining and defining life. This problem is controversial, and there is nothing approaching a consensus about what life is. This raises a philosophical meta-question: Why is life so controversial and so difficult to define? This paper proposes that we can attribute a significant part of the controversy over life to use of a Cartesian approach to explaining life, which seeks necessary and sufficient conditions for being an individual living organism, out of the context of other organisms and the abiotic environment. The Cartesian approach contrasts with an Aristotelian approach to explaining life, which considers life only in the whole context in which it actually exists, looks at the characteristic phenomena involving actual life, and seeks the deepest and most unified explanation for those phenomena. The phenomena of life might be difficult to delimit precisely, but it certainly includes life's characteristic hallmarks, borderline cases, and puzzles. The Program-Metabolism-Container (PMC) model construes minimal chemical life as a functionally integrated triad of chemical systems, which are identified as the Program, Metabolism, and Container. Rasmussen diagrams precisely depict the functional definition of minimal chemical life. The PMC model illustrates the Aristotelian approach to life, because it explains eight of life's hallmarks, one of life's borderline cases (the virus), and two of life's puzzles.

  3. On the generalized minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Setare, M. R.

    2015-09-01

    In this paper we study the Generalized Minimal Massive Gravity (GMMG) in asymptotically AdS3 background. The generalized minimal massive gravity theory is realized by adding the CS deformation term, the higher derivative deformation term, and an extra term to pure Einstein gravity with a negative cosmological constant. We study the linearized excitations around the AdS3 background and find that at special point (tricritical) in parameter space the two massive graviton solutions become massless and they are replaced by two solutions with logarithmic and logarithmic-squared boundary behavior. So it is natural to propose that GMMG model could also provide a holographic description for a 3-rank Logarithmic Conformal Field Theory (LCFT). We calculate the energy of the linearized gravitons in AdS3 background, and show that the theory is free of negative-energy bulk modes. Then we obtain the central charges of the CFT dual explicitly and show GMMG also avoids the aforementioned "bulk-boundary unitarity clash". After that we show that General Zwei-Dreibein Gravity (GZDG) model can reduce to GMMG model. Finally by a Hamiltonian analysis we show that the GMMG model has no Boulware-Deser ghosts and this model propagates only two physical modes.

  4. Design of minimally strained nucleic Acid nanotubes.

    PubMed

    Sherman, William B; Seeman, Nadrian C

    2006-06-15

    A practical theoretical framework is presented for designing and classifying minimally strained nucleic acid nanotubes. The structures are based on the double crossover motif where each double-helical domain is connected to each of its neighbors via two or more Holliday-junction-like reciprocal exchanges, such that each domain is parallel to the main tube axis. Modeling is based on a five-parameter characterization of the segmented double-helical structure. Once the constraint equations have been derived, the primary design problem for a minimally strained N-domain structure is reduced to solving three simultaneous equations in 2N+2 variables. Symmetry analysis and tube merging then allow for the design of a wide variety of tubes, which can be tailored to satisfy requirements such as specific inner and outer radii, or multiple lobed structures. The general form of the equations allows similar techniques to be applied to various nucleic acid helices: B-DNA, A-DNA, RNA, DNA-PNA, or others. Possible applications for such tubes include nanoscale scaffolding as well as custom-shaped enclosures for other nano-objects. PMID:16581842

  5. [Theory and practice of minimally invasive endodontics].

    PubMed

    Jiang, H W

    2016-08-01

    The primary goal of modern endodontic therapy is to achieve the long-term retention of a functional tooth by preventing or treating pulpitis or apical periodontitis is. The long-term retention of endodontically treated tooth is correlated with the remaining amount of tooth tissue and the quality of the restoration after root canal filling. In recent years, there has been rapid progress and development in the basic research of endodontic biology, instrument and applied materials, making treatment procedures safer, more accurate, and more efficient. Thus, minimally invasive endodontics(MIE)has received increasing attention at present. MIE aims to preserve the maximum of tooth structure during root canal therapy, and the concept covers the whole process of diagnosis and treatment of teeth. This review article focuses on describing the minimally invasive concepts and operating essentials in endodontics, from diagnosis and treatment planning to the access opening, pulp cavity finishing, root canal cleaning and shaping, 3-dimensional root canal filling and restoration after root canal treatment. PMID:27511034

  6. [Minimally Invasive Open Surgery for Lung Cancer].

    PubMed

    Nakagawa, Kazuo; Watanabe, Shunichi

    2016-07-01

    Significant efforts have been made to reduce the invasiveness of surgical procedures by surgeons for a long time. Surgeons always keep it in mind that the basic principle performing less invasive surgical procedures for malignant tumors is to decrease the invasiveness for patients without compromising oncological curability and surgical safety. Video-assisted thoracic surgery (VATS) has been used increasingly as a minimally invasive approach to lung cancer surgery. Whereas, whether VATS lobectomy is a less invasive procedure and has equivalent or better clinical effect compared with open lobectomy for patients with lung cancer remains controversial because of the absence of randomized prospective studies. The degree of difficulty for anatomical lung resection depends on the degree of the fissure development, mobility of hilar lymph nodes, and the degree of pleural adhesions. During pulmonary surgery, thoracic surgeons always have to deal with not only these difficulties but other unexpected events such as intraoperative bleeding. Recently, we perform pulmonary resection for lung cancer with minimally invasive open surgery (MIOS) approach. In this article, we introduce the surgical procedure of MIOS and demonstrate short-term results. Off course, the efficacy of MIOS needs to be further evaluated with long-term results. PMID:27440030

  7. MR imaging guidance for minimally invasive procedures

    NASA Astrophysics Data System (ADS)

    Wong, Terence Z.; Kettenbach, Joachim; Silverman, Stuart G.; Schwartz, Richard B.; Morrison, Paul R.; Kacher, Daniel F.; Jolesz, Ferenc A.

    1998-04-01

    Image guidance is one of the major challenges common to all minimally invasive procedures including biopsy, thermal ablation, endoscopy, and laparoscopy. This is essential for (1) identifying the target lesion, (2) planning the minimally invasive approach, and (3) monitoring the therapy as it progresses. MRI is an ideal imaging modality for this purpose, providing high soft tissue contrast and multiplanar imaging, capability with no ionizing radiation. An interventional/surgical MRI suite has been developed at Brigham and Women's Hospital which provides multiplanar imaging guidance during surgery, biopsy, and thermal ablation procedures. The 0.5T MRI system (General Electric Signa SP) features open vertical access, allowing intraoperative imaging to be performed. An integrated navigational system permits near real-time control of imaging planes, and provides interactive guidance for positioning various diagnostic and therapeutic probes. MR imaging can also be used to monitor cryotherapy as well as high temperature thermal ablation procedures sing RF, laser, microwave, or focused ultrasound. Design features of the interventional MRI system will be discussed, and techniques will be described for interactive image acquisition and tracking of interventional instruments. Applications for interactive and near-real-time imaging will be presented as well as examples of specific procedures performed using MRI guidance.

  8. Microbial life detection with minimal assumptions

    NASA Astrophysics Data System (ADS)

    Kounaves, Samuel P.; Noll, Rebecca A.; Buehler, Martin G.; Hecht, Michael H.; Lankford, Kurt; West, Steven J.

    2002-02-01

    To produce definitive and unambiguous results, any life detection experiment must make minimal assumptions about the nature of extraterrestrial life. The only criteria that fits this definition is the ability to reproduce and in the process create a disequilibrium in the chemical and redox environment. The Life Detection Array (LIDA), an instrument proposed for the 2007 NASA Mars Scout Mission, and in the future for the Jovian moons, enables such an experiment. LIDA responds to minute biogenic chemical and physical changes in two identical 'growth' chambers. The sensitivity is provided by two differentially monitored electrochemical sensor arrays. Growth in one of the chambers alters the chemistry and ionic properties and results in a signal. This life detection system makes minimal assumptions; that after addition of water the microorganism replicates and in the process will produce small changes in its immediate surroundings by consuming, metabolizing, and excreting a number of molecules and/or ionic species. The experiment begins by placing an homogenized split-sample of soil or water into each chamber, adding water if soil, sterilizing via high temperature, and equilibrating. In the absence of any microorganism in either chamber, no signal will be detected. The inoculation of one chamber with even a few microorganisms which reproduce, will create a sufficient disequilibrium in the system (compared to the control) to be detectable. Replication of the experiment and positive results would lead to a definitive conclusion of biologically induced changes. The split sample and the nanogram inoculation eliminates chemistry as a causal agent.

  9. Environmental projects. Volume 16: Waste minimization assessment

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Goldstone Deep Space Communications Complex (GDSCC), located in the MoJave Desert, is part of the National Aeronautics and Space Administration's (NASA's) Deep Space Network (DSN), the world's largest and most sensitive scientific telecommunications and radio navigation network. The Goldstone Complex is operated for NASA by the Jet Propulsion Laboratory. At present, activities at the GDSCC support the operation of nine parabolic dish antennas situated at five separate locations known as 'sites.' Each of the five sites at the GDSCC has one or more antennas, called 'Deep Space Stations' (DSS's). In the course of operation of these DSS's, various hazardous and non-hazardous wastes are generated. In 1992, JPL retained Kleinfelder, Inc., San Diego, California, to quantify the various streams of hazardous and non-hazardous wastes generated at the GDSCC. In June 1992, Kleinfelder, Inc., submitted a report to JPL entitled 'Waste Minimization Assessment.' This present volume is a JPL-expanded version of the Kleinfelder, Inc. report. The 'Waste Minimization Assessment' report did not find any deficiencies in the various waste-management programs now practiced at the GDSCC, and it found that these programs are being carried out in accordance with environmental rules and regulations.

  10. Flavored dark matter beyond Minimal Flavor Violation

    SciTech Connect

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms as triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.

  11. Flavored dark matter beyond Minimal Flavor Violation

    DOE PAGESBeta

    Agrawal, Prateek; Blanke, Monika; Gemmler, Katrin

    2014-10-13

    We study the interplay of flavor and dark matter phenomenology for models of flavored dark matter interacting with quarks. We allow an arbitrary flavor structure in the coupling of dark matter with quarks. This coupling is assumed to be the only new source of violation of the Standard Model flavor symmetry extended by a U(3) χ associated with the dark matter. We call this ansatz Dark Minimal Flavor Violation (DMFV) and highlight its various implications, including an unbroken discrete symmetry that can stabilize the dark matter. As an illustration we study a Dirac fermionic dark matter χ which transforms asmore » triplet under U(3) χ , and is a singlet under the Standard Model. The dark matter couples to right-handed down-type quarks via a colored scalar mediator Φ with a coupling λ. We identify a number of “flavor-safe” scenarios for the structure of λ which are beyond Minimal Flavor Violation. Also, for dark matter and collider phenomenology we focus on the well-motivated case of b-flavored dark matter. Furthermore, the combined flavor and dark matter constraints on the parameter space of λ turn out to be interesting intersections of the individual ones. LHC constraints on simplified models of squarks and sbottoms can be adapted to our case, and monojet searches can be relevant if the spectrum is compressed.« less

  12. Utilization of biocatalysts in cellulose waste minimization

    SciTech Connect

    Woodward, J.; Evans, B.R.

    1996-09-01

    Cellulose, a polymer of glucose, is the principal component of biomass and, therefore, a major source of waste that is either buried or burned. Examples of biomass waste include agricultural crop residues, forestry products, and municipal wastes. Recycling of this waste is important for energy conservation as well as waste minimization and there is some probability that in the future biomass could become a major energy source and replace fossil fuels that are currently used for fuels and chemicals production. It has been estimated that in the United States, between 100-450 million dry tons of agricultural waste are produced annually, approximately 6 million dry tons of animal waste, and of the 190 million tons of municipal solid waste (MSW) generated annually, approximately two-thirds is cellulosic in nature and over one-third is paper waste. Interestingly, more than 70% of MSW is landfilled or burned, however landfill space is becoming increasingly scarce. On a smaller scale, important cellulosic products such as cellulose acetate also present waste problems; an estimated 43 thousand tons of cellulose ester waste are generated annually in the United States. Biocatalysts could be used in cellulose waste minimization and this chapter describes their characteristics and potential in bioconversion and bioremediation processes.

  13. Gamma ray tests of Minimal Dark Matter

    NASA Astrophysics Data System (ADS)

    Cirelli, Marco; Hambye, Thomas; Panci, Paolo; Sala, Filippo; Taoso, Marco

    2015-10-01

    We reconsider the model of Minimal Dark Matter (a fermionic, hypercharge-less quintuplet of the EW interactions) and compute its gamma ray signatures. We compare them with a number of gamma ray probes: the galactic halo diffuse measurements, the galactic center line searches and recent dwarf galaxies observations. We find that the original minimal model, whose mass is fixed at 9.4 TeV by the relic abundance requirement, is constrained by the line searches from the Galactic Center: it is ruled out if the Milky Way possesses a cuspy profile such as NFW but it is still allowed if it has a cored one. Observations of dwarf spheroidal galaxies are also relevant (in particular searches for lines), and ongoing astrophysical progresses on these systems have the potential to eventually rule out the model. We also explore a wider mass range, which applies to the case in which the relic abundance requirement is relaxed. Most of our results can be safely extended to the larger class of multi-TeV WIMP DM annihilating into massive gauge bosons.

  14. Information technology security system engineering methodology

    NASA Technical Reports Server (NTRS)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  15. Methodologic frontiers in environmental epidemiology.

    PubMed Central

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic research but looms as a towering obstacle in environmental epidemiology. One of the most promising developments in improving exposure assessment in environmental epidemiology is to find exposure biomarkers, which could serve as built-in dosimeters that reflect the biologic footprint left behind by environmental exposures. Beyond exposure assessment, epidemiologists studying environmental exposures face the difficulty of studying small effects that may be distorted by confounding that eludes easy control. This challenge may prompt reliance on new study designs, such as two-stage designs in which exposure and disease information are collected in the first stage, and covariate information is collected on a subset of subjects in state two. While the analytic methods already available for environmental epidemiology are powerful, analytic methods for ecologic studies need further development. This workshop outlines the range of methodologic issues that environmental epidemiologists must address so that their work meets the goals set by scientists and society at large. PMID:8206029

  16. Modeling methodologies for intelligent systems

    SciTech Connect

    Li, X.

    1988-01-01

    Attempts are made to solve real-world problems by developing problem-solving paradigms using artificial intelligence (AI) technology. An important concept permeating the dissertation is the view that considers most AI issues as modeling tasks. Based on this concept, the dissertation is organized around the notion of model: model of physical system, model of human mental knowledge, and model of human learning process. Thus, the problem-solving paradigms developed are called modeling methodologies. These modeling methodologies, although developed for two specific systems, i.e., (1) a Power Distribution Training System, and (2) a Statistical Process Control Advisory System, address several fundamental issues in AI. Qualitative modeling techniques are used for modeling physical systems, and a generic architecture is proposed and implemented for building qualitative simulation models for a variety of distribution networks. A complete example in the domain of power distribution systems is given. A rule-based expert system is implemented for modeling the instructor and student in the mode-based Power Distribution Training System.

  17. Thoughts on an Indigenous Research Methodology.

    ERIC Educational Resources Information Center

    Steinhauer, Evelyn

    2002-01-01

    Reviews writings of Indigenous scholars concerning the need for and nature of an Indigenous research methodology. Discusses why an Indigenous research methodology is needed; the importance of relational accountability in such a methodology; why Indigenous people must conduct Indigenous research; Indigenous knowledge and ways of knowing (including…

  18. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  19. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Training methodology. 904.205... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and...

  20. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  1. 24 CFR 904.205 - Training methodology.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Training methodology. 904.205... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and...

  2. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  3. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  4. 42 CFR 441.472 - Budget methodology.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets...

  5. A rigorous testing methodology for control systems

    NASA Technical Reports Server (NTRS)

    Lewin, Andrew W.

    1991-01-01

    This paper discusses the development of a generalized verification testing methodology as applied to control systems. The methodology is based upon determining inputs that rigorously test each element of a system in order to verify that it has been specified properly. The methodology was successfully applied to testing of Boeing 737 autoland control system.

  6. 10 CFR 436.14 - Methodological assumptions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall discount to present values the...

  7. 10 CFR 436.14 - Methodological assumptions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Methodological assumptions. 436.14 Section 436.14 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.14 Methodological assumptions. (a) Each Federal Agency shall discount to present values the...

  8. Methodology and the Research-Practice Gap.

    ERIC Educational Resources Information Center

    Robinson, Viviane M. J.

    1998-01-01

    Addresses the mismatch between educational research methodologies and its application to generic features of practice and proposes a problem-based methodology that better links research with problem solving. Implications of this methodology are discussed from recent research on school tracking. (GR)

  9. Methodological quality of behavioural weight loss studies: a systematic review.

    PubMed

    Lemon, S C; Wang, M L; Haughton, C F; Estabrook, D P; Frisard, C F; Pagoto, S L

    2016-07-01

    This systematic review assessed the methodological quality of behavioural weight loss intervention studies conducted among adults and associations between quality and statistically significant weight loss outcome, strength of intervention effectiveness and sample size. Searches for trials published between January, 2009 and December, 2014 were conducted using PUBMED, MEDLINE and PSYCINFO and identified ninety studies. Methodological quality indicators included study design, anthropometric measurement approach, sample size calculations, intent-to-treat (ITT) analysis, loss to follow-up rate, missing data strategy, sampling strategy, report of treatment receipt and report of intervention fidelity (mean = 6.3). Indicators most commonly utilized included randomized design (100%), objectively measured anthropometrics (96.7%), ITT analysis (86.7%) and reporting treatment adherence (76.7%). Most studies (62.2%) had a follow-up rate > 75% and reported a loss to follow-up analytic strategy or minimal missing data (69.9%). Describing intervention fidelity (34.4%) and sampling from a known population (41.1%) were least common. Methodological quality was not associated with reporting a statistically significant result, effect size or sample size. This review found the published literature of behavioural weight loss trials to be of high quality for specific indicators, including study design and measurement. Identified for improvement include utilization of more rigorous statistical approaches to loss to follow up and better fidelity reporting. PMID:27071775

  10. New Methodology for Estimating Fuel Economy by Vehicle Class

    SciTech Connect

    Chin, Shih-Miao; Dabbs, Kathryn; Hwang, Ho-Ling

    2011-01-01

    Office of Highway Policy Information to develop a new methodology to generate annual estimates of average fuel efficiency and number of motor vehicles registered by vehicle class for Table VM-1 of the Highway Statistics annual publication. This paper describes the new methodology developed under this effort and compares the results of the existing manual method and the new systematic approach. The methodology developed under this study takes a two-step approach. First, the preliminary fuel efficiency rates are estimated based on vehicle stock models for different classes of vehicles. Then, a reconciliation model is used to adjust the initial fuel consumption rates from the vehicle stock models and match the VMT information for each vehicle class and the reported total fuel consumption. This reconciliation model utilizes a systematic approach that produces documentable and reproducible results. The basic framework utilizes a mathematical programming formulation to minimize the deviations between the fuel economy estimates published in the previous year s Highway Statistics and the results from the vehicle stock models, subject to the constraint that fuel consumptions for different vehicle classes must sum to the total fuel consumption estimate published in Table MF-21 of the current year Highway Statistics. The results generated from this new approach provide a smoother time series for the fuel economies by vehicle class. It also utilizes the most up-to-date and best available data with sound econometric models to generate MPG estimates by vehicle class.

  11. The minimal and the new minimal supersymmetric Grand Unified Theories on noncommutative space-time

    NASA Astrophysics Data System (ADS)

    Martín, C. P.

    2013-08-01

    We construct noncommutative versions of both the minimal and the new minimal supersymmetric Grand Unified Theories (GUTs). The enveloping-algebra formalism is used to carry out such constructions. The beautiful formulation of the Higgs sector of these noncommutative theories is a consequence of the fact that, in the GUTs at hand, the ordinary Higgs fields can be realized as elements of the Clifford algebra {C}{l}_{10}( {C}). In the noncommutative supersymmetric GUTs we formulate, supersymmetry is linearly realized by the noncommutative fields; but it is not realized by the ordinary fields that define those noncommutative fields via the Seiberg-Witten map.

  12. Minimal Intervention Dentistry – A New Frontier in Clinical Dentistry

    PubMed Central

    NK., Bajwa; A, Pathak

    2014-01-01

    Minimally invasive procedures are the new paradigm in health care. Everything from heart bypasses to gall bladder, surgeries are being performed with these dynamic new techniques. Dentistry is joining this exciting revolution as well. Minimally invasive dentistry adopts a philosophy that integrates prevention, remineralisation and minimal intervention for the placement and replacement of restorations. Minimally invasive dentistry reaches the treatment objective using the least invasive surgical approach, with the removal of the minimal amount of healthy tissues. This paper reviews in brief the concept of minimal intervention in dentistry. PMID:25177659

  13. Minimizing water consumption when producing hydropower

    NASA Astrophysics Data System (ADS)

    Leon, A. S.

    2015-12-01

    In 2007, hydropower accounted for only 16% of the world electricity production, with other renewable sources totaling 3%. Thus, it is not surprising that when alternatives are evaluated for new energy developments, there is strong impulse for fossil fuel or nuclear energy as opposed to renewable sources. However, as hydropower schemes are often part of a multipurpose water resources development project, they can often help to finance other components of the project. In addition, hydropower systems and their associated dams and reservoirs provide human well-being benefits, such as flood control and irrigation, and societal benefits such as increased recreational activities and improved navigation. Furthermore, hydropower due to its associated reservoir storage, can provide flexibility and reliability for energy production in integrated energy systems. The storage capability of hydropower systems act as a regulating mechanism by which other intermittent and variable renewable energy sources (wind, wave, solar) can play a larger role in providing electricity of commercial quality. Minimizing water consumption for producing hydropower is critical given that overuse of water for energy production may result in a shortage of water for other purposes such as irrigation, navigation or fish passage. This paper presents a dimensional analysis for finding optimal flow discharge and optimal penstock diameter when designing impulse and reaction water turbines for hydropower systems. The objective of this analysis is to provide general insights for minimizing water consumption when producing hydropower. This analysis is based on the geometric and hydraulic characteristics of the penstock, the total hydraulic head and the desired power production. As part of this analysis, various dimensionless relationships between power production, flow discharge and head losses were derived. These relationships were used to withdraw general insights on determining optimal flow discharge and

  14. Minimally Informative Prior Distributions for PSA

    SciTech Connect

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that the posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained

  15. Simulating granular materials by energy minimization

    NASA Astrophysics Data System (ADS)

    Krijgsman, D.; Luding, S.

    2016-03-01

    Discrete element methods are extremely helpful in understanding the complex behaviors of granular media, as they give valuable insight into all internal variables of the system. In this paper, a novel discrete element method for performing simulations of granular media is presented, based on the minimization of the potential energy in the system. Contrary to most discrete element methods (i.e., soft-particle method, event-driven method, and non-smooth contact dynamics), the system does not evolve by (approximately) integrating Newtons equations of motion in time, but rather by searching for mechanical equilibrium solutions for the positions of all particles in the system, which is mathematically equivalent to locally minimizing the potential energy. The new method allows for the rapid creation of jammed initial conditions (to be used for further studies) and for the simulation of quasi-static deformation problems. The major advantage of the new method is that it allows for truly static deformations. The system does not evolve with time, but rather with the externally applied strain or load, so that there is no kinetic energy in the system, in contrast to other quasi-static methods. The performance of the algorithm for both types of applications of the method is tested. Therefore we look at the required number of iterations, for the system to converge to a stable solution. For each single iteration, the required computational effort scales linearly with the number of particles. During the process of creating initial conditions, the required number of iterations for two-dimensional systems scales with the square root of the number of particles in the system. The required number of iterations increases for systems closer to the jamming packing fraction. For a quasi-static pure shear deformation simulation, the results of the new method are validated by regular soft-particle dynamics simulations. The energy minimization algorithm is able to capture the evolution of the

  16. Minimally invasive surgery for thyroid eye disease

    PubMed Central

    Naik, Milind Neilkant; Nair, Akshay Gopinathan; Gupta, Adit; Kamal, Saurabh

    2015-01-01

    Thyroid eye disease (TED) can affect the eye in myriad ways: proptosis, strabismus, eyelid retraction, optic neuropathy, soft tissue changes around the eye and an unstable ocular surface. TED consists of two phases: active, and inactive. The active phase of TED is limited to a period of 12–18 months and is mainly managed medically with immunosuppression. The residual structural changes due to the resultant fibrosis are usually addressed with surgery, the mainstay of which is orbital decompression. These surgeries are performed during the inactive phase. The surgical rehabilitation of TED has evolved over the years: not only the surgical techniques, but also the concepts, and the surgical tools available. The indications for decompression surgery have also expanded in the recent past. This article discusses the technological and conceptual advances of minimally invasive surgery for TED that decrease complications and speed up recovery. Current surgical techniques offer predictable, consistent results with better esthetics. PMID:26669337

  17. JSC Metal Finishing Waste Minimization Methods

    NASA Technical Reports Server (NTRS)

    Sullivan, Erica

    2003-01-01

    THe paper discusses the following: Johnson Space Center (JSC) has achieved VPP Star status and is ISO 9001 compliant. The Structural Engineering Division in the Engineering Directorate is responsible for operating the metal finishing facility at JSC. The Engineering Directorate is responsible for $71.4 million of space flight hardware design, fabrication and testing. The JSC Metal Finishing Facility processes flight hardware to support the programs in particular schedule and mission critical flight hardware. The JSC Metal Finishing Facility is operated by Rothe Joint Venture. The Facility provides following processes: anodizing, alodining, passivation, and pickling. JSC Metal Finishing Facility completely rebuilt in 1998. Total cost of $366,000. All new tanks, electrical, plumbing, and ventilation installed. Designed to meet modern safety, environmental, and quality requirements. Designed to minimize contamination and provide the highest quality finishes.

  18. Minimally refined biomass fuels: an economic shortcut

    SciTech Connect

    Pearson, R.K.; Hirschfeld, T.B.

    1980-07-01

    An economic shortcut can be realized if the sugars from which ethanol is made are utilized directly as concentrated aqueous solutions for fuels rather than by further refining them through fermentation and distillation steps. Simple evaporation of carbohydrate solutions from sugar cane or sweet sorghum, or from hydrolysis of starch or cellulose content of many plants yield potential liquid fuels of energy contents (on a volume basis) comparable to highly refined liquid fuels like methanol and ethanol. The potential utilization of such minimally refined biomass derived fuels is discussed and the burning of sucrose-ethanol-water solutions in a small modified domestic burner is demonstrated. Other potential uses of sugar solutions or emulsion and microemulsions in fuel oils for use in diesel or turbine engines are proposed and discussed.

  19. Linear functional minimization for inverse modeling

    NASA Astrophysics Data System (ADS)

    Barajas-Solano, D. A.; Wohlberg, B. E.; Vesselinov, V. V.; Tartakovsky, D. M.

    2015-06-01

    We present a novel inverse modeling strategy to estimate spatially distributed parameters of nonlinear models. The maximum a posteriori (MAP) estimators of these parameters are based on a likelihood functional, which contains spatially discrete measurements of the system parameters and spatiotemporally discrete measurements of the transient system states. The piecewise continuity prior for the parameters is expressed via Total Variation (TV) regularization. The MAP estimator is computed by minimizing a nonquadratic objective equipped with the TV operator. We apply this inversion algorithm to estimate hydraulic conductivity of a synthetic confined aquifer from measurements of conductivity and hydraulic head. The synthetic conductivity field is composed of a low-conductivity heterogeneous intrusion into a high-conductivity heterogeneous medium. Our algorithm accurately reconstructs the location, orientation, and extent of the intrusion from the steady-state data only. Addition of transient measurements of hydraulic head improves the parameter estimation, accurately reconstructing the conductivity field in the vicinity of observation locations.

  20. Intravital microscopy of the lung: minimizing invasiveness.

    PubMed

    Fiole, Daniel; Tournier, Jean-Nicolas

    2016-09-01

    In vivo microscopy has recently become a gold standard in lung immunology studies involving small animals, largely benefiting from the democratization of multiphoton microscopy allowing for deep tissue imaging. This technology represents currently our only way of exploring the lungs and inferring what happens in human respiratory medicine. The interest of lung in vivo microscopy essentially relies upon its relevance as a study model, fulfilling physiological requirements in comparison with in vitro and ex vivo experiments. However, strategies developed in order to overcome movements of the thorax caused by breathing and heartbeats remain the chief drawback of the technique and a major source of invasiveness. In this context, minimizing invasiveness is an unavoidable prerequisite for any improvement of lung in vivo microscopy. This review puts into perspective the main techniques enabling lung in vivo microscopy, providing pros and cons regarding invasiveness. PMID:26846880

  1. Error minimizing algorithms for nearest eighbor classifiers

    SciTech Connect

    Porter, Reid B; Hush, Don; Zimmer, G. Beate

    2011-01-03

    Stack Filters define a large class of discrete nonlinear filter first introd uced in image and signal processing for noise removal. In recent years we have suggested their application to classification problems, and investigated their relationship to other types of discrete classifiers such as Decision Trees. In this paper we focus on a continuous domain version of Stack Filter Classifiers which we call Ordered Hypothesis Machines (OHM), and investigate their relationship to Nearest Neighbor classifiers. We show that OHM classifiers provide a novel framework in which to train Nearest Neighbor type classifiers by minimizing empirical error based loss functions. We use the framework to investigate a new cost sensitive loss function that allows us to train a Nearest Neighbor type classifier for low false alarm rate applications. We report results on both synthetic data and real-world image data.

  2. Endoscopic navigation for minimally invasive suturing.

    PubMed

    Wengert, Christian; Bossard, Lukas; Häberling, Armin; Baur, Charles; Székely, Gábor; Cattin, Philippe C

    2007-01-01

    Manipulating small objects such as needles, screws or plates inside the human body during minimally invasive surgery can be very difficult for less experienced surgeons, due to the loss of 3D depth perception. This paper presents an approach for tracking a suturing needle using a standard endoscope. The resulting pose information of the needle is then used to generate artificial 3D cues on the 2D screen to optimally support surgeons during tissue suturing. Additionally, if an external tracking device is provided to report the endoscope's position, the suturing needle can be tracked in a hybrid fashion with sub-millimeter accuracy. Finally, a visual navigation aid can be incorporated, if a 3D surface is intraoperatively reconstructed from video or registered from preoperative imaging. PMID:18044620

  3. Minimally invasive procedures for neuropathic pain.

    PubMed

    Sdrulla, Andrei; Chen, Grace

    2016-04-01

    Neuropathic pain is "pain arising as a direct consequence of a lesion or disease affecting the somatosensory system". The prevalence of neuropathic pain ranges from 7 to 11% of the population and minimally invasive procedures have been used to both diagnose and treat neuropathic pain. Diagnostic procedures consist of nerve blocks aimed to isolate the peripheral nerve implicated, whereas therapeutic interventions either modify or destroy nerve function. Procedures that modify how nerves function include epidural steroid injections, peripheral nerve blocks and sympathetic nerve blocks. Neuroablative procedures include radiofrequency ablation, cryoanalgesia and neurectomies. Currently, neuromodulation with peripheral nerve stimulators and spinal cord stimulators are the most evidence-based treatments of neuropathic pain. PMID:26988024

  4. Minimal five dimensional supergravities and complex geometries

    SciTech Connect

    Herdeiro, Carlos A. R.

    2010-07-28

    We discuss the relation between solutions admitting Killing spinors of minimal super-gravities in five dimensions, both timelike and null, and complex geometries. For the timelike solutions the results may be summarised as follows. In the ungauged case (vanishing cosmological constant {Lambda} 0) the solutions are determined in terms of a hyper-Kaehler base space; in the gauged case ({Lambda}<0) the complex geometry is Kaehler; in the de Sitter case ({Lambda}>0) the complex geometry is hyper-Kaehler with torsion (HKT). For the null solutions we shall focus on the de Sitter case, for which the solutions are determined by a constrained Einstein-Weyl 3-geometry called Gauduchon-Tod space. The method for constructing explicit solutions is discussed in each case.

  5. Minimal models for axion and neutrino

    NASA Astrophysics Data System (ADS)

    Ahn, Y. H.; Chun, Eung Jin

    2016-01-01

    The PQ mechanism resolving the strong CP problem and the seesaw mechanism explaining the smallness of neutrino masses may be related in a way that the PQ symmetry breaking scale and the seesaw scale arise from a common origin. Depending on how the PQ symmetry and the seesaw mechanism are realized, one has different predictions on the color and electromagnetic anomalies which could be tested in the future axion dark matter search experiments. Motivated by this, we construct various PQ seesaw models which are minimally extended from the (non-) supersymmetric Standard Model and thus set up different benchmark points on the axion-photon-photon coupling in comparison with the standard KSVZ and DFSZ models.

  6. Injectable biomaterials for minimally invasive orthopedic treatments.

    PubMed

    Jayabalan, M; Shalumon, K T; Mitha, M K

    2009-06-01

    Biodegradable and injectable hydroxy terminated-poly propylene fumarate (HT-PPF) bone cement was developed. The injectable formulation consisting HT-PPF and comonomer, n-vinyl pyrrolidone, calcium phosphate filler, free radical catalyst, accelerator and radiopaque agent sets rapidly to hard mass with low exothermic temperature. The candidate bone cement attains mechanical strength more than the required compressive strength of 5 MPa and compressive modulus 50 MPa. The candidate bone cement resin elicits cell adhesion and cytoplasmic spreading of osteoblast cells. The cured bone cement does not induce intracutaneous irritation and skin sensitization. The candidate bone cement is tissue compatible without eliciting any adverse tissue reactions. The candidate bone cement is osteoconductive and inductive and allow osteointegration and bone remodeling. HT-PPF bone cement is candidate bone cement for minimally invasive radiological procedures for the treatment of bone diseases and spinal compression fractures. PMID:19160023

  7. A minimal fate-selection switch.

    PubMed

    Weinberger, Leor S

    2015-12-01

    To preserve fitness in unpredictable, fluctuating environments, a range of biological systems probabilistically generate variant phenotypes--a process often referred to as 'bet-hedging', after the financial practice of diversifying assets to minimize risk in volatile markets. The molecular mechanisms enabling bet-hedging have remained elusive. Here, we review how HIV makes a bet-hedging decision between active replication and proviral latency, a long-lived dormant state that is the chief barrier to an HIV cure. The discovery of a virus-encoded bet-hedging circuit in HIV revealed an ancient evolutionary role for latency and identified core regulatory principles, such as feedback and stochastic 'noise', that enable cell-fate decisions. These core principles were later extended to fate selection in stem cells and cancer, exposed new therapeutic targets for HIV, and led to a potentially broad strategy of using 'noise modulation' to redirect cell fate. PMID:26611210

  8. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  9. Minimizing Reheat Energy Use in Laboratories

    SciTech Connect

    Frenze, David; Mathew, Paul; Morehead, Michael; Sartor, Dale; Starr Jr., William

    2005-11-29

    HVAC systems that are designed without properly accounting for equipment load variation across laboratory spaces in a facility can significantly increase simultaneous heating and cooling, particularly for systems that use zone reheat for temperature control. This best practice guide describes the problem of simultaneous heating and cooling resulting from load variations, and presents several technological and design process strategies to minimize it. This guide is one in a series created by the Laboratories for the 21st century ('Labs21') program, a joint program of the U.S. Environmental Protection Agency and U.S. Department of Energy. Geared towards architects, engineers, and facilities managers, these guides provide information about technologies and practices to use in designing, constructing, and operating safe, sustainable, high-performance laboratories.

  10. Constraints on grip-selection: minimizing awkwardness.

    PubMed

    Fischman, M G

    1998-02-01

    In picking up and manipulating an object, the selection of an initial grip (overhand versus underhand) often depends on how comfortable the hand and arm will be at the end of the movement. This effect has been called "end-state comfort" and seems to be an important constraint in grip-selection. The present experiment further explored this effect by selecting a task that would ensure a comfortable ending position regardless of the initial choice of grip. 206 undergraduates picked up a cardboard paper-towel roll from a horizontal position and placed one end down on a table. Analysis showed a clear preference for the overhand grip, as 78% of the participants chose this grip. In addition, more women preferred the overhand grip than men. The findings indicate that people may be sensitive to minimizing awkwardness in both terminal and initial positions. PMID:9530757

  11. LHC prospects for minimal decaying dark matter

    SciTech Connect

    Arcadi, Giorgio; Covi, Laura; Dradi, Federico E-mail: laura.covi@theorie.physik.uni-goettingen.de

    2014-10-01

    We study the possible signals at LHC of the minimal models of decaying dark matter. Those models are characterized by the fact that DM interacts with SM particles through renormalizable coupling with an additional heavier charged state. Such interaction allows to produce a substantial abundance of DM in the early Universe via the decay of the charged heavy state, either in- or out-of-equilibrium. Moreover additional couplings of the charged particle open up decay channels for the DM, which can nevertheless be sufficiently long-lived to be a good DM candidate and within reach of future Indirect Detection observations. We compare the cosmologically favored parameter regions to the LHC discovery reach and discuss the possibility of simultaneous detection of DM decay in Indirect Detection.

  12. New identities between unitary minimal Virasoro characters

    NASA Astrophysics Data System (ADS)

    Taormina, Anne

    1994-10-01

    Two sets of identities between unitary minimal Virasoro characters at levels m=3, 4, 5 are presented and proven. The first identity suggests a connection between the Ising and the tricritical Ising models since the m=3 Virasoro characters are obtained as bilinears of m=4 Virasoro characters. The second identity given the tricritical Ising model characters as bilinears in the Ising model characters and the six combinations of m=5 Virasoro characters which do not appear in the spectrum of the three state Potts model. The implication of these identities on the study of the branching rules of N=4 superconformal characters intoSwidehat{U(2)} × Swidehat{U(2)} characters is discussed.

  13. Convex Lower Bounds for Free Energy Minimization

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan

    We construct lower bounds on free energy with convex relaxations from the nonlinear minimization over probabilities to linear programs over expectation values. Finite-temperature expectation values are further resolved into distributions over energy. A superset of valid expectation values is delineated by an incomplete set of linear constraints. Free energy bounds can be improved systematically by adding constraints, which also increases their computational cost. We compute several free energy bounds of increasing accuracy for the triangular-lattice Ising model to assess the utility of this method. This work was supported by the Laboratory Directed Research and Development program at Sandia National Laboratories. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  14. Minimally disruptive schedule repair for MCM missions

    NASA Astrophysics Data System (ADS)

    Molineaux, Matthew; Auslander, Bryan; Moore, Philip G.; Gupta, Kalyan M.

    2015-05-01

    Mine countermeasures (MCM) missions entail planning and operations in very dynamic and uncertain operating environments, which pose considerable risk to personnel and equipment. Frequent schedule repairs are needed that consider the latest operating conditions to keep mission on target. Presently no decision support tools are available for the challenging task of MCM mission rescheduling. To address this capability gap, we have developed the CARPE system to assist operation planners. CARPE constantly monitors the operational environment for changes and recommends alternative repaired schedules in response. It includes a novel schedule repair algorithm called Case-Based Local Schedule Repair (CLOSR) that automatically repairs broken schedules while satisfying the requirement of minimal operational disruption. It uses a case-based approach to represent repair strategies and apply them to new situations. Evaluation of CLOSR on simulated MCM operations demonstrates the effectiveness of case-based strategy. Schedule repairs are generated rapidly, ensure the elimination of all mines, and achieve required levels of clearance.

  15. Minimally invasive training in urologic oncology.

    PubMed

    Liu, Jen-Jane; Gonzalgo, Mark L

    2011-11-01

    Use of minimally invasive surgical (MIS) techniques continues to expand in the field of urologic oncology; however, proficiency in these techniques is subject to a learning curve. Current training paradigms have incorporated MIS, but in a non-standardized fashion. Residency work-hour restrictions and ethical concerns may influence efforts to deliver adequate training during a defined residency period. Post-residency fellowships or mini-courses may help urologists gain proficiency in these skills, but are time-consuming and may not provide adequate exposure. Surgical simulation with dry labs and augmentation with virtual reality are important adjuncts to operative training for MIS. The urologic oncologist must be familiar with open and MIS techniques to effectively treat cancer in the least morbid way possible and adapt to the ever-changing field of MIS with dynamic training paradigms. PMID:22155873

  16. Minimal residual method stronger than polynomial preconditioning

    SciTech Connect

    Faber, V.; Joubert, W.; Knill, E.

    1994-12-31

    Two popular methods for solving symmetric and nonsymmetric systems of equations are the minimal residual method, implemented by algorithms such as GMRES, and polynomial preconditioning methods. In this study results are given on the convergence rates of these methods for various classes of matrices. It is shown that for some matrices, such as normal matrices, the convergence rates for GMRES and for the optimal polynomial preconditioning are the same, and for other matrices such as the upper triangular Toeplitz matrices, it is at least assured that if one method converges then the other must converge. On the other hand, it is shown that matrices exist for which restarted GMRES always converges but any polynomial preconditioning of corresponding degree makes no progress toward the solution for some initial error. The implications of these results for these and other iterative methods are discussed.

  17. Minimal Increase Network Coding for Dynamic Networks

    PubMed Central

    Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211

  18. Minimal Increase Network Coding for Dynamic Networks.

    PubMed

    Zhang, Guoyin; Fan, Xu; Wu, Yanxia

    2016-01-01

    Because of the mobility, computing power and changeable topology of dynamic networks, it is difficult for random linear network coding (RLNC) in static networks to satisfy the requirements of dynamic networks. To alleviate this problem, a minimal increase network coding (MINC) algorithm is proposed. By identifying the nonzero elements of an encoding vector, it selects blocks to be encoded on the basis of relationship between the nonzero elements that the controls changes in the degrees of the blocks; then, the encoding time is shortened in a dynamic network. The results of simulations show that, compared with existing encoding algorithms, the MINC algorithm provides reduced computational complexity of encoding and an increased probability of delivery. PMID:26867211

  19. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  20. Design and Demonstration of Minimal Lunar Base

    NASA Astrophysics Data System (ADS)

    Boche-Sauvan, L.; Foing, B. H.; Exohab Team

    2009-04-01

    Introduction: We propose a conceptual analysis of a first minimal lunar base, in focussing on the system aspects and coordinating every different part as part an evolving architecture [1-3]. We justify the case for a scientific outpost allowing experiments, sample analysis in laboratory (relevant to the origin and evolution of the Earth, geophysical and geochemical studies of the Moon, life sciences, observation from the Moon). Research: Research activities will be conducted with this first settlement in: - science (of, from and on the Moon) - exploration (robotic mobility, rover, drilling), - technology (communication, command, organisation, automatism). Life sciences. The life sciences aspects are considered through a life support for a crew of 4 (habitat) and a laboratory activity with biological experiments performed on Earth or LEO, but then without any magnetosphere protection and therefore with direct cosmic rays and solar particle effects. Moreover, the ability of studying the lunar environment in the field will be a big asset before settling a permanent base [3-5]. Lunar environment. The lunar environment adds constraints to instruments specifications (vacuum, extreme temperature, regolith, seism, micrometeorites). SMART-1 and other missions data will bring geometrical, chemical and physical details about the environment (soil material characteristics, on surface conditions …). Test bench. To assess planetary technologies and operations preparing for Mars human exploration. Lunar outpost predesign modular concept: To allow a human presence on the moon and to carry out these experiments, we will give a pre-design of a human minimal lunar base. Through a modular concept, this base will be possibly evolved into a long duration or permanent base. We will analyse the possibilities of settling such a minimal base by means of the current and near term propulsion technology, as a full Ariane 5 ME carrying 1.7 T of gross payload to the surface of the Moon

  1. Reflections concerning triply-periodic minimal surfaces

    PubMed Central

    Schoen, Alan H.

    2012-01-01

    In recent decades, there has been an explosion in the number and variety of embedded triply-periodic minimal surfaces (TPMS) identified by mathematicians and materials scientists. Only the rare examples of low genus, however, are commonly invoked as shape templates in scientific applications. Exact analytic solutions are now known for many of the low genus examples. The more complex surfaces are readily defined with numerical tools such as Surface Evolver software or the Landau–Ginzburg model. Even though table-top versions of several TPMS have been placed within easy reach by rapid prototyping methods, the inherent complexity of many of these surfaces makes it challenging to grasp their structure. The problem of distinguishing TPMS, which is now acute because of the proliferation of examples, has been addressed by Lord & Mackay (Lord & Mackay 2003 Curr. Sci. 85, 346–362). PMID:24098851

  2. Waste Minimization and Pollution Prevention Awareness Plan

    SciTech Connect

    Not Available

    1994-04-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, to estimate budget, and to review the plan. In addition to the above, this plan records LLNL`s goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities.

  3. Design for minimizing fracture risk of all-ceramic cantilever dental bridge.

    PubMed

    Zhang, Zhongpu; Zhou, Shiwei; Li, Eric; Li, Wei; Swain, Michael V; Li, Qing

    2015-01-01

    Minimization of the peak stresses and fracture incidence induced by mastication function is considered critical in design of all-ceramic dental restorations, especially for cantilever fixed partial dentures (FPDs). The focus of this study is on developing a mechanically-sound optimal design for all-ceramic cantilever dental bridge in a posterior region. The topology optimization procedure in association with Extended Finite Element Method (XFEM) is implemented here to search for the best possible distribution of porcelain and zirconia materials in the bridge structure. The designs with different volume fractions of zirconia are considered. The results show that this new methodology is capable of improving FPD design by minimizing incidence of crack in comparison with the initial design. Potentially, it provides dental technicians with a new design tool to develop mechanically sound cantilever fixed partial dentures for more complicated clinical situation. PMID:26405963

  4. Indirect Lightning Safety Assessment Methodology

    SciTech Connect

    Ong, M M; Perkins, M P; Brown, C G; Crull, E W; Streit, R D

    2009-04-24

    Lightning is a safety hazard for high-explosives (HE) and their detonators. In the However, the current flowing from the strike point through the rebar of the building The methodology for estimating the risk from indirect lighting effects will be presented. It has two parts: a method to determine the likelihood of a detonation given a lightning strike, and an approach for estimating the likelihood of a strike. The results of these two parts produce an overall probability of a detonation. The probability calculations are complex for five reasons: (1) lightning strikes are stochastic and relatively rare, (2) the quality of the Faraday cage varies from one facility to the next, (3) RF coupling is inherently a complex subject, (4) performance data for abnormally stressed detonators is scarce, and (5) the arc plasma physics is not well understood. Therefore, a rigorous mathematical analysis would be too complex. Instead, our methodology takes a more practical approach combining rigorous mathematical calculations where possible with empirical data when necessary. Where there is uncertainty, we compensate with conservative approximations. The goal is to determine a conservative estimate of the odds of a detonation. In Section 2, the methodology will be explained. This report will discuss topics at a high-level. The reasons for selecting an approach will be justified. For those interested in technical details, references will be provided. In Section 3, a simple hypothetical example will be given to reinforce the concepts. While the methodology will touch on all the items shown in Figure 1, the focus of this report is the indirect effect, i.e., determining the odds of a detonation from given EM fields. Professor Martin Uman from the University of Florida has been characterizing and defining extreme lightning strikes. Using Professor Uman's research, Dr. Kimball Merewether at Sandia National Laboratory in Albuquerque calculated the EM fields inside a Faraday-cage type

  5. Simulation Enabled Safeguards Assessment Methodology

    SciTech Connect

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-09-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed.

  6. Methodology for flammable gas evaluations

    SciTech Connect

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  7. Lean methodology in health care.

    PubMed

    Kimsey, Diane B

    2010-07-01

    Lean production is a process management philosophy that examines organizational processes from a customer perspective with the goal of limiting the use of resources to those processes that create value for the end customer. Lean manufacturing emphasizes increasing efficiency, decreasing waste, and using methods to decide what matters rather than accepting preexisting practices. A rapid improvement team at Lehigh Valley Health Network, Allentown, Pennsylvania, implemented a plan, do, check, act cycle to determine problems in the central sterile processing department, test solutions, and document improved processes. By using A3 thinking, a consensus building process that graphically depicts the current state, the target state, and the gaps between the two, the team worked to improve efficiency and safety, and to decrease costs. Use of this methodology has increased teamwork, created user-friendly work areas and processes, changed management styles and expectations, increased staff empowerment and involvement, and streamlined the supply chain within the perioperative area. PMID:20619772

  8. Nuclear weapon reliability evaluation methodology

    SciTech Connect

    Wright, D.L.

    1993-06-01

    This document provides an overview of those activities that are normally performed by Sandia National Laboratories to provide nuclear weapon reliability evaluations for the Department of Energy. These reliability evaluations are first provided as a prediction of the attainable stockpile reliability of a proposed weapon design. Stockpile reliability assessments are provided for each weapon type as the weapon is fielded and are continuously updated throughout the weapon stockpile life. The reliability predictions and assessments depend heavily on data from both laboratory simulation and actual flight tests. An important part of the methodology are the opportunities for review that occur throughout the entire process that assure a consistent approach and appropriate use of the data for reliability evaluation purposes.

  9. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  10. Managing sediment to minimize environmental impacts

    SciTech Connect

    Sherman, K.

    1995-12-31

    When considering licensing of a hydroelectric project, FERC must give equal consideration to power and nonpower values such as environmental resources. A case study is the existing Rock-Creek Cresta Project, located on the North Fork of the Feather River in northern California, which is in the process of relicensing by the Commission. This project includes two reservoirs - Rock Creek and Cresta Reservoirs, each formed by a dam that diverts water from the river into a tunnel and to a powerhouse. The watershed includes large natural and man-made sediment sources. Rock Creek Reservoir has accumulated 3.9 million cubic yards of sediments since the dam was built in 1950; Cresta Reservoir has accumulated 2.9 million cy of sediments since 1949. Operational problems began in the 1980s. As part of the relicensing process, Pacific Gas & Electric Company (PG&E) initially proposed a combination of dredging 500,000 cy of sediment from each reservoir, land disposal of dredged sediments, followed by sediment pass-through to achieve a long term net balance of sediment inflow and outflow. This proposal had substantial economic costs and environmental impact. Potential environmental effects included impacts to water quality and aquatic organisms and to terrestrial habitat from disposal of a million cy of dredged sediments. PG&E used physical and mathematical models to develop an innovative approach that minimized the amount of sediment needed to be dredged by limiting dredging to the area immediately adjacent to the intake structures. This would also tend to minimize impacts to water quality and aquatic habitat by reducing the area of disturbance within the reservoirs. PG&E proposes to keep the intake areas open and provide for long-term sediment pass-through by providing additional low-level outlet capacity. This would permit reservoir drawdown, which would increase velocities and sediment movement out of the reservoirs.

  11. Minimally invasive total hip arthroplasty: in opposition.

    PubMed

    Hungerford, David S

    2004-06-01

    At the Knee Society Winter Meeting in 2003, Seth Greenwald and I debated about whether there should be new standards (ie, regulations) applied to the release of information to the public on "new developments." I argued for the public's "right to know" prior to the publication of peer-reviewed literature. He argued for regulatory constraint or "proving by peer-reviewed publication" before alerting the public. It is not a contradiction for me to currently argue against the public advertising of minimally invasive (MIS) total hip arthroplasty as not yet being in the best interest of the public. It is hard to remember a concept that has so captured both the public's and the surgical community's fancy as MIS. Patients are "demanding" MIS without knowing why. Surgeons are offering it as the next best, greatest thing without having developed the skill and experience to avoid the surgery's risks. If you put "minimally invasive hip replacement" into the Google search engine (http://www.google.com), you get 5,170 matches. If you put the same words in PubMed (http://www.ncbi.nlm.nih.gov/entrez/query.fcgi), referencing the National Library of Medicine database, you get SEVENTEEN; none is really a peer-reviewed article. Most are 1 page papers in orthopedics from medical education meetings. On the other hand, there are over 6,000 peer-reviewed articles on total hip arthroplasty. Dr. Thomas Sculco, my couterpart in this debate, wrote an insightful editorial in the American Journal of Orthopedic Surgery in which he stated: "Although these procedures have generated incredible interest and enthusiasm, I am concerned that they may be performed to the detriment of our patients." I couldn't agree with him more. Smaller is not necessarily better and, when it is worse, it will be the "smaller" that is held accountable. PMID:15190556

  12. Minimizing forced outage risk in generator bidding

    NASA Astrophysics Data System (ADS)

    Das, Dibyendu

    Competition in power markets has exposed the participating companies to physical and financial uncertainties. Generator companies bid to supply power in a day-ahead market. Once their bids are accepted by the ISO they are bound to supply power. A random outage after acceptance of bids forces a generator to buy power from the expensive real-time hourly spot market and sell to the ISO at the set day-ahead market clearing price, incurring losses. A risk management technique is developed to assess this financial risk associated with forced outages of generators and then minimize it. This work presents a risk assessment module which measures the financial risk of generators bidding in an open market for different bidding scenarios. The day-ahead power market auction is modeled using a Unit Commitment algorithm and a combination of Normal and Cauchy distributions generate the real time hourly spot market. Risk profiles are derived and VaRs are calculated at 98 percent confidence level as a measure of financial risk. Risk Profiles and VaRs help the generators to analyze the forced outage risk and different factors affecting it. The VaRs and the estimated total earning for different bidding scenarios are used to develop a risk minimization module. This module will develop a bidding strategy of the generator company such that its estimated total earning is maximized keeping the VaR below a tolerable limit. This general framework of a risk management technique for the generating companies bidding in competitive day-ahead market can also help them in decisions related to building new generators.

  13. Minimal distortion pathways in polyhedral rearrangements.

    PubMed

    Casanova, David; Cirera, Jordi; Llunell, Miquel; Alemany, Pere; Avnir, David; Alvarez, Santiago

    2004-02-18

    A definition of minimum distortion paths between two polyhedra in terms of continuous shape measures (CShM) is presented. A general analytical expression deduced for such pathways makes use of one parameter, the minimum distortion constant, that can be easily obtained through the CShM methodology and is herein tabulated for pairs of polyhedra having four to eight vertexes. The work presented here also allows us to obtain representative model molecular structures along the interconversion pathways. Several commonly used polytopal rearrangement pathways are shown to be in fact minimum distortion pathways: the spread path leading from the tetrahedron to the square, the Berry pseudorotation that interconverts a square pyramid and a trigonal bipyramid, and the Bailar twist for the interconversion of the octahedron and the trigonal prism. Examples of applications to the analysis of the stereochemistries of several families of metal complexes are presented. PMID:14871107

  14. Proposal for a tutorial on minimal length encoding (MLE) in molecular biology

    SciTech Connect

    Milosavljevic, A.

    1994-03-01

    This paper describes a tutorial to introduce the Minimal length encoding (MLE) method to computational biologists who are designing sequence analysis algorithms, to computer scientists who are interested in learning more about macromolecular sequence analysis, and to biologists who are more advanced users of the sequence analysis programs. An emphasis of the workshop will be on the use of the MLE method as a tool for comparative analysis of inference programs in computational biology, with an ultimate purpose of providing more methodological coherence to the emerging field of computational biology.

  15. Note: A method for minimizing oxide formation during elevated temperature nanoindentation

    SciTech Connect

    Cheng, I. C.; Hodge, A. M.; Garcia-Sanchez, E.

    2014-09-15

    A standardized method to protect metallic samples and minimize oxide formation during elevated-temperature nanoindentation was adapted to a commercial instrument. Nanoindentation was performed on Al (100), Cu (100), and W (100) single crystals submerged in vacuum oil at 200 °C, while the surface morphology and oxidation was carefully monitored using atomic force microscopy (AFM) and X-ray photoelectron spectroscopy (XPS). The results were compared to room temperature and 200 °C nanoindentation tests performed without oil, in order to evaluate the feasibility of using the oil as a protective medium. Extensive surface characterization demonstrated that this methodology is effective for nanoscale testing.

  16. EXPERIENCE WITH THE EPA MANUAL FOR WASTE MINIMIZATION OPPORTUNITY ASSESSMENT

    EPA Science Inventory

    Waste Minimization Opportunity Assessments Manual (EPA/625/7-88/003) is designed to assist those responsible for planning, managing, and implementing waste minimization activities at the waste generating operation and at all management levels. The Manual defines waste minimizatio...

  17. New methodology in biomedical science: methodological errors in classical science.

    PubMed

    Skurvydas, Albertas

    2005-01-01

    The following methodological errors are observed in biomedical sciences: paradigmatic ones; those of exaggerated search for certainty; science dehumanisation; deterministic and linearity; those of making conclusions; errors of reductionism or quality decomposition as well as exaggerated enlargement; errors connected with discarding odd; unexpected or awkward facts; those of exaggerated mathematization; isolation of science; the error of "common sense"; Ceteris Paribus law's ("other things being equal" laws) error; "youth" and common sense; inflexibility of criteria of the truth; errors of restricting the sources of truth and ways of searching for truth; the error connected with wisdom gained post factum; the errors of wrong interpretation of research mission; "laziness" to repeat the experiment as well as the errors of coordination of errors. One of the basic aims for the present-day scholars of biomedicine is, therefore, mastering the new non-linear, holistic, complex way of thinking that will, undoubtedly, enable one to make less errors doing research. The aim of "scientific travelling" will be achieved with greater probability if the "travelling" itself is performed with great probability. PMID:15687745

  18. Minimal trellises for linear block codes and their duals

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Dolinar, S.; Ekroot, L.; Mceliece, R. J.; Lin, W.

    1995-01-01

    We consider the problem of finding a trellis for a linear block code that minimizes one or more measures of trellis complexity for a fixed permutation of the code. We examine constraints on trellises, including relationships between the minimal trellis of a code and that of the dual code. We identify the primitive structures that can appear in a minimal trellis and relate this to those for the minimal trellis of the dual code.

  19. 12 CFR 264b.4 - Gifts of minimal value.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Office of the Secretary for valuation. (c) Disagreements over whether a gift is of minimal value will be... 12 Banks and Banking 4 2013-01-01 2013-01-01 false Gifts of minimal value. 264b.4 Section 264b.4... (CONTINUED) RULES REGARDING FOREIGN GIFTS AND DECORATIONS § 264b.4 Gifts of minimal value. (a)...

  20. 12 CFR 264b.4 - Gifts of minimal value.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... for valuation. (c) Disagreements over whether a gift is of minimal value will be resolved by an... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Gifts of minimal value. 264b.4 Section 264b.4... RULES REGARDING FOREIGN GIFTS AND DECORATIONS § 264b.4 Gifts of minimal value. (a) Board employees...

  1. 12 CFR 264b.4 - Gifts of minimal value.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Office of the Secretary for valuation. (c) Disagreements over whether a gift is of minimal value will be... 12 Banks and Banking 4 2014-01-01 2014-01-01 false Gifts of minimal value. 264b.4 Section 264b.4... (CONTINUED) RULES REGARDING FOREIGN GIFTS AND DECORATIONS § 264b.4 Gifts of minimal value. (a)...

  2. 12 CFR 264b.4 - Gifts of minimal value.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Office of the Secretary for valuation. (c) Disagreements over whether a gift is of minimal value will be... 12 Banks and Banking 4 2012-01-01 2012-01-01 false Gifts of minimal value. 264b.4 Section 264b.4... (CONTINUED) RULES REGARDING FOREIGN GIFTS AND DECORATIONS § 264b.4 Gifts of minimal value. (a)...

  3. 12 CFR 264b.4 - Gifts of minimal value.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... for valuation. (c) Disagreements over whether a gift is of minimal value will be resolved by an... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Gifts of minimal value. 264b.4 Section 264b.4... RULES REGARDING FOREIGN GIFTS AND DECORATIONS § 264b.4 Gifts of minimal value. (a) Board employees...

  4. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  5. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  6. Support minimized inversion of acoustic and elastic wave scattering

    SciTech Connect

    Safaeinili, A.

    1994-04-24

    This report discusses the following topics on support minimized inversion of acoustic and elastic wave scattering: Minimum support inversion; forward modelling of elastodynamic wave scattering; minimum support linearized acoustic inversion; support minimized nonlinear acoustic inversion without absolute phase; and support minimized nonlinear elastic inversion.

  7. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  8. 10 CFR 20.1406 - Minimization of contamination.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and...

  9. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    PubMed Central

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132

  10. Minimizing Glovebox Glove Breaches: PART II.

    SciTech Connect

    Cournoyer, M. E.; Andrade, R.M.; Taylor, D. J.; Stimmel, J. J.; Zaelke, R. L.; Balkey, J. J.

    2005-01-01

    As a matter of good business practices, a team of glovebox experts from Los Alamos National Laboratory (LANL) has been assembled to proactively investigate processes and procedures that minimize unplanned breaches in the glovebox, e.g., glove failures. A major part of this effort involves the review of glovebox glove failures that have occurred at the Plutonium Facility and at the Chemical and Metallurgy Research Facility. Information dating back to 1993 has been compiled from formal records. This data has been combined with information obtained from a baseline inventory of about 9,000 glovebox gloves. The key attributes tracked include those related to location, the glovebox glove, type and location of breaches, the worker, and the consequences resulting from breaches. This glovebox glove failure analysis yielded results in the areas of the ease of collecting this type of data, the causes of most glove failures that have occurred, the effectiveness of current controls, and recommendations to improve hazard control systems. As expected, a significant number of breaches involve high-risk operations such as grinding, hammering, using sharps (especially screwdrivers), and assembling equipment. Surprisingly, tasks such as the movement of equipment and material between gloveboxes and the opening of cans are also major contributions of breaches. Almost half the gloves fail within a year of their install date. The greatest consequence for over 90% of glovebox glove failures is alpha contamination of protective clothing. Personnel self-monitoring at the gloveboxes continues to be the most effective way of detecting glovebox glove failures. Glove failures from these tasks can be reduced through changes in procedures and the design of remote-handling apparatus. The Nuclear Materials Technology Division management uses this information to improve hazard control systems to reduce the number of unplanned breaches in the glovebox further. As a result, excursions of contaminants

  11. Minimally Invasive Versus Conventional Aortic Valve Replacement

    PubMed Central

    Attia, Rizwan Q.; Hickey, Graeme L.; Grant, Stuart W.; Bridgewater, Ben; Roxburgh, James C.; Kumar, Pankaj; Ridley, Paul; Bhabra, Moninder; Millner, Russell W. J.; Athanasiou, Thanos; Casula, Roberto; Chukwuemka, Andrew; Pillay, Thasee; Young, Christopher P.

    2016-01-01

    Objective Minimally invasive aortic valve replacement (MIAVR) has been demonstrated as a safe and effective option but remains underused. We aimed to evaluate outcomes of isolated MIAVR compared with conventional aortic valve replacement (CAVR). Methods Data from The National Institute for Cardiovascular Outcomes Research (NICOR) were analyzed at seven volunteer centers (2006–2012). Primary outcomes were in-hospital mortality and midterm survival. Secondary outcomes were postoperative length of stay as well as cumulative bypass and cross-clamp times. Propensity modeling with matched cohort analysis was used. Results Of 307 consecutive MIAVR patients, 151 (49%) were performed during the last 2 years of study with a continued increase in numbers. The 307 MIAVR patients were matched on a 1:1 ratio. In the matched CAVR group, there was no statistically significant difference in in-hospital mortality [MIAVR, 4/307,(1.3%); 95% confidence interval (CI), 0.4%–3.4% vs CAVR, 6/307 (2.0%); 95% CI, 0.8%–4.3%; P = 0.752]. One-year survival rates in the MIAVR and CAVR groups were 94.4% and 94.6%, respectively. There was no statistically significant difference in midterm survival (P = 0.677; hazard ratio, 0.90; 95% CI, 0.56–1.46). Median postoperative length of stay was lower in the MIAVR patients by 1 day (P = 0.009). The mean cumulative bypass time (94.8 vs 91.3 minutes; P = 0.333) and cross-clamp time (74.6 vs 68.4 minutes; P = 0.006) were longer in the MIAVR group; however, this was significant only in the cross-clamp time comparison. Conclusions Minimally invasive aortic valve replacement is a safe alternative to CAVR with respect to operative and 1-year mortality and is associated with a shorter postoperative stay. Further studies are required in high-risk (logistic EuroSCORE > 10) patients to define the role of MIAVR. PMID:26926521

  12. VISION 21 SYSTEMS ANALYSIS METHODOLOGIES

    SciTech Connect

    G.S. Samuelsen; A. Rao; F. Robson; B. Washom

    2003-08-11

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into power plant systems that meet performance and emission goals of the Vision 21 program. The study efforts have narrowed down the myriad of fuel processing, power generation, and emission control technologies to selected scenarios that identify those combinations having the potential to achieve the Vision 21 program goals of high efficiency and minimized environmental impact while using fossil fuels. The technology levels considered are based on projected technical and manufacturing advances being made in industry and on advances identified in current and future government supported research. Included in these advanced systems are solid oxide fuel cells and advanced cycle gas turbines. The results of this investigation will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  13. Waste Package Design Methodology Report

    SciTech Connect

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  14. Review of risk premium methodology

    SciTech Connect

    Bittner, C.

    1995-12-31

    Over the last few years, the Iowa Utilities Board (IUB) has used a simple risk premium method as supplemental evidence in determining the cost of equity in rate cases. The Board method has mainly been to add 250 to 350 basis points risk premium to an A-rated utility bond average. Recently, the Board has expressed concern that an update of the risk premium range is warranted. Is an update needed? The 250 to 350 basis points, when first adopted, apparently reflected a specific record at that time. It is not cast in stone nor invariate with time. For a number of reasons, periodic updating is reasonable and proper. First, during periods of extremely high or extremely low capital costs, commissions` desire for stability of rates and avoidance of subsequent rate cases triggered by more normal credit shifts commissions` attention from the discounted cash flow (DCF) {open_quotes}snap-shot{close_quotes} of capital costs to the broader {open_quotes}time-series perspective{close_quotes} of the risk premium method. It is essentially a judgment that the current capital cost, at times of market extremes, is not necessarily the best representative equity cost to use for prospective rates. Nevertheless, a risk premium method is a market-based method, with some tie, via the debt rate, to current market conditions. In any case, during such times when the risk premium method is given more usage, it makes sense to refine or update, if needed, the Board`s risk premium methodology.

  15. Minimizing metastatic risk in radiotherapy fractionation schedules

    NASA Astrophysics Data System (ADS)

    Badri, Hamidreza; Ramakrishnan, Jagdish; Leder, Kevin

    2015-11-01

    Metastasis is the process by which cells from a primary tumor disperse and form new tumors at distant anatomical locations. The treatment and prevention of metastatic cancer remains an extremely challenging problem. This work introduces a novel biologically motivated objective function to the radiation optimization community that takes into account metastatic risk instead of the status of the primary tumor. In this work, we consider the problem of developing fractionated irradiation schedules that minimize production of metastatic cancer cells while keeping normal tissue damage below an acceptable level. A dynamic programming framework is utilized to determine the optimal fractionation scheme. We evaluated our approach on a breast cancer case using the heart and the lung as organs-at-risk (OAR). For small tumor α /β values, hypo-fractionated schedules were optimal, which is consistent with standard models. However, for relatively larger α /β values, we found the type of schedule depended on various parameters such as the time when metastatic risk was evaluated, the α /β values of the OARs, and the normal tissue sparing factors. Interestingly, in contrast to standard models, hypo-fractionated and semi-hypo-fractionated schedules (large initial doses with doses tapering off with time) were suggested even with large tumor α/β values. Numerical results indicate the potential for significant reduction in metastatic risk.

  16. Cultural change and support of waste minimization

    SciTech Connect

    Boylan, M.S.

    1991-12-31

    The process of bringing a subject like pollution prevention to top of mind awareness, where designed to prevent waste becomes part of business as usual, is called cultural change. With Department of Energy orders and management waste minimization commitment statements on file, the REAL work is just beginning at the Idaho National Engineering Laboratory (INEL); shaping the attitudes of 11,000+ employees. The difficulties of such a task are daunting. The 890 square mile INEL site and in-town support offices mean a huge diversity of employee jobs and waste streams; from cafeteria and auto maintenance wastes to high-level nuclear waste casks. INEL is pursuing a three component cultural change strategy: training, publicity, and public outreach. To meet the intent of DOE orders, all INEL employees are slated to receive pollution prevention orientation training. More technical training is given to targeted groups like purchasing and design engineering. To keep newly learned pollution prevention concepts top-of-mind, extensive site-wide publicity is being developed and conducted, culminating in the April Pollution Prevention Awareness Week coinciding with Earth Day 1992. Finally, news of INEL pollution prevention successes is shared with the public to increase their overall environmental awareness and their knowledge of INEL activities. An important added benefit is the sense of pride the program instills in INEL employees to have their successes displayed so publicly.

  17. Minimally Invasive Approach to Achilles Tendon Pathology.

    PubMed

    Hegewald, Kenneth W; Doyle, Matthew D; Todd, Nicholas W; Rush, Shannon M

    2016-01-01

    Many surgical procedures have been described for Achilles tendon pathology; however, no overwhelming consensus has been reached for surgical treatment. Open repair using a central or paramedian incision allows excellent visualization for end-to-end anastomosis in the case of a complete rupture and detachment and reattachment for insertional pathologies. Postoperative wound dehiscence and infection in the Achilles tendon have considerable deleterious effects on overall functional recovery and outcome and sometimes require plastic surgery techniques to achieve coverage. With the aim of avoiding such complications, foot and ankle surgeons have studied less invasive techniques for repair. We describe a percutaneous approach to Achilles tendinopathy using a modification of the Bunnell suture weave technique combined with the use of interference screws. No direct end-to-end repair of the tendon is performed, rather, the proximal stump is brought in direct proximity of the distal stump, preventing overlengthening and proximal stump retraction. This technique also reduces the suture creep often seen with end-to-end tendon repair by providing a direct, rigid suture to bone interface. We have used the new technique to minimize dissection and exposure while restoring function and accelerating recovery postoperatively. PMID:26385574

  18. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care. PMID:26354615

  19. Management options for minimal hepatic encephalopathy.

    PubMed

    Bajaj, Jasmohan S

    2008-12-01

    Minimal hepatic encephalopathy (MHE) is a neurocognitive dysfunction that is present in the majority of patients with cirrhosis. MHE has a characteristic cognitive profile that cannot be diagnosed clinically. This cognitive dysfunction is independent of sleep dysfunction or problems with overall intelligence. MHE has a significant impact on quality of life, the ability to function in daily life and progression to overt hepatic encephalopathy. Driving ability can be impaired in MHE and this may be a significant factor behind motor vehicle accidents. A crucial aspect of the clinical care of MHE patients is their driving history, which is often ignored during routine care and can add a vital dimension to the overall disease assessment. Driving history should be an integral part of the care of patients with MHE. The preserved communication skills and lack of specific signs and insight make MHE difficult to diagnose. The predominant strategies for MHE diagnosis are psychometric or neurophysiological testing. These are usually limited by financial, normative or time constraints. Studies into inhibitory control, cognitive drug research and critical flicker frequency tests are encouraging. These tests do not require a psychologist for administration and interpretation. Lactulose and probiotics have been studied for their potential use as therapies for MHE, but these are not standard-of-care practices at this time. Therapy can improve the quality of life in MHE patients but the natural history, specific diagnostic strategies and treatment options are still being investigated. PMID:19090738

  20. Linearized Functional Minimization for Inverse Modeling

    SciTech Connect

    Wohlberg, Brendt; Tartakovsky, Daniel M.; Dentz, Marco

    2012-06-21

    Heterogeneous aquifers typically consist of multiple lithofacies, whose spatial arrangement significantly affects flow and transport. The estimation of these lithofacies is complicated by the scarcity of data and by the lack of a clear correlation between identifiable geologic indicators and attributes. We introduce a new inverse-modeling approach to estimate both the spatial extent of hydrofacies and their properties from sparse measurements of hydraulic conductivity and hydraulic head. Our approach is to minimize a functional defined on the vectors of values of hydraulic conductivity and hydraulic head fields defined on regular grids at a user-determined resolution. This functional is constructed to (i) enforce the relationship between conductivity and heads provided by the groundwater flow equation, (ii) penalize deviations of the reconstructed fields from measurements where they are available, and (iii) penalize reconstructed fields that are not piece-wise smooth. We develop an iterative solver for this functional that exploits a local linearization of the mapping from conductivity to head. This approach provides a computationally efficient algorithm that rapidly converges to a solution. A series of numerical experiments demonstrates the robustness of our approach.

  1. Wormholes minimally violating the null energy condition

    SciTech Connect

    Bouhmadi-López, Mariam; Lobo, Francisco S N; Martín-Moruno, Prado E-mail: fslobo@fc.ul.pt

    2014-11-01

    We consider novel wormhole solutions supported by a matter content that minimally violates the null energy condition. More specifically, we consider an equation of state in which the sum of the energy density and radial pressure is proportional to a constant with a value smaller than that of the inverse area characterising the system, i.e., the area of the wormhole mouth. This approach is motivated by a recently proposed cosmological event, denoted {sup t}he little sibling of the big rip{sup ,} where the Hubble rate and the scale factor blow up but the cosmic derivative of the Hubble rate does not [1]. By using the cut-and-paste approach, we match interior spherically symmetric wormhole solutions to an exterior Schwarzschild geometry, and analyse the stability of the thin-shell to linearized spherically symmetric perturbations around static solutions, by choosing suitable properties for the exotic material residing on the junction interface radius. Furthermore, we also consider an inhomogeneous generalization of the equation of state considered above and analyse the respective stability regions. In particular, we obtain a specific wormhole solution with an asymptotic behaviour corresponding to a global monopole.

  2. Minimal Technologies Application Project: Planning and installation

    SciTech Connect

    Zellmer, S.D.; Hinchman, R.R.; Severinghaus, W.D.; Johnson, D.O.; Brent, J.J.

    1989-03-01

    Intensive and continuous tactical training during the last 35 years at the Hohenfels Training Area in West Germany has caused the loss of vegetative ground cover and has accelerated soil erosion rates, resulting in extensive environmental damage, safety hazards, and unrealistic training habitats. The objectives of this project are to develop and evaluate revegetation procedures for establishing adequate vegetative cover to control erosion at minimal costs and disruption to training activities. This project involved the development and installation of 12 revegetation procedures that combined four seedbed preparation methods and seeding options with three site-closure periods. In March 1987, the four seedbed preparation/seeding options and closure periods were selected, a study site design and location chosen, and specifications for the revegetation procedures developed. A German rehabilitation contractor attempted the specified seedbed preparation and seeding on the 13.5-ha site in June, but abnormally high rainfall, usually wet site conditions, and lack of adequate equipment prevented the contractor from completing six of the 12 planned procedures. Planning and execution of the project has nonetheless provided valuable information on the importance and use of soil analytical results, seed availability and cost data, contractor equipment requirements, and time required for planning future revegetation efforts. Continued monitoring of vegetative ground cover at the site for the next two years, combined with cost information, will provide necessary data to determine which of the six revegetation procedures is the most effective. These data will be used in planning future rehabilitation efforts on tactical training areas.

  3. Minimal dark matter: model and results

    NASA Astrophysics Data System (ADS)

    Cirelli, Marco; Strumia, Alessandro

    2009-10-01

    We recap the main features of minimal dark matter (MDM) and assess its status in the light of recent experimental data. The theory selects an electroweak 5-plet with hypercharge Y=0 as a fully successful DM candidate, automatically stable against decay and with no free parameters: DM is a fermion with a 9.6 TeV mass. The direct detection cross-section, predicted to be 10-44 cm2, is within reach of next-generation experiments. DM is accompanied by a charged fermion 166 MeV heavier: we discuss how it might manifest. Thanks to an electroweak Sommerfeld enhancement of more than 2 orders of magnitude, DM annihilations into W+W- give, in the presence of a modest astrophysical boost factor, an e+ flux compatible with the PAMELA excess (but not with the ATIC hint for a peak: MDM instead predicts a quasi-power-law spectrum), a \\bar p flux concentrated at energies above 100 GeV, and photon fluxes comparable with present limits, depending on the DM density profile.

  4. FPGA design for constrained energy minimization

    NASA Astrophysics Data System (ADS)

    Wang, Jianwei; Chang, Chein-I.; Cao, Mang

    2004-02-01

    The Constrained Energy Minimization (CEM) has been widely used for hyperspectral detection and classification. The feasibility of implementing the CEM as a real-time processing algorithm in systolic arrays has been also demonstrated. The main challenge of realizing the CEM in hardware architecture in the computation of the inverse of the data correlation matrix performed in the CEM, which requires a complete set of data samples. In order to cope with this problem, the data correlation matrix must be calculated in a causal manner which only needs data samples up to the sample at the time it is processed. This paper presents a Field Programmable Gate Arrays (FPGA) design of such a causal CEM. The main feature of the proposed FPGA design is to use the Coordinate Rotation DIgital Computer (CORDIC) algorithm that can convert a Givens rotation of a vector to a set of shift-add operations. As a result, the CORDIC algorithm can be easily implemented in hardware architecture, therefore in FPGA. Since the computation of the inverse of the data correlction involves a series of Givens rotations, the utility of the CORDIC algorithm allows the causal CEM to perform real-time processing in FPGA. In this paper, an FPGA implementation of the causal CEM will be studied and its detailed architecture will be also described.

  5. Flavor mixing democracy and minimal CP violation

    NASA Astrophysics Data System (ADS)

    Gerard, Jean-Marc; Xing, Zhi-zhong

    2012-06-01

    We point out that there is a unique parametrization of quark flavor mixing in which every angle is close to the Cabibbo angle θC≃13° with the CP-violating phase ϕq around 1°, implying that they might all be related to the strong hierarchy among quark masses. Applying the same parametrization to lepton flavor mixing, we find that all three mixing angles are comparably large (around π/4) and the Dirac CP-violating phase ϕl is also minimal as compared with its values in the other eight possible parametrizations. In this spirit, we propose a simple neutrino mixing ansatz which is equivalent to the tri-bimaximal flavor mixing pattern in the ϕl→0 limit and predicts sin θ13=1/√{2}sin(ϕl/2) for reactor antineutrino oscillations. Hence the Jarlskog invariant of leptonic CP violation Jl=(sin ϕl)/12 can reach a few percent if θ13 lies in the range 7°⩽θ13⩽10°.

  6. Design and Demonstration of Minimal Lunar Base

    NASA Astrophysics Data System (ADS)

    Boche-Sauvan, L.; Foing, B. H.; Exohab Team

    2009-04-01

    Introduction: We propose a conceptual analysis of a first minimal lunar base, in focussing on the system aspects and coordinating every different part as part an evolving architecture [1-3]. We justify the case for a scientific outpost allowing experiments, sample analysis in laboratory (relevant to the origin and evolution of the Earth, geophysical and geochemical studies of the Moon, life sciences, observation from the Moon). Research: Research activities will be conducted with this first settlement in: - science (of, from and on the Moon) - exploration (robotic mobility, rover, drilling), - technology (communication, command, organisation, automatism). Life sciences. The life sciences aspects are considered through a life support for a crew of 4 (habitat) and a laboratory activity with biological experiments performed on Earth or LEO, but then without any magnetosphere protection and therefore with direct cosmic rays and solar particle effects. Moreover, the ability of studying the lunar environment in the field will be a big asset before settling a permanent base [3-5]. Lunar environment. The lunar environment adds constraints to instruments specifications (vacuum, extreme temperature, regolith, seism, micrometeorites). SMART-1 and other missions data will bring geometrical, chemical and physical details about the environment (soil material characteristics, on surface conditions …). Test bench. To assess planetary technologies and operations preparing for Mars human exploration. Lunar outpost predesign modular concept: To allow a human presence on the moon and to carry out these experiments, we will give a pre-design of a human minimal lunar base. Through a modular concept, this base will be possibly evolved into a long duration or permanent base. We will analyse the possibilities of settling such a minimal base by means of the current and near term propulsion technology, as a full Ariane 5 ME carrying 1.7 T of gross payload to the surface of the Moon

  7. Innovations in minimally invasive mitral valve pair.

    PubMed

    Sündermann, Simon H; Seeburger, Joerg; Scherman, Jacques; Mohr, Friedrich Wilhelm; Falk, Volkmar

    2012-12-01

    Mitral valve (MV) insufficiency is the second most common heart valve disease represented in cardiac surgery. The gold standard therapy is surgical repair of the valve. Today, most centers prefer a minimally invasive approach through a right-sided mini-thoracotomy. Despite the small access, there is still the need to use cardiopulmonary bypass (CPB), and the operation has to be performed on the arrested heart. New devices have been developed to optimize the results of surgical repair by implementing mechanisms for post-implantation adjustment on the beating heart or the avoidance of CPB. Early attempts with adjustable mitral annuloplasty rings go back to the early 1990s. Only a few devices are available on the market. Recently, a mitral valve adjustable annuloplasty ring was CE-marked and is under further clinical investigation. In addition, a sutureless annuloplasty band to be implanted on the beating heart is under preclinical and initial clinical investigation for transatrial and transfemoral transcatheter implantation. Furthermore, new neochord systems are being developed, which allow for functional length adjustment on the beating heart after implantation. Some devices were developed for percutaneous MV repair implanted into the coronary sinus to reshape the posterior MV annulus. Other percutaneous devices are directly fixed to the posterior annulus to alter its shape. Several disadvantages have been observed preventing a broad clinical use of some of these devices. There is a continuous effort to develop innovative techniques to optimize MV repair and to decrease invasiveness. PMID:23315719

  8. Subjective loudness of "minimized" sonic boom waveforms.

    PubMed

    Niedzwiecki, A; Ribner, H S

    1978-12-01

    For very long supersonic aircraft the "midfield" sonic boom signature may not have evolved fully into an N wave at ground level. Thus in current boom minimization techniques the shape of the aircraft may be tailored to optimize this midfield wave form for reduced subjective loudness. The present investigation tests a family of "flat-top" waveforms cited by Darden: all but one have a front shock height (deltapSH) less than the peak amplitude (deltapMAX). For equal subjective loudness, "flat top" vs N wave (peak overpressure deltapN), the peak amplitude of the "flat top" signature was found to be substantially higher than that of the N wave; thus for equal peak amplitude the "flat-top" signature was quieter. The results for equal loudness were well fitted by an emperical law deltapSH + 0.11deltapMAX = deltapN; the equivalence shows how the front shock amplitude (deltapSH) dominates the loudness. All this was found compatible with predictions by the method of Johnson and Robinson. PMID:739097

  9. Digital breast tomosynthesis with minimal breast compression

    NASA Astrophysics Data System (ADS)

    Scaduto, David A.; Yang, Min; Ripton-Snyder, Jennifer; Fisher, Paul R.; Zhao, Wei

    2015-03-01

    Breast compression is utilized in mammography to improve image quality and reduce radiation dose. Lesion conspicuity is improved by reducing scatter effects on contrast and by reducing the superposition of tissue structures. However, patient discomfort due to breast compression has been cited as a potential cause of noncompliance with recommended screening practices. Further, compression may also occlude blood flow in the breast, complicating imaging with intravenous contrast agents and preventing accurate quantification of contrast enhancement and kinetics. Previous studies have investigated reducing breast compression in planar mammography and digital breast tomosynthesis (DBT), though this typically comes at the expense of degradation in image quality or increase in mean glandular dose (MGD). We propose to optimize the image acquisition technique for reduced compression in DBT without compromising image quality or increasing MGD. A zero-frequency signal-difference-to-noise ratio model is employed to investigate the relationship between tube potential, SDNR and MGD. Phantom and patient images are acquired on a prototype DBT system using the optimized imaging parameters and are assessed for image quality and lesion conspicuity. A preliminary assessment of patient motion during DBT with minimal compression is presented.

  10. Minimal modeling of the extratropical general circulation

    NASA Technical Reports Server (NTRS)

    O'Brien, Enda; Branscome, Lee E.

    1989-01-01

    The ability of low-order, two-layer models to reproduce basic features of the mid-latitude general circulation is investigated. Changes in model behavior with increased spectral resolution are examined in detail. Qualitatively correct time-mean heat and momentum balances are achieved in a beta-plane channel model which includes the first and third meridional modes. This minimal resolution also reproduces qualitatively realistic surface and upper-level winds and mean meridional circulations. Higher meridional resolution does not result in substantial changes in the latitudinal structure of the circulation. A qualitatively correct kinetic energy spectrum is produced when the resolution is high enough to include several linearly stable modes. A model with three zonal waves and the first three meridional modes has a reasonable energy spectrum and energy conversion cycle, while also satisfying heat and momentum budget requirements. This truncation reproduces the basic mechanisms and zonal circulation features that are obtained at higher resolution. The model performance improves gradually with higher resolution and is smoothly dependent on changes in external parameters.

  11. Minimal model for tag-based cooperation

    NASA Astrophysics Data System (ADS)

    Traulsen, Arne; Schuster, Heinz Georg

    2003-10-01

    Recently, Riolo et al. [Nature (London) 414, 441 (2001)] showed by computer simulations that cooperation can arise without reciprocity when agents donate only to partners who are sufficiently similar to themselves. One striking outcome of their simulations was the observation that the number of tolerant agents that support a wide range of players was not constant in time, but showed characteristic fluctuations. The cause and robustness of these tides of tolerance remained to be explored. Here we clarify the situation by solving a minimal version of the model of Riolo et al. It allows us to identify a net surplus of random changes from intolerant to tolerant agents as a necessary mechanism that produces these oscillations of tolerance, which segregate different agents in time. This provides a new mechanism for maintaining different agents, i.e., for creating biodiversity. In our model the transition to the oscillating state is caused by a saddle node bifurcation. The frequency of the oscillations increases linearly with the transition rate from tolerant to intolerant agents.

  12. Hibernation and daily torpor minimize mammalian extinctions

    NASA Astrophysics Data System (ADS)

    Geiser, Fritz; Turbill, Christopher

    2009-10-01

    Small mammals appear to be less vulnerable to extinction than large species, but the underlying reasons are poorly understood. Here, we provide evidence that almost all (93.5%) of 61 recently extinct mammal species were homeothermic, maintaining a constant high body temperature and thus energy expenditure, which demands a high intake of food, long foraging times, and thus exposure to predators. In contrast, only 6.5% of extinct mammals were likely heterothermic and employed multi-day torpor (hibernation) or daily torpor, even though torpor is widespread within more than half of all mammalian orders. Torpor is characterized by substantial reductions of body temperature and energy expenditure and enhances survival during adverse conditions by minimizing food and water requirements, and consequently reduces foraging requirements and exposure to predators. Moreover, because life span is generally longer in heterothermic mammals than in related homeotherms, heterotherms can employ a ‘sit-and-wait’ strategy to withstand adverse periods and then repopulate when circumstances improve. Thus, torpor is a crucial but hitherto unappreciated attribute of small mammals for avoiding extinction. Many opportunistic heterothermic species, because of their plastic energetic requirements, may also stand a better chance of future survival than homeothermic species in the face of greater climatic extremes and changes in environmental conditions caused by global warming.

  13. Minimal change glomerulopathy in a cat.

    PubMed

    Backlund, Brianna; Cianciolo, Rachel E; Cook, Audrey K; Clubb, Fred J; Lees, George E

    2011-04-01

    A 6-year-old domestic shorthair male castrated cat was evaluated for sudden onset of vomiting and anorexia. A diagnosis of hypereosinophilic syndrome (HES) was made, and the cat was treated with imatinib mesylate. The cat had an initial clinical improvement with the normalization of the peripheral eosinophil count. After approximately 8 weeks of treatment, lethargy and anorexia recurred despite the normal eosinophil count and a significant proteinuric nephropathy was identified. Treatment with imatinib was discontinued. Ultrasound guided renal biopsies exhibited histologic, ultrastructural, and immunostaining changes indicative of a minimal change glomerulopathy (MCG) which has not previously been reported in the literature in a cat. The proteinuria and HES initially improved while the cat was treated with more traditional medications; however, both the problems persisted for 30 months that the cat was followed subsequently. Previous studies demonstrating the safety and efficacy of imatinib in cats do not report any glomerular injury or significant adverse drug reactions, and the exact cause of this cat's proteinuric nephropathy is uncertain. Nonetheless, the possibility of an adverse drug reaction causing proteinuria should be considered when initiating treatment with imatinib in a cat. PMID:21414552

  14. Transdermal Photopolymerization for Minimally Invasive Implantation

    NASA Astrophysics Data System (ADS)

    Elisseeff, J.; Anseth, K.; Sims, D.; McIntosh, W.; Randolph, M.; Langer, R.

    1999-03-01

    Photopolymerizations are widely used in medicine to create polymer networks for use in applications such as bone restorations and coatings for artificial implants. These photopolymerizations occur by directly exposing materials to light in "open" environments such as the oral cavity or during invasive procedures such as surgery. We hypothesized that light, which penetrates tissue including skin, could cause a photopolymerization indirectly. Liquid materials then could be injected s.c. and solidified by exposing the exterior surface of the skin to light. To test this hypothesis, the penetration of UVA and visible light through skin was studied. Modeling predicted the feasibility of transdermal polymerization with only 2 min of light exposure required to photopolymerize an implant underneath human skin. To establish the validity of these modeling studies, transdermal photopolymerization first was applied to tissue engineering by using "injectable" cartilage as a model system. Polymer/chondrocyte constructs were injected s.c. and transdermally photopolymerized. Implants harvested at 2, 4, and 7 weeks demonstrated collagen and proteoglycan production and histology with tissue structure comparable to native neocartilage. To further examine this phenomenon and test the applicability of transdermal photopolymerization for drug release devices, albumin, a model protein, was released for 1 week from photopolymerized hydrogels. With further study, transdermal photpolymerization potentially could be used to create a variety of new, minimally invasive surgical procedures in applications ranging from plastic and orthopedic surgery to tissue engineering and drug delivery.

  15. Bacterial Stressors in Minimally Processed Food

    PubMed Central

    Capozzi, Vittorio; Fiocco, Daniela; Amodio, Maria Luisa; Gallone, Anna; Spano, Giuseppe

    2009-01-01

    Stress responses are of particular importance to microorganisms, because their habitats are subjected to continual changes in temperature, osmotic pressure, and nutrients availability. Stressors (and stress factors), may be of chemical, physical, or biological nature. While stress to microorganisms is frequently caused by the surrounding environment, the growth of microbial cells on its own may also result in induction of some kinds of stress such as starvation and acidity. During production of fresh-cut produce, cumulative mild processing steps are employed, to control the growth of microorganisms. Pathogens on plant surfaces are already stressed and stress may be increased during the multiple mild processing steps, potentially leading to very hardy bacteria geared towards enhanced survival. Cross-protection can occur because the overlapping stress responses enable bacteria exposed to one stress to become resistant to another stress. A number of stresses have been shown to induce cross protection, including heat, cold, acid and osmotic stress. Among other factors, adaptation to heat stress appears to provide bacterial cells with more pronounced cross protection against several other stresses. Understanding how pathogens sense and respond to mild stresses is essential in order to design safe and effective minimal processing regimes. PMID:19742126

  16. Hazardous waste minimization report for CY 1986

    SciTech Connect

    Kendrick, C.M.

    1990-12-01

    Oak Ridge National Laboratory (ORNL) is a multipurpose research and development facility. Its primary role is the support of energy technology through applied research and engineering development and scientific research in basic and physical sciences. ORNL also is a valuable resource in the solution of problems of national importance, such as nuclear and chemical waste management. In addition, useful radioactive and stable isotopes which are unavailable from the private sector are produced at ORNL. As a result of these activities, hazardous, radioactive, and mixed wastes are generated at ORNL. A formal hazardous waste minimization program for ORNL was launched in mid 1985 in response to the requirements of Section 3002 of the Resource Conservation and Recovery Act (RCRA). During 1986, a task plan was developed. The six major tasks include: planning and implementation of a laboratory-wide chemical inventory and the subsequent distribution, treatment, storage, and/or disposal (TSD) of unneeded chemicals; establishment and implementation of a distribution system for surplus chemicals to other (internal and external) organizations; training and communication functions necessary to inform and motivate laboratory personnel; evaluation of current procurement and tracking systems for hazardous materials and recommendation and implementation of improvements; systematic review of applicable current and proposed ORNL procedures and ongoing and proposed activities for waste volume and/or toxicity reduction potential; and establishment of criteria by which to measure progress and reporting of significant achievements. 8 refs., 1 fig., 5 tabs.

  17. Minimizing or eliminating refueling of nuclear reactor

    DOEpatents

    Doncals, Richard A.; Paik, Nam-Chin; Andre, Sandra V.; Porter, Charles A.; Rathbun, Roy W.; Schwallie, Ambrose L.; Petras, Diane S.

    1989-01-01

    Demand for refueling of a liquid metal fast nuclear reactor having a life of 30 years is eliminated or reduced to intervals of at least 10 years by operating the reactor at a low linear-power density, typically 2.5 kw/ft of fuel rod, rather than 7.5 or 15 kw/ft, which is the prior art practice. So that power of the same magnitude as for prior art reactors is produced, the volume of the core is increased. In addition, the height of the core and it diameter are dimensioned so that the ratio of the height to the diameter approximates 1 to the extent practicable considering the requirement of control and that the pressure drop in the coolant shall not be excessive. The surface area of a cylinder of given volume is a minimum if the ratio of the height to the diameter is 1. By minimizing the surface area, the leakage of neutrons is reduced. By reducing the linear-power density, increasing core volume, reducing fissile enrichment and optimizing core geometry, internal-core breeding of fissionable fuel is substantially enhanced. As a result, core operational life, limited by control worth requirements and fuel burnup capability, is extended up to 30 years of continuous power operation.

  18. Minimal model for dark matter and unification

    SciTech Connect

    Mahbubani, Rakhi; Senatore, Leonardo

    2006-02-15

    Gauge coupling unification and the success of TeV-scale weakly-interacting dark matter are usually taken as evidence of low-energy supersymmetry (SUSY). However, if we assume that the tuning of the Higgs can be explained in some unnatural way, from environmental considerations for example, SUSY is no longer a necessary component of any beyond the standard model theory. In this paper we study the minimal model with a dark matter candidate and gauge coupling unification. This consists of the standard model plus fermions with the quantum numbers of SUSY Higgsinos, and a singlet. It predicts thermal dark matter with a mass that can range from 100 GeV to around 2 TeV and generically gives rise to an electric dipole moment (EDM) that is just beyond current experimental limits, with a large portion of its allowed parameter space accessible to next-generation EDM and direct detection experiments. We study precision unification in this model by embedding it in a 5D orbifold GUT where certain large threshold corrections are calculable, achieving gauge coupling and b-{tau} unification, and predicting a rate of proton decay just beyond current limits.

  19. The Design of MACs (Minimal Actin Cortices)

    PubMed Central

    Vogel, Sven K; Heinemann, Fabian; Chwastek, Grzegorz; Schwille, Petra

    2013-01-01

    The actin cell cortex in eukaryotic cells is a key player in controlling and maintaining the shape of cells, and in driving major shape changes such as in cytokinesis. It is thereby constantly being remodeled. Cell shape changes require forces acting on membranes that are generated by the interplay of membrane coupled actin filaments and assemblies of myosin motors. Little is known about how their interaction regulates actin cell cortex remodeling and cell shape changes. Because of the vital importance of actin, myosin motors and the cell membrane, selective in vivo experiments and manipulations are often difficult to perform or not feasible. Thus, the intelligent design of minimal in vitro systems for actin-myosin-membrane interactions could pave a way for investigating actin cell cortex mechanics in a detailed and quantitative manner. Here, we present and discuss the design of several bottom-up in vitro systems accomplishing the coupling of actin filaments to artificial membranes, where key parameters such as actin densities and membrane properties can be varied in a controlled manner. Insights gained from these in vitro systems may help to uncover fundamental principles of how exactly actin-myosin-membrane interactions govern actin cortex remodeling and membrane properties for cell shape changes. © 2013 Wiley Periodicals, Inc. PMID:24039068

  20. [Advanced coronary artery surgery for minimally invasiveness].

    PubMed

    Yamaguchi, Shohjiro; Tomita, Shigeyuki; Watanabe, Go

    2008-07-01

    Since the development of drug-eluting stents, the conditions of coronary artery surgery have changed. The selection criteria for candidates for coronary artery bypass grafting (CABG) have become more stringent. In this era, surgeons should perform less invasive surgery to save such candidates. Off-pump coronary artery bypass (OPCAB) will become the gold standard surgical procedure for the treatment of ischemic heart disease. This paper describes how to perform less invasive OPCAB with some useful devices and points out the pitfalls of the standard procedure. We have also introduced robotic surgery using the DaVinci system. This procedure decreases the length of dermal incisions. Robotic surgery has other advantages compared with the standard endoscopic surgery. The arm of the robot absorbs the vibrations of human hands and the command function can decrease movement significantly. This arm has five joints, allowing the operator to manipulate the equipment easily inside the body. We have also performed awake CABG with high epidural anesthesia for minimally invasive surgery. This procedure is performed especially in patients with severe cerebrovascular disease and lung injury. In our institution, patients can be discharged only 5 days after this surgical procedure. Less invasive surgery will be the standard procedure in future. PMID:18681162

  1. Minimal natural supersymmetry after the LHC8

    NASA Astrophysics Data System (ADS)

    Drees, Manuel; Kim, Jong Soo

    2016-05-01

    In this work, we present limits on natural supersymmetry scenarios based on searches in data taken during run 1 of the LHC. We consider a set of 22 000 model points in a six dimensional parameter space. These scenarios are minimal in the sense of only keeping those superparticles relatively light that are required to cancel the leading quadratically divergent quantum corrections (from the top and QCD sector) to the Higgs mass in the Standard Model. The resulting mass spectra feature Higgsinos as the lightest supersymmetric particle, as well as relatively light third generation S U (2 ) doublet squarks and S U (2 ) singlet stops and gluinos while assuming a Standard-Model-like Higgs boson. All remaining supersymmetric particles and Higgs bosons are assumed to be decoupled. We check each parameter set against a large number of LHC searches as implemented in the public code CheckMATE. These searches show a considerable degree of complementarity, i.e., in general, many searches have to be considered in order to check whether a given scenario is allowed. We delineate allowed and excluded regions in parameter space. For example, we find that all scenarios where either mt˜1<230GeV or mg ˜<440 GeV are clearly excluded, while all model points where mt ˜1>660 GeV and mg ˜>1180 GeV remain allowed.

  2. Disk Acceleration Experiment Utilizing Minimal Material (DAXUMM)

    NASA Astrophysics Data System (ADS)

    Biss, Matthew; Lorenz, Thomas; Sutherland, Gerrit

    2015-06-01

    A venture between the US Army Research Laboratory (ARL) and Lawrence Livermore National Laboratory (LLNL) is currently underway in an effort to characterize novel energetic material performance properties using a single, high-precision, gram-range charge. A nearly all-inclusive characterization experiment is proposed by combing LLNL's disk acceleration experiment (DAX) with the ARL explosive evaluation utilizing minimal material (AXEUMM) experiment. Spherical-cap charges fitted with a flat circular metal disk are centrally initiated using an exploding bridgewire detonator while photonic doppler velocimetry is used to probe the metal disk surface velocity and measure its temporal history. The metal disk's jump-off-velocity measurement is combined with conservation equations, material Hugoniots, and select empirical relationships to determine performance properties of the detonation wave (i.e., velocity, pressure, particle velocity, and density). Using the temporal velocity history with the numerical hydrocode CTH, a determination of the energetic material's equation of state and material expansion energy is possible. Initial experimental and computational results for the plastic-bonded energetic formulation PBXN-5 are presented.

  3. Minimally Invasive Procedures for Nasal Aesthetics

    PubMed Central

    Redaelli, Alessio; Limardo, Pietro

    2012-01-01

    Nose has an important role in the aesthetics of face. It is easy to understand the reason of the major interest that has revolved around the correction of its imperfections for several centuries, or even from the ancient times. In the last decade, all the surgical or medical minimal-invasive techniques evolved exponentially. The techniques of rejuvenation and corrections of nasal imperfections did not escape this development that is much widespread in the medicine of the third millennium. In many cases, the techniques of surgical correction involve invasive procedure that necessitates, for the majority of cases, hospitalisation. The author, using a different approach, has developed mini-invasive techniques using botulinum toxin A (BTxA) and absorbable fillers for the correction of nasal imperfections. BTxA allows to reduce the imperfections due to hypertension of muscles, while the absorbable fillers allow to correct all the imperfections of the nasal profile from the root to the tip in total safety. The correction is based on the precise rules that allow avoiding the majority of side effects. Results are long lasting and well appreciated by patients. PMID:23060706

  4. Minimally invasive knee arthroplasty: An overview

    PubMed Central

    Tria, Alfred J; Scuderi, Giles R

    2015-01-01

    Minimally invasive surgery (MIS) for arthroplasty of the knee began with surgery for unicondylar knee arthroplasty (UKA). Partial knee replacements were designed in the 1970s and were amenable to a more limited exposure. In the 1990s Repicci popularized the MIS for UKA. Surgeons began to apply his concepts to total knee arthroplasty. Four MIS surgical techniques were developed: quadriceps sparing, mini-mid vastus, mini-subvastus, and mini-medial parapatellar. The quadriceps sparing technique is the most limited one and is also the most difficult. However, it is the least invasive and allows rapid recovery. The mini-midvastus is the most common technique because it affords slightly better exposure and can be extended. The mini-subvastus technique entirely avoids incising the quadriceps extensor mechanism but is time consuming and difficult in the obese and in the muscular male patient. The mini-parapatellar technique is most familiar to surgeons and represents a good starting point for surgeons who are learning the techniques. The surgeries are easier with smaller instruments but can be performed with standard ones. The techniques are accurate and do lead to a more rapid recovery, with less pain, less blood loss, and greater motion if they are appropriately performed. PMID:26601062

  5. [Minimally Invasive Thoracoscopic Surgery for Mediastinal Lesions].

    PubMed

    Maeda, Sumiko

    2016-07-01

    This review article describes minimally invasive thoracoscopic surgery for anterior mediastinal lesions. The operative procedures for anterior mediastinal lesions have been changed in a couple of decades from open surgery under median sternotomy to complete thoracoscopic mediastinal surgery with sternal lifting or carbon dioxide insufflation. Carbon dioxide insufflation of the thoracic cavity or the mediastinum is now prevailing to improve the surgical field and facilitate the operative procedures. Surgical indications for complete thoracoscopic mediastinal surgery include benign cystic lesions generally regardless of their size and non-invasive anterior mediastinal tumors usually less than 50~60 mm in the greatest dimension. There are currently three surgical approaches in the complete thoracoscopic surgery for the anterior mediastinal lesions. One is the unilateral or bilateral transthoracic approach. The second is the combination of the subxiphoid and the transthoracic approach. The last is the subxiphoid approach. The selection of the surgical approach depends on the surgeon's preference and experiences. When carbon dioxide insufflation is applied during the operation, following complications may occur;hypercapnia, gas embolism, subcutaneous emphysema, endotracheal tube dislocation due to the mediastinal sift, and hypotention. Special safety considerations are necessary during the complete thoracoscopic mediastinal surgery with carbon dioxide insufflation. PMID:27440034

  6. Prochlorococcus: Advantages and Limits of Minimalism

    NASA Astrophysics Data System (ADS)

    Partensky, Frédéric; Garczarek, Laurence

    2010-01-01

    Prochlorococcus is the key phytoplanktonic organism of tropical gyres, large ocean regions that are depleted of the essential macronutrients needed for photosynthesis and cell growth. This cyanobacterium has adapted itself to oligotrophy by minimizing the resources necessary for life through a drastic reduction of cell and genome sizes. This rarely observed strategy in free-living organisms has conferred on Prochlorococcus a considerable advantage over other phototrophs, including its closest relative Synechococcus, for life in this vast yet little variable ecosystem. However, this strategy seems to reach its limits in the upper layer of the S Pacific gyre, the most oligotrophic region of the world ocean. By losing some important genes and/or functions during evolution, Prochlorococcus has seemingly become dependent on co-occurring microorganisms. In this review, we present some of the recent advances in the ecology, biology, and evolution of Prochlorococcus, which because of its ecological importance and tiny genome is rapidly imposing itself as a model organism in environmental microbiology.

  7. Prochlorococcus: advantages and limits of minimalism.

    PubMed

    Partensky, Frédéric; Garczarek, Laurence

    2010-01-01

    Prochlorococcus is the key phytoplanktonic organism of tropical gyres, large ocean regions that are depleted of the essential macronutrients needed for photosynthesis and cell growth. This cyanobacterium has adapted itself to oligotrophy by minimizing the resources necessary for life through a drastic reduction of cell and genome sizes. This rarely observed strategy in free-living organisms has conferred on Prochlorococcus a considerable advantage over other phototrophs, including its closest relative Synechococcus, for life in this vast yet little variable ecosystem. However, this strategy seems to reach its limits in the upper layer of the S Pacific gyre, the most oligotrophic region of the world ocean. By losing some important genes and/or functions during evolution, Prochlorococcus has seemingly become dependent on co-occurring microorganisms. In this review, we present some of the recent advances in the ecology, biology, and evolution of Prochlorococcus, which because of its ecological importance and tiny genome is rapidly imposing itself as a model organism in environmental microbiology. PMID:21141667

  8. Minimal flow units for magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Orlandi, P.

    2016-08-01

    We present direct numerical simulations of two minimal flow units (MFUs) to investigate the differences between inviscid and viscous simulations, and the different behavior of the evolution for conducting fluids. In these circumstances the introduction of the Lorentz force in the momentum equation produces different scenarios. The Taylor–Green vortex, in the past, was an MFU widely considered for both conducting and non-conducting fluids. The simulations were performed by pseudo-spectral numerical methods; these are repeated here by using a finite difference second-order accurate, energy-conserving scheme for ν =0. Having observed that this initial condition could be inefficient for capturing the eventual occurrence of a finite time singularity a potentially more efficient MFU consisting of two interacting Lamb dipoles was considered. It was found that the two flows have a different time evolution in the vortical dominated stage. In this stage, turbulent structures of different size are generated leading to spectra, in the inviscid conditions, with a {k}-3 range. In real conditions the viscosity produces smaller scales characteristic of fully developed turbulence with energy spectra with well defined exponential and inertial ranges. In the presence of non-conducting conditions the passive vector behaves as the vorticity. The evolution is different in the presence of conducting conditions. Although the time evolution is different, both flows lead to spectra in Kolmogorov units with the same shape at high and intermediate wave numbers.

  9. Infrared dynamics of minimal walking technicolor

    SciTech Connect

    Del Debbio, Luigi; Lucini, Biagio; Patella, Agostino; Pica, Claudio; Rago, Antonio

    2010-07-01

    We study the gauge sector of minimal walking technicolor, which is an SU(2) gauge theory with n{sub f}=2 flavors of Wilson fermions in the adjoint representation. Numerical simulations are performed on lattices N{sub t}xN{sub s}{sup 3}, with N{sub s} ranging from 8 to 16 and N{sub t}=2N{sub s}, at fixed {beta}=2.25, and varying the fermion bare mass m{sub 0}, so that our numerical results cover the full range of fermion masses from the quenched region to the chiral limit. We present results for the string tension and the glueball spectrum. A comparison of mesonic and gluonic observables leads to the conclusion that the infrared dynamics is given by an SU(2) pure Yang-Mills theory with a typical energy scale for the spectrum sliding to zero with the fermion mass. The typical mesonic mass scale is proportional to and much larger than this gluonic scale. Our findings are compatible with a scenario in which the massless theory is conformal in the infrared. An analysis of the scaling of the string tension with the fermion mass toward the massless limit allows us to extract the chiral condensate anomalous dimension {gamma}{sub *}, which is found to be {gamma}{sub *}=0.22{+-}0.06.

  10. Waste minimization and pollution prevention awareness plan

    SciTech Connect

    1994-08-01

    The primary mission of DOE/NV is to manage and operate the Nevada Test Site (NTS) and other designated test locations, within and outside the United States; provide facilities and services to DOE and non-DOE NTS users; and plan. coordinate, and execute nuclear weapons tests and related test activities. DOE/NV also: (a) Supports operations under interagency agreements pertaining to tests, emergencies, and related functions/activities, (b) Plans, coordinates, and executes environmental restoration, (c) Provides support to the Yucca Mountain Site Characterization Project Office in conjunction with DOE/HQ oversight, (d) Manages the Radioactive Waste Management Sites (RWMS) for disposal of low-level and mixed wastes received from the NTS and off-site generators, and (e) Implements waste minimization programs to reduce the amount of hazardous, mixed, radioactive, and nonhazardous solid waste that is generated and disposed The NTS, which is the primary facility controlled by DOE/NV, occupies 1,350 square miles of restricted-access, federally-owned land located in Nye County in Southern Nevada. The NTS is located in a sparsely populated area, approximately 65 miles northwest of Las Vegas, Nevada.

  11. On Ramsey (P3, P6)-minimal graphs

    NASA Astrophysics Data System (ADS)

    Rahmadani, Desi; Baskoro, Edy Tri; Assiyatun, Hilda

    2016-02-01

    Finding all Ramsey (G, H)-minimal graphs for a certain pair of graphs G and H is an interesting and difficult problem. Even though, it is just for small graphs G and H. In this paper, we determine some Ramsey (P3, P6)-minimal graphs of small order. We also characterize all such Ramsey minimal graphs of order 6 by using their degree sequences. We prove that Ramsey (P3, P6)-minimal graphs have diameter at least two. We construct an infinite class of trees [6] which provides Ramsey (P3, P6)-minimal graphs.

  12. Methodology for Validating Building Energy Analysis Simulations

    SciTech Connect

    Judkoff, R.; Wortman, D.; O'Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  13. Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.

    2012-01-01

    This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.

  14. Comparing open and minimally invasive surgical procedures for oesophagectomy in the treatment of cancer: the ROMIO (Randomised Oesophagectomy: Minimally Invasive or Open) feasibility study and pilot trial.

    PubMed Central

    Metcalfe, Chris; Avery, Kerry; Berrisford, Richard; Barham, Paul; Noble, Sian M; Fernandez, Aida Moure; Hanna, George; Goldin, Robert; Elliott, Jackie; Wheatley, Timothy; Sanders, Grant; Hollowood, Andrew; Falk, Stephen; Titcomb, Dan; Streets, Christopher; Donovan, Jenny L; Blazeby, Jane M

    2016-01-01

    BACKGROUND Localised oesophageal cancer can be curatively treated with surgery (oesophagectomy) but the procedure is complex with a risk of complications, negative effects on quality of life and a recovery period of 6-9 months. Minimal-access surgery may accelerate recovery. OBJECTIVES The ROMIO (Randomised Oesophagectomy: Minimally Invasive or Open) study aimed to establish the feasibility of, and methodology for, a definitive trial comparing minimally invasive and open surgery for oesophagectomy. Objectives were to quantify the number of eligible patients in a pilot trial; develop surgical manuals as the basis for quality assurance; standardise pathological processing; establish a method to blind patients to their allocation in the first week post surgery; identify measures of postsurgical outcome of importance to patients and clinicians; and establish the main cost differences between the surgical approaches. DESIGN Pilot parallel three-arm randomised controlled trial nested within feasibility work. SETTING Two UK NHS departments of upper gastrointestinal surgery. PARTICIPANTS Patients aged ≥ 18 years with histopathological evidence of oesophageal or oesophagogastric junctional adenocarcinoma, squamous cell cancer or high-grade dysplasia, referred for oesophagectomy or oesophagectomy following neoadjuvant chemo(radio)therapy. INTERVENTIONS Oesophagectomy, with patients randomised to open surgery, a hybrid open chest and minimally invasive abdomen or totally minimally invasive access. MAIN OUTCOME MEASURE The primary outcome measure for the pilot trial was the number of patients recruited per month, with the main trial considered feasible if at least 2.5 patients per month were recruited. RESULTS During 21 months of recruitment, 263 patients were assessed for eligibility; of these, 135 (51%) were found to be eligible and 104 (77%) agreed to participate, an average of five patients per month. In total, 41 patients were allocated to open surgery, 43 to the

  15. Phenomenology in minimal theory of massive gravity

    NASA Astrophysics Data System (ADS)

    De Felice, Antonio; Mukohyama, Shinji

    2016-04-01

    We investigate the minimal theory of massive gravity (MTMG) recently introduced. After reviewing the original construction based on its Hamiltonian in the vielbein formalism, we reformulate it in terms of its Lagrangian in both the vielbein and the metric formalisms. It then becomes obvious that, unlike previous attempts in the literature of Lorentz-violating massive gravity, not only the potential but also the kinetic structure of the action is modified from the de Rham-Gabadadze-Tolley (dRGT) massive gravity theory. We confirm that the number of physical degrees of freedom in MTMG is two at fully nonlinear level. This proves the absence of various possible pathologies such as superluminality, acausality and strong coupling. Afterwards, we discuss the phenomenology of MTMG in the presence of a dust fluid. We find that on a flat homogeneous and isotropic background we have two branches. One of them (self-accelerating branch) naturally leads to acceleration without the genuine cosmological constant or dark energy. For this branch both the scalar and the vector modes behave exactly as in general relativity (GR). The phenomenology of this branch differs from GR in the tensor modes sector, as the tensor modes acquire a non-zero mass. Hence, MTMG serves as a stable nonlinear completion of the self-accelerating cosmological solution found originally in dRGT theory. The other branch (normal branch) has a dynamics which depends on the time-dependent fiducial metric. For the normal branch, the scalar mode sector, even though as in GR only one scalar mode is present (due to the dust fluid), differs from the one in GR, and, in general, structure formation will follow a different phenomenology. The tensor modes will be massive, whereas the vector modes, for both branches, will have the same phenomenology as in GR.

  16. Planetary protection: elements for cost minimization

    NASA Astrophysics Data System (ADS)

    Debus, Andre

    2003-11-01

    In line with the UN Outer Space Treaty (article IX of the Outer Space Treaty - London/Washington January 27., 1967 -) and with COSPAR recommendations, for ethical, safety and scientific reasons, exploration of the solar system needs to comply with planetary protection constraints in order to avoid extraterrestrial bodies contamination, particularly biological contamination by terrestrial microorganisms. It is also required to protect Earth from an eventual contamination carried by return systems or samples. The search for life in extraterrestrial samples, in situ or in the frame of sample return missions, must be conducted in order to state with the maximum possible confidence, because the discovery or the non-discovery of life in sample has a direct impact on updatations of planetary protection specifications for future missions. This last requirement imposes consequently also for implementation in order to preserve extra terrestrial samples properties, protecting also indirectly exobiological science. These constraints impose to set up unusual requirements for project teams involved in such solar system exploration missions, requirements based on hardware sterilization, sterile integration, organic cleanliness, microbiological and cleanliness control, the use of high reliability system in order to avoid crashs, the definition of specific trajectories and their control, recontamination prevention .... etc. Implementation of such requirements induces costs, difficult to estimate, but which can be important depending on the solar system target and the mission definition (fly-by, orbiter or lander). The cost impact of a planetary protection program could be important if some basic rules are not taken into account enough early and consequently, upon past experience, some recommendations can be proposed here in order to manage properly such programs and to minimize their cost.

  17. Software Replica of Minimal Living Processes

    NASA Astrophysics Data System (ADS)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  18. Software replica of minimal living processes.

    PubMed

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela's autopoietic cellular automata, Ganti's chemoton model, whose running delivers interesting take home messages to open-minded biologists. PMID:20204519

  19. Beyond minimal lepton-flavored Dark Matter

    NASA Astrophysics Data System (ADS)

    Chen, Mu-Chun; Huang, Jinrui; Takhistov, Volodymyr

    2016-02-01

    We consider a class of flavored dark matter (DM) theories where dark matter interacts with the Standard Model lepton fields at the renormalizable level. We allow for a general coupling matrix between the dark matter and leptons whose structure is beyond the one permitted by the minimal flavor violation (MFV) assumption. It is assumed that this is the only new source of flavor violation in addition to the Standard Model (SM) Yukawa interactions. The setup can be described by augmenting the SM flavor symmetry by an additional SU(3) χ , under which the dark matter χ transforms. This framework is especially phenomenologically rich, due to possible novel flavor-changing interactions which are not present within the more restrictive MFV framework. As a representative case study of this setting, which we call "beyond MFV" (BMFV), we consider Dirac fermion dark matter which transforms as a singlet under the SM gauge group and a triplet under SU(3) χ . The DM fermion couples to the SM lepton sector through a scalar mediator ϕ. Unlike the case of quark-flavored DM, we show that there is no {{Z}}_3 symmetry within either the MFV or BMFV settings which automatically stabilizes the lepton-flavored DM. We discuss constraints on this setup from flavor-changing processes, DM relic abundance as well as direct and indirect detections. We find that relatively large flavor-changing couplings are possible, while the dark matter mass is still within the phenomenologically interesting region below the TeV scale. Collider signatures which can be potentially searched for at the lepton and hadron colliders are discussed. Finally, we discuss the implications for decaying dark matter, which can appear if an additional stabilizing symmetry is not imposed.

  20. Minimally invasive follicular thyroid carcinomas: prognostic factors.

    PubMed

    Stenson, Gustav; Nilsson, Inga-Lena; Mu, Ninni; Larsson, Catharina; Lundgren, Catharina Ihre; Juhlin, C Christofer; Höög, Anders; Zedenius, Jan

    2016-08-01

    Although minimally invasive follicular thyroid carcinoma (MI-FTC) is regarded as an indolent tumour, treatment strategies remain controversial. Our aim was to investigate the outcome for patients with MI-FTC and to identify prognostic parameters to facilitate adequate treatment and follow-up. This retrospective follow-up study involved all cases of MI-FTC operated at the Karolinska University Hospital between 1986 and 2009. Outcome was analysed using death from MI-FTC as endpoint. Fifty-eight patients (41 women and 17 men) with MI-FTC were identified. The median follow-up time was 140 (range 21-308) months. Vascular invasion was observed in 36 cases and was associated with larger tumour size [median 40 (20-76) compared with 24 (10-80) mm for patients with capsular invasion only (P = 0.001)] and older patients [54 (20-92) vs. 44 (11-77) years; P = 0.019]. Patients with vascular invasion were more often treated with thyroidectomy (21/36 compared to 7/22 with capsular invasion only; P = 0.045). Five patients died from metastatic disease of FTC after a median follow-up of 114 (range 41-193) months; all were older than 50 years (51-72) at the time of the initial surgery; vascular invasion was present in all tumours and all but one were treated with thyroidectomy. Univariate analysis identified combined capsular and vascular invasion (P = 0.034), age at surgery ≥50 years (P = 0.023) and male gender (P = 0.005) as related to risk of death from MI-FTC. MI-FTC should not be considered a purely indolent disease. Age at diagnosis and the existence of combined capsular and vascular invasion were identified as important prognostic factors. PMID:26858184